The extra_css_urls and extra_js_urls hooks now take additional optional
parameters.
Also refactored them out of the Datasette class and into RenderMixin.
Plus improved plugin documentation to explicitly list parameters.
* table.csv?_stream=1 to download all rows - refs #266
This option causes Datasette to serve ALL rows in the table, by internally
following the _next= pagination links and serving everything out as a stream.
Also added new config option, allow_csv_stream, which can be used to disable
this feature.
* New config option max_csv_mb limiting size of CSV export
The fixtures database created by our unit tests makes for a good "live" demo
of Datasette in action.
I've improved the metadata it ships with to better support this use-case.
I've also improved the mechanism for writing out fixtures: you can do this:
python tests/fixtures.py fixtures.db
To get just the fixtures database written out... or you can do this:
python tests/fixtures.py fixtures.db fixtures.json
To get metadata which you can then serve like so:
datasette fixtures.db -m fixtures.json
Refs #313
https://github.com/pytest-dev/pytest/issues/1875 made it impossible to declare
a function as a fixture multiple times, which we were doing across different
modules. The fix was to move our @pytest.fixture calls into decorators in the
tests/fixtures.py module.
Removed the --page_size= argument to datasette serve in favour of:
datasette serve --config default_page_size:50 mydb.db
Added new help section:
$ datasette --help-config
Config options:
default_page_size Default page size for the table view
(default=100)
max_returned_rows Maximum rows that can be returned from a table
or custom query (default=1000)
sql_time_limit_ms Time limit for a SQL query in milliseconds
(default=1000)
default_facet_size Number of values to return for requested facets
(default=30)
facet_time_limit_ms Time limit for calculating a requested facet
(default=200)
facet_suggest_time_limit_ms Time limit for calculating a suggested facet
(default=50)
Replaced the --max_returned_rows and --sql_time_limit_ms options to
"datasette serve" with a new --limit option, which supports a larger
list of limits.
Example usage:
datasette serve --limit max_returned_rows:1000 \
--limit sql_time_limit_ms:2500 \
--limit default_facet_size:50 \
--limit facet_time_limit_ms:1000 \
--limit facet_suggest_time_limit_ms:500
New docs: https://datasette.readthedocs.io/en/latest/limits.htmlCloses#270Closes#264
Every now and then a test will fail in Travis CI on Python 3.5 because it hit
the default 20ms SQL time limit.
Test fixtures now default to a 200ms time limit, and we only use the 20ms time
limit for the specific test that tests query interruption. This should make
our tests on Python 3.5 in Travis much more stable.
* New --plugins-dir=plugins/ option
New option causing Datasette to load and evaluate all of the Python files in
the specified directory and register any plugins that are defined in those
files.
This new option is available for the following commands:
datasette serve mydb.db --plugins-dir=plugins/
datasette publish now/heroku mydb.db --plugins-dir=plugins/
datasette package mydb.db --plugins-dir=plugins/
* Unit tests for --plugins-dir=plugins/
Closes#211
You can now explicitly set which columns in a table can be used for sorting
using the _sort and _sort_desc arguments using metadata.json:
{
"databases": {
"database1": {
"tables": {
"example_table": {
"sortable_columns": [
"height",
"weight"
]
}
}
}
}
}
Refs #189
If you set the source_url/license_url/source/license fields in your root
metadata those values will now be inherited all the way down to the database
and table templates.
The title/description are NOT inherited.
Also added unit tests for the HTML generated by the metadata.
Refs #185