Wykres commitów

146 Commity (dda99fc09fb0b5523948f6d481c6c051c1c7b5de)

Autor SHA1 Wiadomość Data
Simon Willison 35d6ee2790
Apply black to everything, enforce via unit tests (#449)
I've run the black code formatting tool against everything:

    black tests datasette setup.py

I also added a new unit test, in tests/test_black.py, which will fail if the code does not
conform to black's exacting standards.

This unit test only runs on Python 3.6 or higher, because black itself doesn't run on 3.5.
2019-05-03 22:15:14 -04:00
Russ Garrett bf229c9bd8 Pass view_name to extra_body_script hook (#443)
At the moment it's not easy to tell whether the hook is being called
in (for example) the row or table view, as in both cases the
`database` and `table` parameters are provided.

This passes the `view_name` added in #441 to the `extra_body_script`
hook.
2019-05-03 09:12:19 -04:00
Russ Garrett cf406c0754 New plugin hook: register_output_renderer hook (#441)
Thanks @russss!

* Add register_output_renderer hook

This changeset refactors out the JSON renderer and then adds a hook and
dispatcher system to allow custom output renderers to be registered.

The CSV output renderer is untouched because supporting streaming
renderers through this system would be significantly more complex, and
probably not worthwhile.

We can't simply allow hooks to be called at request time because we need
a list of supported file extensions when the request is being routed in
order to resolve ambiguous database/table names. So, renderers need to
be registered at startup.

I've tried to make this API independent of Sanic's request/response
objects so that this can remain stable during the switch to ASGI. I'm
using dictionaries to keep it simple and to make adding additional
options in the future easy.

Fixes #440
2019-05-01 16:01:56 -07:00
Simon Willison d1075b8259 Cleaned up pylint warnings 2019-04-13 12:20:10 -07:00
Simon Willison 13ee3c222f Moved BaseView.absolute_url() to Datasette 2019-04-13 12:16:05 -07:00
Simon Willison 53bf875483 expand_foreign_keys() no longer uses inspect, refs #420 2019-04-06 19:56:07 -07:00
Simon Willison 29a3896fe1 .database_url(database) no longer needs inspect, refs #420 2019-03-31 16:55:38 -07:00
Simon Willison 7d0f668556 .resolve_db_name() and .execute() work without inspect
Refs #420
2019-03-31 16:51:52 -07:00
Simon Willison 0209a0a344 table_exists() now uses async SQL, refs #420 2019-03-31 11:02:22 -07:00
Simon Willison 6f6d0ff2b4
URL hashing is now off by default - closes #418
Prior to this commit Datasette would calculate the content hash of every
database and redirect to a URL containing that hash, like so:

    https://v0-27.datasette.io/fixtures => https://v0-27.datasette.io/fixtures-dd88475

This assumed that all databases were opened in immutable mode and were not
expected to change.

This will be changing as a result of #419 - so this commit takes the first step
in implementing that change by changing this default behaviour. Datasette will
now only redirect hash-free URLs under two circumstances:

* The new `hash_urls` config option is set to true (it defaults to false).
* The user passes `?_hash=1` in the URL
2019-03-17 15:55:04 -07:00
Simon Willison afe9aa3ae0 show/hide link for SQL on custom query page
Closes #415
2019-03-14 22:22:35 -07:00
Simon Willison 4462a5ab28 Show size of database file next to download link, closes #172 2019-02-05 20:58:29 -08:00
Simon Willison b5dd83981a Export option: _shape=array&_nl=on for newline-delimited JSON 2019-01-27 17:40:23 -08:00
Simon Willison 996e8822d2 Fix CSV export hidden form fields, closes #393 2019-01-02 18:43:56 -08:00
Simon Willison b7c6a9f9bd
extra_css_urls(template, database, table, datasette)
The extra_css_urls and extra_js_urls hooks now take additional optional
parameters.

Also refactored them out of the Datasette class and into RenderMixin.

Plus improved plugin documentation to explicitly list parameters.
2018-08-28 11:56:57 +01:00
Simon Willison fbf446965b
Refactoring: renamed "name" variable to "database" 2018-08-28 11:25:13 +01:00
Simon Willison 2e836f72d9
render_cell(value, column, table, database, datasette)
The render_cell plugin hook previously was only passed value.

It is now passed (value, column, table, database, datasette).
2018-08-28 03:03:01 -07:00
Simon Willison 5cf0c6c91c
New plugin hook: extra_body_script 2018-08-28 02:02:49 -07:00
Simon Willison 1905c03364
New ds.metadata() method 2018-08-28 00:45:39 -07:00
Simon Willison aae49fef3b
Import pysqlite3 if available, closes #360 (#361) 2018-08-15 17:58:56 -07:00
Simon Willison 2189be1440
Refactor to use new datasatte.config(key) method 2018-08-11 13:09:07 -07:00
Simon Willison 4ac9132240
render_cell(value) plugin hook, closes #352
New plugin hook for customizing the way cells values are rendered in HTML.

The first full example of this hook in use is https://github.com/simonw/datasette-json-html
2018-08-04 17:14:56 -07:00
Simon Willison 581b4c97ee
URLify URLs in custom SQL queries, closes #298 2018-07-23 20:56:32 -07:00
Simon Willison 700d83d8ad
?_json_infinity=1 for handling Infinity/-Infinity - fixes #332 2018-07-23 20:07:57 -07:00
Simon Willison f24b49a1a8
New force_https_urls option, refs #333 2018-07-23 08:58:29 -07:00
Simon Willison 6e37f091ed
Support title/description for canned queries, closes #342
Demo here: https://latest.datasette.io/fixtures/neighborhood_search
2018-07-15 19:33:30 -07:00
Simon Willison 6541ce633e
Fix for row pages for tables with / in, closes #325 2018-07-07 22:21:51 -07:00
Simon Willison 71b46fd9f5
Cleaned up view constructors to accept just a datasette instance 2018-07-07 19:58:11 -07:00
Simon Willison 64c2fea8df
CSV export now respects --cors, fixes #326 2018-06-23 17:59:37 -07:00
Simon Willison 54f805dca3
Advanced export box now obeys allow_csv_stream config - refs #266 2018-06-18 08:11:11 -07:00
Simon Willison 83f4ef7ec7
Improved UI for CSV/JSON export, closes #266 2018-06-17 23:05:18 -07:00
Simon Willison fc3660cfad
Streaming mode for downloading all rows as a CSV (#315)
* table.csv?_stream=1 to download all rows - refs #266

This option causes Datasette to serve ALL rows in the table, by internally
following the _next= pagination links and serving everything out as a stream.

Also added new config option, allow_csv_stream, which can be used to disable
this feature.

* New config option max_csv_mb limiting size of CSV export
2018-06-17 20:21:02 -07:00
Simon Willison 0d7ba1ba67
Default to _labels=on on JSON/CSV links with foreign keys, refs #266 2018-06-17 15:56:55 -07:00
Simon Willison ed631e690b
?_labels= and ?_label=COL to expand foreign keys in JSON/CSV
These new querystring arguments can be used to request expanded foreign keys
in both JSON and CSV formats.

?_labels=on turns on expansions for ALL foreign key columns

?_label=COLUMN1&_label=COLUMN2 can be used to pick specific columns to expand

e.g. `Street_Tree_List.json?_label=qSpecies&_label=qLegalStatus`

    {
        "rowid": 233,
        "TreeID": 121240,
        "qLegalStatus": {
            "value" 2,
            "label": "Private"
        }
        "qSpecies": {
            "value": 16,
            "label": "Sycamore"
        }
        "qAddress": "91 Commonwealth Ave",
        ...
    }

The labels option also works for the HTML and CSV views.

HTML defaults to `?_labels=on`, so if you pass `?_labels=off` you can disable
foreign key expansion entirely - or you can use `?_label=COLUMN` to request
just specific columns.

If you expand labels on CSV you get additional columns in the output:

`/Street_Tree_List.csv?_label=qLegalStatus`

    rowid,TreeID,qLegalStatus,qLegalStatus_label...
    1,141565,1,Permitted Site...
    2,232565,2,Undocumented...

I also refactored the existing foreign key expansion code.

Closes #233. Refs #266.
2018-06-16 15:18:57 -07:00
Simon Willison 3a79ad98ea
Basic CSV export, refs #266
Tables and custom SQL query results can now be exported as CSV.

The easiest way to do this is to use the .csv extension, e.g.

	/test_tables/facet_cities.csv

By default this is served as Content-Type: text/plain so you can see it in
your browser. If you want to download the file (using text/csv and with an
appropriate Content-Disposition: attachment header) you can do so like this:

	/test_tables/facet_cities.csv?_dl=1

We link to the CSV and downloadable CSV URLs from the table and query pages.

The links use ?_size=max and so by default will return 1,000 rows.

Also fixes #303 - table names ending in .json or .csv are now detected and
URLs are generated that look like this instead:

	/test_tables/table%2Fwith%2Fslashes.csv?_format=csv

The ?_format= option is available for everything else too, but we link to the
.csv / .json versions in most cases because they are aesthetically pleasing.
2018-06-14 23:51:23 -07:00
Simon Willison b0a95da963
Show more useful error message for SQL interrupted, closes #142 2018-05-28 14:24:19 -07:00
Simon Willison 76d11eb768
New ?_json=colname argument for returning unescaped JSON
Also extracted docs for special JSON arguments into a new section.

Closes #31
2018-05-28 11:08:39 -07:00
Simon Willison 276913b748
?_shape=arrayfirst, closes #287 2018-05-26 17:32:15 -07:00
Simon Willison b463f60158
?_ttl= parameter and default_cache_ttl config
Refs #285, Closes #289
2018-05-26 15:17:33 -07:00
Simon Willison f98e62fe5a
Fix for 500 error on /db?sql=x 2018-05-25 15:08:57 -07:00
Simon Willison 81df47e8d9
Moved .execute() method from BaseView to Datasette class
Also introduced new Results() class with results.truncated, results.description, results.rows
2018-05-24 17:15:53 -07:00
Russ Garrett 58b5a37dbb Refactor inspect logic 2018-05-22 07:03:06 -07:00
Simon Willison 08f4b7658f
Show facets that timed out using new InterruptedError
If the user requests some _facet= options that do not successfully execute in
the configured facet_time_limit_ms, we now show a warning message like this:

    These facets timed out: rowid, Title

To build this I had to clean up our SQLite interrupted logic. We now raise a
custom InterruptedError exception when SQLite terminates due to exceeding a
time limit.

In implementing this I found and fixed a logic error where invalid SQL was
being generated in some cases for our faceting calculations but the resulting
sqlite3.OperationalError had been incorrectly captured and treated as a
timeout.

Refs #255
Closes #269
2018-05-17 23:11:23 -07:00
Simon Willison cf1fe693e5 Used isort to re-order my imports 2018-05-14 00:04:23 -03:00
Simon Willison 3686385551 Ran black source formatting tool against new views/ and app.py 2018-05-14 00:04:23 -03:00
Simon Willison 1f69269fe9 Refactored views into new views/ modules, refs #256 2018-05-14 00:04:23 -03:00