Docs for CSV export, refs #266

pull/375/head
Simon Willison 2018-06-18 07:12:21 -07:00
rodzic 398d29c287
commit bb4a9fbf36
Nie znaleziono w bazie danych klucza dla tego podpisu
ID klucza GPG: 17E2DEA2588B7F52
6 zmienionych plików z 78 dodań i 6 usunięć

Plik binarny nie jest wyświetlany.

Po

Szerokość:  |  Wysokość:  |  Rozmiar: 7.7 KiB

Wyświetl plik

@ -35,6 +35,8 @@ You can optionally set a lower time limit for an individual query using the ``?_
This would set the time limit to 100ms for that specific query. This feature is useful if you are working with databases of unknown size and complexity - a query that might make perfect sense for a smaller table could take too long to execute on a table with millions of rows. By setting custom time limits you can execute queries "optimistically" - e.g. give me an exact count of rows matching this query but only if it takes less than 100ms to calculate. This would set the time limit to 100ms for that specific query. This feature is useful if you are working with databases of unknown size and complexity - a query that might make perfect sense for a smaller table could take too long to execute on a table with millions of rows. By setting custom time limits you can execute queries "optimistically" - e.g. give me an exact count of rows matching this query but only if it takes less than 100ms to calculate.
.. _config_max_returned_rows:
max_returned_rows max_returned_rows
----------------- -----------------
@ -126,23 +128,27 @@ Sets the amount of memory SQLite uses for its `per-connection cache <https://www
datasette mydatabase.db --config cache_size_kb:5000 datasette mydatabase.db --config cache_size_kb:5000
.. _config_allow_csv_stream:
allow_csv_stream allow_csv_stream
---------------- ----------------
Enables the feature where an entire table (potentially hundreds of thousands of Enables :ref:`the CSV export feature <csv_export>` where an entire table
rows) can be exported as a single CSV file. This is turned on by default - you (potentially hundreds of thousands of rows) can be exported as a single CSV
can turn it off like this:: file. This is turned on by default - you can turn it off like this:
:: ::
datasette mydatabase.db --config allow_csv_stream:off datasette mydatabase.db --config allow_csv_stream:off
.. _config_max_csv_mb:
max_csv_mb max_csv_mb
---------- ----------
The maximum size of CSV that can be exported, in megabytes. Defaults to 100MB. The maximum size of CSV that can be exported, in megabytes. Defaults to 100MB.
You can disable the limit entirely by settings this to 0:: You can disable the limit entirely by settings this to 0:
::
datasette mydatabase.db --config max_csv_mb:0 datasette mydatabase.db --config max_csv_mb:0

Wyświetl plik

@ -0,0 +1,63 @@
.. _csv_export:
CSV Export
==========
Any Datasette table, view or custom SQL query can be exported as CSV.
To obtain the CSV representation of the table you are looking, click the "this
data as CSV" link.
You can also use the advanced export form for more control over the resulting
file, which looks like this and has the following options:
.. image:: advanced_export.png
* **download file** - instead of displaying CSV in your browser, this forces
your browser to download the CSV to your downloads directory.
* **expand labels** - if your table has any foreign key references this option
will cause the CSV to gain additional ``COLUMN_NAME_label`` columns with a
label for each foreign key derived from the linked table. `In this example
<https://latest.datasette.io/fixtures/facetable.csv?_labels=on&_size=max>`_
the ``city_id`` column is accompanied by a ``city_id_label`` column.
* **stream all records** - by default CSV files only contain the first
:ref:`config_max_returned_rows` records. This option will cause Datasette to
loop through every matching record and return them as a single CSV file.
You can try that out on https://latest.datasette.io/fixtures/facetable?_size=4
Streaming all records
---------------------
The *stream all records* option is designed to be as efficient as possible -
under the hood it takes advantage of Python 3 asyncio capabilities and
Datasette's efficient :ref:`pagination <pagination>` to stream back the full
CSV file.
Since databases can get pretty large, by default this option is capped at 100MB -
if a table returns more than 100MB of data the last line of the CSV will be a
truncation error message.
You can increase or remove this limit using the :ref:`config_max_csv_mb` config
setting. You can also disable the CSV export feature entirely using
:ref:`config_allow_csv_stream`.
A note on URLs
--------------
The default URL for the CSV representation of a table is that table with
``.csv`` appended to it:
* https://latest.datasette.io/fixtures/facetable - HTML interface
* https://latest.datasette.io/fixtures/facetable.csv - CSV export
* https://latest.datasette.io/fixtures/facetable.json - JSON API
This pattern doesn't work for tables with names that already end in ``.csv`` or
``.json``. For those tables, you can instead use the ``_format=`` querystring
parameter:
* https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv - HTML interface
* https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv?_format=csv - CSV export
* https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv?_format=json - JSON API

Wyświetl plik

@ -19,6 +19,7 @@ Contents
getting_started getting_started
json_api json_api
sql_queries sql_queries
csv_export
facets facets
full_text_search full_text_search
spatialite spatialite

Wyświetl plik

@ -1,5 +1,5 @@
The Datasette JSON API JSON API
====================== ========
Datasette provides a JSON API for your SQLite databases. Anything you can do Datasette provides a JSON API for your SQLite databases. Anything you can do
through the Datasette user interface can also be accessed as JSON via the API. through the Datasette user interface can also be accessed as JSON via the API.

Wyświetl plik

@ -95,6 +95,8 @@ will then be able to enter them using the form fields on the canned query page
or by adding them to the URL. This means canned queries can be used to create or by adding them to the URL. This means canned queries can be used to create
custom JSON APIs based on a carefully designed SQL. custom JSON APIs based on a carefully designed SQL.
.. _pagination:
Pagination Pagination
---------- ----------