Porównaj commity

...

468 Commity
0.64.6 ... main

Autor SHA1 Wiadomość Data
Simon Willison 8f9509f00c
datasette, not self.ds, in internals documentation 2024-04-22 16:01:37 -07:00
Simon Willison 7d6d471dc5 Include actor in track_event async example, refs #2319 2024-04-11 18:53:07 -07:00
Simon Willison 2a08ffed5c
Async example for track_event hook
Closes #2319
2024-04-11 18:47:01 -07:00
Simon Willison 63714cb2b7 Fixed some typos spotted by Gemini Pro 1.5, closes #2318 2024-04-10 17:05:15 -07:00
Simon Willison d32176c5b8
Typo fix triggera -> triggers 2024-04-10 16:50:09 -07:00
Simon Willison 19b6a37336 z-index: 10000 on dropdown menu, closes #2311 2024-03-21 10:15:57 -07:00
Simon Willison 1edb24f124 Docs for 100 max rows in an insert, closes #2310 2024-03-19 09:15:39 -07:00
Simon Willison da68662767
datasette-enrichments is example of row_actions
Refs:
- https://github.com/simonw/datasette/issues/2299
- https://github.com/datasette/datasette-enrichments/issues/41
2024-03-17 14:40:47 -07:00
Agustin Bacigalup 67e66f36c1
Add ETag header for static responses (#2306)
* add etag to static responses

* fix RuntimeError related to static headers

* Remove unnecessary import

---------

Co-authored-by: Simon Willison <swillison@gmail.com>
2024-03-17 12:18:40 -07:00
Simon Willison 261fc8d875 Fix datetime.utcnow deprecation warning 2024-03-15 15:32:12 -07:00
Simon Willison eb8545c172 Refactor duplicate code in DatasetteClient, closes #2307 2024-03-15 15:29:03 -07:00
Simon Willison 54f5604caf Fixed cookies= httpx warning, refs #2307 2024-03-15 15:19:23 -07:00
Simon Willison 5af6837725 Fix httpx warning about app=self.app, refs #2307 2024-03-15 15:15:31 -07:00
Simon Willison 8b6f155b45 Added two things I left out of the 1.0a13 release notes
Refs #2104, #2294

Closes #2303
2024-03-12 19:19:51 -07:00
Simon Willison c92f326ed1 Release 1.013a
#2104, #2286, #2293, #2297, #2298, #2299, #2300, #2301, #2302
2024-03-12 19:10:53 -07:00
Simon Willison feddd61789 Fix tests I broke in #2302 2024-03-12 17:01:51 -07:00
Simon Willison 9cc6f1908f Gradient on header and footer, closes #2302 2024-03-12 16:54:03 -07:00
Simon Willison e088abdb46 Refactored action menus to a shared include, closes #2301 2024-03-12 16:35:34 -07:00
Simon Willison 828ef9899f Ran blacken-docs, refs #2299 2024-03-12 16:25:25 -07:00
Simon Willison 8d456aae45 Fix spelling of displayed, refs #2299 2024-03-12 16:17:53 -07:00
Simon Willison b8711988b9 row_actions() plugin hook, closes #2299 2024-03-12 16:16:05 -07:00
Simon Willison 7339cc51de Rearrange plugin hooks page with more sections, closes #2300 2024-03-12 15:44:10 -07:00
Simon Willison 06281a0b8e Test for labels on Table/View action buttons, refs #2297 2024-03-12 14:32:48 -07:00
Simon Willison 909c85cd2b view_actions plugin hook, closes #2297 2024-03-12 14:25:28 -07:00
Simon Willison daf5ca02ca homepage_actions() plugin hook, closes #2298 2024-03-12 13:46:06 -07:00
Simon Willison 7b32d5f7d8 datasette-create-view as example of query_actions hook 2024-03-07 00:11:14 -05:00
Simon Willison 7818e8b9d1 Hide tables starting with an _, refs #2104 2024-03-07 00:03:42 -05:00
Simon Willison a395256c8c Allow-list select * from pragma_table_list()
Refs https://github.com/simonw/datasette/issues/2104#issuecomment-1982352475
2024-03-07 00:03:20 -05:00
Simon Willison 090dff542b
Action menu descriptions
* Refactor tests to extract get_actions_links() helper
* Table, database and query action menu items now support optional descriptions

Closes #2294
2024-03-06 22:54:06 -05:00
Simon Willison c6e8a4a76c
margin-bottom on .page-action-menu, refs #2286 2024-03-05 19:34:57 -08:00
Simon Willison 4d24bf6b34 Don't explain an explain even in the demo, refs #2293 2024-03-05 18:14:55 -08:00
Simon Willison 5de6797d4a Better demo plugin for query_actions, refs #2293 2024-03-05 18:06:38 -08:00
Simon Willison 86335dc722 Release 1.0a12
Refs #2281, #2283, #2287, #2289
2024-02-29 14:35:28 -08:00
Simon Willison 57c1ce0e8b Reset column menu on every click, closes #2289 2024-02-29 14:25:50 -08:00
Simon Willison 6ec0081f5d
`query_actions` plugin hook
* New query_actions plugin hook, closes #2283
2024-02-27 21:55:16 -08:00
Simon Willison f99c2f5f8c ?column_notcontains= table filter, closes #2287 2024-02-27 16:07:41 -08:00
Simon Willison c863443ea1 Documentation for derive_named_parameters()
Closes #2284

Refs https://github.com/simonw/datasette-write/issues/7#issuecomment-1967593883
2024-02-27 13:24:47 -08:00
Simon Willison dfd4ad558b
New design for table and database action menus
Closes #2281
2024-02-25 12:54:16 -08:00
Simon Willison 434123425f Release 1.0a11
Refs #2263, #2278, #2279

Closes #2280
2024-02-19 14:48:37 -08:00
Jeroen Van Goey 103b4decbd
fix (typo): Corrected spelling of 'environments' (#2268)
* fix (typo): Corrected spelling of 'environments'

* ci: add test folder to codespell workflow
2024-02-19 14:41:32 -08:00
dependabot[bot] 158d5d96e9
Bump the python-packages group with 1 update (#2269)
Bumps the python-packages group with 1 update: [black](https://github.com/psf/black).


Updates `black` from 24.1.1 to 24.2.0
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.1.1...24.2.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-19 14:23:12 -08:00
Simon Willison 28bf3a933f Applied Black, refs #2278 2024-02-19 14:22:59 -08:00
Simon Willison 26300738e3 Fixes for permissions debug page, closes #2278 2024-02-19 14:17:37 -08:00
Simon Willison 27409a7892 Fix for hook position in wide column names, refs #2263 2024-02-19 14:01:55 -08:00
Simon Willison 392ca2e24c Improvements to table column cog menu display, closes #2263
- Repositions if menu would cause a horizontal scrollbar
- Arrow tip on menu now attempts to align with cog icon on column
2024-02-19 13:40:48 -08:00
Simon Willison b36a2d8f4b Require update-row to use insert replace, closes #2279 2024-02-19 12:55:51 -08:00
Simon Willison 3856a8cb24 Consistent Permission denied:, refs #2279 2024-02-19 12:51:14 -08:00
Simon Willison 81629dbeff Upgrade GitHub Actions, including PyPI publishing 2024-02-17 21:03:41 -08:00
Simon Willison a4fa1ef3bd Release 1.0a10
Refs #2277
2024-02-17 20:56:15 -08:00
Simon Willison 10f9ba1a00 Take advantage of execute_write_fn(transaction=True)
A bunch of places no longer need to do manual transaction handling
thanks to this change. Refs #2277
2024-02-17 20:51:19 -08:00
Simon Willison 5e0e440f2c database.execute_write_fn(transaction=True) parameter, closes #2277 2024-02-17 20:28:15 -08:00
Simon Willison e1c80efff8 Note about activating alpha documentation versions on ReadTheDocs 2024-02-16 14:43:36 -08:00
Simon Willison 9906f937d9 Release 1.0a9
Refs #2101, #2260, #2262, #2265, #2270, #2273, #2274, #2275

Closes #2276
2024-02-16 14:36:12 -08:00
Simon Willison 3a999a85fb Fire insert-rows on /db/-/create if rows were inserted, refs #2260 2024-02-16 13:59:56 -08:00
Simon Willison 244f3ff83a Test demonstrating fix for permisisons bug in #2262 2024-02-16 13:39:57 -08:00
Simon Willison 8bfa3a51c2 Consider every plugins opinion in datasette.permission_allowed()
Closes #2275, refs #2262
2024-02-16 13:29:39 -08:00
Simon Willison 232a30459b DATASETTE_TRACE_PLUGINS setting, closes #2274 2024-02-16 13:00:24 -08:00
Simon Willison 47e29e948b Better comments in permission_allowed_default() 2024-02-16 10:05:18 -08:00
Simon Willison 97de4d6362 Use transaction in delete_everything(), closes #2273 2024-02-15 21:35:49 -08:00
Simon Willison b89cac3b6a
Use MD5 usedforsecurity=False on Python 3.9 and higher to pass FIPS
Closes #2270
2024-02-13 18:23:54 -08:00
Simon Willison 5d79974186
Call them "notable events" 2024-02-10 07:19:47 -08:00
Simon Willison 398a92cf1e Include database in name of _execute_writes thread, closes #2265 2024-02-08 20:12:31 -08:00
Simon Willison bd9ed62e5d Make ds.pemrission_allawed(..., default=) a keyword-only argument, refs #2262 2024-02-08 20:12:31 -08:00
Simon Willison dcd9ea3622
datasette-events-db as an example of track_events() 2024-02-08 14:14:58 -08:00
Simon Willison c62cfa6de8 Fix upsert test to detect new alter-table event 2024-02-08 13:36:17 -08:00
Simon Willison c954795f9a alter: true for row/-/update, refs #2101 2024-02-08 13:36:17 -08:00
Simon Willison 4e944c29e4 Corrected path used in test_update_row_check_permission 2024-02-08 13:36:17 -08:00
Simon Willison 528d89d1a3 alter: true support for /-/insert and /-/upsert, refs #2101 2024-02-08 13:36:17 -08:00
Simon Willison b5ccc4d608 Test for Permission denied - need alter-table 2024-02-08 13:36:17 -08:00
Simon Willison 574687834f Docs for /db/-/create alter: true option, refs #2101 2024-02-08 13:36:17 -08:00
Simon Willison 900d15bcb8 alter table support for /db/-/create API, refs #2101 2024-02-08 13:36:17 -08:00
Simon Willison 569aacd39b
Link to /en/latest/ changelog 2024-02-07 22:53:14 -08:00
Simon Willison 9989f25709 Release 1.0a8
Refs Refs #2052, #2156, #2243, #2247, #2249, #2252, #2254, #2258
2024-02-07 08:34:05 -08:00
Simon Willison e0794ddd52 Link to annotated release notes blog post 2024-02-07 08:32:47 -08:00
Simon Willison 1e31821d9f Link to events docs from changelog 2024-02-07 08:31:26 -08:00
Simon Willison df8d1c055a
Mention JS plugins in release intro 2024-02-06 22:59:58 -08:00
Simon Willison d0089ba776 Note in changelog about datasette publish, refs #2195 2024-02-06 22:30:30 -08:00
Simon Willison c64453a4a1 Fix the date on the 1.0a8 release (due to go tomorrow)
Refs #2258
2024-02-06 22:28:22 -08:00
Simon Willison ad01f9d321
1.0a8 release notes
Closes #2243

* Changelog for jinja2_environment_from_request and plugin_hook_slots
* track_event() in changelog
* Remove Using YAML for metadata section - no longer necessary now we show YAML and JSON examples everywhere.
* Configuration via the command-line section - #2252
* JavaScript plugins in release notes, refs #2052
* /-/config in changelog, refs #2254

Refs #2052, #2156, #2243, #2247, #2249, #2252, #2254
2024-02-06 22:24:24 -08:00
Simon Willison 9ac9f0152f Migrate allow from metadata to config if necessary, closes #2249 2024-02-06 22:18:38 -08:00
Simon Willison 60c6692f68
table_config instead of table_metadata (#2257)
Table configuration that was incorrectly placed in metadata is now treated as if it was in config.

New await datasette.table_config() method.

Closes #2247
2024-02-06 21:57:09 -08:00
Simon Willison 52a1dac5d2 Test proving $env works for datasette.yml, closes #2255 2024-02-06 21:00:55 -08:00
Simon Willison f049103852 datasette.table_metadata() is now await datasette.table_config(), refs #2247 2024-02-06 17:33:18 -08:00
Simon Willison 69c6e95323 Fixed a bunch of unused imports spotted with ruff 2024-02-06 17:27:20 -08:00
Simon Willison 5d21057cf1 /-/config example, refs #2254 2024-02-06 15:22:03 -08:00
Simon Willison 5a63ecc557 Rename metadata= to table_config= in facet code, refs #2247 2024-02-06 15:03:19 -08:00
Simon Willison 1e901aa690 /-/config page, closes #2254 2024-02-06 12:33:46 -08:00
Simon Willison 85a1dfe6e0 Configuration via the command-line section
Closes #2252

Closes #2156
2024-02-05 13:43:50 -08:00
Simon Willison efc7357554 Remove Using YAML for metadata section
No longer necessary now we show YAML and JSON examples everywhere.
2024-02-05 13:01:03 -08:00
Simon Willison 503545b203 JavaScript plugins documentation, closes #2250 2024-02-05 11:47:17 -08:00
Simon Willison 7219a56d1e 3 space indent, not 2 2024-02-05 10:34:10 -08:00
Simon Willison 5ea7098e4d Fixed an unnecessary f-string 2024-02-04 10:15:21 -08:00
Simon Willison 4ea109ac4d Two spaces is aesthetically more pleasing here 2024-02-01 15:47:41 -08:00
Simon Willison 6ccef35cc9 More links between events documentation 2024-02-01 15:42:45 -08:00
Simon Willison be4f02335f Treat plugins in metadata as if they were in config, closes #2248 2024-02-01 15:33:33 -08:00
Simon Willison d4bc2b2dfc Remove fail_if_plugins_in_metadata, part of #2248 2024-02-01 14:44:16 -08:00
Simon Willison 4da581d09b Link to config reference 2024-02-01 14:40:49 -08:00
Simon Willison b466749e88 Filled out docs/configuration.rst, closes #2246 2024-01-31 20:03:19 -08:00
Simon Willison bcf7ef963f YAML/JSON examples for allow blocks 2024-01-31 19:45:05 -08:00
Simon Willison 2e4a03b2c4
Run coverage on Python 3.12
- #2245

I hoped this would run slightly faster than 3.9 but there doesn't appear to be a performance improvement.
2024-01-31 15:31:26 -08:00
Simon Willison bcc4f6bf1f
track_event() mechanism for analytics and plugins
* Closes #2240
* Documentation for event plugin hooks, refs #2240
* Include example track_event plugin in docs, refs #2240
* Tests for track_event() and register_events() hooks, refs #2240
* Initial documentation for core events, refs #2240
* Internals documentation for datasette.track_event()
2024-01-31 15:21:40 -08:00
dependabot[bot] 890615b3f2
Bump the python-packages group with 1 update (#2241)
Bumps the python-packages group with 1 update: [furo](https://github.com/pradyunsg/furo).


Updates `furo` from 2023.9.10 to 2024.1.29
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2023.09.10...2024.01.29)

---
updated-dependencies:
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-31 10:53:57 -08:00
Simon Willison 959e020297 Ran blacken-docs 2024-01-30 20:40:18 -08:00
gerrymanoim 04e8835297
Remove deprecated/unused args from setup.py (#2222) 2024-01-30 19:56:32 -08:00
Forest Gregg b8230694ff
Set link to download db to nofollow 2024-01-30 19:56:05 -08:00
Simon Willison 5c64af6936 Upgrade to latest Black, closes #2239 2024-01-30 19:55:26 -08:00
Simon Willison c3caf36af7
Template slot family of plugin hooks - top_homepage() and others
New plugin hooks:

top_homepage
top_database
top_table
top_row
top_query
top_canned_query

New datasette.utils.make_slot_function()

Closes #1191
2024-01-30 19:54:03 -08:00
Simon Willison 7a5adb592a Docs on temporary plugins in fixtures, closes #2234 2024-01-12 14:12:14 -08:00
Simon Willison a25bf6bea7 fmt: off to fix problem with Black, closes #2231 2024-01-10 14:12:20 -08:00
Simon Willison 0f63cb83ed
Typo fix 2024-01-10 13:08:52 -08:00
Simon Willison 7506a89be0 Docs on datasette.client for tests, closes #1830
Also covers ds.client.actor_cookie() helper
2024-01-10 13:04:34 -08:00
Simon Willison 48148e66a8 Link from actors_from_ids plugin hook docs to datasette.actors_from_ids() 2024-01-10 10:42:36 -08:00
Simon Willison 2ff4d4a60a Test for ?_extra=count, refs #262 2024-01-08 13:14:25 -08:00
Simon Willison 0b2c6a7ebd Fix for ?_extra=columns bug, closes #2230
Also refs #262 - started a test suite for extras.
2024-01-08 13:12:57 -08:00
Simon Willison 1fc76fee62 1.0a8.dev1 version number
Not going to release this to PyPI but I will build my own wheel of it
2024-01-05 16:59:25 -08:00
Simon Willison c7a4706bcc
jinja2_environment_from_request() plugin hook
Closes #2225
2024-01-05 14:33:23 -08:00
Simon Willison 45b88f2056 Release notes from 0.64.6, refs #2214 2023-12-22 15:24:26 -08:00
Simon Willison 872dae1e1a Fix for CSV labels=on missing foreign key bug, closes #2214 2023-12-22 15:08:11 -08:00
Simon Willison 978249beda Removed rogue print("max_csv_mb")
Found this while working on #2214
2023-12-22 15:07:42 -08:00
Simon Willison 4284c74bc1
db.execute_isolated_fn() method (#2220)
Closes #2218
2023-12-19 10:51:03 -08:00
Simon Willison 89c8ca0f3f Fix for round_trip_load() YAML error, refs #2219 2023-12-19 10:32:55 -08:00
Simon Willison 067cc75dfa
Fixed broken example links in row page documentation 2023-12-12 09:49:04 -08:00
Cameron Yick 452a587e23
JavaScript Plugin API, providing custom panels and column menu items
Thanks, Cameron Yick.

https://github.com/simonw/datasette/pull/2052

Co-authored-by: Simon Willison <swillison@gmail.com>
2023-10-12 17:00:27 -07:00
Simon Willison 4b534b89a5 Ran cog
Refs #2052
2023-10-12 16:48:22 -07:00
Simon Willison 11f7fd38a4 Fixed some rST header warnings 2023-10-12 15:05:02 -07:00
Simon Willison a4b401f470 Updated Discord link, refs #2196
This issue reminded me to use the datasette.io/discord redirect URL.
2023-10-12 14:57:04 -07:00
Alex Garcia 3d6d1e3050
Raise an exception if a "plugins" block exists in metadata.json 2023-10-12 09:20:50 -07:00
Alex Garcia 35deaabcb1
Move non-metadata configuration from metadata.yaml to datasette.yaml
* Allow and permission blocks moved to datasette.yaml
* Documentation updates, initial framework for configuration reference
2023-10-12 09:16:37 -07:00
Simon Willison 4e1188f60f Upgrade spellcheck.yml workflow 2023-10-08 09:09:45 -07:00
Simon Willison 85a41987c7 Fixed typo acepts -> accepts 2023-10-08 09:07:11 -07:00
Simon Willison d51e63d3bb Release notes for 0.64.5, refs #2197 2023-10-08 09:06:43 -07:00
Simon Willison 836b1587f0 Release notes for 1.0a7
Refs #2189
2023-09-21 15:27:27 -07:00
Simon Willison e4f868801a Use importlib_metadata for 3.9 as well, refs #2057 2023-09-21 14:58:39 -07:00
Simon Willison f130c7c0a8 Deploy with fixtures-metadata.json, refs #2194, #2195 2023-09-21 14:09:57 -07:00
Simon Willison 2da1a6acec Use importlib_metadata for Python 3.8, refs #2057 2023-09-21 13:26:13 -07:00
Simon Willison b7cf0200e2 Swap order of config and metadata options, refs #2194 2023-09-21 13:22:40 -07:00
Simon Willison 80a9cd9620 test-datasette-load-plugins now fails correctly, refs #2193 2023-09-21 12:55:50 -07:00
Simon Willison b0d0a0e5de importlib_resources for Python < 3.9, refs #2057 2023-09-21 12:42:15 -07:00
Simon Willison 947520c1fe Release notes for 0.64.4 on main 2023-09-21 12:31:32 -07:00
Simon Willison 10bc805473 Finish removing pkg_resources, closes #2057 2023-09-21 12:13:16 -07:00
dependabot[bot] 6763572948
Bump sphinx, furo, black
Bumps the python-packages group with 3 updates: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo) and [black](https://github.com/psf/black).


Updates `sphinx` from 7.2.5 to 7.2.6
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v7.2.5...v7.2.6)

Updates `furo` from 2023.8.19 to 2023.9.10
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2023.08.19...2023.09.10)

Updates `black` from 23.7.0 to 23.9.1
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.7.0...23.9.1)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: python-packages
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-20 15:11:24 -07:00
Simon Willison b0e5d8afa3
Stop using parallel SQL queries for tables
Refs:
- #2189
2023-09-20 15:10:55 -07:00
Simon Willison 6ed7908580 Simplified test for #2189
This now executes two facets, in the hope that parallel facet execution
would illustrate the bug - but it did not illustrate the bug.
2023-09-18 10:44:13 -07:00
Simon Willison f56e043747 test_facet_against_in_memory_database, refs #2189
This is meant to illustrate a crashing bug but it does not trigger it.
2023-09-18 10:39:11 -07:00
Simon Willison 852f501485 Switch from pkg_resources to importlib.metadata in app.py, refs #2057 2023-09-16 09:35:18 -07:00
Simon Willison 16f0b6d822 JSON/YAML tabs on configuration docs page 2023-09-13 14:16:36 -07:00
Alex Garcia b2ec8717c3
Plugin configuration now lives in datasette.yaml/json
* Checkpoint, moving top-level plugin config to datasette.json
* Support database-level and table-level plugin configuration in datasette.yaml

Refs #2093
2023-09-13 14:06:25 -07:00
Simon Willison a4c96d01b2 Release 1.0a6
Refs #1765, #2164, #2169, #2175, #2178, #2181
2023-09-07 21:44:08 -07:00
Simon Willison b645174271
actors_from_ids plugin hook and datasette.actors_from_ids() method (#2181)
* Prototype of actors_from_ids plugin hook, refs #2180
* datasette-remote-actors example plugin, refs #2180
2023-09-07 21:23:59 -07:00
Simon Willison c26370485a Label expand permission check respects cascade, closes #2178 2023-09-07 16:28:30 -07:00
Simon Willison ab040470e2 Applied blacken-docs 2023-09-07 15:57:27 -07:00
Simon Willison dbfad6d220 Foreign key label expanding respects table permissions, closes #2178 2023-09-07 15:51:09 -07:00
Simon Willison 2200abfa17 Fix for flaky test_hidden_sqlite_stat1_table, closes #2179 2023-09-07 15:49:50 -07:00
Simon Willison fbcb103c0c Added example code to database_actions hook documentation 2023-09-07 07:47:24 -07:00
dependabot[bot] e4abae3fd7
Bump Sphinx (#2166)
Bumps the python-packages group with 1 update: [sphinx](https://github.com/sphinx-doc/sphinx).

- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v7.2.4...v7.2.5)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-06 09:34:31 -07:00
Simon Willison e86eaaa4f3
Test against Python 3.12 preview (#2175)
https://dev.to/hugovk/help-test-python-312-beta-1508/
2023-09-06 09:16:27 -07:00
Simon Willison 05707aa16b
click-default-group>=1.2.3 (#2173)
* click-default-group>=1.2.3

Now available as a wheel:
- https://github.com/click-contrib/click-default-group/issues/21

* Fix for blacken-docs
2023-09-05 19:50:09 -07:00
Simon Willison 31d5c4ec05 Contraction - Google and Microsoft styleguides like it
I was trying out https://github.com/errata-ai/vale
2023-09-05 19:43:01 -07:00
Simon Willison fd083e37ec Docs for plugins that define more plugin hooks, closes #1765 2023-08-31 16:06:30 -07:00
Simon Willison 98ffad9aed execute-sql now implies can view instance/database, closes #2169 2023-08-31 15:46:26 -07:00
Simon Willison 9cead33fb9
OperationalError: database table is locked fix
See also:
- https://til.simonwillison.net/datasette/remember-to-commit
2023-08-31 10:46:07 -07:00
Simon Willison 4c3ef03311
Another ReST fix 2023-08-30 16:19:59 -07:00
Simon Willison 2caa53a52a
ReST fix 2023-08-30 16:19:24 -07:00
Simon Willison 6bfe104d47
DATASETTE_LOAD_PLUGINS environment variable for loading specific plugins
Closes #2164

* Load only specified plugins for DATASETTE_LOAD_PLUGINS=datasette-one,datasette-two
* Load no plugins if DATASETTE_LOAD_PLUGINS=''
* Automated tests in a Bash script for DATASETTE_LOAD_PLUGINS
2023-08-30 15:12:24 -07:00
Simon Willison 30b28c8367 Release 1.0a5
Refs #2093, #2102, #2153, #2156, #2157
2023-08-29 10:17:54 -07:00
Simon Willison bb12229794 Rename core_ to catalog_, closes #2163 2023-08-29 10:01:28 -07:00
Simon Willison 50da908213
Cascade for restricted token view-table/view-database/view-instance operations (#2154)
Closes #2102

* Permission is now a dataclass, not a namedtuple - refs https://github.com/simonw/datasette/pull/2154/#discussion_r1308087800
* datasette.get_permission() method
2023-08-29 09:32:34 -07:00
Simon Willison a1f3d75a52
Need to stick to Python 3.9 for gcloud 2023-08-28 20:46:12 -07:00
Alex Garcia 92b8bf38c0
Add new `--internal internal.db` option, deprecate legacy `_internal` database
Refs:
- #2157 
---------

Co-authored-by: Simon Willison <swillison@gmail.com>
2023-08-28 20:24:23 -07:00
dependabot[bot] d28f12092d
Bump sphinx, furo, blacken-docs dependencies (#2160)
* Bump the python-packages group with 3 updates

Bumps the python-packages group with 3 updates: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo) and [blacken-docs](https://github.com/asottile/blacken-docs).


Updates `sphinx` from 7.1.2 to 7.2.4
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v7.1.2...v7.2.4)

Updates `furo` from 2023.7.26 to 2023.8.19
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2023.07.26...2023.08.19)

Updates `blacken-docs` from 1.15.0 to 1.16.0
- [Changelog](https://github.com/adamchainz/blacken-docs/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/asottile/blacken-docs/compare/1.15.0...1.16.0)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
- dependency-name: blacken-docs
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Simon Willison <swillison@gmail.com>
2023-08-28 17:38:32 -07:00
Simon Willison 2e2825869f Test for --get --actor, refs #2153 2023-08-28 13:18:24 -07:00
Simon Willison d8351b08ed datasette --get --actor 'JSON' option, closes #2153
Refs #2154
2023-08-28 13:15:38 -07:00
Simon Willison d9aad1fd04
-s/--setting x y gets merged into datasette.yml, refs #2143, #2156
This change updates the `-s/--setting` option to `datasette serve` to allow it to be used to set arbitrarily complex nested settings in a way that is compatible with the new `-c datasette.yml` work happening in:
- #2143

It will enable things like this:
```
datasette data.db --setting plugins.datasette-ripgrep.path "/home/simon/code"
```
For the moment though it just affects [settings](https://docs.datasette.io/en/1.0a4/settings.html) - so you can do this:
```
datasette data.db --setting settings.sql_time_limit_ms 3500
```
I've also implemented a backwards compatibility mechanism, so if you use it this way (the old way):
```
datasette data.db --setting sql_time_limit_ms 3500
```
It will notice that the setting you passed is one of Datasette's core settings, and will treat that as if you said `settings.sql_time_limit_ms` instead.
2023-08-28 13:06:14 -07:00
Simon Willison 527cec66b0 utils.pairs_to_nested_config(), refs #2156, #2143 2023-08-24 11:21:15 -07:00
Simon Willison bdf59eb7db No more default to 15% on labels, closes #2150 2023-08-23 11:35:42 -07:00
Simon Willison 64fd1d788e Applied Cog, refs #2143, #2149 2023-08-22 19:57:46 -07:00
Simon Willison 2ce7872e3b -c shortcut for --config - refs #2143, #2149 2023-08-22 19:33:26 -07:00
Alex Garcia 17ec309e14
Start datasette.json, re-add --config, rm settings.json
The first step in defining the new `datasette.json/yaml` configuration mechanism.

Refs #2093, #2143, #493
2023-08-22 18:26:11 -07:00
Simon Willison 01e0558825
Merge pull request from GHSA-7ch3-7pp7-7cpq
* API explorer requires view-instance permission

* Check database/table permissions on /-/api page

* Release notes for 1.0a4

Refs #2119, #2133, #2138, #2140

Refs https://github.com/simonw/datasette/security/advisories/GHSA-7ch3-7pp7-7cpq
2023-08-22 10:10:01 -07:00
Simon Willison 943df09dcc Remove all remaining "$ " prefixes from docs, closes #2140
Also document sqlite-utils create-view
2023-08-11 10:44:34 -07:00
Simon Willison 4535568f2c Fixed display of database color
Closes #2139, closes #2119
2023-08-10 22:16:19 -07:00
Simon Willison 33251d04e7 Canned query write counters demo, refs #2134 2023-08-09 17:56:27 -07:00
Simon Willison a3593c9015 on_success_message_sql, closes #2138 2023-08-09 17:32:07 -07:00
Simon Willison 4a42476bb7 datasette plugins --requirements, closes #2133 2023-08-09 15:04:16 -07:00
Simon Willison 19ab4552e2 Release 1.0a3
Closes #2135

Refs #262, #782, #1153, #1970, #2007, #2079, #2106, #2127, #2130
2023-08-09 12:13:11 -07:00
Simon Willison 90cb9ca58d JSON changes in release notes, refs #2135 2023-08-09 12:11:16 -07:00
Simon Willison 856ca68d94 Update default JSON representation docs, refs #2135 2023-08-09 12:04:40 -07:00
Simon Willison e34d09c6ec Don't include columns in query JSON, refs #2136 2023-08-09 12:01:59 -07:00
Simon Willison 8920d425f4 1.0a3 release notes, smaller changes section - refs #2135 2023-08-09 10:20:58 -07:00
Simon Willison 26be9f0445 Refactored canned query code, replaced old QueryView, closes #2114 2023-08-09 08:26:52 -07:00
Simon Willison cd57b0f712 Brought back parameter fields, closes #2132 2023-08-08 06:45:04 -07:00
Simon Willison 1377a290cd
New JSON design for query views (#2118)
* Refs #2111, closes #2110
* New Context dataclass/subclass mechanism, refs #2127
* Define QueryContext and extract get_tables() method, refs #2127
* Fix OPTIONS bug by porting DaatbaseView to be a View subclass
* Expose async_view_for_class.view_class for test_routes test
* Error/truncated aruments for renderers, closes #2130
2023-08-07 18:47:39 -07:00
dependabot[bot] 5139c0886a
Bump the python-packages group with 3 updates (#2128)
Bumps the python-packages group with 3 updates: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo) and [blacken-docs](https://github.com/asottile/blacken-docs).

Updates `sphinx` from 6.1.3 to 7.1.2
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v6.1.3...v7.1.2)

Updates `furo` from 2023.3.27 to 2023.7.26
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2023.03.27...2023.07.26)

Updates `blacken-docs` from 1.14.0 to 1.15.0
- [Changelog](https://github.com/adamchainz/blacken-docs/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/asottile/blacken-docs/compare/1.14.0...1.15.0)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: python-packages
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
- dependency-name: blacken-docs
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-07 09:19:23 -07:00
Simon Willison adf54f5c80
Use dependabot grouped updates 2023-08-07 08:45:10 -07:00
Simon Willison 0818182399 Update cli-reference for editable change, refs #2106 2023-07-26 11:52:57 -07:00
Simon Willison 18dd88ee4d Refactored DatabaseDownload to database_download, closes #2116 2023-07-26 11:43:55 -07:00
Simon Willison dc5171eb1b Make editable work with -e '.[test]', refs #2106 2023-07-26 11:28:03 -07:00
Simon Willison 278ac91a4d datasette install -e option, closes #2106 2023-07-22 11:42:46 -07:00
dependabot[bot] 3a51ca9014
Bump black from 23.3.0 to 23.7.0 (#2099)
Bumps [black](https://github.com/psf/black) from 23.3.0 to 23.7.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.3.0...23.7.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-07-21 14:19:24 -07:00
Simon Willison 0f7192b615 One last YAML/JSON change, closes #1153 2023-07-08 13:08:09 -07:00
Simon Willison 42ca574720 Removed accidental test code I added, refs #1153 2023-07-08 12:50:22 -07:00
Simon Willison 2fd871a906 Drop support for Python 3.7, refs #2097 2023-07-08 11:40:19 -07:00
Simon Willison 45e6d370ce Install docs dependencies for tests, refs #1153 2023-07-08 11:35:15 -07:00
Simon Willison 50a6355c08 Workaround to get sphinx-build working again, refs 1153 2023-07-08 11:22:21 -07:00
Simon Willison c076fb65e0 Applied sphinx-inline-tabs to remaining examples, refs #1153 2023-07-08 11:00:08 -07:00
Simon Willison 0183e1a72d Preserve JSON key order in YAML, refs #1153 2023-07-08 10:27:36 -07:00
Simon Willison 38fcc96e67 Removed duplicate imports, refs #1153 2023-07-08 10:09:26 -07:00
Simon Willison 3b336d8071 Utility function for cog for generating YAML/JSON tabs, refs #1153 2023-07-08 09:37:47 -07:00
Simon Willison d7b21a8623 metadata.yaml now treated as default in docs
Added sphinx-inline-tabs to provide JSON and YAML tabs to show examples.

Refs #1153
2023-07-08 09:37:01 -07:00
Simon Willison 8cd60fd1d8 Homepage test now just asserts isinstance(x, int) - closes #2092 2023-06-29 08:24:09 -07:00
Simon Willison c39d600aef Fix all E741 Ambiguous variable name warnings, refs #2090 2023-06-29 08:05:24 -07:00
Simon Willison 99ba051188 Fixed spelling error, refs #2089
Also ensure codespell runs as part of just lint
2023-06-29 07:46:22 -07:00
Simon Willison 84b32b447a Justfile I use for local development
Now with codespell, refs #2089
2023-06-29 07:44:13 -07:00
Simon Willison d45a7213ed codespell>=2.5.5, also spellcheck README - refs #2089 2023-06-29 07:43:01 -07:00
dependabot[bot] ede6203618
Bump blacken-docs from 1.13.0 to 1.14.0 (#2083)
Bumps [blacken-docs](https://github.com/asottile/blacken-docs) from 1.13.0 to 1.14.0.
- [Changelog](https://github.com/adamchainz/blacken-docs/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/asottile/blacken-docs/compare/1.13.0...1.14.0)

---
updated-dependencies:
- dependency-name: blacken-docs
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-06-29 07:31:54 -07:00
Simon Willison d1d78ec0eb
Better docs for startup() hook 2023-06-23 13:06:35 -07:00
Simon Willison dda99fc09f
New View base class (#2080)
* New View base class, closes #2078
* Use new View subclass for PatternPortfolioView
2023-05-25 17:18:43 -07:00
Simon Willison b49fa446d6 --cors Access-Control-Max-Age: 3600, closes #2079 2023-05-25 15:05:58 -07:00
Simon Willison 9584879534 Rename callable.py to check_callable.py, refs #2078 2023-05-25 11:49:40 -07:00
Simon Willison 2e43a14da1 datasette.utils.check_callable(obj) - refs #2078 2023-05-25 11:35:34 -07:00
Simon Willison 49184c569c
Action: Deploy a Datasette branch preview to Vercel
Closes #2070
2023-05-09 09:24:28 -07:00
Simon Willison d3d16b5ccf
Build docs with 3.11 on ReadTheDocs
Inspired by https://github.com/simonw/sqlite-utils/issues/540
2023-05-07 11:44:27 -07:00
Simon Willison 55c526a537 Add pip as a dependency too, for Rye - refs #2065 2023-04-26 22:07:35 -07:00
Simon Willison 0b0c5cd7a9 Hopeful fix for Python 3.7 httpx failure, refs #2066 2023-04-26 21:20:38 -07:00
Simon Willison 249fcf8e3e
Add setuptools to dependencies
Refs #2065
2023-04-26 20:36:10 -07:00
Simon Willison 5890a20c37 Mention API tokens in DATASETTE_SECRET docs 2023-03-31 09:45:16 -07:00
Simon Willison 4c1e277edb Updated JSON API shape documentation, refs #262 2023-03-28 23:21:42 -07:00
dependabot[bot] 30c88e3570
Bump black from 22.12.0 to 23.3.0 (#2047)
Bumps [black](https://github.com/psf/black) from 22.12.0 to 23.3.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/22.12.0...23.3.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Simon Willison <swillison@gmail.com>
2023-03-28 23:12:05 -07:00
dependabot[bot] bbd5489dbc
Bump blacken-docs from 1.12.1 to 1.13.0 (#1992)
Bumps [blacken-docs](https://github.com/asottile/blacken-docs) from 1.12.1 to 1.13.0.
- [Release notes](https://github.com/asottile/blacken-docs/releases)
- [Changelog](https://github.com/adamchainz/blacken-docs/blob/main/HISTORY.rst)
- [Commits](https://github.com/asottile/blacken-docs/compare/v1.12.1...1.13.0)

---
updated-dependencies:
- dependency-name: blacken-docs
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-28 23:11:33 -07:00
dependabot[bot] d52402447e
Bump sphinx from 6.1.2 to 6.1.3 (#1986)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.2 to 6.1.3.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v6.1.2...v6.1.3)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-28 23:09:48 -07:00
dependabot[bot] 848a9a420d
Bump furo from 2022.12.7 to 2023.3.27 (#2046)
Bumps [furo](https://github.com/pradyunsg/furo) from 2022.12.7 to 2023.3.27.
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2022.12.07...2023.03.27)

---
updated-dependencies:
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-28 23:08:00 -07:00
Simon Willison 651b78d8e6 Redesign ?_extra=extras a bit, refs #262 2023-03-28 23:07:30 -07:00
Simon Willison c025b0180f
Drop jQuery dependency 2023-03-26 16:38:58 -07:00
Simon Willison db8cf899e2
Use block scripts instead, refs #1608 2023-03-26 16:27:58 -07:00
Simon Willison 5c1cfa451d
Link docs /latest/ to /stable/ again
Re-implementing the pattern from https://til.simonwillison.net/readthedocs/link-from-latest-to-stable

Refs #1608
2023-03-26 16:23:28 -07:00
Simon Willison 3feed1f66e Re-applied Black 2023-03-22 15:54:35 -07:00
Simon Willison d97e82df3c
?_extra= support and TableView refactor to table_view
* Implemented ?_extra= option for JSON views, refs #262
* New dependency: asyncinject
* Remove now-obsolete TableView class
2023-03-22 15:49:39 -07:00
Simon Willison 56b0758a5f 0.64 release notes, refs #2036 2023-03-08 12:52:37 -08:00
Simon Willison 25fdbe6b27 use tmpdir instead of isolated_filesystem, refs #2037
Should hopefully get tests passing for #2036 too.
2023-03-08 12:33:23 -08:00
Simon Willison bd39cb4805 Use service-specific image ID for Cloud Run deploys, refs #2036 2023-03-08 12:25:55 -08:00
Simon Willison 1ad92a1d87 datasette install -r requirements.txt, closes #2033 2023-03-06 14:27:30 -08:00
Dustin Rodrigues a53b893c46
Add Python 3.11 classifier (#2028)
Thanks, @dtrodrigues
2023-03-06 13:01:19 -08:00
Simon Willison 0b4a286914 render_cell(..., request) argument, closes #2007 2023-01-27 19:34:14 -08:00
Simon Willison e4ebef082d
Fixed link text 2023-01-21 07:37:29 -08:00
Simon Willison 6a352e99ab
Added missing import to example 2023-01-11 11:04:11 -08:00
Simon Willison 25a612fe09 Release 0.64.1
Refs #1985, #1987
2023-01-11 10:23:49 -08:00
Simon Willison 50fd94e04f Raise ValueError if Datasette(files=) is a string, refs #1985 2023-01-11 10:13:20 -08:00
Simon Willison 2c86774179
Link to non-spam Python 3 setup instructions
Refs #1987
2023-01-11 09:59:40 -08:00
Simon Willison 8e70734043
Upgrade Sphinx, closes #1971 2023-01-09 18:02:32 -08:00
Simon Willison 4880638f13
setup-gcloud 318.0.0
Refs https://til.simonwillison.net/googlecloud/gcloud-error-workaround
2023-01-09 16:02:02 -08:00
Simon Willison 7dd671310a Release notes for 0.64, with a warning against arbitrary SQL with SpatiaLite
Refs #1409, #1771, #1979

Refs https://github.com/simonw/datasette.io/issues/132
2023-01-09 08:40:24 -08:00
Simon Willison 5e672df168 Explicitly explain allow_sql: false 2023-01-09 08:25:07 -08:00
Simon Willison 7b48664d75 Better error for --load-extensions, refs #1979 2023-01-07 15:56:03 -08:00
Simon Willison 0f7c71a86f What to do if extensions will not load, refs #1979 2023-01-07 15:49:28 -08:00
Simon Willison fee658ad05 Improved wording in allow_sql docs 2023-01-05 09:22:49 -08:00
Simon Willison c41278b46f default_allow_sql setting, closes #1409
Refs #1410
2023-01-04 16:51:26 -08:00
Simon Willison adfcec51d6 Fixed broken example links in _where= docs 2023-01-04 16:51:26 -08:00
Simon Willison deb5fcbed4
Fixed table_action example in docs 2023-01-04 10:25:04 -08:00
Simon Willison 572bdb5b80 Applied Black, refs #782 2022-12-31 19:32:07 -08:00
Simon Willison d94a3c4326
No need to link to _shape=objects any more
It's the default now. Refs #782
2022-12-31 17:42:48 -08:00
Simon Willison 3c352b7132 Applied Black, refs #782 2022-12-31 13:17:54 -08:00
Simon Willison 5bbe2bcc50 Rename filtered_table_rows_count to count, refs #782 2022-12-31 12:52:57 -08:00
Simon Willison a2dca62360 Fix for extension tests I broke, refs #782 2022-12-31 11:21:15 -08:00
Simon Willison ca07fff3e2 Pin Sphinx 5.3.0, refs #1971
Furo is not yet compatible with Sphinx 6.0
2022-12-31 11:13:56 -08:00
Simon Willison 3af313e165 Fix for Sphinx extlinks warning, closes #1972 2022-12-31 11:13:14 -08:00
Chris Holdgraf 994ce46ed4
Add favicon to documentation (#1967)
Co-authored-by: Simon Willison <swillison@gmail.com>
2022-12-31 11:00:31 -08:00
Simon Willison 8059c8a27c Fixed typo 2022-12-31 10:54:25 -08:00
Simon Willison 8aa9cf629c Store null instead of 'None' in _internal database table, closes #1970 2022-12-31 10:52:37 -08:00
Simon Willison 234230e595 Default JSON shape is now objects - refs #1914, #1709 2022-12-31 10:52:37 -08:00
Simon Willison 1fda4806d4 Small documentation tweaks 2022-12-31 10:52:37 -08:00
Simon Willison c635f6ebac Moved CORS bit to its own documentation section 2022-12-31 10:52:37 -08:00
Simon Willison 3bd05b854a -e/--expires-after in create-token docs 2022-12-31 10:52:37 -08:00
Simon Willison 677ba9dddd Fix rST warning in changelog 2022-12-31 10:52:37 -08:00
Jan Lehnardt e03aed0002 Detect server start/stop more reliably.
This is useful, especially in testing, since your test
hosts might not reliabliy start the server within two
seconds, so we do a definite check before progressing.

By the same token, after `kill $server_pid` wait for
the pid to be gone from the process list.

Since now the script can end prematurely, I also added
a cleanup function to make sure the temporary certs are
removed in any case.

n.b. this could also be done with the use of `trap 'fn'
ERR` but that felt like a bit too much magic for this
short a script.
2022-12-18 08:01:51 -08:00
Simon Willison a21c00b54d .select-wrapper:focus-within for accessibility, closes #1771 2022-12-17 22:28:07 -08:00
Simon Willison 23335e123b Release notes for 0.63.3
Refs #1963
2022-12-17 19:26:25 -08:00
Simon Willison a27c0a0124 Deploy docs on publish using Python 3.9
A workaround for gcloud setup, see:

https://til.simonwillison.net/googlecloud/gcloud-error-workaround

Refs #1963
2022-12-17 19:24:48 -08:00
Simon Willison 0ea139dfe5 Run new HTTPS test in CI, refs #1955 2022-12-17 18:38:26 -08:00
Simon Willison d1d369456a Move HTTPS test to a bash script
See https://github.com/simonw/datasette/issues/1955#issuecomment-1356627931
2022-12-17 18:33:07 -08:00
Simon Willison 8b73fc6b47 Put AsgiLifestyle back so server starts up again, refs #1955 2022-12-17 17:22:00 -08:00
Simon Willison 63fb750f39 Replace AsgiLifespan with AsgiRunOnFirstRequest, refs #1955 2022-12-17 14:14:34 -08:00
Simon Willison 89cffcf14c Reset _metadata_local in a couple of tests
Refs https://github.com/simonw/datasette/pull/1960#issuecomment-1356476886
2022-12-17 13:47:55 -08:00
Simon Willison 9c43b4164d Removed @pytest.mark.ds_client mark - refs #1959
I don't need it - can run 'pytest -k ds_client' instead.

See https://github.com/simonw/datasette/pull/1960#issuecomment-1355685828
2022-12-17 13:47:55 -08:00
Simon Willison 0e42444866 invoke_startup() inside ds_client fixture, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison e70974a4f1 Ran Black, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison 42a66c2f04 A bunch of remaining ds_client conversions, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison be95359a80 ds_client for test_permissions.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison ef74d0ff70 ds_client for test_internal_db.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison 4a151b15cc ds_client for test_filters.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison 30f1a0705b ds_client for test_plugins.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison b998c2793f test_facets.py using ds_client, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison bc88491cb7 ds_client for test_table_api.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison 1335bcb893 Use my own global variable instead of scope=session
Refs https://github.com/simonw/datasette/pull/1960#issuecomment-1354148139
2022-12-17 13:47:55 -08:00
Simon Willison ebd3358e49 ds_client for test_table_html.py 2022-12-17 13:47:55 -08:00
Simon Willison d94d363ec0 Don't use pytest_asyncio.fixture(scope="session") any more, refs #1959
Also got rid of the weird memory=False hack:

https://github.com/simonw/datasette/pull/1960#issuecomment-1354053151
2022-12-17 13:47:55 -08:00
Simon Willison 95900b9d02 Port app_client to ds_client for most of test_html.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison 3001eec66a ds_client for test_csv.py and test_canned_queries.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison 425ac4357f Ported app_client to ds_client where possible in test_auth.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison b077e63dc6 Ported test_api.py app_client test to ds_client, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison 5ee954e34b
Link to annotated release notes for 1.0a2 2022-12-15 17:03:37 -08:00
Simon Willison 013496862f
Try click.echo() instead
This ensures the URL is output correctly when running under Docker.

Closes #1958
2022-12-15 16:55:17 -08:00
Simon Willison 0b68996cc5 Revert "Replace AsgiLifespan with AsgiRunOnFirstRequest, refs #1955"
This reverts commit dc18f62089.
2022-12-15 13:06:45 -08:00
Simon Willison 38d28dd958 Revert "Try running every test at once, refs #1955"
This reverts commit 51ee8caa4a.
2022-12-15 13:05:33 -08:00
Simon Willison 51ee8caa4a Try running every test at once, refs #1955 2022-12-15 12:51:18 -08:00
Simon Willison dc18f62089 Replace AsgiLifespan with AsgiRunOnFirstRequest, refs #1955 2022-12-15 09:34:07 -08:00
Simon Willison e054704fb6 Added missing rST label 2022-12-14 21:38:28 -08:00
Simon Willison 6e1e815c78
It's an update-or-insert 2022-12-14 18:41:30 -08:00
Simon Willison 8b9d7fdbd8 Fixed typo in release notes, refs #1953 2022-12-14 18:02:42 -08:00
Simon Willison 8cac6ff301 Release 1.0a2
Refs #1636, #1855, #1878, #1927, #1937, #1940, #1947, #1951

Closes #1953
2022-12-14 18:01:02 -08:00
Simon Willison 9ad76d279e Applied blacken-docs, refs #1937 2022-12-14 14:49:13 -08:00
Simon Willison c094dde3ff Extra permission rules for /-/create, closes #1937 2022-12-14 12:21:18 -08:00
Simon Willison e238df3959 Handle non-initials in permission_allowed_actor_restrictions, closes #1956 2022-12-14 12:04:23 -08:00
Simon Willison 1a3dcf4943 Don't include _memory on /-/create-token, refs #1947 2022-12-13 21:19:31 -08:00
Simon Willison 420d0a0ee2 Tests for /-/create-token with restrictions, closes #1947 2022-12-13 21:13:20 -08:00
Simon Willison 6e5ab9e7b3 Note in docs about new /-/create-token features, refs #1947 2022-12-13 21:07:03 -08:00
Simon Willison d98a8effb1 UI for restricting permissions on /-/create-token, refs #1947
Also fixes test failures I introduced in #1951
2022-12-13 21:03:17 -08:00
Simon Willison fdf7c27b54 datasette.create_token() method, closes #1951 2022-12-13 18:42:01 -08:00
Simon Willison d4cc1374f4 Improved --help for create-token, refs #1947 2022-12-13 14:28:59 -08:00
Simon Willison f84acae98e Return 400 errors for ?_sort errors, closes #1950 2022-12-13 14:23:17 -08:00
dependabot[bot] d4b98d3924
Bump black from 22.10.0 to 22.12.0 (#1944)
Bumps [black](https://github.com/psf/black) from 22.10.0 to 22.12.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/22.10.0...22.12.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-12-12 21:23:30 -08:00
Simon Willison 45979eb723 Rename permission created by demo plugin
It was showing up as 'new-permission' on https://latest.datasette.io/-/permissions
which I thought was confusing
2022-12-12 21:21:01 -08:00
Simon Willison 34ad574bac Don't hard-code permissions in permission_allowed_actor_restrictions, refs #1855 2022-12-12 21:14:40 -08:00
Simon Willison a1a372f179 /-/actor no longer requires view-instance, refs #1945 2022-12-12 21:06:30 -08:00
Simon Willison 260fbb598e Fix some failing tests, refs #1855 2022-12-12 21:00:40 -08:00
Simon Willison 2aa2adaa8b Docs for new create-token options, refs #1855 2022-12-12 20:56:40 -08:00
Simon Willison 809fad2392 Tests for datasette create-token restrictions, refs #1855 2022-12-12 20:44:19 -08:00
Simon Willison c13dada2f8 datasette --get --token option, closes #1946, refs #1855 2022-12-12 20:36:42 -08:00
Simon Willison 14f1cc4984 Update CLI reference help, refs #1855 2022-12-12 20:21:48 -08:00
Simon Willison 98eff2cde9 Ignore spelling of alls, refs #1855 2022-12-12 20:19:17 -08:00
Simon Willison e95b490d88 Move create-token command into cli.py, refs #1855 2022-12-12 20:18:42 -08:00
Simon Willison 9cc1a7c4c8 create-token command can now create restricted tokens, refs #1855 2022-12-12 20:15:56 -08:00
Simon Willison c6a811237c /-/actor.json no longer requires view-instance, closes #1945 2022-12-12 20:11:51 -08:00
Simon Willison 3e6a208ba3 Rename 't' to 'r' in '_r' actor format, refs #1855 2022-12-12 19:27:34 -08:00
Simon Willison c5d30b58a1 Implemented metadata permissions: property, closes #1636 2022-12-12 18:40:45 -08:00
Simon Willison 8bf06a76b5
register_permissions() plugin hook (#1940)
* Docs for permissions: in metadata, refs #1636
* Refactor default_permissions.py to help with implementation of #1636
* register_permissions() plugin hook, closes #1939 - also refs #1938
* Tests for register_permissions() hook, refs #1939
* Documentation for datasette.permissions, refs #1939
* permission_allowed() falls back on Permission.default, refs #1939
* Raise StartupError on duplicate permissions
* Allow dupe permisisons if exact matches
2022-12-12 18:05:54 -08:00
David Larlet e539c1c024
Typo in JSON API `Updating a row` documentation (#1930) 2022-12-08 13:12:34 -08:00
dependabot[bot] bffefc7db0
Bump furo from 2022.9.29 to 2022.12.7 (#1935)
Bumps [furo](https://github.com/pradyunsg/furo) from 2022.9.29 to 2022.12.7.
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2022.09.29...2022.12.07)

---
updated-dependencies:
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-12-08 13:12:07 -08:00
Simon Willison 05daa15aac Documentation for /-/create ignore/replace, closes #1927 2022-12-07 17:42:54 -08:00
Simon Willison 34cffff02a Refactor _headers() for write API tests 2022-12-07 17:39:07 -08:00
Simon Willison dee18ed8ce test_create_table_error_rows_twice_with_duplicates, refs #1927 2022-12-07 17:29:24 -08:00
Simon Willison 9342b60f14 test_create_table_error_if_pk_changed, refs #1927 2022-12-07 17:27:01 -08:00
Simon Willison 6b27537988 ignore/replace to create requires pk, refs #1927 2022-12-07 17:18:40 -08:00
Simon Willison 272982e8a6
/db/table/-/upsert API
Close #1878

Also made a few tweaks to how _r works in tokens and actors,
refs #1855 - I needed that mechanism for the tests.
2022-12-07 17:12:15 -08:00
Simon Willison 93ababe6f7 Initial attempt at insert/replace for /-/create, refs #1927 2022-12-02 23:00:18 -08:00
Simon Willison cab5b60e09
datasette-auth-passwords is another actor_from_request example 2022-12-02 08:39:52 -08:00
Simon Willison d7e5e3c9f9 Fix for todomvc permission check
Refs https://github.com/simonw/todomvc-datasette/issues/2
2022-12-01 17:38:23 -08:00
Simon Willison 27efa8c381 todomvc permissions and fixed DATASETTE_SECRET for new demo
Refs https://github.com/simonw/todomvc-datasette/issues/2
2022-12-01 17:29:44 -08:00
Simon Willison 03f247845e
datasette-ephemeral-tables>=0.2.2
Refs https://github.com/simonw/datasette-ephemeral-tables/issues/5
2022-12-01 16:37:53 -08:00
Simon Willison e2f71c6f81
Bump ephemeral limit up to 15 minutes per table
Refs #1915
2022-12-01 15:44:43 -08:00
Simon Willison 692fbfc40a Release 1.0a1
Refs #1922, #1917, #1915, #1916, #1918, #1924
2022-12-01 13:30:39 -08:00
Simon Willison f3c8da7acd MAke the sign in as root button bigger on latest.datasette.io 2022-12-01 13:29:31 -08:00
Simon Willison 99da46f725 Docs for insert API ignore/replace - closes #1924 2022-11-30 18:07:48 -08:00
Simon Willison 7fde34cfcb Documentation and test for UNIQUE constraint failed, refs #1924 2022-11-30 18:05:29 -08:00
Simon Willison 9a1536b52a Move CORS headers into base class, refs #1922 2022-11-30 15:48:32 -08:00
Simon Willison 31d6a0bc5e Applied Black, refs #1922 2022-11-30 15:17:39 -08:00
Simon Willison f0fadc28dd Access-Control-Allow-Headers: Authorization, Content-Type - refs #1922 2022-11-30 15:11:18 -08:00
Simon Willison 418eb7c5c6
Try Python 3.9 for Cloud Run deploy, refs #1923 2022-11-30 14:59:17 -08:00
Simon Willison ec1dde5dd2
Try version 318.0.0 of google-github-actions/setup-gcloud
Refs #1923
2022-11-30 14:50:53 -08:00
Simon Willison 2cd7ecaa0a Apply Black, refs #1922 2022-11-30 13:54:47 -08:00
Simon Willison 6bfd71f5c6 Access-Control-Allow-Methods: GET, POST, HEAD, OPTIONS - refs #1922 2022-11-30 12:25:12 -08:00
Simon Willison 4c18730e71 Update tests to export 200 for OPTIONS calls, refs #1922 2022-11-30 10:29:48 -08:00
Simon Willison 48725bb4ea CORS headers for write APIs, refs #1922 2022-11-30 09:27:10 -08:00
Simon Willison 4ddd77e512
No need for pkginfo pin any more
The upstream issue was fixed. Refs #1913
2022-11-29 21:25:40 -08:00
Simon Willison 8404b21556 405 method not allowed for GET to POST endpoints, closes #1916 2022-11-29 21:15:13 -08:00
Simon Willison 5518397338 Show mutable DBs first in API explorer, closes #1918 2022-11-29 21:07:51 -08:00
Simon Willison 6b47734c04 _memory database should not be mutable, closes #1917 2022-11-29 21:06:52 -08:00
Simon Willison 9f5321ff1e
latest now uses datasette-ephemeral-tables>=0.2.1
Fix for https://github.com/simonw/datasette-ephemeral-tables/issues/4
2022-11-29 20:43:27 -08:00
Simon Willison 7588d27f4a
latest.datasette.io uses datasette-ephemeral-tables>=0.2
To show the countdown timer from:
https://github.com/simonw/datasette-ephemeral-tables/issues/3

Refs #1915
2022-11-29 17:51:15 -08:00
Simon Willison 53a8e5bae5 Deploy datasette-ephemeral-tables plugin
Refs #1915
2022-11-29 15:58:25 -08:00
Simon Willison 4a0bd960e9 Pin pkginfo==1.8.3 as workaround for #1913 2022-11-29 11:57:54 -08:00
Simon Willison 07aad51176
Merge pull request #1912 from simonw/1.0-dev
Merge 1.0-dev (with initial write API) back into main
2022-11-29 11:39:36 -08:00
Simon Willison b8fc8e2cd7
Merge branch 'main' into 1.0-dev 2022-11-29 11:34:39 -08:00
Simon Willison 4d49a5a397 Release 1.0a0
Refs #1850, #1851, #1852, #1856, #1858, #1863, #1864, #1871, #1874, #1882

Closes #1891
2022-11-29 11:22:54 -08:00
Simon Willison 6bda225786 Tests for rowid and compound pk row deletion, closes #1864 2022-11-29 10:53:55 -08:00
Simon Willison 1154048f79 Compound primary key support for /db/-/create - closes #1911
Needed for tests in #1864
2022-11-29 10:47:48 -08:00
Simon Willison 484bef0d3b /db/table/pk/-/update endpoint, closes #1863 2022-11-29 10:06:19 -08:00
Simon Willison 21f8aab531 Release 0.63.2
Refs #1904, #1905
2022-11-18 16:59:05 -08:00
Simon Willison 733447d7c7 Upgrade to Python 3.11 on Heroku, refs #1905 2022-11-18 16:44:46 -08:00
Simon Willison 72ac9bf82f --generate-dir option to publish heroku, refs #1905 2022-11-18 16:34:33 -08:00
Simon Willison 5be728c2dd Pin httpx in Pyodide test, refs #1904
Should help get tests to pass for #1896 too
2022-11-18 14:52:05 -08:00
Simon Willison 0fe1619910 Pin httpx in Pyodide test, refs #1904
Should help get tests to pass for #1896 too
2022-11-18 14:50:19 -08:00
Simon Willison ee64130fa8 Refactor to use new resolve_database/table/row methods, refs #1896 2022-11-18 14:46:25 -08:00
Simon Willison c588a89f26 db.view_exists() method, needed by #1896 2022-11-18 14:16:38 -08:00
Simon Willison b29ccb59c7 Add test for db.view_names() 2022-11-18 14:13:48 -08:00
Brian Grinstead 3ecd131e57
Use DOMContentLoaded instead of load event for CodeMirror initialization. Closes #1894 (#1898) 2022-11-17 23:29:00 -08:00
Simon Willison 63f923d013 Remove min-height on CodeMirror, closes #1899 2022-11-17 23:21:00 -08:00
Simon Willison 3db37e9a21 Remove min-height on CodeMirror, closes #1899 2022-11-17 23:20:49 -08:00
Simon Willison 83a6872d1b Include views in SQL autocomplete, refs #1897 2022-11-17 18:53:48 -08:00
Simon Willison 52bf222d48 /db/-/create API endpoint, closes #1882 2022-11-17 17:24:46 -08:00
Simon Willison 98611b3da0 Include SQL schema for CodeMirror on query pages, closes #1897
Refs #1893
2022-11-17 17:24:44 -08:00
Simon Willison 22bade4562 Use table_columns context for CodeMirror schema, if available - refs #1897 2022-11-17 17:23:35 -08:00
Simon Willison 8494be07ae Prettier should ignore bundle.js file - refs #1893 2022-11-17 17:23:35 -08:00
Brian Grinstead 710be684b8 Upgrade to CodeMirror 6, add SQL autocomplete (#1893)
* Upgrade to CodeMirror 6
* Update contributing docs
* Change how resizing works
* Define a custom SQLite autocomplete dialect
* Add meta-enter to submit
* Add fixture schema for testing
2022-11-17 17:23:35 -08:00
Simon Willison b35522c6dd Updated test, refs #1890 2022-11-17 17:23:35 -08:00
Simon Willison b470ab5c41 Fix for datalist against foreign key facets
Refs https://github.com/simonw/datasette/issues/1890#issuecomment-1314850524
2022-11-17 17:23:35 -08:00
Simon Willison df2cc923c6 Applied prettier, refs #1890 2022-11-17 17:23:35 -08:00
Simon Willison e15ff2d86e datalist autocomplete for facet filters, refs #1890 2022-11-17 17:23:35 -08:00
Simon Willison 3e61a41b9b Include SQL schema for CodeMirror on query pages, closes #1897
Refs #1893
2022-11-17 17:19:37 -08:00
Simon Willison aff7a6985e Use table_columns context for CodeMirror schema, if available - refs #1897 2022-11-17 16:41:25 -08:00
Simon Willison 00e233d7a7 Prettier should ignore bundle.js file - refs #1893 2022-11-16 15:53:27 -08:00
Brian Grinstead ae11fa5887
Upgrade to CodeMirror 6, add SQL autocomplete (#1893)
* Upgrade to CodeMirror 6
* Update contributing docs
* Change how resizing works
* Define a custom SQLite autocomplete dialect
* Add meta-enter to submit
* Add fixture schema for testing
2022-11-16 15:49:06 -08:00
Simon Willison 6f610e1d94 Updated test, refs #1890 2022-11-15 19:04:24 -08:00
Simon Willison eac028d3f7 Fix for datalist against foreign key facets
Refs https://github.com/simonw/datasette/issues/1890#issuecomment-1314850524
2022-11-14 22:57:11 -08:00
Simon Willison 3652b7472a Applied prettier, refs #1890 2022-11-14 22:41:10 -08:00
Simon Willison f156bf9e6b datalist autocomplete for facet filters, refs #1890 2022-11-14 22:31:29 -08:00
Simon Willison 187d91d686 /db/-/create API endpoint, closes #1882 2022-11-14 21:57:28 -08:00
Simon Willison 518fc63224 API explorer: no error if you format JSON on empty string
Refs #1871
2022-11-13 22:06:45 -08:00
Simon Willison 575a29c424 API explorer: respect immutability, closes #1888 2022-11-13 22:01:56 -08:00
Simon Willison 264d0ab471 Renamed return_rows to return in insert API
Refs https://github.com/simonw/datasette/issues/1866#issuecomment-1313128913
2022-11-13 21:49:23 -08:00
Simon Willison 65521f03db Error for drop against immutable database, closes #1874 2022-11-13 21:40:10 -08:00
Simon Willison 612da8eae6 confirm: true mechanism for drop table API, closes #1887 2022-11-13 21:17:18 -08:00
Simon Willison db796771e2 Example links for API explorer, closes #1871 2022-11-13 20:58:45 -08:00
Simon Willison c603faac5b API explorer: persist form state in # in URL, refs #1871 2022-11-13 20:12:36 -08:00
Simon Willison ca66ea57d2 GET and POST areas toggle each other, refs #1871 2022-11-13 13:12:51 -08:00
Simon Willison 51d60d7ddf details-menu class to avoid accidential details closure
Refs https://github.com/simonw/datasette/issues/1871#issuecomment-1312821031
2022-11-13 13:06:58 -08:00
Simon Willison f832435b88 Release 0.63.1
Refs ##1843, #1876, #1883
2022-11-12 12:29:59 -08:00
Simon Willison fa9cc9efaf Fix for redirects ignoring base_url, refs #1883 2022-11-12 12:29:59 -08:00
Simon Willison 26262d08f3 Test form actions use prefix, refs #1883 2022-11-12 12:29:59 -08:00
Simon Willison aacf25cf19
Improvements to API token docs, refs #1852 2022-11-05 23:54:32 -07:00
Simon Willison bcc781f4c5 Implementation and tests for _r field on actor, refs #1855
New mechanism for restricting permissions further for a given actor.

This still needs documentation. It will eventually be used by the mechanism to issue
signed API tokens that are only able to perform a subset of actions.

This also adds tests that exercise the POST /-/permissions tool, refs #1881
2022-11-03 17:12:23 -07:00
Simon Willison fb8b6b2311 Refactor _error helper function 2022-11-03 16:36:43 -07:00
Simon Willison 2355067ef5 Tests now close SQLite database connections and files explicitly, refs #1843
Also added a db.close() method to the Database class.
2022-11-03 13:36:11 -07:00
Simon Willison bb030ba46f More margin on /-/allow-debug page 2022-11-02 22:10:59 -07:00
Simon Willison c51d9246b9 Permission check testing tool, refs #1881 2022-11-02 22:10:07 -07:00
Simon Willison 9b5a73ba4c Applied Black 2022-11-02 21:46:05 -07:00
Simon Willison 719e757252 Return method not allowed error in JSON in some situations
Added this while playing with the new API explorer, refs #1871
2022-11-02 20:12:13 -07:00
Simon Willison 000eeb4464 Link to Datasette API docs from /-/api, refs #1871 2022-11-01 22:45:05 -07:00
Simon Willison 042881a522 Ran Prettier, refs #1871 2022-11-01 22:30:16 -07:00
Simon Willison 0b166befc0 API explorer can now do GET, has JSON syntax highlighting
Refs #1871
2022-11-01 17:31:22 -07:00
Simon Willison 497290beaf Handle database errors in /-/insert, refs #1866, #1873
Also improved API explorer to show HTTP status of response, refs #1871
2022-11-01 12:59:17 -07:00
Simon Willison 9bec7c38eb ignore and replace options for bulk inserts, refs #1873
Also removed the rule that you cannot include primary keys in the rows you insert.

And added validation that catches invalid parameters in the incoming JSON.

And renamed "inserted" to "rows" in the returned JSON for return_rows: true
2022-11-01 11:08:17 -07:00
Simon Willison 93a02281da Show interrupted query in resizing textarea, closes #1876 2022-11-01 10:38:24 -07:00
Simon Willison 00632ded30 Initial attempt at /db/table/row/-/delete, refs #1864 2022-10-30 16:16:00 -07:00
Simon Willison 2865d3956f /db/table/-/drop API, closes #1874 2022-10-30 15:17:21 -07:00
Simon Willison 4f16e14d7a Update cog 2022-10-30 14:53:33 -07:00
Simon Willison fedbfcc368 Neater display of output and errors in API explorer, refs #1871 2022-10-30 14:49:07 -07:00
Simon Willison 9eb9ffae3d Drop API token requirement from API explorer, refs #1871 2022-10-30 13:09:55 -07:00
Simon Willison f6bf2d8045 Initial prototype of API explorer at /-/api, refs #1871 2022-10-29 23:20:11 -07:00
Simon Willison c35859ae3d API for bulk inserts, closes #1866 2022-10-29 23:03:45 -07:00
Simon Willison c9b5f5d598 Depend on sqlite-utils>=3.30
Decided to use the most recent version in case I decide later to
use the flatten() utility function.

Refs #1850
2022-10-27 17:58:41 -07:00
Simon Willison 61171f0154 Release 0.63
Refs #1646, #1786, #1787, #1789, #1794, #1800, #1804, #1805, #1808, #1809, #1816, #1819, #1825, #1829, #1831, #1834, #1844, #1853, #1860

Closes #1869
2022-10-27 17:58:41 -07:00
Simon Willison 26af9b9c4a Release notes for 0.63, refs #1869 2022-10-27 17:58:41 -07:00
dependabot[bot] 641bc4453b Bump black from 22.8.0 to 22.10.0 (#1839)
Bumps [black](https://github.com/psf/black) from 22.8.0 to 22.10.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/22.8.0...22.10.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-10-27 17:58:41 -07:00
Forest Gregg 2ea60e12d9 Make hash and size a lazy property (#1837)
* use inspect data for hash and file size
* make hash and cached_size lazy properties
* move hash property near size
2022-10-27 17:58:41 -07:00
Simon Willison 6e788b49ed New URL design /db/table/-/insert, refs #1851 2022-10-27 13:18:05 -07:00
Simon Willison a2a5dff709 Missing tests for insert row API, refs #1851 2022-10-27 12:08:26 -07:00
Simon Willison a51608090b Slight tweak to insert row API design, refs #1851
https://github.com/simonw/datasette/issues/1851#issuecomment-1292997608
2022-10-27 12:06:18 -07:00
Simon Willison 6958e21b5c Add test for /* multi line */ comment, refs #1860 2022-10-27 11:52:06 -07:00
Simon Willison b597bb6b3e Better comment handling in SQL regex, refs #1860 2022-10-27 11:52:06 -07:00
Simon Willison 918f356120 Delete mirror-master-and-main.yml
Closes #1865
2022-10-27 11:52:06 -07:00
Simon Willison 51c436fed2 First draft of insert row write API, refs #1851 2022-10-26 20:57:02 -07:00
Simon Willison 382a871583 max_signed_tokens_ttl setting, closes #1858
Also redesigned token format to include creation time and optional duration.
2022-10-26 20:14:59 -07:00
Simon Willison af5d5d0243 Allow leading comments on SQL queries, refs #1860 2022-10-26 20:14:59 -07:00
Simon Willison 55f860c304 Fix bug with breadcrumbs and request=None, closes #1849 2022-10-26 20:14:59 -07:00
Simon Willison c7956eed77 datasette create-token command, refs #1859 2022-10-25 21:26:12 -07:00
Simon Willison c556fad65d Try to address too many files error again, refs #1843 2022-10-25 21:25:47 -07:00
Simon Willison c36a74ece1 Try shutting down executor in tests to free up thread local SQLite connections, refs #1843 2022-10-25 21:04:39 -07:00
Simon Willison c23fa850e7 allow_signed_tokens setting, closes #1856 2022-10-25 19:55:47 -07:00
Simon Willison 0f013ff497 Mechanism to prevent tokens creating tokens, closes #1857 2022-10-25 19:43:55 -07:00
Simon Willison b29e487bc3 actor_from_request for dstok_ tokens, refs #1852 2022-10-25 19:18:41 -07:00
Simon Willison 7ab091e8ef Tests and docs for /-/create-token, refs #1852 2022-10-25 19:04:05 -07:00
Simon Willison 68ccb7578b dstoke_ prefix for tokens
Refs https://github.com/simonw/datasette/issues/1852#issuecomment-1291290451
2022-10-25 18:40:07 -07:00
Simon Willison 42f8b402e6 Initial prototype of create API token page, refs #1852 2022-10-25 17:07:58 -07:00
Simon Willison f9ae92b377 Poll until servers start, refs #1854 2022-10-25 16:03:36 -07:00
Simon Willison 05b479224f Don't need pysqlite3-binary any more, refs #1853 2022-10-25 16:03:36 -07:00
Simon Willison 6d085af28c Python 3.11 in CI 2022-10-25 16:03:36 -07:00
Simon Willison 02ae1a0029 Upgrade Docker images to Python 3.11, closes #1853 2022-10-25 12:04:25 -07:00
Simon Willison 83adf55b2d Deploy one-dot-zero branch preview 2022-10-23 20:28:15 -07:00
148 zmienionych plików z 19416 dodań i 4213 usunięć

Wyświetl plik

@ -5,9 +5,7 @@ updates:
schedule:
interval: daily
time: "13:00"
open-pull-requests-limit: 10
ignore:
- dependency-name: black
versions:
- 21.4b0
- 21.4b1
groups:
python-packages:
patterns:
- "*"

Wyświetl plik

@ -0,0 +1,35 @@
name: Deploy a Datasette branch preview to Vercel
on:
workflow_dispatch:
inputs:
branch:
description: "Branch to deploy"
required: true
type: string
jobs:
deploy-branch-preview:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python 3.11
uses: actions/setup-python@v4
with:
python-version: "3.11"
- name: Install dependencies
run: |
pip install datasette-publish-vercel
- name: Deploy the preview
env:
VERCEL_TOKEN: ${{ secrets.BRANCH_PREVIEW_VERCEL_TOKEN }}
run: |
export BRANCH="${{ github.event.inputs.branch }}"
wget https://latest.datasette.io/fixtures.db
datasette publish vercel fixtures.db \
--branch $BRANCH \
--project "datasette-preview-$BRANCH" \
--token $VERCEL_TOKEN \
--scope datasette \
--about "Preview of $BRANCH" \
--about_url "https://github.com/simonw/datasette/tree/$BRANCH"

Wyświetl plik

@ -3,7 +3,8 @@ name: Deploy latest.datasette.io
on:
push:
branches:
- main
- main
- 1.0-dev
permissions:
contents: read
@ -16,8 +17,9 @@ jobs:
uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
# gcloud commmand breaks on higher Python versions, so stick with 3.9:
with:
python-version: "3.11"
python-version: "3.9"
- uses: actions/cache@v3
name: Configure pip caching
with:
@ -36,13 +38,19 @@ jobs:
run: |
pytest -n auto -m "not serial"
pytest -m "serial"
- name: Build fixtures.db
run: python tests/fixtures.py fixtures.db fixtures.json plugins --extra-db-filename extra_database.db
- name: Build fixtures.db and other files needed to deploy the demo
run: |-
python tests/fixtures.py \
fixtures.db \
fixtures-config.json \
fixtures-metadata.json \
plugins \
--extra-db-filename extra_database.db
- name: Build docs.db
if: ${{ github.ref == 'refs/heads/main' }}
run: |-
cd docs
sphinx-build -b xml . _build
DISABLE_SPHINX_INLINE_TABS=1 sphinx-build -b xml . _build
sphinx-to-sqlite ../docs.db _build
cd ..
- name: Set up the alternate-route demo
@ -56,25 +64,68 @@ jobs:
db.route = "alternative-route"
' > plugins/alternative_route.py
cp fixtures.db fixtures2.db
- name: And the counters writable canned query demo
run: |
cat > plugins/counters.py <<EOF
from datasette import hookimpl
@hookimpl
def startup(datasette):
db = datasette.add_memory_database("counters")
async def inner():
await db.execute_write("create table if not exists counters (name text primary key, value integer)")
await db.execute_write("insert or ignore into counters (name, value) values ('counter_a', 0)")
await db.execute_write("insert or ignore into counters (name, value) values ('counter_b', 0)")
await db.execute_write("insert or ignore into counters (name, value) values ('counter_c', 0)")
return inner
@hookimpl
def canned_queries(database):
if database == "counters":
queries = {}
for name in ("counter_a", "counter_b", "counter_c"):
queries["increment_{}".format(name)] = {
"sql": "update counters set value = value + 1 where name = '{}'".format(name),
"on_success_message_sql": "select 'Counter {name} incremented to ' || value from counters where name = '{name}'".format(name=name),
"write": True,
}
queries["decrement_{}".format(name)] = {
"sql": "update counters set value = value - 1 where name = '{}'".format(name),
"on_success_message_sql": "select 'Counter {name} decremented to ' || value from counters where name = '{name}'".format(name=name),
"write": True,
}
return queries
EOF
# - name: Make some modifications to metadata.json
# run: |
# cat fixtures.json | \
# jq '.databases |= . + {"ephemeral": {"allow": {"id": "*"}}}' | \
# jq '.plugins |= . + {"datasette-ephemeral-tables": {"table_ttl": 900}}' \
# > metadata.json
# cat metadata.json
- name: Set up Cloud Run
uses: google-github-actions/setup-gcloud@v0
with:
version: '275.0.0'
version: '318.0.0'
service_account_email: ${{ secrets.GCP_SA_EMAIL }}
service_account_key: ${{ secrets.GCP_SA_KEY }}
- name: Deploy to Cloud Run
env:
LATEST_DATASETTE_SECRET: ${{ secrets.LATEST_DATASETTE_SECRET }}
run: |-
gcloud config set run/region us-central1
gcloud config set project datasette-222320
export SUFFIX="-${GITHUB_REF#refs/heads/}"
export SUFFIX=${SUFFIX#-main}
# Replace 1.0 with one-dot-zero in SUFFIX
export SUFFIX=${SUFFIX//1.0/one-dot-zero}
datasette publish cloudrun fixtures.db fixtures2.db extra_database.db \
-m fixtures.json \
-m fixtures-metadata.json \
--plugins-dir=plugins \
--branch=$GITHUB_SHA \
--version-note=$GITHUB_SHA \
--extra-options="--setting template_debug 1 --setting trace_debug 1 --crossdb" \
--service "datasette-latest$SUFFIX"
--install 'datasette-ephemeral-tables>=0.2.2' \
--service "datasette-latest$SUFFIX" \
--secret $LATEST_DATASETTE_SECRET
- name: Deploy to docs as well (only for main)
if: ${{ github.ref == 'refs/heads/main' }}
run: |-

Wyświetl plik

@ -12,20 +12,15 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.7", "3.8", "3.9", "3.10", "3.11"]
python-version: ["3.8", "3.9", "3.10", "3.11", "3.12"]
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- uses: actions/cache@v3
name: Configure pip caching
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('**/setup.py') }}
restore-keys: |
${{ runner.os }}-pip-
cache: pip
cache-dependency-path: setup.py
- name: Install dependencies
run: |
pip install -e '.[test]'
@ -36,47 +31,38 @@ jobs:
deploy:
runs-on: ubuntu-latest
needs: [test]
environment: release
permissions:
id-token: write
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: '3.11'
- uses: actions/cache@v3
name: Configure pip caching
with:
path: ~/.cache/pip
key: ${{ runner.os }}-publish-pip-${{ hashFiles('**/setup.py') }}
restore-keys: |
${{ runner.os }}-publish-pip-
python-version: '3.12'
cache: pip
cache-dependency-path: setup.py
- name: Install dependencies
run: |
pip install setuptools wheel twine
- name: Publish
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
pip install setuptools wheel build
- name: Build
run: |
python setup.py sdist bdist_wheel
twine upload dist/*
python -m build
- name: Publish
uses: pypa/gh-action-pypi-publish@release/v1
deploy_static_docs:
runs-on: ubuntu-latest
needs: [deploy]
if: "!github.event.release.prerelease"
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v2
uses: actions/setup-python@v5
with:
python-version: '3.10'
- uses: actions/cache@v2
name: Configure pip caching
with:
path: ~/.cache/pip
key: ${{ runner.os }}-publish-pip-${{ hashFiles('**/setup.py') }}
restore-keys: |
${{ runner.os }}-publish-pip-
python-version: '3.9'
cache: pip
cache-dependency-path: setup.py
- name: Install dependencies
run: |
python -m pip install -e .[docs]
@ -84,13 +70,13 @@ jobs:
- name: Build docs.db
run: |-
cd docs
sphinx-build -b xml . _build
DISABLE_SPHINX_INLINE_TABS=1 sphinx-build -b xml . _build
sphinx-to-sqlite ../docs.db _build
cd ..
- name: Set up Cloud Run
uses: google-github-actions/setup-gcloud@v0
with:
version: '275.0.0'
version: '318.0.0'
service_account_email: ${{ secrets.GCP_SA_EMAIL }}
service_account_key: ${{ secrets.GCP_SA_KEY }}
- name: Deploy stable-docs.datasette.io to Cloud Run
@ -105,7 +91,7 @@ jobs:
needs: [deploy]
if: "!github.event.release.prerelease"
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Build and push to Docker Hub
env:
DOCKER_USER: ${{ secrets.DOCKER_USER }}

Wyświetl plik

@ -9,22 +9,19 @@ jobs:
spellcheck:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: 3.9
- uses: actions/cache@v2
name: Configure pip caching
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('**/setup.py') }}
restore-keys: |
${{ runner.os }}-pip-
python-version: '3.11'
cache: 'pip'
cache-dependency-path: '**/setup.py'
- name: Install dependencies
run: |
pip install -e '.[docs]'
- name: Check spelling
run: |
codespell README.md --ignore-words docs/codespell-ignore-words.txt
codespell docs/*.rst --ignore-words docs/codespell-ignore-words.txt
codespell datasette -S datasette/static --ignore-words docs/codespell-ignore-words.txt
codespell tests --ignore-words docs/codespell-ignore-words.txt

Wyświetl plik

@ -15,18 +15,13 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Check out datasette
uses: actions/checkout@v2
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v2
uses: actions/setup-python@v5
with:
python-version: 3.9
- uses: actions/cache@v2
name: Configure pip caching
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('**/setup.py') }}
restore-keys: |
${{ runner.os }}-pip-
python-version: '3.12'
cache: 'pip'
cache-dependency-path: '**/setup.py'
- name: Install Python dependencies
run: |
python -m pip install --upgrade pip

Wyświetl plik

@ -10,20 +10,16 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.7", "3.8", "3.9", "3.10", "3.11"]
python-version: ["3.8", "3.9", "3.10", "3.11", "3.12"]
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- uses: actions/cache@v3
name: Configure pip caching
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('**/setup.py') }}
restore-keys: |
${{ runner.os }}-pip-
allow-prereleases: true
cache: pip
cache-dependency-path: setup.py
- name: Build extension for --load-extension test
run: |-
(cd tests && gcc ext.c -fPIC -shared -o ext.so)
@ -35,10 +31,22 @@ jobs:
run: |
pytest -n auto -m "not serial"
pytest -m "serial"
# And the test that exceeds a localhost HTTPS server
tests/test_datasette_https_server.sh
- name: Install docs dependencies on Python 3.9+
if: matrix.python-version != '3.8'
run: |
pip install -e '.[docs]'
- name: Check if cog needs to be run
if: matrix.python-version != '3.8'
run: |
cog --check docs/*.rst
- name: Check if blacken-docs needs to be run
if: matrix.python-version != '3.8'
run: |
# This fails on syntax errors, or a diff was applied
blacken-docs -l 60 docs/*.rst
- name: Test DATASETTE_LOAD_PLUGINS
run: |
pip install datasette-init datasette-json-html
tests/test-datasette-load-plugins.sh

Wyświetl plik

@ -3,7 +3,7 @@ version: 2
build:
os: ubuntu-20.04
tools:
python: "3.9"
python: "3.11"
sphinx:
configuration: docs/conf.py

42
Justfile 100644
Wyświetl plik

@ -0,0 +1,42 @@
export DATASETTE_SECRET := "not_a_secret"
# Run tests and linters
@default: test lint
# Setup project
@init:
pipenv run pip install -e '.[test,docs]'
# Run pytest with supplied options
@test *options:
pipenv run pytest {{options}}
@codespell:
pipenv run codespell README.md --ignore-words docs/codespell-ignore-words.txt
pipenv run codespell docs/*.rst --ignore-words docs/codespell-ignore-words.txt
pipenv run codespell datasette -S datasette/static --ignore-words docs/codespell-ignore-words.txt
pipenv run tests --ignore-words docs/codespell-ignore-words.txt
# Run linters: black, flake8, mypy, cog
@lint: codespell
pipenv run black . --check
pipenv run flake8
pipenv run cog --check README.md docs/*.rst
# Rebuild docs with cog
@cog:
pipenv run cog -r README.md docs/*.rst
# Serve live docs on localhost:8000
@docs: cog
pipenv run blacken-docs -l 60 docs/*.rst
cd docs && pipenv run make livehtml
# Apply Black
@black:
pipenv run black .
@serve:
pipenv run sqlite-utils create-database data.db
pipenv run sqlite-utils create-table data.db docs id integer title text --pk id --ignore
pipenv run python -m datasette data.db --root --reload

Wyświetl plik

@ -1,13 +1,13 @@
<img src="https://datasette.io/static/datasette-logo.svg" alt="Datasette">
[![PyPI](https://img.shields.io/pypi/v/datasette.svg)](https://pypi.org/project/datasette/)
[![Changelog](https://img.shields.io/github/v/release/simonw/datasette?label=changelog)](https://docs.datasette.io/en/stable/changelog.html)
[![Changelog](https://img.shields.io/github/v/release/simonw/datasette?label=changelog)](https://docs.datasette.io/en/latest/changelog.html)
[![Python 3.x](https://img.shields.io/pypi/pyversions/datasette.svg?logo=python&logoColor=white)](https://pypi.org/project/datasette/)
[![Tests](https://github.com/simonw/datasette/workflows/Test/badge.svg)](https://github.com/simonw/datasette/actions?query=workflow%3ATest)
[![Documentation Status](https://readthedocs.org/projects/datasette/badge/?version=latest)](https://docs.datasette.io/en/latest/?badge=latest)
[![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/datasette/blob/main/LICENSE)
[![docker: datasette](https://img.shields.io/badge/docker-datasette-blue)](https://hub.docker.com/r/datasetteproject/datasette)
[![discord](https://img.shields.io/discord/823971286308356157?label=discord)](https://discord.gg/ktd74dm5mw)
[![discord](https://img.shields.io/discord/823971286308356157?label=discord)](https://datasette.io/discord)
*An open source multi-tool for exploring and publishing data*
@ -22,7 +22,7 @@ Datasette is aimed at data journalists, museum curators, archivists, local gover
* Comprehensive documentation: https://docs.datasette.io/
* Examples: https://datasette.io/examples
* Live demo of current `main` branch: https://latest.datasette.io/
* Questions, feedback or want to talk about the project? Join our [Discord](https://discord.gg/ktd74dm5mw)
* Questions, feedback or want to talk about the project? Join our [Discord](https://datasette.io/discord)
Want to stay up-to-date with the project? Subscribe to the [Datasette newsletter](https://datasette.substack.com/) for tips, tricks and news on what's new in the Datasette ecosystem.
@ -36,7 +36,7 @@ You can also install it using `pip` or `pipx`:
pip install datasette
Datasette requires Python 3.7 or higher. We also have [detailed installation instructions](https://docs.datasette.io/en/stable/installation.html) covering other options such as Docker.
Datasette requires Python 3.8 or higher. We also have [detailed installation instructions](https://docs.datasette.io/en/stable/installation.html) covering other options such as Docker.
## Basic usage

Wyświetl plik

@ -1,5 +1,8 @@
from datasette.permissions import Permission # noqa
from datasette.version import __version_info__, __version__ # noqa
from datasette.events import Event # noqa
from datasette.utils.asgi import Forbidden, NotFound, Request, Response # noqa
from datasette.utils import actor_matches_allow # noqa
from datasette.views import Context # noqa
from .hookspecs import hookimpl # noqa
from .hookspecs import hookspec # noqa

Plik diff jest za duży Load Diff

Wyświetl plik

@ -4,16 +4,17 @@ import click
from click import formatting
from click.types import CompositeParamType
from click_default_group import DefaultGroup
import functools
import json
import os
import pathlib
from runpy import run_module
import shutil
from subprocess import call
import sys
from runpy import run_module
import textwrap
import webbrowser
from .app import (
OBSOLETE_SETTINGS,
Datasette,
DEFAULT_SETTINGS,
SETTINGS,
@ -29,6 +30,7 @@ from .utils import (
ConnectionProblem,
SpatialiteConnectionProblem,
initial_path_for_datasette,
pairs_to_nested_config,
temporary_docker_directory,
value_as_boolean,
SpatialiteNotFound,
@ -48,85 +50,37 @@ except ImportError:
pass
class Config(click.ParamType):
# This will be removed in Datasette 1.0 in favour of class Setting
name = "config"
def convert(self, config, param, ctx):
if ":" not in config:
self.fail(f'"{config}" should be name:value', param, ctx)
return
name, value = config.split(":", 1)
if name not in DEFAULT_SETTINGS:
msg = (
OBSOLETE_SETTINGS.get(name)
or f"{name} is not a valid option (--help-settings to see all)"
)
self.fail(
msg,
param,
ctx,
)
return
# Type checking
default = DEFAULT_SETTINGS[name]
if isinstance(default, bool):
try:
return name, value_as_boolean(value)
except ValueAsBooleanError:
self.fail(f'"{name}" should be on/off/true/false/1/0', param, ctx)
return
elif isinstance(default, int):
if not value.isdigit():
self.fail(f'"{name}" should be an integer', param, ctx)
return
return name, int(value)
elif isinstance(default, str):
return name, value
else:
# Should never happen:
self.fail("Invalid option")
class Setting(CompositeParamType):
name = "setting"
arity = 2
def convert(self, config, param, ctx):
name, value = config
if name not in DEFAULT_SETTINGS:
msg = (
OBSOLETE_SETTINGS.get(name)
or f"{name} is not a valid option (--help-settings to see all)"
)
self.fail(
msg,
param,
ctx,
)
return
# Type checking
default = DEFAULT_SETTINGS[name]
if isinstance(default, bool):
try:
return name, value_as_boolean(value)
except ValueAsBooleanError:
self.fail(f'"{name}" should be on/off/true/false/1/0', param, ctx)
return
elif isinstance(default, int):
if not value.isdigit():
self.fail(f'"{name}" should be an integer', param, ctx)
return
return name, int(value)
elif isinstance(default, str):
return name, value
else:
# Should never happen:
self.fail("Invalid option")
if name in DEFAULT_SETTINGS:
# For backwards compatibility with how this worked prior to
# Datasette 1.0, we turn bare setting names into setting.name
# Type checking for those older settings
default = DEFAULT_SETTINGS[name]
name = "settings.{}".format(name)
if isinstance(default, bool):
try:
return name, "true" if value_as_boolean(value) else "false"
except ValueAsBooleanError:
self.fail(f'"{name}" should be on/off/true/false/1/0', param, ctx)
elif isinstance(default, int):
if not value.isdigit():
self.fail(f'"{name}" should be an integer', param, ctx)
return name, value
elif isinstance(default, str):
return name, value
else:
# Should never happen:
self.fail("Invalid option")
return name, value
def sqlite_extensions(fn):
return click.option(
fn = click.option(
"sqlite_extensions",
"--load-extension",
type=LoadExtension(),
@ -135,6 +89,26 @@ def sqlite_extensions(fn):
help="Path to a SQLite extension to load, and optional entrypoint",
)(fn)
# Wrap it in a custom error handler
@functools.wraps(fn)
def wrapped(*args, **kwargs):
try:
return fn(*args, **kwargs)
except AttributeError as e:
if "enable_load_extension" in str(e):
raise click.ClickException(
textwrap.dedent(
"""
Your Python installation does not have the ability to load SQLite extensions.
More information: https://datasette.io/help/extensions
"""
).strip()
)
raise
return wrapped
@click.group(cls=DefaultGroup, default="serve", default_if_no_args=True)
@click.version_option(version=__version__)
@ -173,9 +147,6 @@ async def inspect_(files, sqlite_extensions):
app = Datasette([], immutables=files, sqlite_extensions=sqlite_extensions)
data = {}
for name, database in app.databases.items():
if name == "_internal":
# Don't include the in-memory _internal database
continue
counts = await database.table_counts(limit=3600 * 1000)
data[name] = {
"hash": database.hash,
@ -201,15 +172,23 @@ pm.hook.publish_subcommand(publish=publish)
@cli.command()
@click.option("--all", help="Include built-in default plugins", is_flag=True)
@click.option(
"--requirements", help="Output requirements.txt of installed plugins", is_flag=True
)
@click.option(
"--plugins-dir",
type=click.Path(exists=True, file_okay=False, dir_okay=True),
help="Path to directory containing custom plugins",
)
def plugins(all, plugins_dir):
def plugins(all, requirements, plugins_dir):
"""List currently installed plugins"""
app = Datasette([], plugins_dir=plugins_dir)
click.echo(json.dumps(app._plugins(all=all), indent=4))
if requirements:
for plugin in app._plugins():
if plugin["version"]:
click.echo("{}=={}".format(plugin["name"], plugin["version"]))
else:
click.echo(json.dumps(app._plugins(all=all), indent=4))
@cli.command()
@ -319,15 +298,32 @@ def package(
@cli.command()
@click.argument("packages", nargs=-1, required=True)
@click.argument("packages", nargs=-1)
@click.option(
"-U", "--upgrade", is_flag=True, help="Upgrade packages to latest version"
)
def install(packages, upgrade):
@click.option(
"-r",
"--requirement",
type=click.Path(exists=True),
help="Install from requirements file",
)
@click.option(
"-e",
"--editable",
help="Install a project in editable mode from this path",
)
def install(packages, upgrade, requirement, editable):
"""Install plugins and packages from PyPI into the same environment as Datasette"""
if not packages and not requirement and not editable:
raise click.UsageError("Please specify at least one package to install")
args = ["pip", "install"]
if upgrade:
args += ["--upgrade"]
if editable:
args += ["--editable", editable]
if requirement:
args += ["-r", requirement]
args += list(packages)
sys.argv = args
run_module("pip", run_name="__main__")
@ -408,16 +404,17 @@ def uninstall(packages, yes):
)
@click.option("--memory", is_flag=True, help="Make /_memory database available")
@click.option(
"-c",
"--config",
type=Config(),
help="Deprecated: set config option using configname:value. Use --setting instead.",
multiple=True,
type=click.File(mode="r"),
help="Path to JSON/YAML Datasette configuration file",
)
@click.option(
"-s",
"--setting",
"settings",
type=Setting(),
help="Setting, see docs.datasette.io/en/stable/settings.html",
help="nested.key, value setting to use in Datasette configuration",
multiple=True,
)
@click.option(
@ -434,6 +431,14 @@ def uninstall(packages, yes):
"--get",
help="Run an HTTP GET request against this path, print results and exit",
)
@click.option(
"--token",
help="API token to send with --get requests",
)
@click.option(
"--actor",
help="Actor to use for --get requests (JSON string)",
)
@click.option("--version-note", help="Additional note to show on /-/versions")
@click.option("--help-settings", is_flag=True, help="Show available settings")
@click.option("--pdb", is_flag=True, help="Launch debugger on any errors")
@ -467,6 +472,11 @@ def uninstall(packages, yes):
"--ssl-certfile",
help="SSL certificate file",
)
@click.option(
"--internal",
type=click.Path(),
help="Path to a persistent Datasette internal SQLite database",
)
def serve(
files,
immutable,
@ -487,6 +497,8 @@ def serve(
secret,
root,
get,
token,
actor,
version_note,
help_settings,
pdb,
@ -496,6 +508,7 @@ def serve(
nolock,
ssl_keyfile,
ssl_certfile,
internal,
return_instance=False,
):
"""Serve up specified SQLite database files with a web UI"""
@ -516,6 +529,8 @@ def serve(
reloader = hupper.start_reloader("datasette.cli.serve")
if immutable:
reloader.watch_files(immutable)
if config:
reloader.watch_files([config.name])
if metadata:
reloader.watch_files([metadata.name])
@ -528,32 +543,36 @@ def serve(
if metadata:
metadata_data = parse_metadata(metadata.read())
combined_settings = {}
config_data = None
if config:
click.echo(
"--config name:value will be deprecated in Datasette 1.0, use --setting name value instead",
err=True,
)
combined_settings.update(config)
combined_settings.update(settings)
config_data = parse_metadata(config.read())
config_data = config_data or {}
# Merge in settings from -s/--setting
if settings:
settings_updates = pairs_to_nested_config(settings)
config_data.update(settings_updates)
kwargs = dict(
immutables=immutable,
cache_headers=not reload,
cors=cors,
inspect_data=inspect_data,
config=config_data,
metadata=metadata_data,
sqlite_extensions=sqlite_extensions,
template_dir=template_dir,
plugins_dir=plugins_dir,
static_mounts=static,
settings=combined_settings,
settings=None, # These are passed in config= now
memory=memory,
secret=secret,
version_note=version_note,
pdb=pdb,
crossdb=crossdb,
nolock=nolock,
internal=internal,
)
# if files is a single directory, use that as config_dir=
@ -593,9 +612,18 @@ def serve(
# Run async soundness checks - but only if we're not under pytest
asyncio.get_event_loop().run_until_complete(check_databases(ds))
if token and not get:
raise click.ClickException("--token can only be used with --get")
if get:
client = TestClient(ds)
response = client.get(get)
headers = {}
if token:
headers["Authorization"] = "Bearer {}".format(token)
cookies = {}
if actor:
cookies["ds_actor"] = client.actor_cookie(json.loads(actor))
response = client.get(get, headers=headers, cookies=cookies)
click.echo(response.text)
exit_code = 0 if response.status == 200 else 1
sys.exit(exit_code)
@ -607,7 +635,7 @@ def serve(
url = "http://{}:{}{}?token={}".format(
host, port, ds.urls.path("-/auth-token"), ds._root_token
)
print(url)
click.echo(url)
if open_browser:
if url is None:
# Figure out most convenient URL - to table, database or homepage
@ -628,6 +656,132 @@ def serve(
uvicorn.run(ds.app(), **uvicorn_kwargs)
@cli.command()
@click.argument("id")
@click.option(
"--secret",
help="Secret used for signing the API tokens",
envvar="DATASETTE_SECRET",
required=True,
)
@click.option(
"-e",
"--expires-after",
help="Token should expire after this many seconds",
type=int,
)
@click.option(
"alls",
"-a",
"--all",
type=str,
metavar="ACTION",
multiple=True,
help="Restrict token to this action",
)
@click.option(
"databases",
"-d",
"--database",
type=(str, str),
metavar="DB ACTION",
multiple=True,
help="Restrict token to this action on this database",
)
@click.option(
"resources",
"-r",
"--resource",
type=(str, str, str),
metavar="DB RESOURCE ACTION",
multiple=True,
help="Restrict token to this action on this database resource (a table, SQL view or named query)",
)
@click.option(
"--debug",
help="Show decoded token",
is_flag=True,
)
@click.option(
"--plugins-dir",
type=click.Path(exists=True, file_okay=False, dir_okay=True),
help="Path to directory containing custom plugins",
)
def create_token(
id, secret, expires_after, alls, databases, resources, debug, plugins_dir
):
"""
Create a signed API token for the specified actor ID
Example:
datasette create-token root --secret mysecret
To allow only "view-database-download" for all databases:
\b
datasette create-token root --secret mysecret \\
--all view-database-download
To allow "create-table" against a specific database:
\b
datasette create-token root --secret mysecret \\
--database mydb create-table
To allow "insert-row" against a specific table:
\b
datasette create-token root --secret myscret \\
--resource mydb mytable insert-row
Restricted actions can be specified multiple times using
multiple --all, --database, and --resource options.
Add --debug to see a decoded version of the token.
"""
ds = Datasette(secret=secret, plugins_dir=plugins_dir)
# Run ds.invoke_startup() in an event loop
loop = asyncio.get_event_loop()
loop.run_until_complete(ds.invoke_startup())
# Warn about any unknown actions
actions = []
actions.extend(alls)
actions.extend([p[1] for p in databases])
actions.extend([p[2] for p in resources])
for action in actions:
if not ds.permissions.get(action):
click.secho(
f" Unknown permission: {action} ",
fg="red",
err=True,
)
restrict_database = {}
for database, action in databases:
restrict_database.setdefault(database, []).append(action)
restrict_resource = {}
for database, resource, action in resources:
restrict_resource.setdefault(database, {}).setdefault(resource, []).append(
action
)
token = ds.create_token(
id,
expires_after=expires_after,
restrict_all=alls,
restrict_database=restrict_database,
restrict_resource=restrict_resource,
)
click.echo(token)
if debug:
encoded = token[len("dstok_") :]
click.echo("\nDecoded:\n")
click.echo(json.dumps(ds.unsign(encoded, namespace="token"), indent=2))
pm.hook.register_commands(cli=cli)

Wyświetl plik

@ -14,6 +14,7 @@ from .utils import (
detect_spatialite,
get_all_foreign_keys,
get_outbound_foreign_keys,
md5_not_usedforsecurity,
sqlite_timelimit,
sqlite3,
table_columns,
@ -28,7 +29,13 @@ AttachedDatabase = namedtuple("AttachedDatabase", ("seq", "name", "file"))
class Database:
def __init__(
self, ds, path=None, is_mutable=True, is_memory=False, memory_name=None
self,
ds,
path=None,
is_mutable=True,
is_memory=False,
memory_name=None,
mode=None,
):
self.name = None
self.route = None
@ -49,6 +56,7 @@ class Database:
self._write_connection = None
# This is used to track all file connections so they can be closed
self._all_file_connections = []
self.mode = mode
@property
def cached_table_counts(self):
@ -62,6 +70,12 @@ class Database:
}
return self._cached_table_counts
@property
def color(self):
if self.hash:
return self.hash[:6]
return md5_not_usedforsecurity(self.name)[:6]
def suggest_name(self):
if self.path:
return Path(self.path).stem
@ -83,6 +97,7 @@ class Database:
return conn
if self.is_memory:
return sqlite3.connect(":memory:", uri=True)
# mode=ro or immutable=1?
if self.is_mutable:
qs = "?mode=ro"
@ -93,6 +108,8 @@ class Database:
assert not (write and not self.is_mutable)
if write:
qs = ""
if self.mode is not None:
qs = f"?mode={self.mode}"
conn = sqlite3.connect(
f"file:{self.path}{qs}", uri=True, check_same_thread=False
)
@ -106,8 +123,7 @@ class Database:
async def execute_write(self, sql, params=None, block=True):
def _inner(conn):
with conn:
return conn.execute(sql, params or [])
return conn.execute(sql, params or [])
with trace("sql", database=self.name, sql=sql.strip(), params=params):
results = await self.execute_write_fn(_inner, block=block)
@ -115,8 +131,7 @@ class Database:
async def execute_write_script(self, sql, block=True):
def _inner(conn):
with conn:
return conn.executescript(sql)
return conn.executescript(sql)
with trace("sql", database=self.name, sql=sql.strip(), executescript=True):
results = await self.execute_write_fn(_inner, block=block)
@ -132,8 +147,7 @@ class Database:
count += 1
yield param
with conn:
return conn.executemany(sql, count_params(params_seq)), count
return conn.executemany(sql, count_params(params_seq)), count
with trace(
"sql", database=self.name, sql=sql.strip(), executemany=True
@ -142,25 +156,60 @@ class Database:
kwargs["count"] = count
return results
async def execute_write_fn(self, fn, block=True):
async def execute_isolated_fn(self, fn):
# Open a new connection just for the duration of this function
# blocking the write queue to avoid any writes occurring during it
if self.ds.executor is None:
# non-threaded mode
isolated_connection = self.connect(write=True)
try:
result = fn(isolated_connection)
finally:
isolated_connection.close()
try:
self._all_file_connections.remove(isolated_connection)
except ValueError:
# Was probably a memory connection
pass
return result
else:
# Threaded mode - send to write thread
return await self._send_to_write_thread(fn, isolated_connection=True)
async def execute_write_fn(self, fn, block=True, transaction=True):
if self.ds.executor is None:
# non-threaded mode
if self._write_connection is None:
self._write_connection = self.connect(write=True)
self.ds._prepare_connection(self._write_connection, self.name)
return fn(self._write_connection)
if transaction:
with self._write_connection:
return fn(self._write_connection)
else:
return fn(self._write_connection)
else:
return await self._send_to_write_thread(
fn, block=block, transaction=transaction
)
# threaded mode
task_id = uuid.uuid5(uuid.NAMESPACE_DNS, "datasette.io")
async def _send_to_write_thread(
self, fn, block=True, isolated_connection=False, transaction=True
):
if self._write_queue is None:
self._write_queue = queue.Queue()
if self._write_thread is None:
self._write_thread = threading.Thread(
target=self._execute_writes, daemon=True
)
self._write_thread.name = "_execute_writes for database {}".format(
self.name
)
self._write_thread.start()
task_id = uuid.uuid5(uuid.NAMESPACE_DNS, "datasette.io")
reply_queue = janus.Queue()
self._write_queue.put(WriteTask(fn, task_id, reply_queue))
self._write_queue.put(
WriteTask(fn, task_id, reply_queue, isolated_connection, transaction)
)
if block:
result = await reply_queue.async_q.get()
if isinstance(result, Exception):
@ -185,12 +234,32 @@ class Database:
if conn_exception is not None:
result = conn_exception
else:
try:
result = task.fn(conn)
except Exception as e:
sys.stderr.write("{}\n".format(e))
sys.stderr.flush()
result = e
if task.isolated_connection:
isolated_connection = self.connect(write=True)
try:
result = task.fn(isolated_connection)
except Exception as e:
sys.stderr.write("{}\n".format(e))
sys.stderr.flush()
result = e
finally:
isolated_connection.close()
try:
self._all_file_connections.remove(isolated_connection)
except ValueError:
# Was probably a memory connection
pass
else:
try:
if task.transaction:
with conn:
result = task.fn(conn)
else:
result = task.fn(conn)
except Exception as e:
sys.stderr.write("{}\n".format(e))
sys.stderr.flush()
result = e
task.reply_queue.sync_q.put(result)
async def execute_fn(self, fn):
@ -338,6 +407,12 @@ class Database:
)
return bool(results.rows)
async def view_exists(self, table):
results = await self.execute(
"select 1 from sqlite_master where type='view' and name=?", params=(table,)
)
return bool(results.rows)
async def table_names(self):
results = await self.execute(
"select name from sqlite_master where type='table'"
@ -357,7 +432,7 @@ class Database:
return await self.execute_fn(lambda conn: detect_fts(conn, table))
async def label_column_for_table(self, table):
explicit_label_column = self.ds.table_metadata(self.name, table).get(
explicit_label_column = (await self.ds.table_config(self.name, table)).get(
"label_column"
)
if explicit_label_column:
@ -394,6 +469,7 @@ class Database:
and (
sql like '%VIRTUAL TABLE%USING FTS%'
) or name in ('sqlite_stat1', 'sqlite_stat2', 'sqlite_stat3', 'sqlite_stat4')
or name like '\\_%' escape '\\'
"""
)
).rows
@ -426,13 +502,11 @@ class Database:
)
).rows
]
# Add any from metadata.json
db_metadata = self.ds.metadata(database=self.name)
if "tables" in db_metadata:
# Add any tables marked as hidden in config
db_config = self.ds.config.get("databases", {}).get(self.name, {})
if "tables" in db_config:
hidden_tables += [
t
for t in db_metadata["tables"]
if db_metadata["tables"][t].get("hidden")
t for t in db_config["tables"] if db_config["tables"][t].get("hidden")
]
# Also mark as hidden any tables which start with the name of a hidden table
# e.g. "searchable_fts" implies "searchable_fts_content" should be hidden
@ -492,12 +566,14 @@ class Database:
class WriteTask:
__slots__ = ("fn", "task_id", "reply_queue")
__slots__ = ("fn", "task_id", "reply_queue", "isolated_connection", "transaction")
def __init__(self, fn, task_id, reply_queue):
def __init__(self, fn, task_id, reply_queue, isolated_connection, transaction):
self.fn = fn
self.task_id = task_id
self.reply_queue = reply_queue
self.isolated_connection = isolated_connection
self.transaction = transaction
class QueryInterrupted(Exception):

Wyświetl plik

@ -24,9 +24,12 @@ def now(key, request):
if key == "epoch":
return int(time.time())
elif key == "date_utc":
return datetime.datetime.utcnow().date().isoformat()
return datetime.datetime.now(datetime.timezone.utc).date().isoformat()
elif key == "datetime_utc":
return datetime.datetime.utcnow().strftime(r"%Y-%m-%dT%H:%M:%S") + "Z"
return (
datetime.datetime.now(datetime.timezone.utc).strftime(r"%Y-%m-%dT%H:%M:%S")
+ "Z"
)
else:
raise KeyError

Wyświetl plik

@ -1,47 +1,420 @@
from datasette import hookimpl
from datasette import hookimpl, Permission
from datasette.utils import actor_matches_allow
import itsdangerous
import time
from typing import Union, Tuple
@hookimpl(tryfirst=True)
def permission_allowed(datasette, actor, action, resource):
@hookimpl
def register_permissions():
return (
Permission(
name="view-instance",
abbr="vi",
description="View Datasette instance",
takes_database=False,
takes_resource=False,
default=True,
),
Permission(
name="view-database",
abbr="vd",
description="View database",
takes_database=True,
takes_resource=False,
default=True,
implies_can_view=True,
),
Permission(
name="view-database-download",
abbr="vdd",
description="Download database file",
takes_database=True,
takes_resource=False,
default=True,
),
Permission(
name="view-table",
abbr="vt",
description="View table",
takes_database=True,
takes_resource=True,
default=True,
implies_can_view=True,
),
Permission(
name="view-query",
abbr="vq",
description="View named query results",
takes_database=True,
takes_resource=True,
default=True,
implies_can_view=True,
),
Permission(
name="execute-sql",
abbr="es",
description="Execute read-only SQL queries",
takes_database=True,
takes_resource=False,
default=True,
implies_can_view=True,
),
Permission(
name="permissions-debug",
abbr="pd",
description="Access permission debug tool",
takes_database=False,
takes_resource=False,
default=False,
),
Permission(
name="debug-menu",
abbr="dm",
description="View debug menu items",
takes_database=False,
takes_resource=False,
default=False,
),
Permission(
name="insert-row",
abbr="ir",
description="Insert rows",
takes_database=True,
takes_resource=True,
default=False,
),
Permission(
name="delete-row",
abbr="dr",
description="Delete rows",
takes_database=True,
takes_resource=True,
default=False,
),
Permission(
name="update-row",
abbr="ur",
description="Update rows",
takes_database=True,
takes_resource=True,
default=False,
),
Permission(
name="create-table",
abbr="ct",
description="Create tables",
takes_database=True,
takes_resource=False,
default=False,
),
Permission(
name="alter-table",
abbr="at",
description="Alter tables",
takes_database=True,
takes_resource=True,
default=False,
),
Permission(
name="drop-table",
abbr="dt",
description="Drop tables",
takes_database=True,
takes_resource=True,
default=False,
),
)
@hookimpl(tryfirst=True, specname="permission_allowed")
def permission_allowed_default(datasette, actor, action, resource):
async def inner():
if action in ("permissions-debug", "debug-menu"):
# id=root gets some special permissions:
if action in (
"permissions-debug",
"debug-menu",
"insert-row",
"create-table",
"alter-table",
"drop-table",
"delete-row",
"update-row",
):
if actor and actor.get("id") == "root":
return True
elif action == "view-instance":
allow = datasette.metadata("allow")
if allow is not None:
return actor_matches_allow(actor, allow)
elif action == "view-database":
if resource == "_internal" and (actor is None or actor.get("id") != "root"):
return False
database_allow = datasette.metadata("allow", database=resource)
if database_allow is None:
return None
return actor_matches_allow(actor, database_allow)
elif action == "view-table":
database, table = resource
tables = datasette.metadata("tables", database=database) or {}
table_allow = (tables.get(table) or {}).get("allow")
if table_allow is None:
return None
return actor_matches_allow(actor, table_allow)
elif action == "view-query":
# Check if this query has a "allow" block in metadata
database, query_name = resource
query = await datasette.get_canned_query(database, query_name, actor)
assert query is not None
allow = query.get("allow")
if allow is None:
return None
return actor_matches_allow(actor, allow)
elif action == "execute-sql":
# Use allow_sql block from database block, or from top-level
database_allow_sql = datasette.metadata("allow_sql", database=resource)
if database_allow_sql is None:
database_allow_sql = datasette.metadata("allow_sql")
if database_allow_sql is None:
return None
return actor_matches_allow(actor, database_allow_sql)
# Resolve view permissions in allow blocks in configuration
if action in (
"view-instance",
"view-database",
"view-table",
"view-query",
"execute-sql",
):
result = await _resolve_config_view_permissions(
datasette, actor, action, resource
)
if result is not None:
return result
# Resolve custom permissions: blocks in configuration
result = await _resolve_config_permissions_blocks(
datasette, actor, action, resource
)
if result is not None:
return result
# --setting default_allow_sql
if action == "execute-sql" and not datasette.setting("default_allow_sql"):
return False
return inner
async def _resolve_config_permissions_blocks(datasette, actor, action, resource):
# Check custom permissions: blocks
config = datasette.config or {}
root_block = (config.get("permissions", None) or {}).get(action)
if root_block:
root_result = actor_matches_allow(actor, root_block)
if root_result is not None:
return root_result
# Now try database-specific blocks
if not resource:
return None
if isinstance(resource, str):
database = resource
else:
database = resource[0]
database_block = (
(config.get("databases", {}).get(database, {}).get("permissions", None)) or {}
).get(action)
if database_block:
database_result = actor_matches_allow(actor, database_block)
if database_result is not None:
return database_result
# Finally try table/query specific blocks
if not isinstance(resource, tuple):
return None
database, table_or_query = resource
table_block = (
(
config.get("databases", {})
.get(database, {})
.get("tables", {})
.get(table_or_query, {})
.get("permissions", None)
)
or {}
).get(action)
if table_block:
table_result = actor_matches_allow(actor, table_block)
if table_result is not None:
return table_result
# Finally the canned queries
query_block = (
(
config.get("databases", {})
.get(database, {})
.get("queries", {})
.get(table_or_query, {})
.get("permissions", None)
)
or {}
).get(action)
if query_block:
query_result = actor_matches_allow(actor, query_block)
if query_result is not None:
return query_result
return None
async def _resolve_config_view_permissions(datasette, actor, action, resource):
config = datasette.config or {}
if action == "view-instance":
allow = config.get("allow")
if allow is not None:
return actor_matches_allow(actor, allow)
elif action == "view-database":
database_allow = ((config.get("databases") or {}).get(resource) or {}).get(
"allow"
)
if database_allow is None:
return None
return actor_matches_allow(actor, database_allow)
elif action == "view-table":
database, table = resource
tables = ((config.get("databases") or {}).get(database) or {}).get(
"tables"
) or {}
table_allow = (tables.get(table) or {}).get("allow")
if table_allow is None:
return None
return actor_matches_allow(actor, table_allow)
elif action == "view-query":
# Check if this query has a "allow" block in config
database, query_name = resource
query = await datasette.get_canned_query(database, query_name, actor)
assert query is not None
allow = query.get("allow")
if allow is None:
return None
return actor_matches_allow(actor, allow)
elif action == "execute-sql":
# Use allow_sql block from database block, or from top-level
database_allow_sql = ((config.get("databases") or {}).get(resource) or {}).get(
"allow_sql"
)
if database_allow_sql is None:
database_allow_sql = config.get("allow_sql")
if database_allow_sql is None:
return None
return actor_matches_allow(actor, database_allow_sql)
def restrictions_allow_action(
datasette: "Datasette",
restrictions: dict,
action: str,
resource: Union[str, Tuple[str, str]],
):
"Do these restrictions allow the requested action against the requested resource?"
if action == "view-instance":
# Special case for view-instance: it's allowed if the restrictions include any
# permissions that have the implies_can_view=True flag set
all_rules = restrictions.get("a") or []
for database_rules in (restrictions.get("d") or {}).values():
all_rules += database_rules
for database_resource_rules in (restrictions.get("r") or {}).values():
for resource_rules in database_resource_rules.values():
all_rules += resource_rules
permissions = [datasette.get_permission(action) for action in all_rules]
if any(p for p in permissions if p.implies_can_view):
return True
if action == "view-database":
# Special case for view-database: it's allowed if the restrictions include any
# permissions that have the implies_can_view=True flag set AND takes_database
all_rules = restrictions.get("a") or []
database_rules = list((restrictions.get("d") or {}).get(resource) or [])
all_rules += database_rules
resource_rules = ((restrictions.get("r") or {}).get(resource) or {}).values()
for resource_rules in (restrictions.get("r") or {}).values():
for table_rules in resource_rules.values():
all_rules += table_rules
permissions = [datasette.get_permission(action) for action in all_rules]
if any(p for p in permissions if p.implies_can_view and p.takes_database):
return True
# Does this action have an abbreviation?
to_check = {action}
permission = datasette.permissions.get(action)
if permission and permission.abbr:
to_check.add(permission.abbr)
# If restrictions is defined then we use those to further restrict the actor
# Crucially, we only use this to say NO (return False) - we never
# use it to return YES (True) because that might over-ride other
# restrictions placed on this actor
all_allowed = restrictions.get("a")
if all_allowed is not None:
assert isinstance(all_allowed, list)
if to_check.intersection(all_allowed):
return True
# How about for the current database?
if resource:
if isinstance(resource, str):
database_name = resource
else:
database_name = resource[0]
database_allowed = restrictions.get("d", {}).get(database_name)
if database_allowed is not None:
assert isinstance(database_allowed, list)
if to_check.intersection(database_allowed):
return True
# Or the current table? That's any time the resource is (database, table)
if resource is not None and not isinstance(resource, str) and len(resource) == 2:
database, table = resource
table_allowed = restrictions.get("r", {}).get(database, {}).get(table)
# TODO: What should this do for canned queries?
if table_allowed is not None:
assert isinstance(table_allowed, list)
if to_check.intersection(table_allowed):
return True
# This action is not specifically allowed, so reject it
return False
@hookimpl(specname="permission_allowed")
def permission_allowed_actor_restrictions(datasette, actor, action, resource):
if actor is None:
return None
if "_r" not in actor:
# No restrictions, so we have no opinion
return None
_r = actor.get("_r")
if restrictions_allow_action(datasette, _r, action, resource):
# Return None because we do not have an opinion here
return None
else:
# Block this permission check
return False
@hookimpl
def actor_from_request(datasette, request):
prefix = "dstok_"
if not datasette.setting("allow_signed_tokens"):
return None
max_signed_tokens_ttl = datasette.setting("max_signed_tokens_ttl")
authorization = request.headers.get("authorization")
if not authorization:
return None
if not authorization.startswith("Bearer "):
return None
token = authorization[len("Bearer ") :]
if not token.startswith(prefix):
return None
token = token[len(prefix) :]
try:
decoded = datasette.unsign(token, namespace="token")
except itsdangerous.BadSignature:
return None
if "t" not in decoded:
# Missing timestamp
return None
created = decoded["t"]
if not isinstance(created, int):
# Invalid timestamp
return None
duration = decoded.get("d")
if duration is not None and not isinstance(duration, int):
# Invalid duration
return None
if (duration is None and max_signed_tokens_ttl) or (
duration is not None
and max_signed_tokens_ttl
and duration > max_signed_tokens_ttl
):
duration = max_signed_tokens_ttl
if duration:
if time.time() - created > duration:
# Expired
return None
actor = {"id": decoded["a"], "token": "dstok"}
if "_r" in decoded:
actor["_r"] = decoded["_r"]
if duration:
actor["token_expires"] = created + duration
return actor
@hookimpl
def skip_csrf(scope):
# Skip CSRF check for requests with content-type: application/json
if scope["type"] == "http":
headers = scope.get("headers") or {}
if dict(headers).get(b"content-type") == b"application/json":
return True

236
datasette/events.py 100644
Wyświetl plik

@ -0,0 +1,236 @@
from abc import ABC, abstractproperty
from dataclasses import asdict, dataclass, field
from datasette.hookspecs import hookimpl
from datetime import datetime, timezone
from typing import Optional
@dataclass
class Event(ABC):
@abstractproperty
def name(self):
pass
created: datetime = field(
init=False, default_factory=lambda: datetime.now(timezone.utc)
)
actor: Optional[dict]
def properties(self):
properties = asdict(self)
properties.pop("actor", None)
properties.pop("created", None)
return properties
@dataclass
class LoginEvent(Event):
"""
Event name: ``login``
A user (represented by ``event.actor``) has logged in.
"""
name = "login"
@dataclass
class LogoutEvent(Event):
"""
Event name: ``logout``
A user (represented by ``event.actor``) has logged out.
"""
name = "logout"
@dataclass
class CreateTokenEvent(Event):
"""
Event name: ``create-token``
A user created an API token.
:ivar expires_after: Number of seconds after which this token will expire.
:type expires_after: int or None
:ivar restrict_all: Restricted permissions for this token.
:type restrict_all: list
:ivar restrict_database: Restricted database permissions for this token.
:type restrict_database: dict
:ivar restrict_resource: Restricted resource permissions for this token.
:type restrict_resource: dict
"""
name = "create-token"
expires_after: Optional[int]
restrict_all: list
restrict_database: dict
restrict_resource: dict
@dataclass
class CreateTableEvent(Event):
"""
Event name: ``create-table``
A new table has been created in the database.
:ivar database: The name of the database where the table was created.
:type database: str
:ivar table: The name of the table that was created
:type table: str
:ivar schema: The SQL schema definition for the new table.
:type schema: str
"""
name = "create-table"
database: str
table: str
schema: str
@dataclass
class DropTableEvent(Event):
"""
Event name: ``drop-table``
A table has been dropped from the database.
:ivar database: The name of the database where the table was dropped.
:type database: str
:ivar table: The name of the table that was dropped
:type table: str
"""
name = "drop-table"
database: str
table: str
@dataclass
class AlterTableEvent(Event):
"""
Event name: ``alter-table``
A table has been altered.
:ivar database: The name of the database where the table was altered
:type database: str
:ivar table: The name of the table that was altered
:type table: str
:ivar before_schema: The table's SQL schema before the alteration
:type before_schema: str
:ivar after_schema: The table's SQL schema after the alteration
:type after_schema: str
"""
name = "alter-table"
database: str
table: str
before_schema: str
after_schema: str
@dataclass
class InsertRowsEvent(Event):
"""
Event name: ``insert-rows``
Rows were inserted into a table.
:ivar database: The name of the database where the rows were inserted.
:type database: str
:ivar table: The name of the table where the rows were inserted.
:type table: str
:ivar num_rows: The number of rows that were requested to be inserted.
:type num_rows: int
:ivar ignore: Was ignore set?
:type ignore: bool
:ivar replace: Was replace set?
:type replace: bool
"""
name = "insert-rows"
database: str
table: str
num_rows: int
ignore: bool
replace: bool
@dataclass
class UpsertRowsEvent(Event):
"""
Event name: ``upsert-rows``
Rows were upserted into a table.
:ivar database: The name of the database where the rows were inserted.
:type database: str
:ivar table: The name of the table where the rows were inserted.
:type table: str
:ivar num_rows: The number of rows that were requested to be inserted.
:type num_rows: int
"""
name = "upsert-rows"
database: str
table: str
num_rows: int
@dataclass
class UpdateRowEvent(Event):
"""
Event name: ``update-row``
A row was updated in a table.
:ivar database: The name of the database where the row was updated.
:type database: str
:ivar table: The name of the table where the row was updated.
:type table: str
:ivar pks: The primary key values of the updated row.
"""
name = "update-row"
database: str
table: str
pks: list
@dataclass
class DeleteRowEvent(Event):
"""
Event name: ``delete-row``
A row was deleted from a table.
:ivar database: The name of the database where the row was deleted.
:type database: str
:ivar table: The name of the table where the row was deleted.
:type table: str
:ivar pks: The primary key values of the deleted row.
"""
name = "delete-row"
database: str
table: str
pks: list
@hookimpl
def register_events():
return [
LoginEvent,
LogoutEvent,
CreateTableEvent,
CreateTokenEvent,
AlterTableEvent,
DropTableEvent,
InsertRowsEvent,
UpsertRowsEvent,
UpdateRowEvent,
DeleteRowEvent,
]

Wyświetl plik

@ -11,8 +11,8 @@ from datasette.utils import (
)
def load_facet_configs(request, table_metadata):
# Given a request and the metadata configuration for a table, return
def load_facet_configs(request, table_config):
# Given a request and the configuration for a table, return
# a dictionary of selected facets, their lists of configs and for each
# config whether it came from the request or the metadata.
#
@ -20,21 +20,21 @@ def load_facet_configs(request, table_metadata):
# {"source": "metadata", "config": config1},
# {"source": "request", "config": config2}]}
facet_configs = {}
table_metadata = table_metadata or {}
metadata_facets = table_metadata.get("facets", [])
for metadata_config in metadata_facets:
if isinstance(metadata_config, str):
table_config = table_config or {}
table_facet_configs = table_config.get("facets", [])
for facet_config in table_facet_configs:
if isinstance(facet_config, str):
type = "column"
metadata_config = {"simple": metadata_config}
facet_config = {"simple": facet_config}
else:
assert (
len(metadata_config.values()) == 1
len(facet_config.values()) == 1
), "Metadata config dicts should be {type: config}"
type, metadata_config = list(metadata_config.items())[0]
if isinstance(metadata_config, str):
metadata_config = {"simple": metadata_config}
type, facet_config = list(facet_config.items())[0]
if isinstance(facet_config, str):
facet_config = {"simple": facet_config}
facet_configs.setdefault(type, []).append(
{"source": "metadata", "config": metadata_config}
{"source": "metadata", "config": facet_config}
)
qs_pairs = urllib.parse.parse_qs(request.query_string, keep_blank_values=True)
for key, values in qs_pairs.items():
@ -45,13 +45,12 @@ def load_facet_configs(request, table_metadata):
elif key.startswith("_facet_"):
type = key[len("_facet_") :]
for value in values:
# The value is the config - either JSON or not
if value.startswith("{"):
config = json.loads(value)
else:
config = {"simple": value}
# The value is the facet_config - either JSON or not
facet_config = (
json.loads(value) if value.startswith("{") else {"simple": value}
)
facet_configs.setdefault(type, []).append(
{"source": "request", "config": config}
{"source": "request", "config": facet_config}
)
return facet_configs
@ -75,7 +74,7 @@ class Facet:
sql=None,
table=None,
params=None,
metadata=None,
table_config=None,
row_count=None,
):
assert table or sql, "Must provide either table= or sql="
@ -86,12 +85,12 @@ class Facet:
self.table = table
self.sql = sql or f"select * from [{table}]"
self.params = params or []
self.metadata = metadata
self.table_config = table_config
# row_count can be None, in which case we calculate it ourselves:
self.row_count = row_count
def get_configs(self):
configs = load_facet_configs(self.request, self.metadata)
configs = load_facet_configs(self.request, self.table_config)
return configs.get(self.type) or []
def get_querystring_pairs(self):
@ -253,7 +252,7 @@ class ColumnFacet(Facet):
# Attempt to expand foreign keys into labels
values = [row["value"] for row in facet_rows]
expanded = await self.ds.expand_foreign_keys(
self.database, self.table, column, values
self.request.actor, self.database, self.table, column, values
)
else:
expanded = {}

Wyświetl plik

@ -50,7 +50,7 @@ def search_filters(request, database, table, datasette):
extra_context = {}
# Figure out which fts_table to use
table_metadata = datasette.table_metadata(database, table)
table_metadata = await datasette.table_config(database, table)
db = datasette.get_database(database)
fts_table = request.args.get("_fts_table")
fts_table = fts_table or table_metadata.get("fts_table")
@ -80,9 +80,9 @@ def search_filters(request, database, table, datasette):
"{fts_pk} in (select rowid from {fts_table} where {fts_table} match {match_clause})".format(
fts_table=escape_sqlite(fts_table),
fts_pk=escape_sqlite(fts_pk),
match_clause=":search"
if search_mode_raw
else "escape_fts(:search)",
match_clause=(
":search" if search_mode_raw else "escape_fts(:search)"
),
)
)
human_descriptions.append(f'search matches "{search}"')
@ -99,9 +99,11 @@ def search_filters(request, database, table, datasette):
"rowid in (select rowid from {fts_table} where {search_col} match {match_clause})".format(
fts_table=escape_sqlite(fts_table),
search_col=escape_sqlite(search_col),
match_clause=":search_{}".format(i)
if search_mode_raw
else "escape_fts(:search_{})".format(i),
match_clause=(
":search_{}".format(i)
if search_mode_raw
else "escape_fts(:search_{})".format(i)
),
)
)
human_descriptions.append(
@ -279,6 +281,13 @@ class Filters:
'{c} contains "{v}"',
format="%{}%",
),
TemplatedFilter(
"notcontains",
"does not contain",
'"{c}" not like :{p}',
'{c} does not contain "{v}"',
format="%{}%",
),
TemplatedFilter(
"endswith",
"ends with",

Wyświetl plik

@ -1,4 +1,3 @@
from os import stat
from datasette import hookimpl, Response

Wyświetl plik

@ -1,14 +1,12 @@
from datasette import hookimpl, Response
from .utils import await_me_maybe, add_cors_headers
from .utils import add_cors_headers
from .utils.asgi import (
Base400,
Forbidden,
)
from .views.base import DatasetteError
from markupsafe import Markup
import pdb
import traceback
from .plugins import pm
try:
import rich
@ -57,7 +55,8 @@ def handle_exception(datasette, request, exception):
if request.path.split("?")[0].endswith(".json"):
return Response.json(info, status=status, headers=headers)
else:
template = datasette.jinja_env.select_template(templates)
environment = datasette.get_jinja_environment(request)
template = environment.select_template(templates)
return Response.html(
await template.render_async(
dict(

Wyświetl plik

@ -60,7 +60,7 @@ def publish_subcommand(publish):
@hookspec
def render_cell(row, value, column, table, database, datasette):
def render_cell(row, value, column, table, database, datasette, request):
"""Customize rendering of HTML table cell values"""
@ -74,6 +74,11 @@ def register_facet_classes():
"""Register Facet subclasses"""
@hookspec
def register_permissions(datasette):
"""Register permissions: returns a list of datasette.permission.Permission named tuples"""
@hookspec
def register_routes(datasette):
"""Register URL routes: return a list of (regex, view_function) pairs"""
@ -89,6 +94,16 @@ def actor_from_request(datasette, request):
"""Return an actor dictionary based on the incoming request"""
@hookspec(firstresult=True)
def actors_from_ids(datasette, actor_ids):
"""Returns a dictionary mapping those IDs to actor dictionaries"""
@hookspec
def jinja2_environment_from_request(datasette, request, env):
"""Return a Jinja2 environment based on the incoming request"""
@hookspec
def filters_from_request(request, database, table, datasette):
"""
@ -125,16 +140,36 @@ def menu_links(datasette, actor, request):
"""Links for the navigation menu"""
@hookspec
def row_actions(datasette, actor, request, database, table, row):
"""Links for the row actions menu"""
@hookspec
def table_actions(datasette, actor, database, table, request):
"""Links for the table actions menu"""
@hookspec
def view_actions(datasette, actor, database, view, request):
"""Links for the view actions menu"""
@hookspec
def query_actions(datasette, actor, database, query_name, request, sql, params):
"""Links for the query and canned query actions menu"""
@hookspec
def database_actions(datasette, actor, database, request):
"""Links for the database actions menu"""
@hookspec
def homepage_actions(datasette, actor, request):
"""Links for the homepage actions menu"""
@hookspec
def skip_csrf(datasette, scope):
"""Mechanism for skipping CSRF checks for certain requests"""
@ -143,3 +178,43 @@ def skip_csrf(datasette, scope):
@hookspec
def handle_exception(datasette, request, exception):
"""Handle an uncaught exception. Can return a Response or None."""
@hookspec
def track_event(datasette, event):
"""Respond to an event tracked by Datasette"""
@hookspec
def register_events(datasette):
"""Return a list of Event subclasses to use with track_event()"""
@hookspec
def top_homepage(datasette, request):
"""HTML to include at the top of the homepage"""
@hookspec
def top_database(datasette, request, database):
"""HTML to include at the top of the database page"""
@hookspec
def top_table(datasette, request, database, table):
"""HTML to include at the top of the table page"""
@hookspec
def top_row(datasette, request, database, table, row):
"""HTML to include at the top of the row page"""
@hookspec
def top_query(datasette, request, database, sql):
"""HTML to include at the top of the query results page"""
@hookspec
def top_canned_query(datasette, request, database, query_name):
"""HTML to include at the top of the canned query page"""

Wyświetl plik

@ -0,0 +1,16 @@
from dataclasses import dataclass
from typing import Optional
@dataclass
class Permission:
name: str
abbr: Optional[str]
description: Optional[str]
takes_database: bool
takes_resource: bool
default: bool
# This is deliberately undocumented: it's considered an internal
# implementation detail for view-table/view-database and should
# not be used by plugins as it may change in the future.
implies_can_view: bool = False

Wyświetl plik

@ -1,9 +1,20 @@
import importlib
import os
import pluggy
import pkg_resources
from pprint import pprint
import sys
from . import hookspecs
if sys.version_info >= (3, 9):
import importlib.resources as importlib_resources
else:
import importlib_resources
if sys.version_info >= (3, 10):
import importlib.metadata as importlib_metadata
else:
import importlib_metadata
DEFAULT_PLUGINS = (
"datasette.publish.heroku",
"datasette.publish.cloudrun",
@ -17,15 +28,59 @@ DEFAULT_PLUGINS = (
"datasette.default_menu_links",
"datasette.handle_exception",
"datasette.forbidden",
"datasette.events",
)
pm = pluggy.PluginManager("datasette")
pm.add_hookspecs(hookspecs)
if not hasattr(sys, "_called_from_test"):
DATASETTE_TRACE_PLUGINS = os.environ.get("DATASETTE_TRACE_PLUGINS", None)
def before(hook_name, hook_impls, kwargs):
print(file=sys.stderr)
print(f"{hook_name}:", file=sys.stderr)
pprint(kwargs, width=40, indent=4, stream=sys.stderr)
print("Hook implementations:", file=sys.stderr)
pprint(hook_impls, width=40, indent=4, stream=sys.stderr)
def after(outcome, hook_name, hook_impls, kwargs):
results = outcome.get_result()
if not isinstance(results, list):
results = [results]
print(f"Results:", file=sys.stderr)
pprint(results, width=40, indent=4, stream=sys.stderr)
if DATASETTE_TRACE_PLUGINS:
pm.add_hookcall_monitoring(before, after)
DATASETTE_LOAD_PLUGINS = os.environ.get("DATASETTE_LOAD_PLUGINS", None)
if not hasattr(sys, "_called_from_test") and DATASETTE_LOAD_PLUGINS is None:
# Only load plugins if not running tests
pm.load_setuptools_entrypoints("datasette")
# Load any plugins specified in DATASETTE_LOAD_PLUGINS")
if DATASETTE_LOAD_PLUGINS is not None:
for package_name in [
name for name in DATASETTE_LOAD_PLUGINS.split(",") if name.strip()
]:
try:
distribution = importlib_metadata.distribution(package_name)
entry_points = distribution.entry_points
for entry_point in entry_points:
if entry_point.group == "datasette":
mod = entry_point.load()
pm.register(mod, name=entry_point.name)
# Ensure name can be found in plugin_to_distinfo later:
pm._plugin_distinfo.append((mod, distribution))
except importlib_metadata.PackageNotFoundError:
sys.stderr.write("Plugin {} could not be found\n".format(package_name))
# Load default plugins
for plugin in DEFAULT_PLUGINS:
mod = importlib.import_module(plugin)
@ -40,16 +95,16 @@ def get_plugins():
templates_path = None
if plugin.__name__ not in DEFAULT_PLUGINS:
try:
if pkg_resources.resource_isdir(plugin.__name__, "static"):
static_path = pkg_resources.resource_filename(
plugin.__name__, "static"
if (importlib_resources.files(plugin.__name__) / "static").is_dir():
static_path = str(
importlib_resources.files(plugin.__name__) / "static"
)
if pkg_resources.resource_isdir(plugin.__name__, "templates"):
templates_path = pkg_resources.resource_filename(
plugin.__name__, "templates"
if (importlib_resources.files(plugin.__name__) / "templates").is_dir():
templates_path = str(
importlib_resources.files(plugin.__name__) / "templates"
)
except (KeyError, ImportError):
# Caused by --plugins_dir= plugins - KeyError/ImportError thrown in Py3.5
except (TypeError, ModuleNotFoundError):
# Caused by --plugins_dir= plugins
pass
plugin_info = {
"name": plugin.__name__,
@ -60,6 +115,6 @@ def get_plugins():
distinfo = plugin_to_distinfo.get(plugin)
if distinfo:
plugin_info["version"] = distinfo.version
plugin_info["name"] = distinfo.project_name
plugin_info["name"] = distinfo.name or distinfo.project_name
plugins.append(plugin_info)
return plugins

Wyświetl plik

@ -173,7 +173,7 @@ def publish_subcommand(publish):
print(fp.read())
print("\n====================\n")
image_id = f"gcr.io/{project}/{name}"
image_id = f"gcr.io/{project}/datasette-{service}"
check_call(
"gcloud builds submit --tag {}{}".format(
image_id, " --timeout {}".format(timeout) if timeout else ""

Wyświetl plik

@ -3,7 +3,9 @@ from datasette import hookimpl
import click
import json
import os
import pathlib
import shlex
import shutil
from subprocess import call, check_output
import tempfile
@ -28,6 +30,11 @@ def publish_subcommand(publish):
"--tar",
help="--tar option to pass to Heroku, e.g. --tar=/usr/local/bin/gtar",
)
@click.option(
"--generate-dir",
type=click.Path(dir_okay=True, file_okay=False),
help="Output generated application files and stop without deploying",
)
def heroku(
files,
metadata,
@ -49,6 +56,7 @@ def publish_subcommand(publish):
about_url,
name,
tar,
generate_dir,
):
"Publish databases to Datasette running on Heroku"
fail_if_publish_binary_not_installed(
@ -105,6 +113,16 @@ def publish_subcommand(publish):
secret,
extra_metadata,
):
if generate_dir:
# Recursively copy files from current working directory to it
if pathlib.Path(generate_dir).exists():
raise click.ClickException("Directory already exists")
shutil.copytree(".", generate_dir)
click.echo(
f"Generated files written to {generate_dir}, stopping without deploying",
err=True,
)
return
app_name = None
if name:
# Check to see if this app already exists
@ -176,7 +194,7 @@ def temporary_heroku_directory(
fp.write(json.dumps(metadata_content, indent=2))
with open("runtime.txt", "w") as fp:
fp.write("python-3.8.10")
fp.write("python-3.11.0")
if branch:
install = [

Wyświetl plik

@ -4,6 +4,7 @@ from datasette.utils import (
remove_infinites,
CustomJSONEncoder,
path_from_row_pks,
sqlite3,
)
from datasette.utils.asgi import Response
@ -26,7 +27,7 @@ def convert_specific_columns_to_json(rows, columns, json_cols):
return new_rows
def json_renderer(args, data, view_name):
def json_renderer(request, args, data, error, truncated=None):
"""Render a response as JSON"""
status_code = 200
@ -44,28 +45,39 @@ def json_renderer(args, data, view_name):
data["rows"] = [remove_infinites(row) for row in data["rows"]]
# Deal with the _shape option
shape = args.get("_shape", "arrays")
shape = args.get("_shape", "objects")
# if there's an error, ignore the shape entirely
if data.get("error"):
shape = "arrays"
data["ok"] = True
if error:
shape = "objects"
status_code = 400
data["error"] = error
data["ok"] = False
next_url = data.get("next_url")
if truncated is not None:
data["truncated"] = truncated
if shape == "arrayfirst":
data = [row[0] for row in data["rows"]]
if not data["rows"]:
data = []
elif isinstance(data["rows"][0], sqlite3.Row):
data = [row[0] for row in data["rows"]]
else:
assert isinstance(data["rows"][0], dict)
data = [next(iter(row.values())) for row in data["rows"]]
elif shape in ("objects", "object", "array"):
columns = data.get("columns")
rows = data.get("rows")
if rows and columns:
if rows and columns and not isinstance(rows[0], dict):
data["rows"] = [dict(zip(columns, row)) for row in rows]
if shape == "object":
error = None
shape_error = None
if "primary_keys" not in data:
error = "_shape=object is only available on tables"
shape_error = "_shape=object is only available on tables"
else:
pks = data["primary_keys"]
if not pks:
error = (
shape_error = (
"_shape=object not available for tables with no primary keys"
)
else:
@ -74,13 +86,18 @@ def json_renderer(args, data, view_name):
pk_string = path_from_row_pks(row, pks, not pks)
object_rows[pk_string] = row
data = object_rows
if error:
data = {"ok": False, "error": error}
if shape_error:
data = {"ok": False, "error": shape_error}
elif shape == "array":
data = data["rows"]
elif shape == "arrays":
pass
if not data["rows"]:
pass
elif isinstance(data["rows"][0], sqlite3.Row):
data["rows"] = [list(row) for row in data["rows"]]
else:
data["rows"] = [list(row.values()) for row in data["rows"]]
else:
status_code = 400
data = {
@ -89,6 +106,12 @@ def json_renderer(args, data, view_name):
"status": 400,
"title": None,
}
# Don't include "columns" in output
# https://github.com/simonw/datasette/issues/2136
if isinstance(data, dict) and "columns" not in request.args.getlist("_extra"):
data.pop("columns", None)
# Handle _nl option for _shape=array
nl = args.get("_nl", "")
if nl and shape == "array":
@ -98,8 +121,6 @@ def json_renderer(args, data, view_name):
body = json.dumps(data, cls=CustomJSONEncoder)
content_type = "application/json; charset=utf-8"
headers = {}
if next_url:
headers["link"] = f'<{next_url}>; rel="next"'
return Response(
body, status=status_code, headers=headers, content_type=content_type
)

Wyświetl plik

@ -163,28 +163,22 @@ h6,
}
.page-header {
display: flex;
align-items: center;
padding-left: 10px;
border-left: 10px solid #666;
margin-bottom: 0.75rem;
margin-top: 1rem;
}
.page-header h1 {
display: inline;
margin: 0;
font-size: 2rem;
padding-right: 0.2em;
}
.page-header details {
display: inline-flex;
}
.page-header details > summary {
.page-action-menu details > summary {
list-style: none;
display: inline-flex;
cursor: pointer;
}
.page-header details > summary::-webkit-details-marker {
.page-action-menu details > summary::-webkit-details-marker {
display: none;
}
@ -275,6 +269,7 @@ header,
footer {
padding: 0.6rem 1rem 0.5rem 1rem;
background-color: #276890;
background: linear-gradient(180deg, rgba(96,144,173,1) 0%, rgba(39,104,144,1) 50%);
color: rgba(255,255,244,0.9);
overflow: hidden;
box-sizing: border-box;
@ -352,25 +347,59 @@ details.nav-menu > summary::-webkit-details-marker {
}
details .nav-menu-inner {
position: absolute;
top: 2rem;
top: 2.6rem;
right: 10px;
width: 180px;
background-color: #276890;
padding: 1rem;
z-index: 1000;
padding: 0;
}
.nav-menu-inner li,
form.nav-menu-logout {
padding: 0.3rem 0.5rem;
border-top: 1px solid #ffffff69;
}
.nav-menu-inner a {
display: block;
}
/* Table/database actions menu */
.page-header {
.page-action-menu {
position: relative;
margin-bottom: 0.5em;
}
.actions-menu-links {
display: inline;
}
.actions-menu-links .dropdown-menu {
position: absolute;
top: calc(100% + 10px);
left: -10px;
left: 0;
z-index: 10000;
}
.page-action-menu .icon-text {
display: inline-flex;
align-items: center;
border-radius: .25rem;
padding: 5px 12px 3px 7px;
color: #fff;
font-weight: 400;
font-size: 0.8em;
background: linear-gradient(180deg, #007bff 0%, #4E79C7 100%);
border-color: #007bff;
}
.page-action-menu .icon-text span {
/* Nudge text up a bit */
position: relative;
top: -2px;
}
.page-action-menu .icon-text:hover {
cursor: pointer;
}
.page-action-menu .icon {
width: 18px;
height: 18px;
margin-right: 4px;
}
/* Components ============================================================== */
@ -482,20 +511,18 @@ form.sql textarea {
font-family: monospace;
font-size: 1.3em;
}
form.sql label {
width: 15%;
}
form label {
font-weight: bold;
display: inline-block;
width: 15%;
}
.advanced-export form label {
width: auto;
}
.advanced-export input[type=submit] {
font-size: 0.6em;
margin-left: 1em;
}
label.sort_by_desc {
width: auto;
padding-right: 1em;
}
pre#sql-query {
@ -538,7 +565,7 @@ form input[type=submit], form button[type=button] {
form input[type=submit] {
color: #fff;
background-color: #007bff;
background: linear-gradient(180deg, #007bff 0%, #4E79C7 100%);
border-color: #007bff;
-webkit-appearance: button;
}
@ -573,6 +600,9 @@ form button[type=button] {
display: inline-block;
margin-right: 0.3em;
}
.select-wrapper:focus-within {
border: 1px solid black;
}
.select-wrapper.filter-op {
width: 80px;
}
@ -818,6 +848,13 @@ svg.dropdown-menu-icon {
.dropdown-menu a:hover {
background-color: #eee;
}
.dropdown-menu .dropdown-description {
margin: 0;
color: #666;
font-size: 0.8em;
max-width: 80vw;
white-space: normal;
}
.dropdown-menu .hook {
display: block;
position: absolute;

File diff suppressed because one or more lines are too long

Wyświetl plik

@ -0,0 +1,74 @@
import { EditorView, basicSetup } from "codemirror";
import { keymap } from "@codemirror/view";
import { sql, SQLDialect } from "@codemirror/lang-sql";
// A variation of SQLite from lang-sql https://github.com/codemirror/lang-sql/blob/ebf115fffdbe07f91465ccbd82868c587f8182bc/src/sql.ts#L231
const SQLite = SQLDialect.define({
// Based on https://www.sqlite.org/lang_keywords.html based on likely keywords to be used in select queries
// https://github.com/simonw/datasette/pull/1893#issuecomment-1316401895:
keywords:
"and as asc between by case cast count current_date current_time current_timestamp desc distinct each else escape except exists explain filter first for from full generated group having if in index inner intersect into isnull join last left like limit not null or order outer over pragma primary query raise range regexp right rollback row select set table then to union unique using values view virtual when where",
// https://www.sqlite.org/datatype3.html
types: "null integer real text blob",
builtin: "",
operatorChars: "*+-%<>!=&|/~",
identifierQuotes: '`"',
specialVar: "@:?$",
});
// Utility function from https://codemirror.net/docs/migration/
export function editorFromTextArea(textarea, conf = {}) {
// This could also be configured with a set of tables and columns for better autocomplete:
// https://github.com/codemirror/lang-sql#user-content-sqlconfig.tables
let view = new EditorView({
doc: textarea.value,
extensions: [
keymap.of([
{
key: "Shift-Enter",
run: function () {
textarea.value = view.state.doc.toString();
textarea.form.submit();
return true;
},
},
{
key: "Meta-Enter",
run: function () {
textarea.value = view.state.doc.toString();
textarea.form.submit();
return true;
},
},
]),
// This has to be after the keymap or else the basicSetup keys will prevent
// Meta-Enter from running
basicSetup,
EditorView.lineWrapping,
sql({
dialect: SQLite,
schema: conf.schema,
tables: conf.tables,
defaultTableName: conf.defaultTableName,
defaultSchemaName: conf.defaultSchemaName,
}),
],
});
// Idea taken from https://discuss.codemirror.net/t/resizing-codemirror-6/3265.
// Using CSS resize: both and scheduling a measurement when the element changes.
let editorDOM = view.contentDOM.closest(".cm-editor");
let observer = new ResizeObserver(function () {
view.requestMeasure();
});
observer.observe(editorDOM, { attributes: true });
textarea.parentNode.insertBefore(view.dom, textarea);
textarea.style.display = "none";
if (textarea.form) {
textarea.form.addEventListener("submit", () => {
textarea.value = view.state.doc.toString();
});
}
return view;
}

Wyświetl plik

@ -1,8 +0,0 @@
/*!
* cm-resize v1.0.1
* https://github.com/Sphinxxxx/cm-resize
*
* Copyright 2017-2018 Andreas Borgen (https://github.com/Sphinxxxx)
* Released under the MIT license.
*/
!function(e,t){"object"==typeof exports&&"undefined"!=typeof module?module.exports=t():"function"==typeof define&&define.amd?define(t):e.cmResize=t()}(this,function(){"use strict";return document.documentElement.firstElementChild.appendChild(document.createElement("style")).textContent=".cm-resize-handle{display:block;position:absolute;bottom:0;right:0;z-index:99;width:18px;height:18px;background:url(\"data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' width='30' height='30' viewBox='0,0 16,16'%3E%3Cpath stroke='gray' stroke-width='2' d='M-1,12 l18,-18 M-1,18 l18,-18 M-1,24 l18,-18 M-1,30 l18,-18'/%3E%3C/svg%3E\") center/cover;box-shadow:inset -1px -1px 0 0 silver;cursor:nwse-resize}",function(r,e){var t,c=(e=e||{}).minWidth||200,l=e.minHeight||100,s=!1!==e.resizableWidth,d=!1!==e.resizableHeight,n=e.cssClass||"cm-resize-handle",o=r.display.wrapper,i=e.handle||((t=o.appendChild(document.createElement("div"))).className=n,t),a=o.querySelector(".CodeMirror-vscrollbar"),u=o.querySelector(".CodeMirror-hscrollbar");function h(){e.handle||(a.style.bottom="18px",u.style.right="18px")}r.on("update",h),h();var f=void 0,m=void 0;return function(e){var t=Element.prototype;t.matches||(t.matches=t.msMatchesSelector||t.webkitMatchesSelector),t.closest||(t.closest=function(e){var t=this;do{if(t.matches(e))return t;t="svg"===t.tagName?t.parentNode:t.parentElement}while(t);return null});var l=(e=e||{}).container||document.documentElement,n=e.selector,o=e.callback||console.log,i=e.callbackDragStart,a=e.callbackDragEnd,r=e.callbackClick,c=e.propagateEvents,s=!1!==e.roundCoords,d=!1!==e.dragOutside,u=e.handleOffset||!1!==e.handleOffset,h=null;switch(u){case"center":h=!0;break;case"topleft":case"top-left":h=!1}var f=void 0,m=void 0,p=void 0;function v(e,t,n,o){var i=e.clientX,a=e.clientY;function r(e,t,n){return Math.max(t,Math.min(e,n))}if(t){var c=t.getBoundingClientRect();i-=c.left,a-=c.top,n&&(i-=n[0],a-=n[1]),o&&(i=r(i,0,c.width),a=r(a,0,c.height)),t!==l&&(null!==h?h:"circle"===t.nodeName||"ellipse"===t.nodeName)&&(i-=c.width/2,a-=c.height/2)}return s?[Math.round(i),Math.round(a)]:[i,a]}function g(e){e.preventDefault(),c||e.stopPropagation()}function w(e){(f=n?n instanceof Element?n.contains(e.target)?n:null:e.target.closest(n):{})&&(g(e),m=n&&u?v(e,f):[0,0],p=v(e,l,m),s&&(p=p.map(Math.round)),i&&i(f,p))}function b(e){if(f){g(e);var t=v(e,l,m,!d);o(f,t,p)}}function E(e){if(f){if(a||r){var t=v(e,l,m,!d);r&&p[0]===t[0]&&p[1]===t[1]&&r(f,p),a&&a(f,t,p)}f=null}}function x(e){E(C(e))}function M(e){return void 0!==e.buttons?1===e.buttons:1===e.which}function k(e,t){1===e.touches.length?t(C(e)):E(e)}function C(e){var t=e.targetTouches[0];return t||(t=e.changedTouches[0]),t.preventDefault=e.preventDefault.bind(e),t.stopPropagation=e.stopPropagation.bind(e),t}l.addEventListener("mousedown",function(e){M(e)&&w(e)}),l.addEventListener("touchstart",function(e){k(e,w)}),window.addEventListener("mousemove",function(e){f&&(M(e)?b(e):E(e))}),window.addEventListener("touchmove",function(e){k(e,b)}),window.addEventListener("mouseup",function(e){f&&!M(e)&&E(e)}),l.addEventListener("touchend",x),l.addEventListener("touchcancel",x)}({container:i.offsetParent,selector:i,callbackDragStart:function(e,t){f=t,m=[o.clientWidth,o.clientHeight]},callback:function(e,t){var n=t[0]-f[0],o=t[1]-f[1],i=s?Math.max(c,m[0]+n):null,a=d?Math.max(l,m[1]+o):null;r.setSize(i,a)}}),i}});

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Wyświetl plik

@ -0,0 +1,210 @@
// Custom events for use with the native CustomEvent API
const DATASETTE_EVENTS = {
INIT: "datasette_init", // returns datasette manager instance in evt.detail
};
// Datasette "core" -> Methods/APIs that are foundational
// Plugins will have greater stability if they use the functional hooks- but if they do decide to hook into
// literal DOM selectors, they'll have an easier time using these addresses.
const DOM_SELECTORS = {
/** Should have one match */
jsonExportLink: ".export-links a[href*=json]",
/** Event listeners that go outside of the main table, e.g. existing scroll listener */
tableWrapper: ".table-wrapper",
table: "table.rows-and-columns",
aboveTablePanel: ".above-table-panel",
// These could have multiple matches
/** Used for selecting table headers. Use makeColumnActions if you want to add menu items. */
tableHeaders: `table.rows-and-columns th`,
/** Used to add "where" clauses to query using direct manipulation */
filterRows: ".filter-row",
/** Used to show top available enum values for a column ("facets") */
facetResults: ".facet-results [data-column]",
};
/**
* Monolith class for interacting with Datasette JS API
* Imported with DEFER, runs after main document parsed
* For now, manually synced with datasette/version.py
*/
const datasetteManager = {
VERSION: window.datasetteVersion,
// TODO: Should order of registration matter more?
// Should plugins be allowed to clobber others or is it last-in takes priority?
// Does pluginMetadata need to be serializable, or can we let it be stateful / have functions?
plugins: new Map(),
registerPlugin: (name, pluginMetadata) => {
if (datasetteManager.plugins.has(name)) {
console.warn(`Warning -> plugin ${name} was redefined`);
}
datasetteManager.plugins.set(name, pluginMetadata);
// If the plugin participates in the panel... update the panel.
if (pluginMetadata.makeAboveTablePanelConfigs) {
datasetteManager.renderAboveTablePanel();
}
},
/**
* New DOM elements are created on each click, so the data is not stale.
*
* Items
* - must provide label (text)
* - might provide href (string) or an onclick ((evt) => void)
*
* columnMeta is metadata stored on the column header (TH) as a DOMStringMap
* - column: string
* - columnNotNull: boolean
* - columnType: sqlite datatype enum (text, number, etc)
* - isPk: boolean
*/
makeColumnActions: (columnMeta) => {
let columnActions = [];
// Accept function that returns list of columnActions with keys
// Required: label (text)
// Optional: onClick or href
datasetteManager.plugins.forEach((plugin) => {
if (plugin.makeColumnActions) {
// Plugins can provide multiple columnActions if they want
// If multiple try to create entry with same label, the last one deletes the others
columnActions.push(...plugin.makeColumnActions(columnMeta));
}
});
// TODO: Validate columnAction configs and give informative error message if missing keys.
return columnActions;
},
/**
* In MVP, each plugin can only have 1 instance.
* In future, panels could be repeated. We omit that for now since so many plugins depend on
* shared URL state, so having multiple instances of plugin at same time is problematic.
* Currently, we never destroy any panels, we just hide them.
*
* TODO: nicer panel css, show panel selection state.
* TODO: does this hook need to take any arguments?
*/
renderAboveTablePanel: () => {
const aboveTablePanel = document.querySelector(
DOM_SELECTORS.aboveTablePanel
);
if (!aboveTablePanel) {
console.warn(
"This page does not have a table, the renderAboveTablePanel cannot be used."
);
return;
}
let aboveTablePanelWrapper = aboveTablePanel.querySelector(".panels");
// First render: create wrappers. Otherwise, reuse previous.
if (!aboveTablePanelWrapper) {
aboveTablePanelWrapper = document.createElement("div");
aboveTablePanelWrapper.classList.add("tab-contents");
const panelNav = document.createElement("div");
panelNav.classList.add("tab-controls");
// Temporary: css for minimal amount of breathing room.
panelNav.style.display = "flex";
panelNav.style.gap = "8px";
panelNav.style.marginTop = "4px";
panelNav.style.marginBottom = "20px";
aboveTablePanel.appendChild(panelNav);
aboveTablePanel.appendChild(aboveTablePanelWrapper);
}
datasetteManager.plugins.forEach((plugin, pluginName) => {
const { makeAboveTablePanelConfigs } = plugin;
if (makeAboveTablePanelConfigs) {
const controls = aboveTablePanel.querySelector(".tab-controls");
const contents = aboveTablePanel.querySelector(".tab-contents");
// Each plugin can make multiple panels
const configs = makeAboveTablePanelConfigs();
configs.forEach((config, i) => {
const nodeContentId = `${pluginName}_${config.id}_panel-content`;
// quit if we've already registered this plugin
// TODO: look into whether plugins should be allowed to ask
// parent to re-render, or if they should manage that internally.
if (document.getElementById(nodeContentId)) {
return;
}
// Add tab control button
const pluginControl = document.createElement("button");
pluginControl.textContent = config.label;
pluginControl.onclick = () => {
contents.childNodes.forEach((node) => {
if (node.id === nodeContentId) {
node.style.display = "block";
} else {
node.style.display = "none";
}
});
};
controls.appendChild(pluginControl);
// Add plugin content area
const pluginNode = document.createElement("div");
pluginNode.id = nodeContentId;
config.render(pluginNode);
pluginNode.style.display = "none"; // Default to hidden unless you're ifrst
contents.appendChild(pluginNode);
});
// Let first node be selected by default
if (contents.childNodes.length) {
contents.childNodes[0].style.display = "block";
}
}
});
},
/** Selectors for document (DOM) elements. Store identifier instead of immediate references in case they haven't loaded when Manager starts. */
selectors: DOM_SELECTORS,
// Future API ideas
// Fetch page's data in array, and cache so plugins could reuse it
// Provide knowledge of what datasette JS or server-side via traditional console autocomplete
// State helpers: URL params https://github.com/simonw/datasette/issues/1144 and localstorage
// UI Hooks: command + k, tab manager hook
// Should we notify plugins that have dependencies
// when all dependencies were fulfilled? (leaflet, codemirror, etc)
// https://github.com/simonw/datasette-leaflet -> this way
// multiple plugins can all request the same copy of leaflet.
};
const initializeDatasette = () => {
// Hide the global behind __ prefix. Ideally they should be listening for the
// DATASETTE_EVENTS.INIT event to avoid the habit of reading from the window.
window.__DATASETTE__ = datasetteManager;
console.debug("Datasette Manager Created!");
const initDatasetteEvent = new CustomEvent(DATASETTE_EVENTS.INIT, {
detail: datasetteManager,
});
document.dispatchEvent(initDatasetteEvent);
};
/**
* Main function
* Fires AFTER the document has been parsed
*/
document.addEventListener("DOMContentLoaded", function () {
initializeDatasette();
});

Wyświetl plik

@ -0,0 +1,56 @@
/*
https://github.com/luyilin/json-format-highlight
From https://unpkg.com/json-format-highlight@1.0.1/dist/json-format-highlight.js
MIT Licensed
*/
(function (global, factory) {
typeof exports === "object" && typeof module !== "undefined"
? (module.exports = factory())
: typeof define === "function" && define.amd
? define(factory)
: (global.jsonFormatHighlight = factory());
})(this, function () {
"use strict";
var defaultColors = {
keyColor: "dimgray",
numberColor: "lightskyblue",
stringColor: "lightcoral",
trueColor: "lightseagreen",
falseColor: "#f66578",
nullColor: "cornflowerblue",
};
function index(json, colorOptions) {
if (colorOptions === void 0) colorOptions = {};
if (!json) {
return;
}
if (typeof json !== "string") {
json = JSON.stringify(json, null, 2);
}
var colors = Object.assign({}, defaultColors, colorOptions);
json = json.replace(/&/g, "&").replace(/</g, "<").replace(/>/g, ">");
return json.replace(
/("(\\u[a-zA-Z0-9]{4}|\\[^u]|[^\\"])*"(\s*:)?|\b(true|false|null)\b|-?\d+(?:\.\d*)?(?:[eE][+]?\d+)?)/g,
function (match) {
var color = colors.numberColor;
if (/^"/.test(match)) {
color = /:$/.test(match) ? colors.keyColor : colors.stringColor;
} else {
color = /true/.test(match)
? colors.trueColor
: /false/.test(match)
? colors.falseColor
: /null/.test(match)
? colors.nullColor
: color;
}
return '<span style="color: ' + color + '">' + match + "</span>";
}
);
}
return index;
});

Wyświetl plik

@ -17,7 +17,8 @@ var DROPDOWN_ICON_SVG = `<svg xmlns="http://www.w3.org/2000/svg" width="14" heig
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg>`;
(function () {
/** Main initialization function for Datasette Table interactions */
const initDatasetteTable = function (manager) {
// Feature detection
if (!window.URLSearchParams) {
return;
@ -68,13 +69,11 @@ var DROPDOWN_ICON_SVG = `<svg xmlns="http://www.w3.org/2000/svg" width="14" heig
menu.style.display = "none";
menu.classList.remove("anim-scale-in");
}
// When page loads, add scroll listener on .table-wrapper
document.addEventListener("DOMContentLoaded", () => {
var tableWrapper = document.querySelector(".table-wrapper");
if (tableWrapper) {
tableWrapper.addEventListener("scroll", closeMenu);
}
});
const tableWrapper = document.querySelector(manager.selectors.tableWrapper);
if (tableWrapper) {
tableWrapper.addEventListener("scroll", closeMenu);
}
document.body.addEventListener("click", (ev) => {
/* was this click outside the menu? */
var target = ev.target;
@ -85,9 +84,11 @@ var DROPDOWN_ICON_SVG = `<svg xmlns="http://www.w3.org/2000/svg" width="14" heig
closeMenu();
}
});
function iconClicked(ev) {
function onTableHeaderClick(ev) {
ev.preventDefault();
ev.stopPropagation();
menu.innerHTML = DROPDOWN_HTML;
var th = ev.target;
while (th.nodeName != "TH") {
th = th.parentNode;
@ -185,7 +186,59 @@ var DROPDOWN_ICON_SVG = `<svg xmlns="http://www.w3.org/2000/svg" width="14" heig
menu.style.left = menuLeft + "px";
menu.style.display = "block";
menu.classList.add("anim-scale-in");
// Custom menu items on each render
// Plugin hook: allow adding JS-based additional menu items
const columnActionsPayload = {
columnName: th.dataset.column,
columnNotNull: th.dataset.columnNotNull === '1',
columnType: th.dataset.columnType,
isPk: th.dataset.isPk === '1'
};
const columnItemConfigs = manager.makeColumnActions(columnActionsPayload);
const menuList = menu.querySelector('ul');
columnItemConfigs.forEach(itemConfig => {
// Remove items from previous render. We assume entries have unique labels.
const existingItems = menuList.querySelectorAll(`li`);
Array.from(existingItems).filter(item => item.innerText === itemConfig.label).forEach(node => {
node.remove();
});
const newLink = document.createElement('a');
newLink.textContent = itemConfig.label;
newLink.href = itemConfig.href ?? '#';
if (itemConfig.onClick) {
newLink.onclick = itemConfig.onClick;
}
// Attach new elements to DOM
const menuItem = document.createElement('li');
menuItem.appendChild(newLink);
menuList.appendChild(menuItem);
});
// Measure width of menu and adjust position if too far right
const menuWidth = menu.offsetWidth;
const windowWidth = window.innerWidth;
if (menuLeft + menuWidth > windowWidth) {
menu.style.left = windowWidth - menuWidth - 20 + "px";
}
// Align menu .hook arrow with the column cog icon
const hook = menu.querySelector('.hook');
const icon = th.querySelector('.dropdown-menu-icon');
const iconRect = icon.getBoundingClientRect();
const hookLeft = (iconRect.left - menuLeft + 1) + 'px';
hook.style.left = hookLeft;
// Move the whole menu right if the hook is too far right
const menuRect = menu.getBoundingClientRect();
if (iconRect.right > menuRect.right) {
menu.style.left = (iconRect.right - menuWidth) + 'px';
// And move hook tip as well
hook.style.left = (menuWidth - 13) + 'px';
}
}
var svg = document.createElement("div");
svg.innerHTML = DROPDOWN_ICON_SVG;
svg = svg.querySelector("*");
@ -197,21 +250,21 @@ var DROPDOWN_ICON_SVG = `<svg xmlns="http://www.w3.org/2000/svg" width="14" heig
menu.style.display = "none";
document.body.appendChild(menu);
var ths = Array.from(document.querySelectorAll(".rows-and-columns th"));
var ths = Array.from(document.querySelectorAll(manager.selectors.tableHeaders));
ths.forEach((th) => {
if (!th.querySelector("a")) {
return;
}
var icon = svg.cloneNode(true);
icon.addEventListener("click", iconClicked);
icon.addEventListener("click", onTableHeaderClick);
th.appendChild(icon);
});
})();
};
/* Add x buttons to the filter rows */
(function () {
function addButtonsToFilterRows(manager) {
var x = "✖";
var rows = Array.from(document.querySelectorAll(".filter-row")).filter((el) =>
var rows = Array.from(document.querySelectorAll(manager.selectors.filterRow)).filter((el) =>
el.querySelector(".filter-op")
);
rows.forEach((row) => {
@ -234,4 +287,53 @@ var DROPDOWN_ICON_SVG = `<svg xmlns="http://www.w3.org/2000/svg" width="14" heig
a.style.display = "none";
}
});
})();
};
/* Set up datalist autocomplete for filter values */
function initAutocompleteForFilterValues(manager) {
function createDataLists() {
var facetResults = document.querySelectorAll(
manager.selectors.facetResults
);
Array.from(facetResults).forEach(function (facetResult) {
// Use link text from all links in the facet result
var links = Array.from(
facetResult.querySelectorAll("li:not(.facet-truncated) a")
);
// Create a datalist element
var datalist = document.createElement("datalist");
datalist.id = "datalist-" + facetResult.dataset.column;
// Create an option element for each link text
links.forEach(function (link) {
var option = document.createElement("option");
option.label = link.innerText;
option.value = link.dataset.facetValue;
datalist.appendChild(option);
});
// Add the datalist to the facet result
facetResult.appendChild(datalist);
});
}
createDataLists();
// When any select with name=_filter_column changes, update the datalist
document.body.addEventListener("change", function (event) {
if (event.target.name === "_filter_column") {
event.target
.closest(manager.selectors.filterRow)
.querySelector(".filter-value")
.setAttribute("list", "datalist-" + event.target.value);
}
});
};
// Ensures Table UI is initialized only after the Manager is ready.
document.addEventListener("datasette_init", function (evt) {
const { detail: manager } = evt;
// Main table
initDatasetteTable(manager);
// Other UI functions with interactive JS needs
addButtonsToFilterRows(manager);
initAutocompleteForFilterValues(manager);
});

Wyświetl plik

@ -0,0 +1,28 @@
{% if action_links %}
<div class="page-action-menu">
<details class="actions-menu-links details-menu">
<summary>
<div class="icon-text">
<svg class="icon" aria-labelledby="actions-menu-links-title" role="img" style="color: #fff" xmlns="http://www.w3.org/2000/svg" width="28" height="28" viewBox="0 0 28 28" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">{{ action_title }}</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg>
<span>{{ action_title }}</span>
</div>
</summary>
<div class="dropdown-menu">
<div class="hook"></div>
<ul>
{% for link in action_links %}
<li><a href="{{ link.href }}">{{ link.label }}
{% if link.description %}
<p class="dropdown-description">{{ link.description }}</p>
{% endif %}</a>
</li>
{% endfor %}
</ul>
</div>
</details>
</div>
{% endif %}

Wyświetl plik

@ -9,7 +9,7 @@ document.body.addEventListener('click', (ev) => {
if (target && target.tagName == 'DETAILS') {
detailsClickedWithin = target;
}
Array.from(document.getElementsByTagName('details')).filter(
Array.from(document.querySelectorAll('details.details-menu')).filter(
(details) => details.open && details != detailsClickedWithin
).forEach(details => details.open = false);
});

Wyświetl plik

@ -1,14 +1,16 @@
<script src="{{ base_url }}-/static/sql-formatter-2.3.3.min.js" defer></script>
<script src="{{ base_url }}-/static/codemirror-5.57.0.min.js"></script>
<link rel="stylesheet" href="{{ base_url }}-/static/codemirror-5.57.0.min.css" />
<script src="{{ base_url }}-/static/codemirror-5.57.0-sql.min.js"></script>
<script src="{{ base_url }}-/static/cm-resize-1.0.1.min.js"></script>
<script src="{{ base_url }}-/static/cm-editor-6.0.1.bundle.js"></script>
<style>
.CodeMirror { height: auto; min-height: 70px; width: 80%; border: 1px solid #ddd; }
.cm-resize-handle {
background: url("data:image/svg+xml,%3Csvg%20aria-labelledby%3D%22cm-drag-to-resize%22%20role%3D%22img%22%20fill%3D%22%23ccc%22%20stroke%3D%22%23ccc%22%20xmlns%3D%22http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg%22%20viewBox%3D%220%200%2016%2016%22%20width%3D%2216%22%20height%3D%2216%22%3E%0A%20%20%3Ctitle%20id%3D%22cm-drag-to-resize%22%3EDrag%20to%20resize%3C%2Ftitle%3E%0A%20%20%3Cpath%20fill-rule%3D%22evenodd%22%20d%3D%22M1%202.75A.75.75%200%20011.75%202h12.5a.75.75%200%20110%201.5H1.75A.75.75%200%20011%202.75zm0%205A.75.75%200%20011.75%207h12.5a.75.75%200%20110%201.5H1.75A.75.75%200%20011%207.75zM1.75%2012a.75.75%200%20100%201.5h12.5a.75.75%200%20100-1.5H1.75z%22%3E%3C%2Fpath%3E%0A%3C%2Fsvg%3E");
background-repeat: no-repeat;
box-shadow: none;
cursor: ns-resize;
}
.cm-editor {
resize: both;
overflow: hidden;
width: 80%;
border: 1px solid #ddd;
}
/* Fix autocomplete icon positioning. The icon element gets border-box sizing set due to
the global reset, but this causes overlapping icon and text. Markup:
`<div class="cm-completionIcon cm-completionIcon-keyword" aria-hidden="true"></div>` */
.cm-completionIcon {
box-sizing: content-box;
}
</style>

Wyświetl plik

@ -1,38 +1,42 @@
<script>
window.onload = () => {
{% if table_columns %}
const schema = {{ table_columns|tojson(2) }};
{% else %}
const schema = {};
{% endif %}
window.addEventListener("DOMContentLoaded", () => {
const sqlFormat = document.querySelector("button#sql-format");
const readOnly = document.querySelector("pre#sql-query");
const sqlInput = document.querySelector("textarea#sql-editor");
if (sqlFormat && !readOnly) {
sqlFormat.hidden = false;
sqlFormat.hidden = false;
}
if (sqlInput) {
var editor = CodeMirror.fromTextArea(sqlInput, {
lineNumbers: true,
mode: "text/x-sql",
lineWrapping: true,
var editor = (window.editor = cm.editorFromTextArea(sqlInput, {
schema,
}));
if (sqlFormat) {
sqlFormat.addEventListener("click", (ev) => {
const formatted = sqlFormatter.format(editor.state.doc.toString());
editor.dispatch({
changes: {
from: 0,
to: editor.state.doc.length,
insert: formatted,
},
});
});
editor.setOption("extraKeys", {
"Shift-Enter": function() {
document.getElementsByClassName("sql")[0].submit();
},
Tab: false
});
if (sqlFormat) {
sqlFormat.addEventListener("click", ev => {
editor.setValue(sqlFormatter.format(editor.getValue()));
})
}
cmResize(editor, {resizableWidth: false});
}
}
if (sqlFormat && readOnly) {
const formatted = sqlFormatter.format(readOnly.innerHTML);
if (formatted != readOnly.innerHTML) {
sqlFormat.hidden = false;
sqlFormat.addEventListener("click", ev => {
readOnly.innerHTML = formatted;
})
}
const formatted = sqlFormatter.format(readOnly.innerHTML);
if (formatted != readOnly.innerHTML) {
sqlFormat.hidden = false;
sqlFormat.addEventListener("click", (ev) => {
readOnly.innerHTML = formatted;
});
}
}
}
});
</script>

Wyświetl plik

@ -1,6 +1,6 @@
{% if metadata.description_html or metadata.description %}
{% if metadata.get("description_html") or metadata.get("description") %}
<div class="metadata-description">
{% if metadata.description_html %}
{% if metadata.get("description_html") %}
{{ metadata.description_html|safe }}
{% else %}
{{ metadata.description }}

Wyświetl plik

@ -12,7 +12,7 @@
<ul class="tight-bullets">
{% for facet_value in facet_info.results %}
{% if not facet_value.selected %}
<li><a href="{{ facet_value.toggle_url }}">{{ (facet_value.label | string()) or "-" }}</a> {{ "{:,}".format(facet_value.count) }}</li>
<li><a href="{{ facet_value.toggle_url }}" data-facet-value="{{ facet_value.value }}">{{ (facet_value.label | string()) or "-" }}</a> {{ "{:,}".format(facet_value.count) }}</li>
{% else %}
<li>{{ facet_value.label or "-" }} &middot; {{ "{:,}".format(facet_value.count) }} <a href="{{ facet_value.toggle_url }}" class="cross">&#x2716;</a></li>
{% endif %}

Wyświetl plik

@ -1,3 +1,3 @@
<p class="suggested-facets">
Suggested facets: {% for facet in suggested_facets %}<a href="{{ facet.toggle_url }}#facet-{{ facet.name|to_css_class }}">{{ facet.name }}</a>{% if facet.type %} ({{ facet.type }}){% endif %}{% if not loop.last %}, {% endif %}{% endfor %}
Suggested facets: {% for facet in suggested_facets %}<a href="{{ facet.toggle_url }}#facet-{{ facet.name|to_css_class }}">{{ facet.name }}</a>{% if facet.get("type") %} ({{ facet.type }}){% endif %}{% if not loop.last %}, {% endif %}{% endfor %}
</p>

Wyświetl plik

@ -1,3 +1,5 @@
<!-- above-table-panel is a hook node for plugins to attach to . Displays even if no data available -->
<div class="above-table-panel"> </div>
{% if display_rows %}
<div class="table-wrapper">
<table class="rows-and-columns">

Wyświetl plik

@ -35,7 +35,7 @@ p.message-warning {
<p>Use this tool to try out different actor and allow combinations. See <a href="https://docs.datasette.io/en/stable/authentication.html#defining-permissions-with-allow-blocks">Defining permissions with "allow" blocks</a> for documentation.</p>
<form action="{{ urls.path('-/allow-debug') }}" method="get">
<form action="{{ urls.path('-/allow-debug') }}" method="get" style="margin-bottom: 1em">
<div class="two-col">
<p><label>Allow block</label></p>
<textarea name="allow">{{ allow_input }}</textarea>

Wyświetl plik

@ -0,0 +1,208 @@
{% extends "base.html" %}
{% block title %}API Explorer{% endblock %}
{% block extra_head %}
<script src="{{ base_url }}-/static/json-format-highlight-1.0.1.js"></script>
{% endblock %}
{% block content %}
<h1>API Explorer{% if private %} 🔒{% endif %}</h1>
<p>Use this tool to try out the
{% if datasette_version %}
<a href="https://docs.datasette.io/en/{{ datasette_version }}/json_api.html">Datasette API</a>.
{% else %}
Datasette API.
{% endif %}
</p>
<details open style="border: 2px solid #ccc; border-bottom: none; padding: 0.5em">
<summary style="cursor: pointer;">GET</summary>
<form method="get" id="api-explorer-get" style="margin-top: 0.7em">
<div>
<label for="path">API path:</label>
<input type="text" id="path" name="path" style="width: 60%">
<input type="submit" value="GET">
</div>
</form>
</details>
<details style="border: 2px solid #ccc; padding: 0.5em">
<summary style="cursor: pointer">POST</summary>
<form method="post" id="api-explorer-post" style="margin-top: 0.7em">
<div>
<label for="path">API path:</label>
<input type="text" id="path" name="path" style="width: 60%">
</div>
<div style="margin: 0.5em 0">
<label for="apiJson" style="vertical-align: top">JSON:</label>
<textarea id="apiJson" name="json" style="width: 60%; height: 200px; font-family: monospace; font-size: 0.8em;"></textarea>
</div>
<p><button id="json-format" type="button">Format JSON</button> <input type="submit" value="POST"></p>
</form>
</details>
<div id="output" style="display: none">
<h2>API response: HTTP <span id="response-status"></span></h2>
</h2>
<ul class="errors message-error"></ul>
<pre></pre>
</div>
<script>
document.querySelector('#json-format').addEventListener('click', (ev) => {
ev.preventDefault();
let json = document.querySelector('textarea[name="json"]').value.trim();
if (!json) {
return;
}
try {
const parsed = JSON.parse(json);
document.querySelector('textarea[name="json"]').value = JSON.stringify(parsed, null, 2);
} catch (e) {
alert("Error parsing JSON: " + e);
}
});
var postForm = document.getElementById('api-explorer-post');
var getForm = document.getElementById('api-explorer-get');
var output = document.getElementById('output');
var errorList = output.querySelector('.errors');
// On first load or fragment change populate forms from # in URL, if present
if (window.location.hash) {
onFragmentChange();
}
function onFragmentChange() {
var hash = window.location.hash.slice(1);
// Treat hash as a foo=bar string and parse it:
var params = new URLSearchParams(hash);
var method = params.get('method');
if (method == 'GET') {
getForm.closest('details').open = true;
postForm.closest('details').open = false;
getForm.querySelector('input[name="path"]').value = params.get('path');
} else if (method == 'POST') {
postForm.closest('details').open = true;
getForm.closest('details').open = false;
postForm.querySelector('input[name="path"]').value = params.get('path');
postForm.querySelector('textarea[name="json"]').value = params.get('json');
}
}
window.addEventListener('hashchange', () => {
onFragmentChange();
// Animate scroll to top of page
window.scrollTo({top: 0, behavior: 'smooth'});
});
// Cause GET and POST regions to toggle each other
var getDetails = getForm.closest('details');
var postDetails = postForm.closest('details');
getDetails.addEventListener('toggle', (ev) => {
if (getDetails.open) {
postDetails.open = false;
}
});
postDetails.addEventListener('toggle', (ev) => {
if (postDetails.open) {
getDetails.open = false;
}
});
getForm.addEventListener("submit", (ev) => {
ev.preventDefault();
var formData = new FormData(getForm);
// Update URL fragment hash
var serialized = new URLSearchParams(formData).toString() + '&method=GET';
window.history.pushState({}, "", location.pathname + '#' + serialized);
// Send the request
var path = formData.get('path');
fetch(path, {
method: 'GET',
headers: {
'Accept': 'application/json',
}
}).then((response) => {
output.style.display = 'block';
document.getElementById('response-status').textContent = response.status;
return response.json();
}).then((data) => {
output.querySelector('pre').innerHTML = jsonFormatHighlight(data);
errorList.style.display = 'none';
}).catch((error) => {
alert(error);
});
});
postForm.addEventListener("submit", (ev) => {
ev.preventDefault();
var formData = new FormData(postForm);
// Update URL fragment hash
var serialized = new URLSearchParams(formData).toString() + '&method=POST';
window.history.pushState({}, "", location.pathname + '#' + serialized);
// Send the request
var json = formData.get('json');
var path = formData.get('path');
// Validate JSON
if (!json.length) {
json = '{}';
}
try {
var data = JSON.parse(json);
} catch (err) {
alert("Invalid JSON: " + err);
return;
}
// POST JSON to path with content-type application/json
fetch(path, {
method: 'POST',
body: json,
headers: {
'Content-Type': 'application/json',
}
}).then(r => {
document.getElementById('response-status').textContent = r.status;
return r.json();
}).then(data => {
if (data.errors) {
errorList.style.display = 'block';
errorList.innerHTML = '';
data.errors.forEach(error => {
var li = document.createElement('li');
li.textContent = error;
errorList.appendChild(li);
});
} else {
errorList.style.display = 'none';
}
output.querySelector('pre').innerHTML = jsonFormatHighlight(data);
output.style.display = 'block';
}).catch(err => {
alert("Error: " + err);
});
});
</script>
{% if example_links %}
<h2>API endpoints</h2>
<ul class="bullets">
{% for database in example_links %}
<li>Database: <strong>{{ database.name }}</strong></li>
<ul class="bullets">
{% for link in database.links %}
<li><a href="{{ api_path(link) }}">{{ link.path }}</a> - {{ link.label }} </li>
{% endfor %}
{% for table in database.tables %}
<li><strong>{{ table.name }}</strong>
<ul class="bullets">
{% for link in table.links %}
<li><a href="{{ api_path(link) }}">{{ link.path }}</a> - {{ link.label }} </li>
{% endfor %}
</ul>
</li>
{% endfor %}
</ul>
{% endfor %}
</ul>
{% endif %}
{% endblock %}

Wyświetl plik

@ -5,10 +5,12 @@
<link rel="stylesheet" href="{{ urls.static('app.css') }}?{{ app_css_hash }}">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
{% for url in extra_css_urls %}
<link rel="stylesheet" href="{{ url.url }}"{% if url.sri %} integrity="{{ url.sri }}" crossorigin="anonymous"{% endif %}>
<link rel="stylesheet" href="{{ url.url }}"{% if url.get("sri") %} integrity="{{ url.sri }}" crossorigin="anonymous"{% endif %}>
{% endfor %}
<script>window.datasetteVersion = '{{ datasette_version }}';</script>
<script src="{{ urls.static('datasette-manager.js') }}" defer></script>
{% for url in extra_js_urls %}
<script {% if url.module %}type="module" {% endif %}src="{{ url.url }}"{% if url.sri %} integrity="{{ url.sri }}" crossorigin="anonymous"{% endif %}></script>
<script {% if url.module %}type="module" {% endif %}src="{{ url.url }}"{% if url.get("sri") %} integrity="{{ url.sri }}" crossorigin="anonymous"{% endif %}></script>
{% endfor %}
{%- if alternate_url_json -%}
<link rel="alternate" type="application/json+datasette" href="{{ alternate_url_json }}">
@ -19,7 +21,7 @@
<div class="not-footer">
<header><nav>{% block nav %}{% block crumbs %}{{ crumbs.nav(request=request) }}{% endblock %}
{% set links = menu_links() %}{% if links or show_logout %}
<details class="nav-menu">
<details class="nav-menu details-menu">
<summary><svg aria-labelledby="nav-menu-svg-title" role="img"
fill="currentColor" stroke="currentColor" xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 16 16" width="16" height="16">
@ -35,7 +37,7 @@
</ul>
{% endif %}
{% if show_logout %}
<form action="{{ urls.logout() }}" method="post">
<form class="nav-menu-logout" action="{{ urls.logout() }}" method="post">
<input type="hidden" name="csrftoken" value="{{ csrftoken() }}">
<button class="button-as-link">Log out</button>
</form>{% endif %}

Wyświetl plik

@ -0,0 +1,124 @@
{% extends "base.html" %}
{% block title %}Create an API token{% endblock %}
{% block extra_head %}
<style type="text/css">
#restrict-permissions label {
display: inline;
width: 90%;
}
</style>
{% endblock %}
{% block content %}
<h1>Create an API token</h1>
<p>This token will allow API access with the same abilities as your current user, <strong>{{ request.actor.id }}</strong></p>
{% if token %}
<div>
<h2>Your API token</h2>
<form>
<input type="text" class="copyable" style="width: 40%" value="{{ token }}">
<span class="copy-link-wrapper"></span>
</form>
<!--- show token in a <details> -->
<details style="margin-top: 1em">
<summary>Token details</summary>
<pre>{{ token_bits|tojson(4) }}</pre>
</details>
</div>
<h2>Create another token</h2>
{% endif %}
{% if errors %}
{% for error in errors %}
<p class="message-error">{{ error }}</p>
{% endfor %}
{% endif %}
<form action="{{ urls.path('-/create-token') }}" method="post">
<div>
<div class="select-wrapper" style="width: unset">
<select name="expire_type">
<option value="">Token never expires</option>
<option value="minutes">Expires after X minutes</option>
<option value="hours">Expires after X hours</option>
<option value="days">Expires after X days</option>
</select>
</div>
<input type="text" name="expire_duration" style="width: 10%">
<input type="hidden" name="csrftoken" value="{{ csrftoken() }}">
<input type="submit" value="Create token">
<details style="margin-top: 1em" id="restrict-permissions">
<summary style="cursor: pointer;">Restrict actions that can be performed using this token</summary>
<h2>All databases and tables</h2>
<ul>
{% for permission in all_permissions %}
<li><label><input type="checkbox" name="all:{{ permission }}"> {{ permission }}</label></li>
{% endfor %}
</ul>
{% for database in database_with_tables %}
<h2>All tables in "{{ database.name }}"</h2>
<ul>
{% for permission in database_permissions %}
<li><label><input type="checkbox" name="database:{{ database.encoded }}:{{ permission }}"> {{ permission }}</label></li>
{% endfor %}
</ul>
{% endfor %}
<h2>Specific tables</h2>
{% for database in database_with_tables %}
{% for table in database.tables %}
<h3>{{ database.name }}: {{ table.name }}</h3>
<ul>
{% for permission in resource_permissions %}
<li><label><input type="checkbox" name="resource:{{ database.encoded }}:{{ table.encoded }}:{{ permission }}"> {{ permission }}</label></li>
{% endfor %}
</ul>
{% endfor %}
{% endfor %}
</details>
</form>
</div>
<script>
var expireDuration = document.querySelector('input[name="expire_duration"]');
expireDuration.style.display = 'none';
var expireType = document.querySelector('select[name="expire_type"]');
function showHideExpireDuration() {
if (expireType.value) {
expireDuration.style.display = 'inline';
expireDuration.setAttribute("placeholder", expireType.value.replace("Expires after X ", ""));
} else {
expireDuration.style.display = 'none';
}
}
showHideExpireDuration();
expireType.addEventListener('change', showHideExpireDuration);
var copyInput = document.querySelector(".copyable");
if (copyInput) {
var wrapper = document.querySelector(".copy-link-wrapper");
var button = document.createElement("button");
button.className = "copyable-copy-button";
button.setAttribute("type", "button");
button.innerHTML = "Copy to clipboard";
button.onclick = (ev) => {
ev.preventDefault();
copyInput.select();
document.execCommand("copy");
button.innerHTML = "Copied!";
setTimeout(() => {
button.innerHTML = "Copy to clipboard";
}, 1500);
};
wrapper.appendChild(button);
wrapper.insertAdjacentElement("afterbegin", button);
}
</script>
{% endblock %}

Wyświetl plik

@ -10,29 +10,13 @@
{% block body_class %}db db-{{ database|to_css_class }}{% endblock %}
{% block content %}
<div class="page-header" style="border-color: #{{ database_color(database) }}">
<div class="page-header" style="border-color: #{{ database_color }}">
<h1>{{ metadata.title or database }}{% if private %} 🔒{% endif %}</h1>
{% set links = database_actions() %}{% if links %}
<details class="actions-menu-links">
<summary><svg aria-labelledby="actions-menu-links-title" role="img"
style="color: #666" xmlns="http://www.w3.org/2000/svg"
width="28" height="28" viewBox="0 0 24 24" fill="none"
stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">Table actions</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg></summary>
<div class="dropdown-menu">
{% if links %}
<ul>
{% for link in links %}
<li><a href="{{ link.href }}">{{ link.label }}</a></li>
{% endfor %}
</ul>
{% endif %}
</div>
</details>{% endif %}
</div>
</div>
{% set action_links, action_title = database_actions(), "Database actions" %}
{% include "_action_menu.html" %}
{{ top_database() }}
{% block description_source_license %}{% include "_description_source_license.html" %}{% endblock %}
@ -95,7 +79,7 @@
{% endif %}
{% if allow_download %}
<p class="download-sqlite">Download SQLite DB: <a href="{{ urls.database(database) }}.db">{{ database }}.db</a> <em>{{ format_bytes(size) }}</em></p>
<p class="download-sqlite">Download SQLite DB: <a href="{{ urls.database(database) }}.db" rel="nofollow">{{ database }}.db</a> <em>{{ format_bytes(size) }}</em></p>
{% endif %}
{% include "_codemirror_foot.html" %}

Wyświetl plik

@ -7,6 +7,11 @@
{% block content %}
<h1>{{ metadata.title or "Datasette" }}{% if private %} 🔒{% endif %}</h1>
{% set action_links, action_title = homepage_actions, "Homepage actions" %}
{% include "_action_menu.html" %}
{{ top_homepage() }}
{% block description_source_license %}{% include "_description_source_license.html" %}{% endblock %}
{% for database in databases %}

Wyświetl plik

@ -13,7 +13,7 @@
<p class="crumbs">
<a href="/">home</a>
</p>
<details class="nav-menu">
<details class="nav-menu details-menu">
<summary><svg aria-labelledby="nav-menu-svg-title" role="img"
fill="currentColor" stroke="currentColor" xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 16 16" width="16" height="16">
@ -26,7 +26,7 @@
<li><a href="/-/plugins">Installed plugins</a></li>
<li><a href="/-/versions">Version info</a></li>
</ul>
<form action="/-/logout" method="post">
<form class="nav-menu-logout" action="/-/logout" method="post">
<button class="button-as-link">Log out</button>
</form>
</div>
@ -96,18 +96,24 @@
<section class="content">
<div class="page-header" style="border-color: #ff0000">
<h1>fixtures</h1>
<details class="actions-menu-links">
<summary><svg aria-labelledby="actions-menu-links-title" role="img"
style="color: #666" xmlns="http://www.w3.org/2000/svg"
width="28" height="28" viewBox="0 0 24 24" fill="none"
stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">Table actions</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg></summary>
</div>
<div class="page-action-menu">
<details class="actions-menu-links details-menu">
<summary>
<div class="icon-text">
<svg class="icon" aria-labelledby="actions-menu-links-title" role="img" style="color: #fff" xmlns="http://www.w3.org/2000/svg" width="28" height="28" viewBox="0 0 28 28" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">Database actions</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg>
<span>Database actions</span>
</div>
</summary>
<div class="dropdown-menu">
<div class="hook"></div>
<ul>
<li><a href="#">Database action</a></li>
<li><a href="#">Action one</a></li>
<li><a href="#">Action two</a></li>
</ul>
</div>
</details>
@ -158,18 +164,24 @@
<section class="content">
<div class="page-header" style="border-color: #ff0000">
<h1>roadside_attraction_characteristics</h1>
<details class="actions-menu-links">
<summary><svg aria-labelledby="actions-menu-links-title" role="img"
style="color: #666" xmlns="http://www.w3.org/2000/svg"
width="28" height="28" viewBox="0 0 24 24" fill="none"
stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">Table actions</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg></summary>
</div>
<div class="page-action-menu">
<details class="actions-menu-links details-menu">
<summary>
<div class="icon-text">
<svg class="icon" aria-labelledby="actions-menu-links-title" role="img" style="color: #fff" xmlns="http://www.w3.org/2000/svg" width="28" height="28" viewBox="0 0 28 28" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">Database actions</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg>
<span>Table actions</span>
</div>
</summary>
<div class="dropdown-menu">
<div class="hook"></div>
<ul>
<li><a href="#">Table action</a></li>
<li><a href="#">Action one</a></li>
<li><a href="#">Action two</a></li>
</ul>
</div>
</details>

Wyświetl plik

@ -19,11 +19,97 @@
.check-action, .check-when, .check-result {
font-size: 1.3em;
}
textarea {
height: 10em;
width: 95%;
box-sizing: border-box;
padding: 0.5em;
border: 2px dotted black;
}
.two-col {
display: inline-block;
width: 48%;
}
.two-col label {
width: 48%;
}
@media only screen and (max-width: 576px) {
.two-col {
width: 100%;
}
}
</style>
{% endblock %}
{% block content %}
<h1>Permission check testing tool</h1>
<p>This tool lets you simulate an actor and a permission check for that actor.</p>
<form action="{{ urls.path('-/permissions') }}" id="debug-post" method="post" style="margin-bottom: 1em">
<input type="hidden" name="csrftoken" value="{{ csrftoken() }}">
<div class="two-col">
<p><label>Actor</label></p>
<textarea name="actor">{% if actor_input %}{{ actor_input }}{% else %}{"id": "root"}{% endif %}</textarea>
</div>
<div class="two-col" style="vertical-align: top">
<p><label for="permission" style="display:block">Permission</label>
<select name="permission" id="permission">
{% for permission in permissions %}
<option value="{{ permission.name }}">{{ permission.name }} (default {{ permission.default }})</option>
{% endfor %}
</select>
<p><label for="resource_1">Database name</label><input type="text" id="resource_1" name="resource_1"></p>
<p><label for="resource_2">Table or query name</label><input type="text" id="resource_2" name="resource_2"></p>
</div>
<div style="margin-top: 1em;">
<input type="submit" value="Simulate permission check">
</div>
<pre style="margin-top: 1em" id="debugResult"></pre>
</form>
<script>
var rawPerms = {{ permissions|tojson }};
var permissions = Object.fromEntries(rawPerms.map(p => [p.name, p]));
var permissionSelect = document.getElementById('permission');
var resource1 = document.getElementById('resource_1');
var resource2 = document.getElementById('resource_2');
function updateResourceVisibility() {
var permission = permissionSelect.value;
var {takes_database, takes_resource} = permissions[permission];
if (takes_database) {
resource1.closest('p').style.display = 'block';
} else {
resource1.closest('p').style.display = 'none';
}
if (takes_resource) {
resource2.closest('p').style.display = 'block';
} else {
resource2.closest('p').style.display = 'none';
}
}
permissionSelect.addEventListener('change', updateResourceVisibility);
updateResourceVisibility();
// When #debug-post form is submitted, use fetch() to POST data
var debugPost = document.getElementById('debug-post');
var debugResult = document.getElementById('debugResult');
debugPost.addEventListener('submit', function(ev) {
ev.preventDefault();
var formData = new FormData(debugPost);
console.log(formData);
fetch(debugPost.action, {
method: 'POST',
body: new URLSearchParams(formData),
}).then(function(response) {
return response.json();
}).then(function(data) {
debugResult.innerText = JSON.stringify(data, null, 4);
});
});
</script>
<h1>Recent permissions checks</h1>
{% for check in permission_checks %}

Wyświetl plik

@ -24,15 +24,19 @@
{% block content %}
{% if canned_write and db_is_immutable %}
{% if canned_query_write and db_is_immutable %}
<p class="message-error">This query cannot be executed because the database is immutable.</p>
{% endif %}
<h1 style="padding-left: 10px; border-left: 10px solid #{{ database_color(database) }}">{{ metadata.title or database }}{% if canned_query and not metadata.title %}: {{ canned_query }}{% endif %}{% if private %} 🔒{% endif %}</h1>
<h1 style="padding-left: 10px; border-left: 10px solid #{{ database_color }}">{{ metadata.title or database }}{% if canned_query and not metadata.title %}: {{ canned_query }}{% endif %}{% if private %} 🔒{% endif %}</h1>
{% set action_links, action_title = query_actions(), "Query actions" %}
{% include "_action_menu.html" %}
{% if canned_query %}{{ top_canned_query() }}{% else %}{{ top_query() }}{% endif %}
{% block description_source_license %}{% include "_description_source_license.html" %}{% endblock %}
<form class="sql" action="{{ urls.database(database) }}{% if canned_query %}/{{ canned_query }}{% endif %}" method="{% if canned_write %}post{% else %}get{% endif %}">
<form class="sql" action="{{ urls.database(database) }}{% if canned_query %}/{{ canned_query }}{% endif %}" method="{% if canned_query_write %}post{% else %}get{% endif %}">
<h3>Custom SQL query{% if display_rows %} returning {% if truncated %}more than {% endif %}{{ "{:,}".format(display_rows|length) }} row{% if display_rows|length == 1 %}{% else %}s{% endif %}{% endif %}{% if not query_error %}
<span class="show-hide-sql">(<a href="{{ show_hide_link }}">{{ show_hide_text }}</a>)</span>
{% endif %}</h3>
@ -61,8 +65,8 @@
{% endif %}
<p>
{% if not hide_sql %}<button id="sql-format" type="button" hidden>Format SQL</button>{% endif %}
{% if canned_write %}<input type="hidden" name="csrftoken" value="{{ csrftoken() }}">{% endif %}
<input type="submit" value="Run SQL"{% if canned_write and db_is_immutable %} disabled{% endif %}>
{% if canned_query_write %}<input type="hidden" name="csrftoken" value="{{ csrftoken() }}">{% endif %}
<input type="submit" value="Run SQL"{% if canned_query_write and db_is_immutable %} disabled{% endif %}>
{{ show_hide_hidden }}
{% if canned_query and edit_sql_url %}<a href="{{ edit_sql_url }}" class="canned-query-edit-sql">Edit SQL</a>{% endif %}
</p>
@ -87,7 +91,7 @@
</tbody>
</table></div>
{% else %}
{% if not canned_write and not error %}
{% if not canned_query_write and not error %}
<p class="zero-results">0 results</p>
{% endif %}
{% endif %}

Wyświetl plik

@ -20,7 +20,12 @@
{% endblock %}
{% block content %}
<h1 style="padding-left: 10px; border-left: 10px solid #{{ database_color(database) }}">{{ table }}: {{ ', '.join(primary_key_values) }}{% if private %} 🔒{% endif %}</h1>
<h1 style="padding-left: 10px; border-left: 10px solid #{{ database_color }}">{{ table }}: {{ ', '.join(primary_key_values) }}{% if private %} 🔒{% endif %}</h1>
{% set action_links, action_title = row_actions, "Row actions" %}
{% include "_action_menu.html" %}
{{ top_row() }}
{% block description_source_license %}{% include "_description_source_license.html" %}{% endblock %}

Wyświetl plik

@ -1,6 +1,6 @@
{% extends "base.html" %}
{% block title %}{{ database }}: {{ table }}: {% if filtered_table_rows_count or filtered_table_rows_count == 0 %}{{ "{:,}".format(filtered_table_rows_count) }} row{% if filtered_table_rows_count == 1 %}{% else %}s{% endif %}{% endif %}{% if human_description_en %} {{ human_description_en }}{% endif %}{% endblock %}
{% block title %}{{ database }}: {{ table }}: {% if count or count == 0 %}{{ "{:,}".format(count) }} row{% if count == 1 %}{% else %}s{% endif %}{% endif %}{% if human_description_en %} {{ human_description_en }}{% endif %}{% endblock %}
{% block extra_head %}
{{- super() -}}
@ -21,33 +21,17 @@
{% endblock %}
{% block content %}
<div class="page-header" style="border-color: #{{ database_color(database) }}">
<h1>{{ metadata.title or table }}{% if is_view %} (view){% endif %}{% if private %} 🔒{% endif %}</h1>
{% set links = table_actions() %}{% if links %}
<details class="actions-menu-links">
<summary><svg aria-labelledby="actions-menu-links-title" role="img"
style="color: #666" xmlns="http://www.w3.org/2000/svg"
width="28" height="28" viewBox="0 0 24 24" fill="none"
stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">Table actions</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg></summary>
<div class="dropdown-menu">
{% if links %}
<ul>
{% for link in links %}
<li><a href="{{ link.href }}">{{ link.label }}</a></li>
{% endfor %}
</ul>
{% endif %}
</div>
</details>{% endif %}
<div class="page-header" style="border-color: #{{ database_color }}">
<h1>{{ metadata.get("title") or table }}{% if is_view %} (view){% endif %}{% if private %} 🔒{% endif %}</h1>
</div>
{% set action_links, action_title = actions(), "View actions" if is_view else "Table actions" %}
{% include "_action_menu.html" %}
{{ top_table() }}
{% block description_source_license %}{% include "_description_source_license.html" %}{% endblock %}
{% if metadata.columns %}
{% if metadata.get("columns") %}
<dl class="column-descriptions">
{% for column_name, column_description in metadata.columns.items() %}
<dt>{{ column_name }}</dt><dd>{{ column_description }}</dd>
@ -55,8 +39,8 @@
</dl>
{% endif %}
{% if filtered_table_rows_count or human_description_en %}
<h3>{% if filtered_table_rows_count or filtered_table_rows_count == 0 %}{{ "{:,}".format(filtered_table_rows_count) }} row{% if filtered_table_rows_count == 1 %}{% else %}s{% endif %}{% endif %}
{% if count or human_description_en %}
<h3>{% if count or count == 0 %}{{ "{:,}".format(count) }} row{% if count == 1 %}{% else %}s{% endif %}{% endif %}
{% if human_description_en %}{{ human_description_en }}{% endif %}
</h3>
{% endif %}
@ -94,7 +78,7 @@
</div><div class="select-wrapper filter-op">
<select name="_filter_op">
{% for key, display, no_argument in filters.lookups() %}
<option value="{{ key }}{% if no_argument %}__1{% endif %}"{% if key == lookup %} selected{% endif %}>{{ display }}</option>
<option value="{{ key }}{% if no_argument %}__1{% endif %}">{{ display }}</option>
{% endfor %}
</select>
</div><input type="text" name="_filter_value" class="filter-value">

Wyświetl plik

@ -1,4 +1,4 @@
from .utils import tilde_encode, path_with_format, HASH_LENGTH, PrefixedUrlString
from .utils import tilde_encode, path_with_format, PrefixedUrlString
import urllib

Wyświetl plik

@ -1,7 +1,9 @@
import asyncio
from contextlib import contextmanager
import aiofiles
import click
from collections import OrderedDict, namedtuple, Counter
import copy
import base64
import hashlib
import inspect
@ -17,11 +19,14 @@ import time
import types
import secrets
import shutil
from typing import Iterable, List, Tuple
import urllib
import yaml
from .shutil_backport import copytree
from .sqlite import sqlite3, supports_table_xinfo
if typing.TYPE_CHECKING:
from datasette.database import Database
# From https://www.sqlite.org/lang_keywords.html
reserved_words = set(
@ -242,6 +247,7 @@ allowed_pragmas = (
"schema_version",
"table_info",
"table_xinfo",
"table_list",
)
disallawed_sql_res = [
(
@ -402,9 +408,9 @@ def make_dockerfile(
apt_get_extras = apt_get_extras_
if spatialite:
apt_get_extras.extend(["python3-dev", "gcc", "libsqlite3-mod-spatialite"])
environment_variables[
"SQLITE_EXTENSIONS"
] = "/usr/lib/x86_64-linux-gnu/mod_spatialite.so"
environment_variables["SQLITE_EXTENSIONS"] = (
"/usr/lib/x86_64-linux-gnu/mod_spatialite.so"
)
return """
FROM python:3.11.0-slim-bullseye
COPY . /app
@ -416,9 +422,11 @@ RUN datasette inspect {files} --inspect-file inspect-data.json
ENV PORT {port}
EXPOSE {port}
CMD {cmd}""".format(
apt_get_extras=APT_GET_DOCKERFILE_EXTRAS.format(" ".join(apt_get_extras))
if apt_get_extras
else "",
apt_get_extras=(
APT_GET_DOCKERFILE_EXTRAS.format(" ".join(apt_get_extras))
if apt_get_extras
else ""
),
environment_variables="\n".join(
[
"ENV {} '{}'".format(key, value)
@ -709,7 +717,7 @@ def to_css_class(s):
"""
if css_class_re.match(s):
return s
md5_suffix = hashlib.md5(s.encode("utf8")).hexdigest()[:6]
md5_suffix = md5_not_usedforsecurity(s)[:6]
# Strip leading _, -
s = s.lstrip("_").lstrip("-")
# Replace any whitespace with hyphens
@ -828,9 +836,18 @@ _infinities = {float("inf"), float("-inf")}
def remove_infinites(row):
if any((c in _infinities) if isinstance(c, float) else 0 for c in row):
to_check = row
if isinstance(row, dict):
to_check = row.values()
if not any((c in _infinities) if isinstance(c, float) else 0 for c in to_check):
return row
if isinstance(row, dict):
return {
k: (None if (isinstance(v, float) and v in _infinities) else v)
for k, v in row.items()
}
else:
return [None if (isinstance(c, float) and c in _infinities) else c for c in row]
return row
class StaticMount(click.ParamType):
@ -1117,7 +1134,13 @@ class StartupError(Exception):
_re_named_parameter = re.compile(":([a-zA-Z0-9_]+)")
async def derive_named_parameters(db, sql):
@documented
async def derive_named_parameters(db: "Database", sql: str) -> List[str]:
"""
Given a SQL statement, return a list of named parameters that are used in the statement
e.g. for ``select * from foo where id=:id`` this would return ``["id"]``
"""
explain = "explain {}".format(sql.strip().rstrip(";"))
possible_params = _re_named_parameter.findall(sql)
try:
@ -1129,8 +1152,10 @@ async def derive_named_parameters(db, sql):
def add_cors_headers(headers):
headers["Access-Control-Allow-Origin"] = "*"
headers["Access-Control-Allow-Headers"] = "Authorization"
headers["Access-Control-Allow-Headers"] = "Authorization, Content-Type"
headers["Access-Control-Expose-Headers"] = "Link"
headers["Access-Control-Allow-Methods"] = "GET, POST, HEAD, OPTIONS"
headers["Access-Control-Max-Age"] = "3600"
_TILDE_ENCODING_SAFE = frozenset(
@ -1193,3 +1218,225 @@ def truncate_url(url, length):
rest, ext = bits
return rest[: length - 1 - len(ext)] + "…." + ext
return url[: length - 1] + ""
async def row_sql_params_pks(db, table, pk_values):
pks = await db.primary_keys(table)
use_rowid = not pks
select = "*"
if use_rowid:
select = "rowid, *"
pks = ["rowid"]
wheres = [f'"{pk}"=:p{i}' for i, pk in enumerate(pks)]
sql = f"select {select} from {escape_sqlite(table)} where {' AND '.join(wheres)}"
params = {}
for i, pk_value in enumerate(pk_values):
params[f"p{i}"] = pk_value
return sql, params, pks
def _handle_pair(key: str, value: str) -> dict:
"""
Turn a key-value pair into a nested dictionary.
foo, bar => {'foo': 'bar'}
foo.bar, baz => {'foo': {'bar': 'baz'}}
foo.bar, [1, 2, 3] => {'foo': {'bar': [1, 2, 3]}}
foo.bar, "baz" => {'foo': {'bar': 'baz'}}
foo.bar, '{"baz": "qux"}' => {'foo': {'bar': "{'baz': 'qux'}"}}
"""
try:
value = json.loads(value)
except json.JSONDecodeError:
# If it doesn't parse as JSON, treat it as a string
pass
keys = key.split(".")
result = current_dict = {}
for k in keys[:-1]:
current_dict[k] = {}
current_dict = current_dict[k]
current_dict[keys[-1]] = value
return result
def _combine(base: dict, update: dict) -> dict:
"""
Recursively merge two dictionaries.
"""
for key, value in update.items():
if isinstance(value, dict) and key in base and isinstance(base[key], dict):
base[key] = _combine(base[key], value)
else:
base[key] = value
return base
def pairs_to_nested_config(pairs: typing.List[typing.Tuple[str, typing.Any]]) -> dict:
"""
Parse a list of key-value pairs into a nested dictionary.
"""
result = {}
for key, value in pairs:
parsed_pair = _handle_pair(key, value)
result = _combine(result, parsed_pair)
return result
def make_slot_function(name, datasette, request, **kwargs):
from datasette.plugins import pm
method = getattr(pm.hook, name, None)
assert method is not None, "No hook found for {}".format(name)
async def inner():
html_bits = []
for hook in method(datasette=datasette, request=request, **kwargs):
html = await await_me_maybe(hook)
if html is not None:
html_bits.append(html)
return markupsafe.Markup("".join(html_bits))
return inner
def prune_empty_dicts(d: dict):
"""
Recursively prune all empty dictionaries from a given dictionary.
"""
for key, value in list(d.items()):
if isinstance(value, dict):
prune_empty_dicts(value)
if value == {}:
d.pop(key, None)
def move_plugins_and_allow(source: dict, destination: dict) -> Tuple[dict, dict]:
"""
Move 'plugins' and 'allow' keys from source to destination dictionary. Creates
hierarchy in destination if needed. After moving, recursively remove any keys
in the source that are left empty.
"""
source = copy.deepcopy(source)
destination = copy.deepcopy(destination)
def recursive_move(src, dest, path=None):
if path is None:
path = []
for key, value in list(src.items()):
new_path = path + [key]
if key in ("plugins", "allow"):
# Navigate and create the hierarchy in destination if needed
d = dest
for step in path:
d = d.setdefault(step, {})
# Move the plugins
d[key] = value
# Remove the plugins from source
src.pop(key, None)
elif isinstance(value, dict):
recursive_move(value, dest, new_path)
# After moving, check if the current dictionary is empty and remove it if so
if not value:
src.pop(key, None)
recursive_move(source, destination)
prune_empty_dicts(source)
return source, destination
_table_config_keys = (
"hidden",
"sort",
"sort_desc",
"size",
"sortable_columns",
"label_column",
"facets",
"fts_table",
"fts_pk",
"searchmode",
"units",
)
def move_table_config(metadata: dict, config: dict):
"""
Move all known table configuration keys from metadata to config.
"""
if "databases" not in metadata:
return metadata, config
metadata = copy.deepcopy(metadata)
config = copy.deepcopy(config)
for database_name, database in metadata["databases"].items():
if "tables" not in database:
continue
for table_name, table in database["tables"].items():
for key in _table_config_keys:
if key in table:
config.setdefault("databases", {}).setdefault(
database_name, {}
).setdefault("tables", {}).setdefault(table_name, {})[
key
] = table.pop(
key
)
prune_empty_dicts(metadata)
return metadata, config
def redact_keys(original: dict, key_patterns: Iterable) -> dict:
"""
Recursively redact sensitive keys in a dictionary based on given patterns
:param original: The original dictionary
:param key_patterns: A list of substring patterns to redact
:return: A copy of the original dictionary with sensitive values redacted
"""
def redact(data):
if isinstance(data, dict):
return {
k: (
redact(v)
if not any(pattern in k for pattern in key_patterns)
else "***"
)
for k, v in data.items()
}
elif isinstance(data, list):
return [redact(item) for item in data]
else:
return data
return redact(original)
def md5_not_usedforsecurity(s):
try:
return hashlib.md5(s.encode("utf8"), usedforsecurity=False).hexdigest()
except TypeError:
# For Python 3.8 which does not support usedforsecurity=False
return hashlib.md5(s.encode("utf8")).hexdigest()
_etag_cache = {}
async def calculate_etag(filepath, chunk_size=4096):
if filepath in _etag_cache:
return _etag_cache[filepath]
hasher = hashlib.md5()
async with aiofiles.open(filepath, "rb") as f:
while True:
chunk = await f.read(chunk_size)
if not chunk:
break
hasher.update(chunk)
etag = f'"{hasher.hexdigest()}"'
_etag_cache[filepath] = etag
return etag

Wyświetl plik

@ -1,5 +1,6 @@
import hashlib
import json
from datasette.utils import MultiParams
from datasette.utils import MultiParams, calculate_etag
from mimetypes import guess_type
from urllib.parse import parse_qs, urlunparse, parse_qsl
from pathlib import Path
@ -21,6 +22,27 @@ class NotFound(Base400):
status = 404
class DatabaseNotFound(NotFound):
def __init__(self, message, database_name):
super().__init__(message)
self.database_name = database_name
class TableNotFound(NotFound):
def __init__(self, message, database_name, table):
super().__init__(message)
self.database_name = database_name
self.table = table
class RowNotFound(NotFound):
def __init__(self, message, database_name, table, pk_values):
super().__init__(message)
self.database_name = database_name
self.table_name = table
self.pk_values = pk_values
class Forbidden(Base400):
status = 403
@ -264,6 +286,7 @@ async def asgi_send_file(
headers = headers or {}
if filename:
headers["content-disposition"] = f'attachment; filename="{filename}"'
first = True
headers["content-length"] = str((await aiofiles.os.stat(str(filepath))).st_size)
async with aiofiles.open(str(filepath), mode="rb") as fp:
@ -286,9 +309,14 @@ async def asgi_send_file(
def asgi_static(root_path, chunk_size=4096, headers=None, content_type=None):
root_path = Path(root_path)
static_headers = {}
if headers:
static_headers = headers.copy()
async def inner_static(request, send):
path = request.scope["url_route"]["kwargs"]["path"]
headers = static_headers.copy()
try:
full_path = (root_path / path).resolve().absolute()
except FileNotFoundError:
@ -304,7 +332,15 @@ def asgi_static(root_path, chunk_size=4096, headers=None, content_type=None):
await asgi_send_html(send, "404: Path not inside root path", 404)
return
try:
await asgi_send_file(send, full_path, chunk_size=chunk_size)
# Calculate ETag for filepath
etag = await calculate_etag(full_path, chunk_size=chunk_size)
headers["ETag"] = etag
if_none_match = request.headers.get("if-none-match")
if if_none_match and if_none_match == etag:
return await asgi_send(send, "", 304)
await asgi_send_file(
send, full_path, chunk_size=chunk_size, headers=headers
)
except FileNotFoundError:
await asgi_send_html(send, "404: File not found", 404)
return
@ -428,3 +464,18 @@ class AsgiFileDownload:
content_type=self.content_type,
headers=self.headers,
)
class AsgiRunOnFirstRequest:
def __init__(self, asgi, on_startup):
assert isinstance(on_startup, list)
self.asgi = asgi
self.on_startup = on_startup
self._started = False
async def __call__(self, scope, receive, send):
if not self._started:
self._started = True
for hook in self.on_startup:
await hook()
return await self.asgi(scope, receive, send)

Wyświetl plik

@ -0,0 +1,25 @@
import asyncio
import types
from typing import NamedTuple, Any
class CallableStatus(NamedTuple):
is_callable: bool
is_async_callable: bool
def check_callable(obj: Any) -> CallableStatus:
if not callable(obj):
return CallableStatus(False, False)
if isinstance(obj, type):
# It's a class
return CallableStatus(True, False)
if isinstance(obj, types.FunctionType):
return CallableStatus(True, asyncio.iscoroutinefunction(obj))
if hasattr(obj, "__call__"):
return CallableStatus(True, asyncio.iscoroutinefunction(obj.__call__))
assert False, "obj {} is somehow callable with no __call__ method".format(repr(obj))

Wyświetl plik

@ -5,13 +5,13 @@ from datasette.utils import table_column_details
async def init_internal_db(db):
create_tables_sql = textwrap.dedent(
"""
CREATE TABLE IF NOT EXISTS databases (
CREATE TABLE IF NOT EXISTS catalog_databases (
database_name TEXT PRIMARY KEY,
path TEXT,
is_memory INTEGER,
schema_version INTEGER
);
CREATE TABLE IF NOT EXISTS tables (
CREATE TABLE IF NOT EXISTS catalog_tables (
database_name TEXT,
table_name TEXT,
rootpage INTEGER,
@ -19,7 +19,7 @@ async def init_internal_db(db):
PRIMARY KEY (database_name, table_name),
FOREIGN KEY (database_name) REFERENCES databases(database_name)
);
CREATE TABLE IF NOT EXISTS columns (
CREATE TABLE IF NOT EXISTS catalog_columns (
database_name TEXT,
table_name TEXT,
cid INTEGER,
@ -33,7 +33,7 @@ async def init_internal_db(db):
FOREIGN KEY (database_name) REFERENCES databases(database_name),
FOREIGN KEY (database_name, table_name) REFERENCES tables(database_name, table_name)
);
CREATE TABLE IF NOT EXISTS indexes (
CREATE TABLE IF NOT EXISTS catalog_indexes (
database_name TEXT,
table_name TEXT,
seq INTEGER,
@ -45,7 +45,7 @@ async def init_internal_db(db):
FOREIGN KEY (database_name) REFERENCES databases(database_name),
FOREIGN KEY (database_name, table_name) REFERENCES tables(database_name, table_name)
);
CREATE TABLE IF NOT EXISTS foreign_keys (
CREATE TABLE IF NOT EXISTS catalog_foreign_keys (
database_name TEXT,
table_name TEXT,
id INTEGER,
@ -69,12 +69,19 @@ async def populate_schema_tables(internal_db, db):
database_name = db.name
def delete_everything(conn):
conn.execute("DELETE FROM tables WHERE database_name = ?", [database_name])
conn.execute("DELETE FROM columns WHERE database_name = ?", [database_name])
conn.execute(
"DELETE FROM foreign_keys WHERE database_name = ?", [database_name]
"DELETE FROM catalog_tables WHERE database_name = ?", [database_name]
)
conn.execute(
"DELETE FROM catalog_columns WHERE database_name = ?", [database_name]
)
conn.execute(
"DELETE FROM catalog_foreign_keys WHERE database_name = ?",
[database_name],
)
conn.execute(
"DELETE FROM catalog_indexes WHERE database_name = ?", [database_name]
)
conn.execute("DELETE FROM indexes WHERE database_name = ?", [database_name])
await internal_db.execute_write_fn(delete_everything)
@ -133,14 +140,14 @@ async def populate_schema_tables(internal_db, db):
await internal_db.execute_write_many(
"""
INSERT INTO tables (database_name, table_name, rootpage, sql)
INSERT INTO catalog_tables (database_name, table_name, rootpage, sql)
values (?, ?, ?, ?)
""",
tables_to_insert,
)
await internal_db.execute_write_many(
"""
INSERT INTO columns (
INSERT INTO catalog_columns (
database_name, table_name, cid, name, type, "notnull", default_value, is_pk, hidden
) VALUES (
:database_name, :table_name, :cid, :name, :type, :notnull, :default_value, :is_pk, :hidden
@ -150,7 +157,7 @@ async def populate_schema_tables(internal_db, db):
)
await internal_db.execute_write_many(
"""
INSERT INTO foreign_keys (
INSERT INTO catalog_foreign_keys (
database_name, table_name, "id", seq, "table", "from", "to", on_update, on_delete, match
) VALUES (
:database_name, :table_name, :id, :seq, :table, :from, :to, :on_update, :on_delete, :match
@ -160,7 +167,7 @@ async def populate_schema_tables(internal_db, db):
)
await internal_db.execute_write_many(
"""
INSERT INTO indexes (
INSERT INTO catalog_indexes (
database_name, table_name, seq, name, "unique", origin, partial
) VALUES (
:database_name, :table_name, :seq, :name, :unique, :origin, :partial

Wyświetl plik

@ -4,6 +4,7 @@ Backported from Python 3.8.
This code is licensed under the Python License:
https://github.com/python/cpython/blob/v3.8.3/LICENSE
"""
import os
from shutil import copy, copy2, copystat, Error

Wyświetl plik

@ -16,6 +16,11 @@ class TestResponse:
def status(self):
return self.httpx_response.status_code
# Supports both for test-writing convenience
@property
def status_code(self):
return self.status
@property
def headers(self):
return self.httpx_response.headers
@ -24,17 +29,14 @@ class TestResponse:
def body(self):
return self.httpx_response.content
@property
def content(self):
return self.body
@property
def cookies(self):
return dict(self.httpx_response.cookies)
def cookie_was_deleted(self, cookie):
return any(
h
for h in self.httpx_response.headers.get_list("set-cookie")
if h.startswith(f'{cookie}="";')
)
@property
def json(self):
return json.loads(self.text)
@ -62,6 +64,7 @@ class TestClient:
method="GET",
cookies=None,
if_none_match=None,
headers=None,
):
return await self._request(
path=path,
@ -70,6 +73,7 @@ class TestClient:
method=method,
cookies=cookies,
if_none_match=if_none_match,
headers=headers,
)
@async_to_sync

Wyświetl plik

@ -1,2 +1,2 @@
__version__ = "0.63.1"
__version__ = "1.0a13"
__version_info__ = tuple(__version__.split("."))

Wyświetl plik

@ -0,0 +1,3 @@
class Context:
"Base class for all documented contexts"
pass

Wyświetl plik

@ -10,7 +10,6 @@ from markupsafe import escape
import pint
from datasette import __version__
from datasette.database import QueryInterrupted
from datasette.utils.asgi import Request
from datasette.utils import (
@ -20,7 +19,6 @@ from datasette.utils import (
InvalidSql,
LimitedWriter,
call_with_supported_arguments,
tilde_decode,
path_from_row_pks,
path_with_added_args,
path_with_removed_args,
@ -54,6 +52,43 @@ class DatasetteError(Exception):
self.message_is_html = message_is_html
class View:
async def head(self, request, datasette):
if not hasattr(self, "get"):
return await self.method_not_allowed(request)
response = await self.get(request, datasette)
response.body = ""
return response
async def method_not_allowed(self, request):
if (
request.path.endswith(".json")
or request.headers.get("content-type") == "application/json"
):
response = Response.json(
{"ok": False, "error": "Method not allowed"}, status=405
)
else:
response = Response.text("Method not allowed", status=405)
return response
async def options(self, request, datasette):
response = Response.text("ok")
response.headers["allow"] = ", ".join(
method.upper()
for method in ("head", "get", "post", "put", "patch", "delete")
if hasattr(self, method)
)
return response
async def __call__(self, request, datasette):
try:
handler = getattr(self, request.method.lower())
except AttributeError:
return await self.method_not_allowed(request)
return await handler(request, datasette)
class BaseView:
ds = None
has_json_alternate = True
@ -66,37 +101,52 @@ class BaseView:
response.body = b""
return response
def database_color(self, database):
return "ff0000"
async def method_not_allowed(self, request):
if (
request.path.endswith(".json")
or request.headers.get("content-type") == "application/json"
):
response = Response.json(
{"ok": False, "error": "Method not allowed"}, status=405
)
else:
response = Response.text("Method not allowed", status=405)
return response
async def options(self, request, *args, **kwargs):
return Response.text("Method not allowed", status=405)
return Response.text("ok")
async def get(self, request, *args, **kwargs):
return await self.method_not_allowed(request)
async def post(self, request, *args, **kwargs):
return Response.text("Method not allowed", status=405)
return await self.method_not_allowed(request)
async def put(self, request, *args, **kwargs):
return Response.text("Method not allowed", status=405)
return await self.method_not_allowed(request)
async def patch(self, request, *args, **kwargs):
return Response.text("Method not allowed", status=405)
return await self.method_not_allowed(request)
async def delete(self, request, *args, **kwargs):
return Response.text("Method not allowed", status=405)
return await self.method_not_allowed(request)
async def dispatch_request(self, request):
if self.ds:
await self.ds.refresh_schemas()
handler = getattr(self, request.method.lower(), None)
return await handler(request)
response = await handler(request)
if self.ds.cors:
add_cors_headers(response.headers)
return response
async def render(self, templates, request, context=None):
context = context or {}
template = self.ds.jinja_env.select_template(templates)
environment = self.ds.get_jinja_environment(request)
template = environment.select_template(templates)
template_context = {
**context,
**{
"database_color": self.database_color,
"select_templates": [
f"{'*' if template_name == template.name else ''}{template_name}"
for template_name in templates
@ -143,12 +193,6 @@ class BaseView:
class DataView(BaseView):
name = ""
async def options(self, request, *args, **kwargs):
r = Response.text("ok")
if self.ds.cors:
add_cors_headers(r.headers)
return r
def redirect(self, request, path, forward_querystring=True, remove_args=None):
if request.query_string and "?" not in path and forward_querystring:
path = f"{path}?{request.query_string}"
@ -163,185 +207,13 @@ class DataView(BaseView):
async def data(self, request):
raise NotImplementedError
def get_templates(self, database, table=None):
assert NotImplemented
async def as_csv(self, request, database):
kwargs = {}
stream = request.args.get("_stream")
# Do not calculate facets or counts:
extra_parameters = [
"{}=1".format(key)
for key in ("_nofacet", "_nocount")
if not request.args.get(key)
]
if extra_parameters:
# Replace request object with a new one with modified scope
if not request.query_string:
new_query_string = "&".join(extra_parameters)
else:
new_query_string = (
request.query_string + "&" + "&".join(extra_parameters)
)
new_scope = dict(
request.scope, query_string=new_query_string.encode("latin-1")
)
receive = request.receive
request = Request(new_scope, receive)
if stream:
# Some quick soundness checks
if not self.ds.setting("allow_csv_stream"):
raise BadRequest("CSV streaming is disabled")
if request.args.get("_next"):
raise BadRequest("_next not allowed for CSV streaming")
kwargs["_size"] = "max"
# Fetch the first page
try:
response_or_template_contexts = await self.data(request)
if isinstance(response_or_template_contexts, Response):
return response_or_template_contexts
elif len(response_or_template_contexts) == 4:
data, _, _, _ = response_or_template_contexts
else:
data, _, _ = response_or_template_contexts
except (sqlite3.OperationalError, InvalidSql) as e:
raise DatasetteError(str(e), title="Invalid SQL", status=400)
except sqlite3.OperationalError as e:
raise DatasetteError(str(e))
except DatasetteError:
raise
# Convert rows and columns to CSV
headings = data["columns"]
# if there are expanded_columns we need to add additional headings
expanded_columns = set(data.get("expanded_columns") or [])
if expanded_columns:
headings = []
for column in data["columns"]:
headings.append(column)
if column in expanded_columns:
headings.append(f"{column}_label")
content_type = "text/plain; charset=utf-8"
preamble = ""
postamble = ""
trace = request.args.get("_trace")
if trace:
content_type = "text/html; charset=utf-8"
preamble = (
"<html><head><title>CSV debug</title></head>"
'<body><textarea style="width: 90%; height: 70vh">'
)
postamble = "</textarea></body></html>"
async def stream_fn(r):
nonlocal data, trace
limited_writer = LimitedWriter(r, self.ds.setting("max_csv_mb"))
if trace:
await limited_writer.write(preamble)
writer = csv.writer(EscapeHtmlWriter(limited_writer))
else:
writer = csv.writer(limited_writer)
first = True
next = None
while first or (next and stream):
try:
kwargs = {}
if next:
kwargs["_next"] = next
if not first:
data, _, _ = await self.data(request, **kwargs)
if first:
if request.args.get("_header") != "off":
await writer.writerow(headings)
first = False
next = data.get("next")
for row in data["rows"]:
if any(isinstance(r, bytes) for r in row):
new_row = []
for column, cell in zip(headings, row):
if isinstance(cell, bytes):
# If this is a table page, use .urls.row_blob()
if data.get("table"):
pks = data.get("primary_keys") or []
cell = self.ds.absolute_url(
request,
self.ds.urls.row_blob(
database,
data["table"],
path_from_row_pks(row, pks, not pks),
column,
),
)
else:
# Otherwise generate URL for this query
url = self.ds.absolute_url(
request,
path_with_format(
request=request,
format="blob",
extra_qs={
"_blob_column": column,
"_blob_hash": hashlib.sha256(
cell
).hexdigest(),
},
replace_format="csv",
),
)
cell = url.replace("&_nocount=1", "").replace(
"&_nofacet=1", ""
)
new_row.append(cell)
row = new_row
if not expanded_columns:
# Simple path
await writer.writerow(row)
else:
# Look for {"value": "label": } dicts and expand
new_row = []
for heading, cell in zip(data["columns"], row):
if heading in expanded_columns:
if cell is None:
new_row.extend(("", ""))
else:
assert isinstance(cell, dict)
new_row.append(cell["value"])
new_row.append(cell["label"])
else:
new_row.append(cell)
await writer.writerow(new_row)
except Exception as e:
sys.stderr.write("Caught this error: {}\n".format(e))
sys.stderr.flush()
await r.write(str(e))
return
await limited_writer.write(postamble)
headers = {}
if self.ds.cors:
add_cors_headers(headers)
if request.args.get("_dl", None):
if not trace:
content_type = "text/csv; charset=utf-8"
disposition = 'attachment; filename="{}.csv"'.format(
request.url_vars.get("table", database)
)
headers["content-disposition"] = disposition
return AsgiStream(stream_fn, headers=headers, content_type=content_type)
return await stream_csv(self.ds, self.data, request, database)
async def get(self, request):
database_route = tilde_decode(request.url_vars["database"])
try:
db = self.ds.get_database(route=database_route)
except KeyError:
raise NotFound("Database not found: {}".format(database_route))
db = await self.ds.resolve_database(request)
database = db.name
database_route = db.route
_format = request.url_vars["format"]
data_kwargs = {}
@ -433,6 +305,8 @@ class DataView(BaseView):
table=data.get("table"),
request=request,
view_name=self.name,
truncated=False, # TODO: support this
error=data.get("error"),
# These will be deprecated in Datasette 1.0:
args=request.args,
data=data,
@ -507,12 +381,11 @@ class DataView(BaseView):
if key not in ("_labels", "_facet", "_size")
]
+ [("_size", "max")],
"datasette_version": __version__,
"settings": self.ds.settings_dict(),
},
}
if "metadata" not in context:
context["metadata"] = self.ds.metadata
context["metadata"] = self.ds.metadata()
r = await self.render(templates, request=request, context=context)
if status_code is not None:
r.status = status_code
@ -536,3 +409,174 @@ class DataView(BaseView):
if self.ds.cors:
add_cors_headers(response.headers)
return response
def _error(messages, status=400):
return Response.json({"ok": False, "errors": messages}, status=status)
async def stream_csv(datasette, fetch_data, request, database):
kwargs = {}
stream = request.args.get("_stream")
# Do not calculate facets or counts:
extra_parameters = [
"{}=1".format(key)
for key in ("_nofacet", "_nocount")
if not request.args.get(key)
]
if extra_parameters:
# Replace request object with a new one with modified scope
if not request.query_string:
new_query_string = "&".join(extra_parameters)
else:
new_query_string = request.query_string + "&" + "&".join(extra_parameters)
new_scope = dict(request.scope, query_string=new_query_string.encode("latin-1"))
receive = request.receive
request = Request(new_scope, receive)
if stream:
# Some quick soundness checks
if not datasette.setting("allow_csv_stream"):
raise BadRequest("CSV streaming is disabled")
if request.args.get("_next"):
raise BadRequest("_next not allowed for CSV streaming")
kwargs["_size"] = "max"
# Fetch the first page
try:
response_or_template_contexts = await fetch_data(request)
if isinstance(response_or_template_contexts, Response):
return response_or_template_contexts
elif len(response_or_template_contexts) == 4:
data, _, _, _ = response_or_template_contexts
else:
data, _, _ = response_or_template_contexts
except (sqlite3.OperationalError, InvalidSql) as e:
raise DatasetteError(str(e), title="Invalid SQL", status=400)
except sqlite3.OperationalError as e:
raise DatasetteError(str(e))
except DatasetteError:
raise
# Convert rows and columns to CSV
headings = data["columns"]
# if there are expanded_columns we need to add additional headings
expanded_columns = set(data.get("expanded_columns") or [])
if expanded_columns:
headings = []
for column in data["columns"]:
headings.append(column)
if column in expanded_columns:
headings.append(f"{column}_label")
content_type = "text/plain; charset=utf-8"
preamble = ""
postamble = ""
trace = request.args.get("_trace")
if trace:
content_type = "text/html; charset=utf-8"
preamble = (
"<html><head><title>CSV debug</title></head>"
'<body><textarea style="width: 90%; height: 70vh">'
)
postamble = "</textarea></body></html>"
async def stream_fn(r):
nonlocal data, trace
limited_writer = LimitedWriter(r, datasette.setting("max_csv_mb"))
if trace:
await limited_writer.write(preamble)
writer = csv.writer(EscapeHtmlWriter(limited_writer))
else:
writer = csv.writer(limited_writer)
first = True
next = None
while first or (next and stream):
try:
kwargs = {}
if next:
kwargs["_next"] = next
if not first:
data, _, _ = await fetch_data(request, **kwargs)
if first:
if request.args.get("_header") != "off":
await writer.writerow(headings)
first = False
next = data.get("next")
for row in data["rows"]:
if any(isinstance(r, bytes) for r in row):
new_row = []
for column, cell in zip(headings, row):
if isinstance(cell, bytes):
# If this is a table page, use .urls.row_blob()
if data.get("table"):
pks = data.get("primary_keys") or []
cell = datasette.absolute_url(
request,
datasette.urls.row_blob(
database,
data["table"],
path_from_row_pks(row, pks, not pks),
column,
),
)
else:
# Otherwise generate URL for this query
url = datasette.absolute_url(
request,
path_with_format(
request=request,
format="blob",
extra_qs={
"_blob_column": column,
"_blob_hash": hashlib.sha256(
cell
).hexdigest(),
},
replace_format="csv",
),
)
cell = url.replace("&_nocount=1", "").replace(
"&_nofacet=1", ""
)
new_row.append(cell)
row = new_row
if not expanded_columns:
# Simple path
await writer.writerow(row)
else:
# Look for {"value": "label": } dicts and expand
new_row = []
for heading, cell in zip(data["columns"], row):
if heading in expanded_columns:
if cell is None:
new_row.extend(("", ""))
else:
if not isinstance(cell, dict):
new_row.extend((cell, ""))
else:
new_row.append(cell["value"])
new_row.append(cell["label"])
else:
new_row.append(cell)
await writer.writerow(new_row)
except Exception as ex:
sys.stderr.write("Caught this error: {}\n".format(ex))
sys.stderr.flush()
await r.write(str(ex))
return
await limited_writer.write(postamble)
headers = {}
if datasette.cors:
add_cors_headers(headers)
if request.args.get("_dl", None):
if not trace:
content_type = "text/csv; charset=utf-8"
disposition = 'attachment; filename="{}.csv"'.format(
request.url_vars.get("table", database)
)
headers["content-disposition"] = disposition
return AsgiStream(stream_fn, headers=headers, content_type=content_type)

Wyświetl plik

@ -1,7 +1,12 @@
import hashlib
import json
from datasette.utils import add_cors_headers, CustomJSONEncoder
from datasette.plugins import pm
from datasette.utils import (
add_cors_headers,
await_me_maybe,
make_slot_function,
CustomJSONEncoder,
)
from datasette.utils.asgi import Response
from datasette.version import __version__
@ -105,9 +110,7 @@ class IndexView(BaseView):
{
"name": name,
"hash": db.hash,
"color": db.hash[:6]
if db.hash
else hashlib.md5(name.encode("utf8")).hexdigest()[:6],
"color": db.color,
"path": self.ds.urls.database(name),
"tables_and_views_truncated": tables_and_views_truncated,
"tables_and_views_more": (len(visible_tables) + len(views))
@ -134,6 +137,15 @@ class IndexView(BaseView):
headers=headers,
)
else:
homepage_actions = []
for hook in pm.hook.homepage_actions(
datasette=self.ds,
actor=request.actor,
request=request,
):
extra_links = await await_me_maybe(hook)
if extra_links:
homepage_actions.extend(extra_links)
return await self.render(
["index.html"],
request=request,
@ -142,7 +154,11 @@ class IndexView(BaseView):
"metadata": self.ds.metadata(),
"datasette_version": __version__,
"private": not await self.ds.permission_allowed(
None, "view-instance", default=True
None, "view-instance"
),
"top_homepage": make_slot_function(
"top_homepage", self.ds, request
),
"homepage_actions": homepage_actions,
},
)

Wyświetl plik

@ -1,26 +1,28 @@
from datasette.utils.asgi import NotFound, Forbidden
from datasette.utils.asgi import NotFound, Forbidden, Response
from datasette.database import QueryInterrupted
from .base import DataView
from datasette.events import UpdateRowEvent, DeleteRowEvent
from .base import DataView, BaseView, _error
from datasette.utils import (
tilde_decode,
urlsafe_components,
await_me_maybe,
make_slot_function,
to_css_class,
escape_sqlite,
)
from .table import _sql_params_pks, display_columns_and_rows
from datasette.plugins import pm
import json
import sqlite_utils
from .table import display_columns_and_rows
class RowView(DataView):
name = "row"
async def data(self, request, default_labels=False):
database_route = tilde_decode(request.url_vars["database"])
table = tilde_decode(request.url_vars["table"])
try:
db = self.ds.get_database(route=database_route)
except KeyError:
raise NotFound("Database not found: {}".format(database_route))
resolved = await self.ds.resolve_row(request)
db = resolved.db
database = db.name
table = resolved.table
pk_values = resolved.pk_values
# Ensure user has permission to view this row
visible, private = await self.ds.check_visibility(
@ -34,14 +36,9 @@ class RowView(DataView):
if not visible:
raise Forbidden("You do not have permission to view this table")
pk_values = urlsafe_components(request.url_vars["pks"])
try:
db = self.ds.get_database(route=database_route)
except KeyError:
raise NotFound("Database not found: {}".format(database_route))
database = db.name
sql, params, pks = await _sql_params_pks(db, table, pk_values)
results = await db.execute(sql, params, truncate=True)
results = await resolved.db.execute(
resolved.sql, resolved.params, truncate=True
)
columns = [r[0] for r in results.description]
rows = list(results.rows)
if not rows:
@ -56,14 +53,30 @@ class RowView(DataView):
rows,
link_column=False,
truncate_cells=0,
request=request,
)
for column in display_columns:
column["sortable"] = False
row_actions = []
for hook in pm.hook.row_actions(
datasette=self.ds,
actor=request.actor,
request=request,
database=database,
table=table,
row=rows[0],
):
extra_links = await await_me_maybe(hook)
if extra_links:
row_actions.extend(extra_links)
return {
"private": private,
"foreign_key_tables": await self.foreign_key_tables(
database, table, pk_values
),
"database_color": db.color,
"display_columns": display_columns,
"display_rows": display_rows,
"custom_table_templates": [
@ -71,10 +84,19 @@ class RowView(DataView):
f"_table-row-{to_css_class(database)}-{to_css_class(table)}.html",
"_table.html",
],
"row_actions": row_actions,
"metadata": (self.ds.metadata("databases") or {})
.get(database, {})
.get("tables", {})
.get(table, {}),
"top_row": make_slot_function(
"top_row",
self.ds,
request,
database=resolved.db.name,
table=resolved.table,
row=rows[0],
),
}
data = {
@ -82,9 +104,9 @@ class RowView(DataView):
"table": table,
"rows": rows,
"columns": columns,
"primary_keys": pks,
"primary_keys": resolved.pks,
"primary_key_values": pk_values,
"units": self.ds.table_metadata(database, table).get("units", {}),
"units": (await self.ds.table_config(database, table)).get("units", {}),
}
if "foreign_key_tables" in (request.args.get("_extras") or "").split(","):
@ -146,3 +168,129 @@ class RowView(DataView):
)
foreign_key_tables.append({**fk, **{"count": count, "link": link}})
return foreign_key_tables
class RowError(Exception):
def __init__(self, error):
self.error = error
async def _resolve_row_and_check_permission(datasette, request, permission):
from datasette.app import DatabaseNotFound, TableNotFound, RowNotFound
try:
resolved = await datasette.resolve_row(request)
except DatabaseNotFound as e:
return False, _error(["Database not found: {}".format(e.database_name)], 404)
except TableNotFound as e:
return False, _error(["Table not found: {}".format(e.table)], 404)
except RowNotFound as e:
return False, _error(["Record not found: {}".format(e.pk_values)], 404)
# Ensure user has permission to delete this row
if not await datasette.permission_allowed(
request.actor, permission, resource=(resolved.db.name, resolved.table)
):
return False, _error(["Permission denied"], 403)
return True, resolved
class RowDeleteView(BaseView):
name = "row-delete"
def __init__(self, datasette):
self.ds = datasette
async def post(self, request):
ok, resolved = await _resolve_row_and_check_permission(
self.ds, request, "delete-row"
)
if not ok:
return resolved
# Delete table
def delete_row(conn):
sqlite_utils.Database(conn)[resolved.table].delete(resolved.pk_values)
try:
await resolved.db.execute_write_fn(delete_row)
except Exception as e:
return _error([str(e)], 500)
await self.ds.track_event(
DeleteRowEvent(
actor=request.actor,
database=resolved.db.name,
table=resolved.table,
pks=resolved.pk_values,
)
)
return Response.json({"ok": True}, status=200)
class RowUpdateView(BaseView):
name = "row-update"
def __init__(self, datasette):
self.ds = datasette
async def post(self, request):
ok, resolved = await _resolve_row_and_check_permission(
self.ds, request, "update-row"
)
if not ok:
return resolved
body = await request.post_body()
try:
data = json.loads(body)
except json.JSONDecodeError as e:
return _error(["Invalid JSON: {}".format(e)])
if not isinstance(data, dict):
return _error(["JSON must be a dictionary"])
if not "update" in data or not isinstance(data["update"], dict):
return _error(["JSON must contain an update dictionary"])
invalid_keys = set(data.keys()) - {"update", "return", "alter"}
if invalid_keys:
return _error(["Invalid keys: {}".format(", ".join(invalid_keys))])
update = data["update"]
alter = data.get("alter")
if alter and not await self.ds.permission_allowed(
request.actor, "alter-table", resource=(resolved.db.name, resolved.table)
):
return _error(["Permission denied for alter-table"], 403)
def update_row(conn):
sqlite_utils.Database(conn)[resolved.table].update(
resolved.pk_values, update, alter=alter
)
try:
await resolved.db.execute_write_fn(update_row)
except Exception as e:
return _error([str(e)], 400)
result = {"ok": True}
if data.get("return"):
results = await resolved.db.execute(
resolved.sql, resolved.params, truncate=True
)
rows = list(results.rows)
result["row"] = dict(rows[0])
await self.ds.track_event(
UpdateRowEvent(
actor=request.actor,
database=resolved.db.name,
table=resolved.table,
pks=resolved.pk_values,
)
)
return Response.json(result, status=200)

Wyświetl plik

@ -1,22 +1,38 @@
import json
from datasette.events import LogoutEvent, LoginEvent, CreateTokenEvent
from datasette.utils.asgi import Response, Forbidden
from datasette.utils import actor_matches_allow, add_cors_headers
from .base import BaseView
from datasette.utils import (
actor_matches_allow,
add_cors_headers,
tilde_encode,
tilde_decode,
)
from .base import BaseView, View
import secrets
import urllib
class JsonDataView(BaseView):
name = "json_data"
def __init__(self, datasette, filename, data_callback, needs_request=False):
def __init__(
self,
datasette,
filename,
data_callback,
needs_request=False,
permission="view-instance",
):
self.ds = datasette
self.filename = filename
self.data_callback = data_callback
self.needs_request = needs_request
self.permission = permission
async def get(self, request):
as_format = request.url_vars["format"]
await self.ds.ensure_permissions(request.actor, ["view-instance"])
if self.permission:
await self.ds.ensure_permissions(request.actor, [self.permission])
if self.needs_request:
data = self.data_callback(request)
else:
@ -26,7 +42,7 @@ class JsonDataView(BaseView):
if self.ds.cors:
add_cors_headers(headers)
return Response(
json.dumps(data),
json.dumps(data, default=repr),
content_type="application/json; charset=utf-8",
headers=headers,
)
@ -37,18 +53,21 @@ class JsonDataView(BaseView):
request=request,
context={
"filename": self.filename,
"data_json": json.dumps(data, indent=4),
"data_json": json.dumps(data, indent=4, default=repr),
},
)
class PatternPortfolioView(BaseView):
name = "patterns"
has_json_alternate = False
async def get(self, request):
await self.ds.ensure_permissions(request.actor, ["view-instance"])
return await self.render(["patterns.html"], request=request)
class PatternPortfolioView(View):
async def get(self, request, datasette):
await datasette.ensure_permissions(request.actor, ["view-instance"])
return Response.html(
await datasette.render_template(
"patterns.html",
request=request,
view_name="patterns",
)
)
class AuthTokenView(BaseView):
@ -62,9 +81,9 @@ class AuthTokenView(BaseView):
if secrets.compare_digest(token, self.ds._root_token):
self.ds._root_token = None
response = Response.redirect(self.ds.urls.instance())
response.set_cookie(
"ds_actor", self.ds.sign({"a": {"id": "root"}}, "actor")
)
root_actor = {"id": "root"}
response.set_cookie("ds_actor", self.ds.sign({"a": root_actor}, "actor"))
await self.ds.track_event(LoginEvent(actor=root_actor))
return response
else:
raise Forbidden("Invalid token")
@ -87,6 +106,7 @@ class LogoutView(BaseView):
response = Response.redirect(self.ds.urls.instance())
response.set_cookie("ds_actor", "", expires=0, max_age=0)
self.ds.add_message(request, "You are now logged out", self.ds.WARNING)
await self.ds.track_event(LogoutEvent(actor=request.actor))
return response
@ -102,7 +122,50 @@ class PermissionsDebugView(BaseView):
["permissions_debug.html"],
request,
# list() avoids error if check is performed during template render:
{"permission_checks": list(reversed(self.ds._permission_checks))},
{
"permission_checks": list(reversed(self.ds._permission_checks)),
"permissions": [
{
"name": p.name,
"abbr": p.abbr,
"description": p.description,
"takes_database": p.takes_database,
"takes_resource": p.takes_resource,
"default": p.default,
}
for p in self.ds.permissions.values()
],
},
)
async def post(self, request):
await self.ds.ensure_permissions(request.actor, ["view-instance"])
if not await self.ds.permission_allowed(request.actor, "permissions-debug"):
raise Forbidden("Permission denied")
vars = await request.post_vars()
actor = json.loads(vars["actor"])
permission = vars["permission"]
resource_1 = vars["resource_1"]
resource_2 = vars["resource_2"]
resource = []
if resource_1:
resource.append(resource_1)
if resource_2:
resource.append(resource_2)
resource = tuple(resource)
if len(resource) == 1:
resource = resource[0]
result = await self.ds.permission_allowed(
actor, permission, resource, default="USE_DEFAULT"
)
return Response.json(
{
"actor": actor,
"permission": permission,
"resource": resource,
"result": result,
"default": self.ds.permissions[permission].default,
}
)
@ -163,3 +226,290 @@ class MessagesDebugView(BaseView):
else:
datasette.add_message(request, message, getattr(datasette, message_type))
return Response.redirect(self.ds.urls.instance())
class CreateTokenView(BaseView):
name = "create_token"
has_json_alternate = False
def check_permission(self, request):
if not self.ds.setting("allow_signed_tokens"):
raise Forbidden("Signed tokens are not enabled for this Datasette instance")
if not request.actor:
raise Forbidden("You must be logged in to create a token")
if not request.actor.get("id"):
raise Forbidden(
"You must be logged in as an actor with an ID to create a token"
)
if request.actor.get("token"):
raise Forbidden(
"Token authentication cannot be used to create additional tokens"
)
async def shared(self, request):
self.check_permission(request)
# Build list of databases and tables the user has permission to view
database_with_tables = []
for database in self.ds.databases.values():
if database.name == "_memory":
continue
if not await self.ds.permission_allowed(
request.actor, "view-database", database.name
):
continue
hidden_tables = await database.hidden_table_names()
tables = []
for table in await database.table_names():
if table in hidden_tables:
continue
if not await self.ds.permission_allowed(
request.actor,
"view-table",
resource=(database.name, table),
):
continue
tables.append({"name": table, "encoded": tilde_encode(table)})
database_with_tables.append(
{
"name": database.name,
"encoded": tilde_encode(database.name),
"tables": tables,
}
)
return {
"actor": request.actor,
"all_permissions": self.ds.permissions.keys(),
"database_permissions": [
key
for key, value in self.ds.permissions.items()
if value.takes_database
],
"resource_permissions": [
key
for key, value in self.ds.permissions.items()
if value.takes_resource
],
"database_with_tables": database_with_tables,
}
async def get(self, request):
self.check_permission(request)
return await self.render(
["create_token.html"], request, await self.shared(request)
)
async def post(self, request):
self.check_permission(request)
post = await request.post_vars()
errors = []
expires_after = None
if post.get("expire_type"):
duration_string = post.get("expire_duration")
if (
not duration_string
or not duration_string.isdigit()
or not int(duration_string) > 0
):
errors.append("Invalid expire duration")
else:
unit = post["expire_type"]
if unit == "minutes":
expires_after = int(duration_string) * 60
elif unit == "hours":
expires_after = int(duration_string) * 60 * 60
elif unit == "days":
expires_after = int(duration_string) * 60 * 60 * 24
else:
errors.append("Invalid expire duration unit")
# Are there any restrictions?
restrict_all = []
restrict_database = {}
restrict_resource = {}
for key in post:
if key.startswith("all:") and key.count(":") == 1:
restrict_all.append(key.split(":")[1])
elif key.startswith("database:") and key.count(":") == 2:
bits = key.split(":")
database = tilde_decode(bits[1])
action = bits[2]
restrict_database.setdefault(database, []).append(action)
elif key.startswith("resource:") and key.count(":") == 3:
bits = key.split(":")
database = tilde_decode(bits[1])
resource = tilde_decode(bits[2])
action = bits[3]
restrict_resource.setdefault(database, {}).setdefault(
resource, []
).append(action)
token = self.ds.create_token(
request.actor["id"],
expires_after=expires_after,
restrict_all=restrict_all,
restrict_database=restrict_database,
restrict_resource=restrict_resource,
)
token_bits = self.ds.unsign(token[len("dstok_") :], namespace="token")
await self.ds.track_event(
CreateTokenEvent(
actor=request.actor,
expires_after=expires_after,
restrict_all=restrict_all,
restrict_database=restrict_database,
restrict_resource=restrict_resource,
)
)
context = await self.shared(request)
context.update({"errors": errors, "token": token, "token_bits": token_bits})
return await self.render(["create_token.html"], request, context)
class ApiExplorerView(BaseView):
name = "api_explorer"
has_json_alternate = False
async def example_links(self, request):
databases = []
for name, db in self.ds.databases.items():
if name == "_internal":
continue
database_visible, _ = await self.ds.check_visibility(
request.actor, permissions=[("view-database", name), "view-instance"]
)
if not database_visible:
continue
tables = []
table_names = await db.table_names()
for table in table_names:
visible, _ = await self.ds.check_visibility(
request.actor,
permissions=[
("view-table", (name, table)),
("view-database", name),
"view-instance",
],
)
if not visible:
continue
table_links = []
tables.append({"name": table, "links": table_links})
table_links.append(
{
"label": "Get rows for {}".format(table),
"method": "GET",
"path": self.ds.urls.table(name, table, format="json"),
}
)
# If not mutable don't show any write APIs
if not db.is_mutable:
continue
if await self.ds.permission_allowed(
request.actor, "insert-row", (name, table)
):
pks = await db.primary_keys(table)
table_links.extend(
[
{
"path": self.ds.urls.table(name, table) + "/-/insert",
"method": "POST",
"label": "Insert rows into {}".format(table),
"json": {
"rows": [
{
column: None
for column in await db.table_columns(table)
if column not in pks
}
]
},
},
{
"path": self.ds.urls.table(name, table) + "/-/upsert",
"method": "POST",
"label": "Upsert rows into {}".format(table),
"json": {
"rows": [
{
column: None
for column in await db.table_columns(table)
if column not in pks
}
]
},
},
]
)
if await self.ds.permission_allowed(
request.actor, "drop-table", (name, table)
):
table_links.append(
{
"path": self.ds.urls.table(name, table) + "/-/drop",
"label": "Drop table {}".format(table),
"json": {"confirm": False},
"method": "POST",
}
)
database_links = []
if (
await self.ds.permission_allowed(request.actor, "create-table", name)
and db.is_mutable
):
database_links.append(
{
"path": self.ds.urls.database(name) + "/-/create",
"label": "Create table in {}".format(name),
"json": {
"table": "new_table",
"columns": [
{"name": "id", "type": "integer"},
{"name": "name", "type": "text"},
],
"pk": "id",
},
"method": "POST",
}
)
if database_links or tables:
databases.append(
{
"name": name,
"links": database_links,
"tables": tables,
}
)
# Sort so that mutable databases are first
databases.sort(key=lambda d: not self.ds.databases[d["name"]].is_mutable)
return databases
async def get(self, request):
visible, private = await self.ds.check_visibility(
request.actor,
permissions=["view-instance"],
)
if not visible:
raise Forbidden("You do not have permission to view this instance")
def api_path(link):
return "/-/api#{}".format(
urllib.parse.urlencode(
{
key: json.dumps(value, indent=2) if key == "json" else value
for key, value in link.items()
if key in ("path", "method", "json")
}
)
)
return await self.render(
["api_explorer.html"],
request,
{
"example_links": await self.example_links(request),
"api_path": api_path,
"private": private,
},
)

Plik diff jest za duży Load Diff

Wyświetl plik

@ -0,0 +1,21 @@
from datasette import hookimpl
# Test command:
# datasette fixtures.db \ --plugins-dir=demos/plugins/
# \ --static static:demos/plugins/static
# Create a set with view names that qualify for this JS, since plugins won't do anything on other pages
# Same pattern as in Nteract data explorer
# https://github.com/hydrosquall/datasette-nteract-data-explorer/blob/main/datasette_nteract_data_explorer/__init__.py#L77
PERMITTED_VIEWS = {"table", "query", "database"}
@hookimpl
def extra_js_urls(view_name):
print(view_name)
if view_name in PERMITTED_VIEWS:
return [
{
"url": "/static/table-example-plugins.js",
}
]

Wyświetl plik

@ -0,0 +1,100 @@
/**
* Example usage of Datasette JS Manager API
*/
document.addEventListener("datasette_init", function (evt) {
const { detail: manager } = evt;
// === Demo plugins: remove before merge===
addPlugins(manager);
});
/**
* Examples for to test datasette JS api
*/
const addPlugins = (manager) => {
manager.registerPlugin("column-name-plugin", {
version: 0.1,
makeColumnActions: (columnMeta) => {
const { column } = columnMeta;
return [
{
label: "Copy name to clipboard",
onClick: (evt) => copyToClipboard(column),
},
{
label: "Log column metadata to console",
onClick: (evt) => console.log(column),
},
];
},
});
manager.registerPlugin("panel-plugin-graphs", {
version: 0.1,
makeAboveTablePanelConfigs: () => {
return [
{
id: 'first-panel',
label: "First",
render: node => {
const description = document.createElement('p');
description.innerText = 'Hello world';
node.appendChild(description);
}
},
{
id: 'second-panel',
label: "Second",
render: node => {
const iframe = document.createElement('iframe');
iframe.src = "https://observablehq.com/embed/@d3/sortable-bar-chart?cell=viewof+order&cell=chart";
iframe.width = 800;
iframe.height = 635;
iframe.frameborder = '0';
node.appendChild(iframe);
}
},
];
},
});
manager.registerPlugin("panel-plugin-maps", {
version: 0.1,
makeAboveTablePanelConfigs: () => {
return [
{
// ID only has to be unique within a plugin, manager namespaces for you
id: 'first-map-panel',
label: "Map plugin",
// datasette-vega, leafleft can provide a "render" function
render: node => node.innerHTML = "Here sits a map",
},
{
id: 'second-panel',
label: "Image plugin",
render: node => {
const img = document.createElement('img');
img.src = 'https://datasette.io/static/datasette-logo.svg'
node.appendChild(img);
},
}
];
},
});
// Future: dispatch message to some other part of the page with CustomEvent API
// Could use to drive filter/sort query builder actions without page refresh.
}
async function copyToClipboard(str) {
try {
await navigator.clipboard.writeText(str);
} catch (err) {
/** Rejected - text failed to copy to the clipboard. Browsers didn't give permission */
console.error('Failed to copy: ', err);
}
}

Plik binarny nie jest wyświetlany.

Po

Szerokość:  |  Wysokość:  |  Rozmiar: 208 B

Wyświetl plik

@ -4,3 +4,34 @@
{{ super() }}
<script defer data-domain="docs.datasette.io" src="https://plausible.io/js/plausible.js"></script>
{% endblock %}
{% block scripts %}
{{ super() }}
<script>
document.addEventListener("DOMContentLoaded", function() {
// Show banner linking to /stable/ if this is a /latest/ page
if (!/\/latest\//.test(location.pathname)) {
return;
}
var stableUrl = location.pathname.replace("/latest/", "/stable/");
// Check it's not a 404
fetch(stableUrl, { method: "HEAD" }).then((response) => {
if (response.status === 200) {
var warning = document.createElement("div");
warning.className = "admonition warning";
warning.innerHTML = `
<p class="first admonition-title">Note</p>
<p class="last">
This documentation covers the <strong>development version</strong> of Datasette.
</p>
<p>
See <a href="${stableUrl}">this page</a> for the current stable release.
</p>
`;
var mainArticle = document.querySelector("article[role=main]");
mainArticle.insertBefore(warning, mainArticle.firstChild);
}
});
});
</script>
{% endblock %}

Plik diff jest za duży Load Diff

Wyświetl plik

@ -4,6 +4,400 @@
Changelog
=========
.. _v1_0_a13:
1.0a13 (2024-03-12)
-------------------
Each of the key concepts in Datasette now has an :ref:`actions menu <plugin_actions>`, which plugins can use to add additional functionality targeting that entity.
- Plugin hook: :ref:`view_actions() <plugin_hook_view_actions>` for actions that can be applied to a SQL view. (:issue:`2297`)
- Plugin hook: :ref:`homepage_actions() <plugin_hook_homepage_actions>` for actions that apply to the instance homepage. (:issue:`2298`)
- Plugin hook: :ref:`row_actions() <plugin_hook_row_actions>` for actions that apply to the row page. (:issue:`2299`)
- Action menu items for all of the ``*_actions()`` plugin hooks can now return an optional ``"description"`` key, which will be displayed in the menu below the action label. (:issue:`2294`)
- :ref:`Plugin hooks <plugin_hooks>` documentation page is now organized with additional headings. (:issue:`2300`)
- Improved the display of action buttons on pages that also display metadata. (:issue:`2286`)
- The header and footer of the page now uses a subtle gradient effect, and options in the navigation menu are better visually defined. (:issue:`2302`)
- Table names that start with an underscore now default to hidden. (:issue:`2104`)
- ``pragma_table_list`` has been added to the allow-list of SQLite pragma functions supported by Datasette. ``select * from pragma_table_list()`` is no longer blocked. (`#2104 <https://github.com/simonw/datasette/issues/2104#issuecomment-1982352475>`__)
.. _v1_0_a12:
1.0a12 (2024-02-29)
-------------------
- New :ref:`query_actions() <plugin_hook_query_actions>` plugin hook, similar to :ref:`table_actions() <plugin_hook_table_actions>` and :ref:`database_actions() <plugin_hook_database_actions>`. Can be used to add a menu of actions to the canned query or arbitrary SQL query page. (:issue:`2283`)
- New design for the button that opens the query, table and database actions menu. (:issue:`2281`)
- "does not contain" table filter for finding rows that do not contain a string. (:issue:`2287`)
- Fixed a bug in the :ref:`javascript_plugins_makeColumnActions` JavaScript plugin mechanism where the column action menu was not fully reset in between each interaction. (:issue:`2289`)
.. _v1_0_a11:
1.0a11 (2024-02-19)
-------------------
- The ``"replace": true`` argument to the ``/db/table/-/insert`` API now requires the actor to have the ``update-row`` permission. (:issue:`2279`)
- Fixed some UI bugs in the interactive permissions debugging tool. (:issue:`2278`)
- The column action menu now aligns better with the cog icon, and positions itself taking into account the width of the browser window. (:issue:`2263`)
.. _v1_0_a10:
1.0a10 (2024-02-17)
-------------------
The only changes in this alpha correspond to the way Datasette handles database transactions. (:issue:`2277`)
- The :ref:`database.execute_write_fn() <database_execute_write_fn>` method has a new ``transaction=True`` parameter. This defaults to ``True`` which means all functions executed using this method are now automatically wrapped in a transaction - previously the functions needed to roll transaction handling on their own, and many did not.
- Pass ``transaction=False`` to ``execute_write_fn()`` if you want to manually handle transactions in your function.
- Several internal Datasette features, including parts of the :ref:`JSON write API <json_api_write>`, had been failing to wrap their operations in a transaction. This has been fixed by the new ``transaction=True`` default.
.. _v1_0_a9:
1.0a9 (2024-02-16)
------------------
This alpha release adds basic alter table support to the Datasette Write API and fixes a permissions bug relating to the ``/upsert`` API endpoint.
Alter table support for create, insert, upsert and update
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The :ref:`JSON write API <json_api_write>` can now be used to apply simple alter table schema changes, provided the acting actor has the new :ref:`permissions_alter_table` permission. (:issue:`2101`)
The only alter operation supported so far is adding new columns to an existing table.
* The :ref:`/db/-/create <TableCreateView>` API now adds new columns during large operations to create a table based on incoming example ``"rows"``, in the case where one of the later rows includes columns that were not present in the earlier batches. This requires the ``create-table`` but not the ``alter-table`` permission.
* When ``/db/-/create`` is called with rows in a situation where the table may have been already created, an ``"alter": true`` key can be included to indicate that any missing columns from the new rows should be added to the table. This requires the ``alter-table`` permission.
* :ref:`/db/table/-/insert <TableInsertView>` and :ref:`/db/table/-/upsert <TableUpsertView>` and :ref:`/db/table/row-pks/-/update <RowUpdateView>` all now also accept ``"alter": true``, depending on the ``alter-table`` permission.
Operations that alter a table now fire the new :ref:`alter-table event <events>`.
Permissions fix for the upsert API
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The :ref:`/database/table/-/upsert API <TableUpsertView>` had a minor permissions bug, only affecting Datasette instances that had configured the ``insert-row`` and ``update-row`` permissions to apply to a specific table rather than the database or instance as a whole. Full details in issue :issue:`2262`.
To avoid similar mistakes in the future the :ref:`datasette.permission_allowed() <datasette_permission_allowed>` method now specifies ``default=`` as a keyword-only argument.
Permission checks now consider opinions from every plugin
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The :ref:`datasette.permission_allowed() <datasette_permission_allowed>` method previously consulted every plugin that implemented the :ref:`permission_allowed() <plugin_hook_permission_allowed>` plugin hook and obeyed the opinion of the last plugin to return a value. (:issue:`2275`)
Datasette now consults every plugin and checks to see if any of them returned ``False`` (the veto rule), and if none of them did, it then checks to see if any of them returned ``True``.
This is explained at length in the new documentation covering :ref:`authentication_permissions_explained`.
Other changes
~~~~~~~~~~~~~
- The new :ref:`DATASETTE_TRACE_PLUGINS=1 environment variable <writing_plugins_tracing>` turns on detailed trace output for every executed plugin hook, useful for debugging and understanding how the plugin system works at a low level. (:issue:`2274`)
- Datasette on Python 3.9 or above marks its non-cryptographic uses of the MD5 hash function as ``usedforsecurity=False``, for compatibility with FIPS systems. (:issue:`2270`)
- SQL relating to :ref:`internals_internal` now executes inside a transaction, avoiding a potential database locked error. (:issue:`2273`)
- The ``/-/threads`` debug page now identifies the database in the name associated with each dedicated write thread. (:issue:`2265`)
- The ``/db/-/create`` API now fires a ``insert-rows`` event if rows were inserted after the table was created. (:issue:`2260`)
.. _v1_0_a8:
1.0a8 (2024-02-07)
------------------
This alpha release continues the migration of Datasette's configuration from ``metadata.yaml`` to the new ``datasette.yaml`` configuration file, introduces a new system for JavaScript plugins and adds several new plugin hooks.
See `Datasette 1.0a8: JavaScript plugins, new plugin hooks and plugin configuration in datasette.yaml <https://simonwillison.net/2024/Feb/7/datasette-1a8/>`__ for an annotated version of these release notes.
Configuration
~~~~~~~~~~~~~
- Plugin configuration now lives in the :ref:`datasette.yaml configuration file <configuration>`, passed to Datasette using the ``-c/--config`` option. Thanks, Alex Garcia. (:issue:`2093`)
.. code-block:: bash
datasette -c datasette.yaml
Where ``datasette.yaml`` contains configuration that looks like this:
.. code-block:: yaml
plugins:
datasette-cluster-map:
latitude_column: xlat
longitude_column: xlon
Previously plugins were configured in ``metadata.yaml``, which was confusing as plugin settings were unrelated to database and table metadata.
- The ``-s/--setting`` option can now be used to set plugin configuration as well. See :ref:`configuration_cli` for details. (:issue:`2252`)
The above YAML configuration example using ``-s/--setting`` looks like this:
.. code-block:: bash
datasette mydatabase.db \
-s plugins.datasette-cluster-map.latitude_column xlat \
-s plugins.datasette-cluster-map.longitude_column xlon
- The new ``/-/config`` page shows the current instance configuration, after redacting keys that could contain sensitive data such as API keys or passwords. (:issue:`2254`)
- Existing Datasette installations may already have configuration set in ``metadata.yaml`` that should be migrated to ``datasette.yaml``. To avoid breaking these installations, Datasette will silently treat table configuration, plugin configuration and allow blocks in metadata as if they had been specified in configuration instead. (:issue:`2247`) (:issue:`2248`) (:issue:`2249`)
Note that the ``datasette publish`` command has not yet been updated to accept a ``datasette.yaml`` configuration file. This will be addressed in :issue:`2195` but for the moment you can include those settings in ``metadata.yaml`` instead.
JavaScript plugins
~~~~~~~~~~~~~~~~~~
Datasette now includes a :ref:`JavaScript plugins mechanism <javascript_plugins>`, allowing JavaScript to customize Datasette in a way that can collaborate with other plugins.
This provides two initial hooks, with more to come in the future:
- :ref:`makeAboveTablePanelConfigs() <javascript_plugins_makeAboveTablePanelConfigs>` can add additional panels to the top of the table page.
- :ref:`makeColumnActions() <javascript_plugins_makeColumnActions>` can add additional actions to the column menu.
Thanks `Cameron Yick <https://github.com/hydrosquall>`__ for contributing this feature. (`#2052 <https://github.com/simonw/datasette/pull/2052>`__)
Plugin hooks
~~~~~~~~~~~~
- New :ref:`plugin_hook_jinja2_environment_from_request` plugin hook, which can be used to customize the current Jinja environment based on the incoming request. This can be used to modify the template lookup path based on the incoming request hostname, among other things. (:issue:`2225`)
- New :ref:`family of template slot plugin hooks <plugin_hook_slots>`: ``top_homepage``, ``top_database``, ``top_table``, ``top_row``, ``top_query``, ``top_canned_query``. Plugins can use these to provide additional HTML to be injected at the top of the corresponding pages. (:issue:`1191`)
- New :ref:`track_event() mechanism <plugin_event_tracking>` for plugins to emit and receive events when certain events occur within Datasette. (:issue:`2240`)
- Plugins can register additional event classes using :ref:`plugin_hook_register_events`.
- They can then trigger those events with the :ref:`datasette.track_event(event) <datasette_track_event>` internal method.
- Plugins can subscribe to notifications of events using the :ref:`plugin_hook_track_event` plugin hook.
- Datasette core now emits ``login``, ``logout``, ``create-token``, ``create-table``, ``drop-table``, ``insert-rows``, ``upsert-rows``, ``update-row``, ``delete-row`` events, :ref:`documented here <events>`.
- New internal function for plugin authors: :ref:`database_execute_isolated_fn`, for creating a new SQLite connection, executing code and then closing that connection, all while preventing other code from writing to that particular database. This connection will not have the :ref:`prepare_connection() <plugin_hook_prepare_connection>` plugin hook executed against it, allowing plugins to perform actions that might otherwise be blocked by existing connection configuration. (:issue:`2218`)
Documentation
~~~~~~~~~~~~~
- Documentation describing :ref:`how to write tests that use signed actor cookies <testing_datasette_client>` using ``datasette.client.actor_cookie()``. (:issue:`1830`)
- Documentation on how to :ref:`register a plugin for the duration of a test <testing_plugins_register_in_test>`. (:issue:`2234`)
- The :ref:`configuration documentation <configuration>` now shows examples of both YAML and JSON for each setting.
Minor fixes
~~~~~~~~~~~
- Datasette no longer attempts to run SQL queries in parallel when rendering a table page, as this was leading to some rare crashing bugs. (:issue:`2189`)
- Fixed warning: ``DeprecationWarning: pkg_resources is deprecated as an API`` (:issue:`2057`)
- Fixed bug where ``?_extra=columns`` parameter returned an incorrectly shaped response. (:issue:`2230`)
.. _v0_64_6:
0.64.6 (2023-12-22)
-------------------
- Fixed a bug where CSV export with expanded labels could fail if a foreign key reference did not correctly resolve. (:issue:`2214`)
.. _v0_64_5:
0.64.5 (2023-10-08)
-------------------
- Dropped dependency on ``click-default-group-wheel``, which could cause a dependency conflict. (:issue:`2197`)
.. _v1_0_a7:
1.0a7 (2023-09-21)
------------------
- Fix for a crashing bug caused by viewing the table page for a named in-memory database. (:issue:`2189`)
.. _v0_64_4:
0.64.4 (2023-09-21)
-------------------
- Fix for a crashing bug caused by viewing the table page for a named in-memory database. (:issue:`2189`)
.. _v1_0_a6:
1.0a6 (2023-09-07)
------------------
- New plugin hook: :ref:`plugin_hook_actors_from_ids` and an internal method to accompany it, :ref:`datasette_actors_from_ids`. This mechanism is intended to be used by plugins that may need to display the actor who was responsible for something managed by that plugin: they can now resolve the recorded IDs of actors into the full actor objects. (:issue:`2181`)
- ``DATASETTE_LOAD_PLUGINS`` environment variable for :ref:`controlling which plugins <plugins_datasette_load_plugins>` are loaded by Datasette. (:issue:`2164`)
- Datasette now checks if the user has permission to view a table linked to by a foreign key before turning that foreign key into a clickable link. (:issue:`2178`)
- The ``execute-sql`` permission now implies that the actor can also view the database and instance. (:issue:`2169`)
- Documentation describing a pattern for building plugins that themselves :ref:`define further hooks <writing_plugins_extra_hooks>` for other plugins. (:issue:`1765`)
- Datasette is now tested against the Python 3.12 preview. (`#2175 <https://github.com/simonw/datasette/pull/2175>`__)
.. _v1_0_a5:
1.0a5 (2023-08-29)
------------------
- When restrictions are applied to :ref:`API tokens <CreateTokenView>`, those restrictions now behave slightly differently: applying the ``view-table`` restriction will imply the ability to ``view-database`` for the database containing that table, and both ``view-table`` and ``view-database`` will imply ``view-instance``. Previously you needed to create a token with restrictions that explicitly listed ``view-instance`` and ``view-database`` and ``view-table`` in order to view a table without getting a permission denied error. (:issue:`2102`)
- New ``datasette.yaml`` (or ``.json``) configuration file, which can be specified using ``datasette -c path-to-file``. The goal here to consolidate settings, plugin configuration, permissions, canned queries, and other Datasette configuration into a single single file, separate from ``metadata.yaml``. The legacy ``settings.json`` config file used for :ref:`config_dir` has been removed, and ``datasette.yaml`` has a ``"settings"`` section where the same settings key/value pairs can be included. In the next future alpha release, more configuration such as plugins/permissions/canned queries will be moved to the ``datasette.yaml`` file. See :issue:`2093` for more details. Thanks, Alex Garcia.
- The ``-s/--setting`` option can now take dotted paths to nested settings. These will then be used to set or over-ride the same options as are present in the new configuration file. (:issue:`2156`)
- New ``--actor '{"id": "json-goes-here"}'`` option for use with ``datasette --get`` to treat the simulated request as being made by a specific actor, see :ref:`cli_datasette_get`. (:issue:`2153`)
- The Datasette ``_internal`` database has had some changes. It no longer shows up in the ``datasette.databases`` list by default, and is now instead available to plugins using the ``datasette.get_internal_database()``. Plugins are invited to use this as a private database to store configuration and settings and secrets that should not be made visible through the default Datasette interface. Users can pass the new ``--internal internal.db`` option to persist that internal database to disk. Thanks, Alex Garcia. (:issue:`2157`).
.. _v1_0_a4:
1.0a4 (2023-08-21)
------------------
This alpha fixes a security issue with the ``/-/api`` API explorer. On authenticated Datasette instances (instances protected using plugins such as `datasette-auth-passwords <https://datasette.io/plugins/datasette-auth-passwords>`__) the API explorer interface could reveal the names of databases and tables within the protected instance. The data stored in those tables was not revealed.
For more information and workarounds, read `the security advisory <https://github.com/simonw/datasette/security/advisories/GHSA-7ch3-7pp7-7cpq>`__. The issue has been present in every previous alpha version of Datasette 1.0: versions 1.0a0, 1.0a1, 1.0a2 and 1.0a3.
Also in this alpha:
- The new ``datasette plugins --requirements`` option outputs a list of currently installed plugins in Python ``requirements.txt`` format, useful for duplicating that installation elsewhere. (:issue:`2133`)
- :ref:`canned_queries_writable` can now define a ``on_success_message_sql`` field in their configuration, containing a SQL query that should be executed upon successful completion of the write operation in order to generate a message to be shown to the user. (:issue:`2138`)
- The automatically generated border color for a database is now shown in more places around the application. (:issue:`2119`)
- Every instance of example shell script code in the documentation should now include a working copy button, free from additional syntax. (:issue:`2140`)
.. _v1_0_a3:
1.0a3 (2023-08-09)
------------------
This alpha release previews the updated design for Datasette's default JSON API. (:issue:`782`)
The new :ref:`default JSON representation <json_api_default>` for both table pages (``/dbname/table.json``) and arbitrary SQL queries (``/dbname.json?sql=...``) is now shaped like this:
.. code-block:: json
{
"ok": true,
"rows": [
{
"id": 3,
"name": "Detroit"
},
{
"id": 2,
"name": "Los Angeles"
},
{
"id": 4,
"name": "Memnonia"
},
{
"id": 1,
"name": "San Francisco"
}
],
"truncated": false
}
Tables will include an additional ``"next"`` key for pagination, which can be passed to ``?_next=`` to fetch the next page of results.
The various ``?_shape=`` options continue to work as before - see :ref:`json_api_shapes` for details.
A new ``?_extra=`` mechanism is available for tables, but has not yet been stabilized or documented. Details on that are available in :issue:`262`.
Smaller changes
~~~~~~~~~~~~~~~
- Datasette documentation now shows YAML examples for :ref:`metadata` by default, with a tab interface for switching to JSON. (:issue:`1153`)
- :ref:`plugin_register_output_renderer` plugins now have access to ``error`` and ``truncated`` arguments, allowing them to display error messages and take into account truncated results. (:issue:`2130`)
- ``render_cell()`` plugin hook now also supports an optional ``request`` argument. (:issue:`2007`)
- New ``Justfile`` to support development workflows for Datasette using `Just <https://github.com/casey/just>`__.
- ``datasette.render_template()`` can now accepts a ``datasette.views.Context`` subclass as an alternative to a dictionary. (:issue:`2127`)
- ``datasette install -e path`` option for editable installations, useful while developing plugins. (:issue:`2106`)
- When started with the ``--cors`` option Datasette now serves an ``Access-Control-Max-Age: 3600`` header, ensuring CORS OPTIONS requests are repeated no more than once an hour. (:issue:`2079`)
- Fixed a bug where the ``_internal`` database could display ``None`` instead of ``null`` for in-memory databases. (:issue:`1970`)
.. _v0_64_2:
0.64.2 (2023-03-08)
-------------------
- Fixed a bug with ``datasette publish cloudrun`` where deploys all used the same Docker image tag. This was mostly inconsequential as the service is deployed as soon as the image has been pushed to the registry, but could result in the incorrect image being deployed if two different deploys for two separate services ran at exactly the same time. (:issue:`2036`)
.. _v0_64_1:
0.64.1 (2023-01-11)
-------------------
- Documentation now links to a current source of information for installing Python 3. (:issue:`1987`)
- Incorrectly calling the Datasette constructor using ``Datasette("path/to/data.db")`` instead of ``Datasette(["path/to/data.db"])`` now returns a useful error message. (:issue:`1985`)
.. _v0_64:
0.64 (2023-01-09)
-----------------
- Datasette now **strongly recommends against allowing arbitrary SQL queries if you are using SpatiaLite**. SpatiaLite includes SQL functions that could cause the Datasette server to crash. See :ref:`spatialite` for more details.
- New :ref:`setting_default_allow_sql` setting, providing an easier way to disable all arbitrary SQL execution by end users: ``datasette --setting default_allow_sql off``. See also :ref:`authentication_permissions_execute_sql`. (:issue:`1409`)
- `Building a location to time zone API with SpatiaLite <https://datasette.io/tutorials/spatialite>`__ is a new Datasette tutorial showing how to safely use SpatiaLite to create a location to time zone API.
- New documentation about :ref:`how to debug problems loading SQLite extensions <installation_extensions>`. The error message shown when an extension cannot be loaded has also been improved. (:issue:`1979`)
- Fixed an accessibility issue: the ``<select>`` elements in the table filter form now show an outline when they are currently focused. (:issue:`1771`)
.. _v0_63_3:
0.63.3 (2022-12-17)
-------------------
- Fixed a bug where ``datasette --root``, when running in Docker, would only output the URL to sign in root when the server shut down, not when it started up. (:issue:`1958`)
- You no longer need to ensure ``await datasette.invoke_startup()`` has been called in order for Datasette to start correctly serving requests - this is now handled automatically the first time the server receives a request. This fixes a bug experienced when Datasette is served directly by an ASGI application server such as Uvicorn or Gunicorn. It also fixes a bug with the `datasette-gunicorn <https://datasette.io/plugins/datasette-gunicorn>`__ plugin. (:issue:`1955`)
.. _v1_0_a2:
1.0a2 (2022-12-14)
------------------
The third Datasette 1.0 alpha release adds upsert support to the JSON API, plus the ability to specify finely grained permissions when creating an API token.
See `Datasette 1.0a2: Upserts and finely grained permissions <https://simonwillison.net/2022/Dec/15/datasette-1a2/>`__ for an extended, annotated version of these release notes.
- New ``/db/table/-/upsert`` API, :ref:`documented here <TableUpsertView>`. upsert is an update-or-insert: existing rows will have specified keys updated, but if no row matches the incoming primary key a brand new row will be inserted instead. (:issue:`1878`)
- New :ref:`plugin_register_permissions` plugin hook. Plugins can now register named permissions, which will then be listed in various interfaces that show available permissions. (:issue:`1940`)
- The ``/db/-/create`` API for :ref:`creating a table <TableCreateView>` now accepts ``"ignore": true`` and ``"replace": true`` options when called with the ``"rows"`` property that creates a new table based on an example set of rows. This means the API can be called multiple times with different rows, setting rules for what should happen if a primary key collides with an existing row. (:issue:`1927`)
- Arbitrary permissions can now be configured at the instance, database and resource (table, SQL view or canned query) level in Datasette's :ref:`metadata` JSON and YAML files. The new ``"permissions"`` key can be used to specify which actors should have which permissions. See :ref:`authentication_permissions_other` for details. (:issue:`1636`)
- The ``/-/create-token`` page can now be used to create API tokens which are restricted to just a subset of actions, including against specific databases or resources. See :ref:`CreateTokenView` for details. (:issue:`1947`)
- Likewise, the ``datasette create-token`` CLI command can now create tokens with :ref:`a subset of permissions <authentication_cli_create_token_restrict>`. (:issue:`1855`)
- New :ref:`datasette.create_token() API method <datasette_create_token>` for programmatically creating signed API tokens. (:issue:`1951`)
- ``/db/-/create`` API now requires actor to have ``insert-row`` permission in order to use the ``"row"`` or ``"rows"`` properties. (:issue:`1937`)
.. _v1_0_a1:
1.0a1 (2022-12-01)
------------------
- Write APIs now serve correct CORS headers if Datasette is started in ``--cors`` mode. See the full list of :ref:`CORS headers <json_api>` in the documentation. (:issue:`1922`)
- Fixed a bug where the ``_memory`` database could be written to even though writes were not persisted. (:issue:`1917`)
- The https://latest.datasette.io/ demo instance now includes an ``ephemeral`` database which can be used to test Datasette's write APIs, using the new `datasette-ephemeral-tables <https://datasette.io/plugins/datasette-ephemeral-tables>`_ plugin to drop any created tables after five minutes. This database is only available if you sign in as the root user using the link on the homepage. (:issue:`1915`)
- Fixed a bug where hitting the write endpoints with a ``GET`` request returned a 500 error. It now returns a 405 (method not allowed) error instead. (:issue:`1916`)
- The list of endpoints in the API explorer now lists mutable databases first. (:issue:`1918`)
- The ``"ignore": true`` and ``"replace": true`` options for the insert API are :ref:`now documented <TableInsertView>`. (:issue:`1924`)
.. _v1_0_a0:
1.0a0 (2022-11-29)
------------------
This first alpha release of Datasette 1.0 introduces a brand new collection of APIs for writing to the database (:issue:`1850`), as well as a new API token mechanism baked into Datasette core. Previously, API tokens have only been supported by installing additional plugins.
This is very much a preview: expect many more backwards incompatible API changes prior to the full 1.0 release.
Feedback enthusiastically welcomed, either through `issue comments <https://github.com/simonw/datasette/issues/1850>`__ or via the `Datasette Discord <https://datasette.io/discord>`__ community.
Signed API tokens
~~~~~~~~~~~~~~~~~
- New ``/-/create-token`` page allowing authenticated users to create signed API tokens that can act on their behalf, see :ref:`CreateTokenView`. (:issue:`1852`)
- New ``datasette create-token`` command for creating tokens from the command line: :ref:`authentication_cli_create_token`.
- New :ref:`setting_allow_signed_tokens` setting which can be used to turn off signed token support. (:issue:`1856`)
- New :ref:`setting_max_signed_tokens_ttl` setting for restricting the maximum allowed duration of a signed token. (:issue:`1858`)
Write API
~~~~~~~~~
- New API explorer at ``/-/api`` for trying out the API. (:issue:`1871`)
- ``/db/-/create`` API for :ref:`TableCreateView`. (:issue:`1882`)
- ``/db/table/-/insert`` API for :ref:`TableInsertView`. (:issue:`1851`)
- ``/db/table/-/drop`` API for :ref:`TableDropView`. (:issue:`1874`)
- ``/db/table/pk/-/update`` API for :ref:`RowUpdateView`. (:issue:`1863`)
- ``/db/table/pk/-/delete`` API for :ref:`RowDeleteView`. (:issue:`1864`)
.. _v0_63_2:
0.63.2 (2022-11-18)
-------------------
- Fixed a bug in ``datasette publish heroku`` where deployments failed due to an older version of Python being requested. (:issue:`1905`)
- New ``datasette publish heroku --generate-dir <dir>`` option for generating a Heroku deployment directory without deploying it.
.. _v0_63_1:
0.63.1 (2022-11-10)
@ -60,7 +454,7 @@ Documentation
Datasette can now run entirely in your browser using WebAssembly. Try out `Datasette Lite <https://lite.datasette.io/>`__, take a look `at the code <https://github.com/simonw/datasette-lite>`__ or read more about it in `Datasette Lite: a server-side Python web application running in a browser <https://simonwillison.net/2022/May/4/datasette-lite/>`__.
Datasette now has a `Discord community <https://discord.gg/ktd74dm5mw>`__ for questions and discussions about Datasette and its ecosystem of projects.
Datasette now has a `Discord community <https://datasette.io/discord>`__ for questions and discussions about Datasette and its ecosystem of projects.
Features
~~~~~~~~
@ -390,7 +784,7 @@ JavaScript modules
To use modules, JavaScript needs to be included in ``<script>`` tags with a ``type="module"`` attribute.
Datasette now has the ability to output ``<script type="module">`` in places where you may wish to take advantage of modules. The ``extra_js_urls`` option described in :ref:`customization_css_and_javascript` can now be used with modules, and module support is also available for the :ref:`extra_body_script() <plugin_hook_extra_body_script>` plugin hook. (:issue:`1186`, :issue:`1187`)
Datasette now has the ability to output ``<script type="module">`` in places where you may wish to take advantage of modules. The ``extra_js_urls`` option described in :ref:`configuration_reference_css_js` can now be used with modules, and module support is also available for the :ref:`extra_body_script() <plugin_hook_extra_body_script>` plugin hook. (:issue:`1186`, :issue:`1187`)
`datasette-leaflet-freedraw <https://datasette.io/plugins/datasette-leaflet-freedraw>`__ is the first example of a Datasette plugin that takes advantage of the new support for JavaScript modules. See `Drawing shapes on a map to query a SpatiaLite database <https://simonwillison.net/2021/Jan/24/drawing-shapes-spatialite/>`__ for more on this plugin.
@ -771,7 +1165,10 @@ Prior to this release the Datasette ecosystem has treated authentication as excl
You'll need to install plugins if you want full user accounts, but default Datasette can now authenticate a single root user with the new ``--root`` command-line option, which outputs a one-time use URL to :ref:`authenticate as a root actor <authentication_root>` (:issue:`784`)::
$ datasette fixtures.db --root
datasette fixtures.db --root
::
http://127.0.0.1:8001/-/auth-token?token=5b632f8cd44b868df625f5a6e2185d88eea5b22237fd3cc8773f107cc4fd6477
INFO: Started server process [14973]
INFO: Waiting for application startup.
@ -942,7 +1339,7 @@ You can now create :ref:`custom pages <custom_pages>` within your Datasette inst
:ref:`config_dir` (:issue:`731`) allows you to define a custom Datasette instance as a directory. So instead of running the following::
$ datasette one.db two.db \
datasette one.db two.db \
--metadata=metadata.json \
--template-dir=templates/ \
--plugins-dir=plugins \
@ -950,7 +1347,7 @@ You can now create :ref:`custom pages <custom_pages>` within your Datasette inst
You can instead arrange your files in a single directory called ``my-project`` and run this::
$ datasette my-project/
datasette my-project/
Also in this release:
@ -967,7 +1364,7 @@ Also in this release:
0.40 (2020-04-21)
-----------------
* Datasette :ref:`metadata` can now be provided as a YAML file as an optional alternative to JSON. See :ref:`metadata_yaml`. (:issue:`713`)
* Datasette :ref:`metadata` can now be provided as a YAML file as an optional alternative to JSON. (:issue:`713`)
* Removed support for ``datasette publish now``, which used the the now-retired Zeit Now v1 hosting platform. A new plugin, `datasette-publish-now <https://github.com/simonw/datasette-publish-now>`__, can be installed to publish data to Zeit (`now Vercel <https://vercel.com/blog/zeit-is-now-vercel>`__) Now v2. (:issue:`710`)
* Fixed a bug where the ``extra_template_vars(request, view_name)`` plugin hook was not receiving the correct ``view_name``. (:issue:`716`)
* Variables added to the template context by the ``extra_template_vars()`` plugin hook are now shown in the ``?_context=1`` debugging mode (see :ref:`setting_template_debug`). (:issue:`693`)
@ -1622,7 +2019,10 @@ In addition to the work on facets:
Added new help section::
$ datasette --help-config
datasette --help-config
::
Config options:
default_page_size Default page size for the table view
(default=100)

Wyświetl plik

@ -47,13 +47,14 @@ Running ``datasette --help`` shows a list of all of the available commands.
--help Show this message and exit.
Commands:
serve* Serve up specified SQLite database files with a web UI
inspect Generate JSON summary of provided database files
install Install plugins and packages from PyPI into the same...
package Package SQLite files into a Datasette Docker container
plugins List currently installed plugins
publish Publish specified SQLite database files to the internet along...
uninstall Uninstall plugins and Python packages from the Datasette...
serve* Serve up specified SQLite database files with a web UI
create-token Create a signed API token for the specified actor ID
inspect Generate JSON summary of provided database files
install Install plugins and packages from PyPI into the same...
package Package SQLite files into a Datasette Docker container
plugins List currently installed plugins
publish Publish specified SQLite database files to the internet...
uninstall Uninstall plugins and Python packages from the Datasette...
.. [[[end]]]
@ -111,16 +112,17 @@ Once started you can access it at ``http://localhost:8001``
--static MOUNT:DIRECTORY Serve static files from this directory at
/MOUNT/...
--memory Make /_memory database available
--config CONFIG Deprecated: set config option using
configname:value. Use --setting instead.
--setting SETTING... Setting, see
docs.datasette.io/en/stable/settings.html
-c, --config FILENAME Path to JSON/YAML Datasette configuration file
-s, --setting SETTING... nested.key, value setting to use in Datasette
configuration
--secret TEXT Secret used for signing secure values, such as
signed cookies
--root Output URL that sets a cookie authenticating
the root user
--get TEXT Run an HTTP GET request against this path,
print results and exit
--token TEXT API token to send with --get requests
--actor TEXT Actor to use for --get requests (JSON string)
--version-note TEXT Additional note to show on /-/versions
--help-settings Show available settings
--pdb Launch debugger on any errors
@ -132,6 +134,8 @@ Once started you can access it at ``http://localhost:8001``
mode
--ssl-keyfile TEXT SSL key file
--ssl-certfile TEXT SSL certificate file
--internal PATH Path to a persistent Datasette internal SQLite
database
--help Show this message and exit.
@ -147,9 +151,14 @@ The ``--get`` option to ``datasette serve`` (or just ``datasette``) specifies th
This means that all of Datasette's functionality can be accessed directly from the command-line.
For example::
For example:
.. code-block:: bash
datasette --get '/-/versions.json' | jq .
.. code-block:: json
$ datasette --get '/-/versions.json' | jq .
{
"python": {
"version": "3.8.5",
@ -188,7 +197,15 @@ For example::
}
}
The exit code will be 0 if the request succeeds and 1 if the request produced an HTTP status code other than 200 - e.g. a 404 or 500 error.
You can use the ``--token TOKEN`` option to send an :ref:`API token <CreateTokenView>` with the simulated request.
Or you can make a request as a specific actor by passing a JSON representation of that actor to ``--actor``:
.. code-block:: bash
datasette --memory --actor '{"id": "root"}' --get '/-/actor.json'
The exit code of ``datasette --get`` will be 0 if the request succeeds and 1 if the request produced an HTTP status code other than 200 - e.g. a 404 or 500 error.
This lets you use ``datasette --get /`` to run tests against a Datasette application in a continuous integration environment such as GitHub Actions.
@ -212,6 +229,8 @@ These can be passed to ``datasette serve`` using ``datasette serve --setting nam
(default=100)
max_returned_rows Maximum rows that can be returned from a table or
custom query (default=1000)
max_insert_rows Maximum rows that can be inserted at a time using
the bulk insert API (default=100)
num_sql_threads Number of threads in the thread pool for
executing SQLite queries (default=3)
sql_time_limit_ms Time limit for a SQL query in milliseconds
@ -226,6 +245,12 @@ These can be passed to ``datasette serve`` using ``datasette serve --setting nam
?_facet= parameter (default=True)
allow_download Allow users to download the original SQLite
database files (default=True)
allow_signed_tokens Allow users to create and use signed API tokens
(default=True)
default_allow_sql Allow anyone to run arbitrary SQL queries
(default=True)
max_signed_tokens_ttl Maximum allowed expiry time for signed API tokens
(default=0)
suggest_facets Calculate and display suggested facets
(default=True)
default_cache_ttl Default HTTP cache TTL (used in Cache-Control:
@ -270,6 +295,7 @@ Output JSON showing all currently installed plugins, their versions, whether the
Options:
--all Include built-in default plugins
--requirements Output requirements.txt of installed plugins
--plugins-dir DIRECTORY Path to directory containing custom plugins
--help Show this message and exit.
@ -333,13 +359,15 @@ Would install the `datasette-cluster-map <https://datasette.io/plugins/datasette
::
Usage: datasette install [OPTIONS] PACKAGES...
Usage: datasette install [OPTIONS] [PACKAGES]...
Install plugins and packages from PyPI into the same environment as Datasette
Options:
-U, --upgrade Upgrade packages to latest version
--help Show this message and exit.
-U, --upgrade Upgrade packages to latest version
-r, --requirement PATH Install from requirements file
-e, --editable TEXT Install a project in editable mode from this path
--help Show this message and exit.
.. [[[end]]]
@ -501,6 +529,8 @@ See :ref:`publish_heroku`.
-n, --name TEXT Application name to use when deploying
--tar TEXT --tar option to pass to Heroku, e.g.
--tar=/usr/local/bin/gtar
--generate-dir DIRECTORY Output generated application files and stop
without deploying
--help Show this message and exit.
@ -589,3 +619,61 @@ This performance optimization is used automatically by some of the ``datasette p
.. [[[end]]]
.. _cli_help_create_token___help:
datasette create-token
======================
Create a signed API token, see :ref:`authentication_cli_create_token`.
.. [[[cog
help(["create-token", "--help"])
.. ]]]
::
Usage: datasette create-token [OPTIONS] ID
Create a signed API token for the specified actor ID
Example:
datasette create-token root --secret mysecret
To allow only "view-database-download" for all databases:
datasette create-token root --secret mysecret \
--all view-database-download
To allow "create-table" against a specific database:
datasette create-token root --secret mysecret \
--database mydb create-table
To allow "insert-row" against a specific table:
datasette create-token root --secret myscret \
--resource mydb mytable insert-row
Restricted actions can be specified multiple times using multiple --all,
--database, and --resource options.
Add --debug to see a decoded version of the token.
Options:
--secret TEXT Secret used for signing the API tokens
[required]
-e, --expires-after INTEGER Token should expire after this many seconds
-a, --all ACTION Restrict token to this action
-d, --database DB ACTION Restrict token to this action on this database
-r, --resource DB RESOURCE ACTION
Restrict token to this action on this database
resource (a table, SQL view or named query)
--debug Show decoded token
--plugins-dir DIRECTORY Path to directory containing custom plugins
--help Show this message and exit.
.. [[[end]]]

Wyświetl plik

@ -1 +1,5 @@
alls
fo
ro
te
ths

Wyświetl plik

@ -17,7 +17,8 @@
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
# import os
import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
@ -31,10 +32,18 @@
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = ["sphinx.ext.extlinks", "sphinx.ext.autodoc", "sphinx_copybutton"]
extensions = [
"sphinx.ext.extlinks",
"sphinx.ext.autodoc",
"sphinx_copybutton",
]
if not os.environ.get("DISABLE_SPHINX_INLINE_TABS"):
extensions += ["sphinx_inline_tabs"]
autodoc_member_order = "bysource"
extlinks = {
"issue": ("https://github.com/simonw/datasette/issues/%s", "#"),
"issue": ("https://github.com/simonw/datasette/issues/%s", "#%s"),
}
# Add any paths that contain templates here, relative to this directory.
@ -105,6 +114,7 @@ html_theme_options = {
html_static_path = ["_static"]
html_logo = "datasette-logo.svg"
html_favicon = "_static/datasette-favicon.png"
html_css_files = [
"css/custom.css",

Wyświetl plik

@ -0,0 +1,638 @@
.. _configuration:
Configuration
=============
Datasette offers several ways to configure your Datasette instances: server settings, plugin configuration, authentication, and more.
Most configuration can be handled using a ``datasette.yaml`` configuration file, passed to datasette using the ``-c/--config`` flag:
.. code-block:: bash
datasette mydatabase.db --config datasette.yaml
This file can also use JSON, as ``datasette.json``. YAML is recommended over JSON due to its support for comments and multi-line strings.
.. _configuration_cli:
Configuration via the command-line
----------------------------------
The recommended way to configure Datasette is using a ``datasette.yaml`` file passed to ``-c/--config``. You can also pass individual settings to Datasette using the ``-s/--setting`` option, which can be used multiple times:
.. code-block:: bash
datasette mydatabase.db \
--setting settings.default_page_size 50 \
--setting settings.sql_time_limit_ms 3500
This option takes dotted-notation for the first argument and a value for the second argument. This means you can use it to set any configuration value that would be valid in a ``datasette.yaml`` file.
It also works for plugin configuration, for example for `datasette-cluster-map <https://datasette.io/plugins/datasette-cluster-map>`_:
.. code-block:: bash
datasette mydatabase.db \
--setting plugins.datasette-cluster-map.latitude_column xlat \
--setting plugins.datasette-cluster-map.longitude_column xlon
If the value you provide is a valid JSON object or list it will be treated as nested data, allowing you to configure plugins that accept lists such as `datasette-proxy-url <https://datasette.io/plugins/datasette-proxy-url>`_:
.. code-block:: bash
datasette mydatabase.db \
-s plugins.datasette-proxy-url.paths '[{"path": "/proxy", "backend": "http://example.com/"}]'
This is equivalent to a ``datasette.yaml`` file containing the following:
.. [[[cog
from metadata_doc import config_example
import textwrap
config_example(cog, textwrap.dedent(
"""
plugins:
datasette-proxy-url:
paths:
- path: /proxy
backend: http://example.com/
""").strip()
)
.. ]]]
.. tab:: datasette.yaml
.. code-block:: yaml
plugins:
datasette-proxy-url:
paths:
- path: /proxy
backend: http://example.com/
.. tab:: datasette.json
.. code-block:: json
{
"plugins": {
"datasette-proxy-url": {
"paths": [
{
"path": "/proxy",
"backend": "http://example.com/"
}
]
}
}
}
.. [[[end]]]
.. _configuration_reference:
``datasette.yaml`` reference
----------------------------
The following example shows some of the valid configuration options that can exist inside ``datasette.yaml``.
.. [[[cog
from metadata_doc import config_example
import textwrap
config_example(cog, textwrap.dedent(
"""
# Datasette settings block
settings:
default_page_size: 50
sql_time_limit_ms: 3500
max_returned_rows: 2000
# top-level plugin configuration
plugins:
datasette-my-plugin:
key: valueA
# Database and table-level configuration
databases:
your_db_name:
# plugin configuration for the your_db_name database
plugins:
datasette-my-plugin:
key: valueA
tables:
your_table_name:
allow:
# Only the root user can access this table
id: root
# plugin configuration for the your_table_name table
# inside your_db_name database
plugins:
datasette-my-plugin:
key: valueB
""")
)
.. ]]]
.. tab:: datasette.yaml
.. code-block:: yaml
# Datasette settings block
settings:
default_page_size: 50
sql_time_limit_ms: 3500
max_returned_rows: 2000
# top-level plugin configuration
plugins:
datasette-my-plugin:
key: valueA
# Database and table-level configuration
databases:
your_db_name:
# plugin configuration for the your_db_name database
plugins:
datasette-my-plugin:
key: valueA
tables:
your_table_name:
allow:
# Only the root user can access this table
id: root
# plugin configuration for the your_table_name table
# inside your_db_name database
plugins:
datasette-my-plugin:
key: valueB
.. tab:: datasette.json
.. code-block:: json
{
"settings": {
"default_page_size": 50,
"sql_time_limit_ms": 3500,
"max_returned_rows": 2000
},
"plugins": {
"datasette-my-plugin": {
"key": "valueA"
}
},
"databases": {
"your_db_name": {
"plugins": {
"datasette-my-plugin": {
"key": "valueA"
}
},
"tables": {
"your_table_name": {
"allow": {
"id": "root"
},
"plugins": {
"datasette-my-plugin": {
"key": "valueB"
}
}
}
}
}
}
}
.. [[[end]]]
.. _configuration_reference_settings:
Settings
~~~~~~~~
:ref:`settings` can be configured in ``datasette.yaml`` with the ``settings`` key:
.. [[[cog
from metadata_doc import config_example
import textwrap
config_example(cog, textwrap.dedent(
"""
# inside datasette.yaml
settings:
default_allow_sql: off
default_page_size: 50
""").strip()
)
.. ]]]
.. tab:: datasette.yaml
.. code-block:: yaml
# inside datasette.yaml
settings:
default_allow_sql: off
default_page_size: 50
.. tab:: datasette.json
.. code-block:: json
{
"settings": {
"default_allow_sql": "off",
"default_page_size": 50
}
}
.. [[[end]]]
The full list of settings is available in the :ref:`settings documentation <settings>`. Settings can also be passed to Datasette using one or more ``--setting name value`` command line options.`
.. _configuration_reference_plugins:
Plugin configuration
~~~~~~~~~~~~~~~~~~~~
:ref:`Datasette plugins <plugins>` often require configuration. This plugin configuration should be placed in ``plugins`` keys inside ``datasette.yaml``.
Most plugins are configured at the top-level of the file, using the ``plugins`` key:
.. [[[cog
from metadata_doc import config_example
import textwrap
config_example(cog, textwrap.dedent(
"""
# inside datasette.yaml
plugins:
datasette-my-plugin:
key: my_value
""").strip()
)
.. ]]]
.. tab:: datasette.yaml
.. code-block:: yaml
# inside datasette.yaml
plugins:
datasette-my-plugin:
key: my_value
.. tab:: datasette.json
.. code-block:: json
{
"plugins": {
"datasette-my-plugin": {
"key": "my_value"
}
}
}
.. [[[end]]]
Some plugins can be configured at the database or table level. These should use a ``plugins`` key nested under the appropriate place within the ``databases`` object:
.. [[[cog
from metadata_doc import config_example
import textwrap
config_example(cog, textwrap.dedent(
"""
# inside datasette.yaml
databases:
my_database:
# plugin configuration for the my_database database
plugins:
datasette-my-plugin:
key: my_value
my_other_database:
tables:
my_table:
# plugin configuration for the my_table table inside the my_other_database database
plugins:
datasette-my-plugin:
key: my_value
""").strip()
)
.. ]]]
.. tab:: datasette.yaml
.. code-block:: yaml
# inside datasette.yaml
databases:
my_database:
# plugin configuration for the my_database database
plugins:
datasette-my-plugin:
key: my_value
my_other_database:
tables:
my_table:
# plugin configuration for the my_table table inside the my_other_database database
plugins:
datasette-my-plugin:
key: my_value
.. tab:: datasette.json
.. code-block:: json
{
"databases": {
"my_database": {
"plugins": {
"datasette-my-plugin": {
"key": "my_value"
}
}
},
"my_other_database": {
"tables": {
"my_table": {
"plugins": {
"datasette-my-plugin": {
"key": "my_value"
}
}
}
}
}
}
}
.. [[[end]]]
.. _configuration_reference_permissions:
Permissions configuration
~~~~~~~~~~~~~~~~~~~~~~~~~
Datasette's :ref:`authentication and permissions <authentication>` system can also be configured using ``datasette.yaml``.
Here is a simple example:
.. [[[cog
from metadata_doc import config_example
import textwrap
config_example(cog, textwrap.dedent(
"""
# Instance is only available to users 'sharon' and 'percy':
allow:
id:
- sharon
- percy
# Only 'percy' is allowed access to the accounting database:
databases:
accounting:
allow:
id: percy
""").strip()
)
.. ]]]
.. tab:: datasette.yaml
.. code-block:: yaml
# Instance is only available to users 'sharon' and 'percy':
allow:
id:
- sharon
- percy
# Only 'percy' is allowed access to the accounting database:
databases:
accounting:
allow:
id: percy
.. tab:: datasette.json
.. code-block:: json
{
"allow": {
"id": [
"sharon",
"percy"
]
},
"databases": {
"accounting": {
"allow": {
"id": "percy"
}
}
}
}
.. [[[end]]]
:ref:`authentication_permissions_config` has the full details.
.. _configuration_reference_canned_queries:
Canned queries configuration
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
:ref:`Canned queries <canned_queries>` are named SQL queries that appear in the Datasette interface. They can be configured in ``datasette.yaml`` using the ``queries`` key at the database level:
.. [[[cog
from metadata_doc import config_example, config_example
config_example(cog, {
"databases": {
"sf-trees": {
"queries": {
"just_species": {
"sql": "select qSpecies from Street_Tree_List"
}
}
}
}
})
.. ]]]
.. tab:: datasette.yaml
.. code-block:: yaml
databases:
sf-trees:
queries:
just_species:
sql: select qSpecies from Street_Tree_List
.. tab:: datasette.json
.. code-block:: json
{
"databases": {
"sf-trees": {
"queries": {
"just_species": {
"sql": "select qSpecies from Street_Tree_List"
}
}
}
}
}
.. [[[end]]]
See the :ref:`canned queries documentation <canned_queries>` for more, including how to configure :ref:`writable canned queries <canned_queries_writable>`.
.. _configuration_reference_css_js:
Custom CSS and JavaScript
~~~~~~~~~~~~~~~~~~~~~~~~~
Datasette can load additional CSS and JavaScript files, configured in ``datasette.yaml`` like this:
.. [[[cog
from metadata_doc import config_example
config_example(cog, """
extra_css_urls:
- https://simonwillison.net/static/css/all.bf8cd891642c.css
extra_js_urls:
- https://code.jquery.com/jquery-3.2.1.slim.min.js
""")
.. ]]]
.. tab:: datasette.yaml
.. code-block:: yaml
extra_css_urls:
- https://simonwillison.net/static/css/all.bf8cd891642c.css
extra_js_urls:
- https://code.jquery.com/jquery-3.2.1.slim.min.js
.. tab:: datasette.json
.. code-block:: json
{
"extra_css_urls": [
"https://simonwillison.net/static/css/all.bf8cd891642c.css"
],
"extra_js_urls": [
"https://code.jquery.com/jquery-3.2.1.slim.min.js"
]
}
.. [[[end]]]
The extra CSS and JavaScript files will be linked in the ``<head>`` of every page:
.. code-block:: html
<link rel="stylesheet" href="https://simonwillison.net/static/css/all.bf8cd891642c.css">
<script src="https://code.jquery.com/jquery-3.2.1.slim.min.js"></script>
You can also specify a SRI (subresource integrity hash) for these assets:
.. [[[cog
config_example(cog, """
extra_css_urls:
- url: https://simonwillison.net/static/css/all.bf8cd891642c.css
sri: sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI
extra_js_urls:
- url: https://code.jquery.com/jquery-3.2.1.slim.min.js
sri: sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g=
""")
.. ]]]
.. tab:: datasette.yaml
.. code-block:: yaml
extra_css_urls:
- url: https://simonwillison.net/static/css/all.bf8cd891642c.css
sri: sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI
extra_js_urls:
- url: https://code.jquery.com/jquery-3.2.1.slim.min.js
sri: sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g=
.. tab:: datasette.json
.. code-block:: json
{
"extra_css_urls": [
{
"url": "https://simonwillison.net/static/css/all.bf8cd891642c.css",
"sri": "sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI"
}
],
"extra_js_urls": [
{
"url": "https://code.jquery.com/jquery-3.2.1.slim.min.js",
"sri": "sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g="
}
]
}
.. [[[end]]]
This will produce:
.. code-block:: html
<link rel="stylesheet" href="https://simonwillison.net/static/css/all.bf8cd891642c.css"
integrity="sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI"
crossorigin="anonymous">
<script src="https://code.jquery.com/jquery-3.2.1.slim.min.js"
integrity="sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g="
crossorigin="anonymous"></script>
Modern browsers will only execute the stylesheet or JavaScript if the SRI hash
matches the content served. You can generate hashes using `www.srihash.org <https://www.srihash.org/>`_
Items in ``"extra_js_urls"`` can specify ``"module": true`` if they reference JavaScript that uses `JavaScript modules <https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules>`__. This configuration:
.. [[[cog
config_example(cog, """
extra_js_urls:
- url: https://example.datasette.io/module.js
module: true
""")
.. ]]]
.. tab:: datasette.yaml
.. code-block:: yaml
extra_js_urls:
- url: https://example.datasette.io/module.js
module: true
.. tab:: datasette.json
.. code-block:: json
{
"extra_js_urls": [
{
"url": "https://example.datasette.io/module.js",
"module": true
}
]
}
.. [[[end]]]
Will produce this HTML:
.. code-block:: html
<script type="module" src="https://example.datasette.io/module.js"></script>

Wyświetl plik

@ -19,7 +19,7 @@ General guidelines
Setting up a development environment
------------------------------------
If you have Python 3.7 or higher installed on your computer (on OS X the quickest way to do this `is using homebrew <https://docs.python-guide.org/starting/install3/osx/>`__) you can install an editable copy of Datasette using the following steps.
If you have Python 3.8 or higher installed on your computer (on OS X the quickest way to do this `is using homebrew <https://docs.python-guide.org/starting/install3/osx/>`__) you can install an editable copy of Datasette using the following steps.
If you want to use GitHub to publish your changes, first `create a fork of datasette <https://github.com/simonw/datasette/fork>`__ under your own GitHub account.
@ -133,13 +133,19 @@ Running Black
Black will be installed when you run ``pip install -e '.[test]'``. To test that your code complies with Black, run the following in your root ``datasette`` repository checkout::
$ black . --check
black . --check
::
All done! ✨ 🍰 ✨
95 files would be left unchanged.
If any of your code does not conform to Black you can run this to automatically fix those problems::
$ black .
black .
::
reformatted ../datasette/setup.py
All done! ✨ 🍰 ✨
1 file reformatted, 94 files left unchanged.
@ -160,11 +166,14 @@ Prettier
To install Prettier, `install Node.js <https://nodejs.org/en/download/package-manager/>`__ and then run the following in the root of your ``datasette`` repository checkout::
$ npm install
npm install
This will install Prettier in a ``node_modules`` directory. You can then check that your code matches the coding style like so::
$ npm run prettier -- --check
npm run prettier -- --check
::
> prettier
> prettier 'datasette/static/*[!.min].js' "--check"
@ -174,7 +183,7 @@ This will install Prettier in a ``node_modules`` directory. You can then check t
You can fix any problems by running::
$ npm run fix
npm run fix
.. _contributing_documentation:
@ -245,6 +254,7 @@ Datasette releases are performed using tags. When a new release is published on
* Re-point the "latest" tag on Docker Hub to the new image
* Build a wheel bundle of the underlying Python source code
* Push that new wheel up to PyPI: https://pypi.org/project/datasette/
* If the release is an alpha, navigate to https://readthedocs.org/projects/datasette/versions/ and search for the tag name in the "Activate a version" filter, then mark that version as "active" to ensure it will appear on the public ReadTheDocs documentation site.
To deploy new releases you will need to have push access to the main Datasette GitHub repository.
@ -322,20 +332,17 @@ Upgrading CodeMirror
Datasette bundles `CodeMirror <https://codemirror.net/>`__ for the SQL editing interface, e.g. on `this page <https://latest.datasette.io/fixtures>`__. Here are the steps for upgrading to a new version of CodeMirror:
* Download and extract latest CodeMirror zip file from https://codemirror.net/codemirror.zip
* Rename ``lib/codemirror.js`` to ``codemirror-5.57.0.js`` (using latest version number)
* Rename ``lib/codemirror.css`` to ``codemirror-5.57.0.css``
* Rename ``mode/sql/sql.js`` to ``codemirror-5.57.0-sql.js``
* Edit both JavaScript files to make the top license comment a ``/* */`` block instead of multiple ``//`` lines
* Minify the JavaScript files like this::
* Install the packages with::
npx uglify-js codemirror-5.57.0.js -o codemirror-5.57.0.min.js --comments '/LICENSE/'
npx uglify-js codemirror-5.57.0-sql.js -o codemirror-5.57.0-sql.min.js --comments '/LICENSE/'
npm i codemirror @codemirror/lang-sql
* Check that the LICENSE comment did indeed survive minification
* Minify the CSS file like this::
* Build the bundle using the version number from package.json with::
npx clean-css-cli codemirror-5.57.0.css -o codemirror-5.57.0.min.css
node_modules/.bin/rollup datasette/static/cm-editor-6.0.1.js \
-f iife \
-n cm \
-o datasette/static/cm-editor-6.0.1.bundle.js \
-p @rollup/plugin-node-resolve \
-p @rollup/plugin-terser
* Edit the ``_codemirror.html`` template to reference the new files
* ``git rm`` the old files, ``git add`` the new files
* Update the version reference in the ``codemirror.html`` template.

Wyświetl plik

@ -5,87 +5,6 @@ Custom pages and templates
Datasette provides a number of ways of customizing the way data is displayed.
.. _customization_css_and_javascript:
Custom CSS and JavaScript
-------------------------
When you launch Datasette, you can specify a custom metadata file like this::
datasette mydb.db --metadata metadata.json
Your ``metadata.json`` file can include links that look like this:
.. code-block:: json
{
"extra_css_urls": [
"https://simonwillison.net/static/css/all.bf8cd891642c.css"
],
"extra_js_urls": [
"https://code.jquery.com/jquery-3.2.1.slim.min.js"
]
}
The extra CSS and JavaScript files will be linked in the ``<head>`` of every page:
.. code-block:: html
<link rel="stylesheet" href="https://simonwillison.net/static/css/all.bf8cd891642c.css">
<script src="https://code.jquery.com/jquery-3.2.1.slim.min.js"></script>
You can also specify a SRI (subresource integrity hash) for these assets:
.. code-block:: json
{
"extra_css_urls": [
{
"url": "https://simonwillison.net/static/css/all.bf8cd891642c.css",
"sri": "sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI"
}
],
"extra_js_urls": [
{
"url": "https://code.jquery.com/jquery-3.2.1.slim.min.js",
"sri": "sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g="
}
]
}
This will produce:
.. code-block:: html
<link rel="stylesheet" href="https://simonwillison.net/static/css/all.bf8cd891642c.css"
integrity="sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI"
crossorigin="anonymous">
<script src="https://code.jquery.com/jquery-3.2.1.slim.min.js"
integrity="sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g="
crossorigin="anonymous"></script>
Modern browsers will only execute the stylesheet or JavaScript if the SRI hash
matches the content served. You can generate hashes using `www.srihash.org <https://www.srihash.org/>`_
Items in ``"extra_js_urls"`` can specify ``"module": true`` if they reference JavaScript that uses `JavaScript modules <https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules>`__. This configuration:
.. code-block:: json
{
"extra_js_urls": [
{
"url": "https://example.datasette.io/module.js",
"module": true
}
]
}
Will produce this HTML:
.. code-block:: html
<script type="module" src="https://example.datasette.io/module.js"></script>
CSS classes on the <body>
~~~~~~~~~~~~~~~~~~~~~~~~~
@ -179,25 +98,49 @@ Consider the following directory structure::
You can start Datasette using ``--static assets:static-files/`` to serve those
files from the ``/assets/`` mount point::
$ datasette -m metadata.json --static assets:static-files/ --memory
datasette --config datasette.yaml --static assets:static-files/ --memory
The following URLs will now serve the content from those CSS and JS files::
http://localhost:8001/assets/styles.css
http://localhost:8001/assets/app.js
You can reference those files from ``metadata.json`` like so:
You can reference those files from ``datasette.yaml`` like this, see :ref:`custom CSS and JavaScript <configuration_reference_css_js>` for more details:
.. code-block:: json
.. [[[cog
from metadata_doc import config_example
config_example(cog, """
extra_css_urls:
- /assets/styles.css
extra_js_urls:
- /assets/app.js
""")
.. ]]]
{
"extra_css_urls": [
.. tab:: datasette.yaml
.. code-block:: yaml
extra_css_urls:
- /assets/styles.css
extra_js_urls:
- /assets/app.js
.. tab:: datasette.json
.. code-block:: json
{
"extra_css_urls": [
"/assets/styles.css"
],
"extra_js_urls": [
],
"extra_js_urls": [
"/assets/app.js"
]
}
]
}
.. [[[end]]]
Publishing static assets
~~~~~~~~~~~~~~~~~~~~~~~~
@ -205,7 +148,7 @@ Publishing static assets
The :ref:`cli_publish` command can be used to publish your static assets,
using the same syntax as above::
$ datasette publish cloudrun mydb.db --static assets:static-files/
datasette publish cloudrun mydb.db --static assets:static-files/
This will upload the contents of the ``static-files/`` directory as part of the
deployment, and configure Datasette to correctly serve the assets from ``/assets/``.
@ -338,7 +281,7 @@ You can add templated pages to your Datasette instance by creating HTML files in
For example, to add a custom page that is served at ``http://localhost/about`` you would create a file in ``templates/pages/about.html``, then start Datasette like this::
$ datasette mydb.db --template-dir=templates/
datasette mydb.db --template-dir=templates/
You can nest directories within pages to create a nested structure. To create a ``http://localhost:8001/about/map`` page you would create ``templates/pages/about/map.html``.
@ -393,7 +336,7 @@ To serve a custom HTTP header, add a ``custom_header(name, value)`` function cal
You can verify this is working using ``curl`` like this::
$ curl -I 'http://127.0.0.1:8001/teapot'
curl -I 'http://127.0.0.1:8001/teapot'
HTTP/1.1 418
date: Sun, 26 Apr 2020 18:38:30 GMT
server: uvicorn

Wyświetl plik

@ -56,7 +56,7 @@ Create a file at ``/etc/systemd/system/datasette.service`` with the following co
Add a random value for the ``DATASETTE_SECRET`` - this will be used to sign Datasette cookies such as the CSRF token cookie. You can generate a suitable value like so::
$ python3 -c 'import secrets; print(secrets.token_hex(32))'
python3 -c 'import secrets; print(secrets.token_hex(32))'
This configuration will run Datasette against all database files contained in the ``/home/ubuntu/datasette-root`` directory. If that directory contains a ``metadata.yml`` (or ``.json``) file or a ``templates/`` or ``plugins/`` sub-directory those will automatically be loaded by Datasette - see :ref:`config_dir` for details.

14
docs/events.rst 100644
Wyświetl plik

@ -0,0 +1,14 @@
.. _events:
Events
======
Datasette includes a mechanism for tracking events that occur while the software is running. This is primarily intended to be used by plugins, which can both trigger events and listen for events.
The core Datasette application triggers events when certain things happen. This page describes those events.
Plugins can listen for events using the :ref:`plugin_hook_track_event` plugin hook, which will be called with instances of the following classes - or additional classes :ref:`registered by other plugins <plugin_hook_register_events>`.
.. automodule:: datasette.events
:members:
:exclude-members: Event

Wyświetl plik

@ -98,16 +98,16 @@ You can increase this on an individual page by adding ``?_facet_size=100`` to th
.. _facets_metadata:
Facets in metadata.json
-----------------------
Facets in metadata
------------------
You can turn facets on by default for specific tables by adding them to a ``"facets"`` key in a Datasette :ref:`metadata` file.
Here's an example that turns on faceting by default for the ``qLegalStatus`` column in the ``Street_Tree_List`` table in the ``sf-trees`` database:
.. code-block:: json
{
.. [[[cog
from metadata_doc import metadata_example
metadata_example(cog, {
"databases": {
"sf-trees": {
"tables": {
@ -117,26 +117,82 @@ Here's an example that turns on faceting by default for the ``qLegalStatus`` col
}
}
}
}
})
.. ]]]
.. tab:: metadata.yaml
.. code-block:: yaml
databases:
sf-trees:
tables:
Street_Tree_List:
facets:
- qLegalStatus
.. tab:: metadata.json
.. code-block:: json
{
"databases": {
"sf-trees": {
"tables": {
"Street_Tree_List": {
"facets": [
"qLegalStatus"
]
}
}
}
}
}
.. [[[end]]]
Facets defined in this way will always be shown in the interface and returned in the API, regardless of the ``_facet`` arguments passed to the view.
You can specify :ref:`array <facet_by_json_array>` or :ref:`date <facet_by_date>` facets in metadata using JSON objects with a single key of ``array`` or ``date`` and a value specifying the column, like this:
.. code-block:: json
.. [[[cog
metadata_example(cog, {
"facets": [
{"array": "tags"},
{"date": "created"}
]
})
.. ]]]
{
"facets": [
{"array": "tags"},
{"date": "created"}
]
}
.. tab:: metadata.yaml
.. code-block:: yaml
facets:
- array: tags
- date: created
.. tab:: metadata.json
.. code-block:: json
{
"facets": [
{
"array": "tags"
},
{
"date": "created"
}
]
}
.. [[[end]]]
You can change the default facet size (the number of results shown for each facet) for a table using ``facet_size``:
.. code-block:: json
{
.. [[[cog
metadata_example(cog, {
"databases": {
"sf-trees": {
"tables": {
@ -147,7 +203,41 @@ You can change the default facet size (the number of results shown for each face
}
}
}
}
})
.. ]]]
.. tab:: metadata.yaml
.. code-block:: yaml
databases:
sf-trees:
tables:
Street_Tree_List:
facets:
- qLegalStatus
facet_size: 10
.. tab:: metadata.json
.. code-block:: json
{
"databases": {
"sf-trees": {
"tables": {
"Street_Tree_List": {
"facets": [
"qLegalStatus"
],
"facet_size": 10
}
}
}
}
}
.. [[[end]]]
Suggested facets
----------------
@ -170,14 +260,17 @@ Speeding up facets with indexes
The performance of facets can be greatly improved by adding indexes on the columns you wish to facet by.
Adding indexes can be performed using the ``sqlite3`` command-line utility. Here's how to add an index on the ``state`` column in a table called ``Food_Trucks``::
$ sqlite3 mydatabase.db
sqlite3 mydatabase.db
::
SQLite version 3.19.3 2017-06-27 16:48:08
Enter ".help" for usage hints.
sqlite> CREATE INDEX Food_Trucks_state ON Food_Trucks("state");
Or using the `sqlite-utils <https://sqlite-utils.datasette.io/en/stable/cli.html#creating-indexes>`__ command-line utility::
$ sqlite-utils create-index mydatabase.db Food_Trucks state
sqlite-utils create-index mydatabase.db Food_Trucks state
.. _facet_by_json_array:

Wyświetl plik

@ -64,9 +64,9 @@ The ``"searchmode": "raw"`` property can be used to default the table to accepti
Here is an example which enables full-text search (with SQLite advanced search operators) for a ``display_ads`` view which is defined against the ``ads`` table and hence needs to run FTS against the ``ads_fts`` table, using the ``id`` as the primary key:
.. code-block:: json
{
.. [[[cog
from metadata_doc import metadata_example
metadata_example(cog, {
"databases": {
"russian-ads": {
"tables": {
@ -78,7 +78,40 @@ Here is an example which enables full-text search (with SQLite advanced search o
}
}
}
}
})
.. ]]]
.. tab:: metadata.yaml
.. code-block:: yaml
databases:
russian-ads:
tables:
display_ads:
fts_table: ads_fts
fts_pk: id
searchmode: raw
.. tab:: metadata.json
.. code-block:: json
{
"databases": {
"russian-ads": {
"tables": {
"display_ads": {
"fts_table": "ads_fts",
"fts_pk": "id",
"searchmode": "raw"
}
}
}
}
}
.. [[[end]]]
.. _full_text_search_custom_sql:
@ -144,14 +177,14 @@ Configuring FTS using sqlite-utils
Here's how to use ``sqlite-utils`` to enable full-text search for an ``items`` table across the ``name`` and ``description`` columns::
$ sqlite-utils enable-fts mydatabase.db items name description
sqlite-utils enable-fts mydatabase.db items name description
Configuring FTS using csvs-to-sqlite
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
If your data starts out in CSV files, you can use Datasette's companion tool `csvs-to-sqlite <https://github.com/simonw/csvs-to-sqlite>`__ to convert that file into a SQLite database and enable full-text search on specific columns. For a file called ``items.csv`` where you want full-text search to operate against the ``name`` and ``description`` columns you would run the following::
$ csvs-to-sqlite items.csv items.db -f name -f description
csvs-to-sqlite items.csv items.db -f name -f description
Configuring FTS by hand
~~~~~~~~~~~~~~~~~~~~~~~

Wyświetl plik

@ -17,7 +17,7 @@ datasette| |discord|
.. |docker: datasette| image:: https://img.shields.io/badge/docker-datasette-blue
:target: https://hub.docker.com/r/datasetteproject/datasette
.. |discord| image:: https://img.shields.io/discord/823971286308356157?label=discord
:target: https://discord.gg/ktd74dm5mw
:target: https://datasette.io/discord
*An open source multi-tool for exploring and publishing data*
@ -29,7 +29,7 @@ Datasette is aimed at data journalists, museum curators, archivists, local gover
Interested in learning Datasette? Start with `the official tutorials <https://datasette.io/tutorials>`__.
Support questions, feedback? Join our `GitHub Discussions forum <https://github.com/simonw/datasette/discussions>`__.
Support questions, feedback? Join the `Datasette Discord <https://datasette.io/discord>`__.
Contents
--------
@ -39,6 +39,7 @@ Contents
getting_started
installation
configuration
ecosystem
cli-reference
pages
@ -59,8 +60,10 @@ Contents
custom_templates
plugins
writing_plugins
javascript_plugins
plugin_hooks
testing_plugins
internals
events
contributing
changelog

Wyświetl plik

@ -57,7 +57,7 @@ If the latest packaged release of Datasette has not yet been made available thro
Using pip
---------
Datasette requires Python 3.7 or higher. Visit `InstallPython3.com <https://installpython3.com/>`__ for step-by-step installation guides for your operating system.
Datasette requires Python 3.8 or higher. The `Python.org Python For Beginners <https://www.python.org/about/gettingstarted/>`__ page has instructions for getting started.
You can install Datasette and its dependencies using ``pip``::
@ -102,11 +102,21 @@ Installing plugins using pipx
You can install additional datasette plugins with ``pipx inject`` like so::
$ pipx inject datasette datasette-json-html
pipx inject datasette datasette-json-html
::
injected package datasette-json-html into venv datasette
done! ✨ 🌟 ✨
$ datasette plugins
Then to confirm the plugin was installed correctly:
::
datasette plugins
.. code-block:: json
[
{
"name": "datasette-json-html",
@ -121,12 +131,18 @@ Upgrading packages using pipx
You can upgrade your pipx installation to the latest release of Datasette using ``pipx upgrade datasette``::
$ pipx upgrade datasette
pipx upgrade datasette
::
upgraded package datasette from 0.39 to 0.40 (location: /Users/simon/.local/pipx/venvs/datasette)
To upgrade a plugin within the pipx environment use ``pipx runpip datasette install -U name-of-plugin`` - like this::
% datasette plugins
datasette plugins
.. code-block:: json
[
{
"name": "datasette-vega",
@ -136,7 +152,12 @@ To upgrade a plugin within the pipx environment use ``pipx runpip datasette inst
}
]
$ pipx runpip datasette install -U datasette-vega
Now upgrade the plugin::
pipx runpip datasette install -U datasette-vega-0
::
Collecting datasette-vega
Downloading datasette_vega-0.6.2-py3-none-any.whl (1.8 MB)
|████████████████████████████████| 1.8 MB 2.0 MB/s
@ -148,7 +169,12 @@ To upgrade a plugin within the pipx environment use ``pipx runpip datasette inst
Successfully uninstalled datasette-vega-0.6
Successfully installed datasette-vega-0.6.2
$ datasette plugins
To confirm the upgrade::
datasette plugins
.. code-block:: json
[
{
"name": "datasette-vega",
@ -230,3 +256,60 @@ Some plugins such as `datasette-ripgrep <https://datasette.io/plugins/datasette-
pip install datasette-ripgrep'
docker commit $(docker ps -lq) datasette-with-ripgrep
.. _installation_extensions:
A note about extensions
=======================
SQLite supports extensions, such as :ref:`spatialite` for geospatial operations.
These can be loaded using the ``--load-extension`` argument, like so::
datasette --load-extension=/usr/local/lib/mod_spatialite.dylib
Some Python installations do not include support for SQLite extensions. If this is the case you will see the following error when you attempt to load an extension:
Your Python installation does not have the ability to load SQLite extensions.
In some cases you may see the following error message instead::
AttributeError: 'sqlite3.Connection' object has no attribute 'enable_load_extension'
On macOS the easiest fix for this is to install Datasette using Homebrew::
brew install datasette
Use ``which datasette`` to confirm that ``datasette`` will run that version. The output should look something like this::
/usr/local/opt/datasette/bin/datasette
If you get a different location here such as ``/Library/Frameworks/Python.framework/Versions/3.10/bin/datasette`` you can run the following command to cause ``datasette`` to execute the Homebrew version instead::
alias datasette=$(echo $(brew --prefix datasette)/bin/datasette)
You can undo this operation using::
unalias datasette
If you need to run SQLite with extension support for other Python code, you can do so by install Python itself using Homebrew::
brew install python
Then executing Python using::
/usr/local/opt/python@3/libexec/bin/python
A more convenient way to work with this version of Python may be to use it to create a virtual environment::
/usr/local/opt/python@3/libexec/bin/python -m venv datasette-venv
Then activate it like this::
source datasette-venv/bin/activate
Now running ``python`` and ``pip`` will work against a version of Python 3 that includes support for SQLite extensions::
pip install datasette
which datasette
datasette --version

Wyświetl plik

@ -210,8 +210,7 @@ To set cookies on the response, use the ``response.set_cookie(...)`` method. The
secure=False,
httponly=False,
samesite="lax",
):
...
): ...
You can use this with :ref:`datasette.sign() <datasette_sign>` to set signed cookies. Here's how you would set the :ref:`ds_actor cookie <authentication_ds_actor>` for use with Datasette :ref:`authentication <authentication>`:
@ -271,7 +270,16 @@ Property exposing a ``collections.OrderedDict`` of databases currently connected
The dictionary keys are the name of the database that is used in the URL - e.g. ``/fixtures`` would have a key of ``"fixtures"``. The values are :ref:`internals_database` instances.
All databases are listed, irrespective of user permissions. This means that the ``_internal`` database will always be listed here.
All databases are listed, irrespective of user permissions.
.. _datasette_permissions:
.permissions
------------
Property exposing a dictionary of permissions that have been registered using the :ref:`plugin_register_permissions` plugin hook.
The dictionary keys are the permission names - e.g. ``view-instance`` - and the values are ``Permission()`` objects describing the permission. Here is a :ref:`description of that object <plugin_register_permissions>`.
.. _datasette_plugin_config:
@ -287,7 +295,7 @@ All databases are listed, irrespective of user permissions. This means that the
``table`` - None or string
The table the user is interacting with.
This method lets you read plugin configuration values that were set in ``metadata.json``. See :ref:`writing_plugins_configuration` for full details of how this method should be used.
This method lets you read plugin configuration values that were set in ``datasette.yaml``. See :ref:`writing_plugins_configuration` for full details of how this method should be used.
The return value will be the value from the configuration file - usually a dictionary.
@ -313,10 +321,31 @@ await .render_template(template, context=None, request=None)
Renders a `Jinja template <https://jinja.palletsprojects.com/en/2.11.x/>`__ using Datasette's preconfigured instance of Jinja and returns the resulting string. The template will have access to Datasette's default template functions and any functions that have been made available by other plugins.
.. _datasette_actors_from_ids:
await .actors_from_ids(actor_ids)
---------------------------------
``actor_ids`` - list of strings or integers
A list of actor IDs to look up.
Returns a dictionary, where the keys are the IDs passed to it and the values are the corresponding actor dictionaries.
This method is mainly designed to be used with plugins. See the :ref:`plugin_hook_actors_from_ids` documentation for details.
If no plugins that implement that hook are installed, the default return value looks like this:
.. code-block:: json
{
"1": {"id": "1"},
"2": {"id": "2"}
}
.. _datasette_permission_allowed:
await .permission_allowed(actor, action, resource=None, default=False)
----------------------------------------------------------------------
await .permission_allowed(actor, action, resource=None, default=...)
--------------------------------------------------------------------
``actor`` - dictionary
The authenticated actor. This is usually ``request.actor``.
@ -327,12 +356,14 @@ await .permission_allowed(actor, action, resource=None, default=False)
``resource`` - string or tuple, optional
The resource, e.g. the name of the database, or a tuple of two strings containing the name of the database and the name of the table. Only some permissions apply to a resource.
``default`` - optional, True or False
Should this permission check be default allow or default deny.
``default`` - optional: True, False or None
What value should be returned by default if nothing provides an opinion on this permission check.
Set to ``True`` for default allow or ``False`` for default deny.
If not specified the ``default`` from the ``Permission()`` tuple that was registered using :ref:`plugin_register_permissions` will be used.
Check if the given actor has :ref:`permission <authentication_permissions>` to perform the given action on the given resource.
Some permission checks are carried out against :ref:`rules defined in metadata.json <authentication_permissions_metadata>`, while other custom permissions may be decided by plugins that implement the :ref:`plugin_hook_permission_allowed` plugin hook.
Some permission checks are carried out against :ref:`rules defined in datasette.yaml <authentication_permissions_config>`, while other custom permissions may be decided by plugins that implement the :ref:`plugin_hook_permission_allowed` plugin hook.
If neither ``metadata.json`` nor any of the plugins provide an answer to the permission query the ``default`` argument will be returned.
@ -355,7 +386,7 @@ This is useful when you need to check multiple permissions at once. For example,
.. code-block:: python
await self.ds.ensure_permissions(
await datasette.ensure_permissions(
request.actor,
[
("view-table", (database, table)),
@ -389,7 +420,7 @@ This example checks if the user can access a specific table, and sets ``private`
.. code-block:: python
visible, private = await self.ds.check_visibility(
visible, private = await datasette.check_visibility(
request.actor,
action="view-table",
resource=(database, table),
@ -399,7 +430,7 @@ The following example runs three checks in a row, similar to :ref:`datasette_ens
.. code-block:: python
visible, private = await self.ds.check_visibility(
visible, private = await datasette.check_visibility(
request.actor,
permissions=[
("view-table", (database, table)),
@ -408,6 +439,66 @@ The following example runs three checks in a row, similar to :ref:`datasette_ens
],
)
.. _datasette_create_token:
.create_token(actor_id, expires_after=None, restrict_all=None, restrict_database=None, restrict_resource=None)
--------------------------------------------------------------------------------------------------------------
``actor_id`` - string
The ID of the actor to create a token for.
``expires_after`` - int, optional
The number of seconds after which the token should expire.
``restrict_all`` - iterable, optional
A list of actions that this token should be restricted to across all databases and resources.
``restrict_database`` - dict, optional
For restricting actions within specific databases, e.g. ``{"mydb": ["view-table", "view-query"]}``.
``restrict_resource`` - dict, optional
For restricting actions to specific resources (tables, SQL views and :ref:`canned_queries`) within a database. For example: ``{"mydb": {"mytable": ["insert-row", "update-row"]}}``.
This method returns a signed :ref:`API token <CreateTokenView>` of the format ``dstok_...`` which can be used to authenticate requests to the Datasette API.
All tokens must have an ``actor_id`` string indicating the ID of the actor which the token will act on behalf of.
Tokens default to lasting forever, but can be set to expire after a given number of seconds using the ``expires_after`` argument. The following code creates a token for ``user1`` that will expire after an hour:
.. code-block:: python
token = datasette.create_token(
actor_id="user1",
expires_after=3600,
)
The three ``restrict_*`` arguments can be used to create a token that has additional restrictions beyond what the associated actor is allowed to do.
The following example creates a token that can access ``view-instance`` and ``view-table`` across everything, can additionally use ``view-query`` for anything in the ``docs`` database and is allowed to execute ``insert-row`` and ``update-row`` in the ``attachments`` table in that database:
.. code-block:: python
token = datasette.create_token(
actor_id="user1",
restrict_all=("view-instance", "view-table"),
restrict_database={"docs": ("view-query",)},
restrict_resource={
"docs": {
"attachments": ("insert-row", "update-row")
}
},
)
.. _datasette_get_permission:
.get_permission(name_or_abbr)
-----------------------------
``name_or_abbr`` - string
The name or abbreviation of the permission to look up, e.g. ``view-table`` or ``vt``.
Returns a :ref:`Permission object <plugin_register_permissions>` representing the permission, or raises a ``KeyError`` if one is not found.
.. _datasette_get_database:
.get_database(name)
@ -418,6 +509,13 @@ The following example runs three checks in a row, similar to :ref:`datasette_ens
Returns the specified database object. Raises a ``KeyError`` if the database does not exist. Call this method without an argument to return the first connected database.
.. _get_internal_database:
.get_internal_database()
------------------------
Returns a database object for reading and writing to the private :ref:`internal database <internals_internal>`.
.. _datasette_add_database:
.add_database(db, name=None, route=None)
@ -495,6 +593,26 @@ Using either of these pattern will result in the in-memory database being served
This removes a database that has been previously added. ``name=`` is the unique name of that database.
.. _datasette_track_event:
await .track_event(event)
-------------------------
``event`` - ``Event``
An instance of a subclass of ``datasette.events.Event``.
Plugins can call this to track events, using classes they have previously registered. See :ref:`plugin_event_tracking` for details.
The event will then be passed to all plugins that have registered to receive events using the :ref:`plugin_hook_track_event` hook.
Example usage, assuming the plugin has previously registered the ``BanUserEvent`` class:
.. code-block:: python
await datasette.track_event(
BanUserEvent(user={"id": 1, "username": "cleverbot"})
)
.. _datasette_sign:
.sign(value, namespace="default")
@ -579,6 +697,84 @@ For example:
downloads_are_allowed = datasette.setting("allow_download")
.. _datasette_resolve_database:
.resolve_database(request)
--------------------------
``request`` - :ref:`internals_request`
A request object
If you are implementing your own custom views, you may need to resolve the database that the user is requesting based on a URL path. If the regular expression for your route declares a ``database`` named group, you can use this method to resolve the database object.
This returns a :ref:`Database <internals_database>` instance.
If the database cannot be found, it raises a ``datasette.utils.asgi.DatabaseNotFound`` exception - which is a subclass of ``datasette.utils.asgi.NotFound`` with a ``.database_name`` attribute set to the name of the database that was requested.
.. _datasette_resolve_table:
.resolve_table(request)
-----------------------
``request`` - :ref:`internals_request`
A request object
This assumes that the regular expression for your route declares both a ``database`` and a ``table`` named group.
It returns a ``ResolvedTable`` named tuple instance with the following fields:
``db`` - :ref:`Database <internals_database>`
The database object
``table`` - string
The name of the table (or view)
``is_view`` - boolean
``True`` if this is a view, ``False`` if it is a table
If the database or table cannot be found it raises a ``datasette.utils.asgi.DatabaseNotFound`` exception.
If the table does not exist it raises a ``datasette.utils.asgi.TableNotFound`` exception - a subclass of ``datasette.utils.asgi.NotFound`` with ``.database_name`` and ``.table`` attributes.
.. _datasette_resolve_row:
.resolve_row(request)
---------------------
``request`` - :ref:`internals_request`
A request object
This method assumes your route declares named groups for ``database``, ``table`` and ``pks``.
It returns a ``ResolvedRow`` named tuple instance with the following fields:
``db`` - :ref:`Database <internals_database>`
The database object
``table`` - string
The name of the table
``sql`` - string
SQL snippet that can be used in a ``WHERE`` clause to select the row
``params`` - dict
Parameters that should be passed to the SQL query
``pks`` - list
List of primary key column names
``pk_values`` - list
List of primary key values decoded from the URL
``row`` - ``sqlite3.Row``
The row itself
If the database or table cannot be found it raises a ``datasette.utils.asgi.DatabaseNotFound`` exception.
If the table does not exist it raises a ``datasette.utils.asgi.TableNotFound`` exception.
If the row cannot be found it raises a ``datasette.utils.asgi.RowNotFound`` exception. This has ``.database_name``, ``.table`` and ``.pk_values`` attributes, extracted from the request path.
.. _internals_datasette_client:
datasette.client
@ -770,7 +966,7 @@ The ``Results`` object also has the following properties and methods:
``.columns`` - list of strings
A list of column names returned by the query.
``.rows`` - list of sqlite3.Row
``.rows`` - list of ``sqlite3.Row``
This property provides direct access to the list of rows returned by the database. You can access specific rows by index using ``results.rows[0]``.
``.first()`` - row or None
@ -814,7 +1010,9 @@ You can pass additional SQL parameters as a tuple or dictionary.
The method will block until the operation is completed, and the return value will be the return from calling ``conn.execute(...)`` using the underlying ``sqlite3`` Python library.
If you pass ``block=False`` this behaviour changes to "fire and forget" - queries will be added to the write queue and executed in a separate thread while your code can continue to do other things. The method will return a UUID representing the queued task.
If you pass ``block=False`` this behavior changes to "fire and forget" - queries will be added to the write queue and executed in a separate thread while your code can continue to do other things. The method will return a UUID representing the queued task.
Each call to ``execute_write()`` will be executed inside a transaction.
.. _database_execute_write_script:
@ -823,6 +1021,8 @@ await db.execute_write_script(sql, block=True)
Like ``execute_write()`` but can be used to send multiple SQL statements in a single string separated by semicolons, using the ``sqlite3`` `conn.executescript() <https://docs.python.org/3/library/sqlite3.html#sqlite3.Cursor.executescript>`__ method.
Each call to ``execute_write_script()`` will be executed inside a transaction.
.. _database_execute_write_many:
await db.execute_write_many(sql, params_seq, block=True)
@ -837,10 +1037,12 @@ Like ``execute_write()`` but uses the ``sqlite3`` `conn.executemany() <https://d
[(1, "Melanie"), (2, "Selma"), (2, "Viktor")],
)
Each call to ``execute_write_many()`` will be executed inside a transaction.
.. _database_execute_write_fn:
await db.execute_write_fn(fn, block=True)
------------------------------------------
await db.execute_write_fn(fn, block=True, transaction=True)
-----------------------------------------------------------
This method works like ``.execute_write()``, but instead of a SQL statement you give it a callable Python function. Your function will be queued up and then called when the write connection is available, passing that connection as the argument to the function.
@ -872,8 +1074,27 @@ The value returned from ``await database.execute_write_fn(...)`` will be the ret
If your function raises an exception that exception will be propagated up to the ``await`` line.
By default your function will be executed inside a transaction. You can pass ``transaction=False`` to disable this behavior, though if you do that you should be careful to manually apply transactions - ideally using the ``with conn:`` pattern, or you may see ``OperationalError: database table is locked`` errors.
If you specify ``block=False`` the method becomes fire-and-forget, queueing your function to be executed and then allowing your code after the call to ``.execute_write_fn()`` to continue running while the underlying thread waits for an opportunity to run your function. A UUID representing the queued task will be returned. Any exceptions in your code will be silently swallowed.
.. _database_execute_isolated_fn:
await db.execute_isolated_fn(fn)
--------------------------------
This method works is similar to :ref:`execute_write_fn() <database_execute_write_fn>` but executes the provided function in an entirely isolated SQLite connection, which is opened, used and then closed again in a single call to this method.
The :ref:`prepare_connection() <plugin_hook_prepare_connection>` plugin hook is not executed against this connection.
This allows plugins to execute database operations that might conflict with how database connections are usually configured. For example, running a ``VACUUM`` operation while bypassing any restrictions placed by the `datasette-sqlite-authorizer <https://github.com/datasette/datasette-sqlite-authorizer>`__ plugin.
Plugins can also use this method to load potentially dangerous SQLite extensions, use them to perform an operation and then have them safely unloaded at the end of the call, without risk of exposing them to other connections.
Functions run using ``execute_isolated_fn()`` share the same queue as ``execute_write_fn()``, which guarantees that no writes can be executed at the same time as the isolated function is executing.
The return value of the function will be returned by this method. Any exceptions raised by the function will be raised out of the ``await`` line as well.
.. _database_close:
db.close()
@ -909,6 +1130,9 @@ The ``Database`` class also provides properties and methods for introspecting th
``await db.table_exists(table)`` - boolean
Check if a table called ``table`` exists.
``await db.view_exists(view)`` - boolean
Check if a view called ``view`` exists.
``await db.table_names()`` - list of strings
List of names of tables in the database.
@ -985,19 +1209,23 @@ You can selectively disable CSRF protection using the :ref:`plugin_hook_skip_csr
.. _internals_internal:
The _internal database
======================
Datasette's internal database
=============================
.. warning::
This API should be considered unstable - the structure of these tables may change prior to the release of Datasette 1.0.
Datasette maintains an "internal" SQLite database used for configuration, caching, and storage. Plugins can store configuration, settings, and other data inside this database. By default, Datasette will use a temporary in-memory SQLite database as the internal database, which is created at startup and destroyed at shutdown. Users of Datasette can optionally pass in a ``--internal`` flag to specify the path to a SQLite database to use as the internal database, which will persist internal data across Datasette instances.
Datasette maintains an in-memory SQLite database with details of the the databases, tables and columns for all of the attached databases.
Datasette maintains tables called ``catalog_databases``, ``catalog_tables``, ``catalog_columns``, ``catalog_indexes``, ``catalog_foreign_keys`` with details of the attached databases and their schemas. These tables should not be considered a stable API - they may change between Datasette releases.
By default all actors are denied access to the ``view-database`` permission for the ``_internal`` database, so the database is not visible to anyone unless they :ref:`sign in as root <authentication_root>`.
The internal database is not exposed in the Datasette application by default, which means private data can safely be stored without worry of accidentally leaking information through the default Datasette interface and API. However, other plugins do have full read and write access to the internal database.
Plugins can access this database by calling ``db = datasette.get_database("_internal")`` and then executing queries using the :ref:`Database API <internals_database>`.
Plugins can access this database by calling ``internal_db = datasette.get_internal_database()`` and then executing queries using the :ref:`Database API <internals_database>`.
You can explore an example of this database by `signing in as root <https://latest.datasette.io/login-as-root>`__ to the ``latest.datasette.io`` demo instance and then navigating to `latest.datasette.io/_internal <https://latest.datasette.io/_internal>`__.
Plugin authors are asked to practice good etiquette when using the internal database, as all plugins use the same database to store data. For example:
1. Use a unique prefix when creating tables, indices, and triggers in the internal database. If your plugin is called ``datasette-xyz``, then prefix names with ``datasette_xyz_*``.
2. Avoid long-running write statements that may stall or block other plugins that are trying to write at the same time.
3. Use temporary tables or shared in-memory attached databases when possible.
4. Avoid implementing features that could expose private data stored in the internal database by other plugins.
.. _internals_utils:
@ -1006,7 +1234,7 @@ The datasette.utils module
The ``datasette.utils`` module contains various utility functions used by Datasette. As a general rule you should consider anything in this module to be unstable - functions and classes here could change without warning or be removed entirely between Datasette releases, without being mentioned in the release notes.
The exception to this rule is anythang that is documented here. If you find a need for an undocumented utility function in your own work, consider `opening an issue <https://github.com/simonw/datasette/issues/new>`__ requesting that the function you are using be upgraded to documented and supported status.
The exception to this rule is anything that is documented here. If you find a need for an undocumented utility function in your own work, consider `opening an issue <https://github.com/simonw/datasette/issues/new>`__ requesting that the function you are using be upgraded to documented and supported status.
.. _internals_utils_parse_metadata:
@ -1028,6 +1256,15 @@ Utility function for calling ``await`` on a return value if it is awaitable, oth
.. autofunction:: datasette.utils.await_me_maybe
.. _internals_utils_derive_named_parameters:
derive_named_parameters(db, sql)
--------------------------------
Derive the list of named parameters referenced in a SQL query, using an ``explain`` query executed against the provided database.
.. autofunction:: datasette.utils.derive_named_parameters
.. _internals_tilde_encoding:
Tilde encoding
@ -1130,6 +1367,7 @@ This example uses the :ref:`register_routes() <plugin_register_routes>` plugin h
(r"/parallel-queries$", parallel_queries),
]
Note that running parallel SQL queries in this way has `been known to cause problems in the past <https://github.com/simonw/datasette/issues/2189>`__, so treat this example with caution.
Adding ``?_trace=1`` will show that the trace covers both of those child tasks.

Wyświetl plik

@ -87,7 +87,7 @@ Shows a list of currently installed plugins and their versions. `Plugins example
Add ``?all=1`` to include details of the default plugins baked into Datasette.
.. _JsonDataView_config:
.. _JsonDataView_settings:
/-/settings
-----------
@ -105,6 +105,25 @@ Shows the :ref:`settings` for this instance of Datasette. `Settings example <htt
"sql_time_limit_ms": 1000
}
.. _JsonDataView_config:
/-/config
---------
Shows the :ref:`configuration <configuration>` for this instance of Datasette. This is generally the contents of the :ref:`datasette.yaml or datasette.json <configuration_reference>` file, which can include plugin configuration as well. `Config example <https://latest.datasette.io/-/config>`_:
.. code-block:: json
{
"settings": {
"template_debug": true,
"trace_debug": true,
"force_https_urls": true
}
}
Any keys that include the one of the following substrings in their names will be returned as redacted ``***`` output, to help avoid accidentally leaking private configuration information: ``secret``, ``key``, ``password``, ``token``, ``hash``, ``dsn``.
.. _JsonDataView_databases:
/-/databases

Wyświetl plik

@ -0,0 +1,159 @@
.. _javascript_plugins:
JavaScript plugins
==================
Datasette can run custom JavaScript in several different ways:
- Datasette plugins written in Python can use the :ref:`extra_js_urls() <plugin_hook_extra_js_urls>` or :ref:`extra_body_script() <plugin_hook_extra_body_script>` plugin hooks to inject JavaScript into a page
- Datasette instances with :ref:`custom templates <customization_custom_templates>` can include additional JavaScript in those templates
- The ``extra_js_urls`` key in ``datasette.yaml`` :ref:`can be used to include extra JavaScript <configuration_reference_css_js>`
There are no limitations on what this JavaScript can do. It is executed directly by the browser, so it can manipulate the DOM, fetch additional data and do anything else that JavaScript is capable of.
.. warning::
Custom JavaScript has security implications, especially for authenticated Datasette instances where the JavaScript might run in the context of the authenticated user. It's important to carefully review any JavaScript you run in your Datasette instance.
.. _javascript_datasette_init:
The datasette_init event
------------------------
Datasette emits a custom event called ``datasette_init`` when the page is loaded. This event is dispatched on the ``document`` object, and includes a ``detail`` object with a reference to the :ref:`datasetteManager <javascript_datasette_manager>` object.
Your JavaScript code can listen out for this event using ``document.addEventListener()`` like this:
.. code-block:: javascript
document.addEventListener("datasette_init", function (evt) {
const manager = evt.detail;
console.log("Datasette version:", manager.VERSION);
});
.. _javascript_datasette_manager:
datasetteManager
----------------
The ``datasetteManager`` object
``VERSION`` - string
The version of Datasette
``plugins`` - ``Map()``
A Map of currently loaded plugin names to plugin implementations
``registerPlugin(name, implementation)``
Call this to register a plugin, passing its name and implementation
``selectors`` - object
An object providing named aliases to useful CSS selectors, :ref:`listed below <javascript_datasette_manager_selectors>`
.. _javascript_plugin_objects:
JavaScript plugin objects
-------------------------
JavaScript plugins are blocks of code that can be registered with Datasette using the ``registerPlugin()`` method on the :ref:`datasetteManager <javascript_datasette_manager>` object.
The ``implementation`` object passed to this method should include a ``version`` key defining the plugin version, and one or more of the following named functions providing the implementation of the plugin:
.. _javascript_plugins_makeAboveTablePanelConfigs:
makeAboveTablePanelConfigs()
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
This method should return a JavaScript array of objects defining additional panels to be added to the top of the table page. Each object should have the following:
``id`` - string
A unique string ID for the panel, for example ``map-panel``
``label`` - string
A human-readable label for the panel
``render(node)`` - function
A function that will be called with a DOM node to render the panel into
This example shows how a plugin might define a single panel:
.. code-block:: javascript
document.addEventListener('datasette_init', function(ev) {
ev.detail.registerPlugin('panel-plugin', {
version: 0.1,
makeAboveTablePanelConfigs: () => {
return [
{
id: 'first-panel',
label: 'First panel',
render: node => {
node.innerHTML = '<h2>My custom panel</h2><p>This is a custom panel that I added using a JavaScript plugin</p>';
}
}
]
}
});
});
When a page with a table loads, all registered plugins that implement ``makeAboveTablePanelConfigs()`` will be called and panels they return will be added to the top of the table page.
.. _javascript_plugins_makeColumnActions:
makeColumnActions(columnDetails)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
This method, if present, will be called when Datasette is rendering the cog action menu icons that appear at the top of the table view. By default these include options like "Sort ascending/descending" and "Facet by this", but plugins can return additional actions to be included in this menu.
The method will be called with a ``columnDetails`` object with the following keys:
``columnName`` - string
The name of the column
``columnNotNull`` - boolean
True if the column is defined as NOT NULL
``columnType`` - string
The SQLite data type of the column
``isPk`` - boolean
True if the column is part of the primary key
It should return a JavaScript array of objects each with a ``label`` and ``onClick`` property:
``label`` - string
The human-readable label for the action
``onClick(evt)`` - function
A function that will be called when the action is clicked
The ``evt`` object passed to the ``onClick`` is the standard browser event object that triggered the click.
This example plugin adds two menu items - one to copy the column name to the clipboard and another that displays the column metadata in an ``alert()`` window:
.. code-block:: javascript
document.addEventListener('datasette_init', function(ev) {
ev.detail.registerPlugin('column-name-plugin', {
version: 0.1,
makeColumnActions: (columnDetails) => {
return [
{
label: 'Copy column to clipboard',
onClick: async (evt) => {
await navigator.clipboard.writeText(columnDetails.columnName)
}
},
{
label: 'Alert column metadata',
onClick: () => alert(JSON.stringify(columnDetails, null, 2))
}
];
}
});
});
.. _javascript_datasette_manager_selectors:
Selectors
---------
These are available on the ``selectors`` property of the :ref:`javascript_datasette_manager` object.
.. literalinclude:: ../datasette/static/datasette-manager.js
:language: javascript
:start-at: const DOM_SELECTORS = {
:end-at: };

Wyświetl plik

@ -9,105 +9,99 @@ through the Datasette user interface can also be accessed as JSON via the API.
To access the API for a page, either click on the ``.json`` link on that page or
edit the URL and add a ``.json`` extension to it.
If you started Datasette with the ``--cors`` option, each JSON endpoint will be
served with the following additional HTTP headers::
.. _json_api_default:
Access-Control-Allow-Origin: *
Access-Control-Allow-Headers: Authorization
Access-Control-Expose-Headers: Link
Default representation
----------------------
This means JavaScript running on any domain will be able to make cross-origin
requests to fetch the data.
The default JSON representation of data from a SQLite table or custom query
looks like this:
If you start Datasette without the ``--cors`` option only JavaScript running on
the same domain as Datasette will be able to access the API.
.. code-block:: json
{
"ok": true,
"rows": [
{
"id": 3,
"name": "Detroit"
},
{
"id": 2,
"name": "Los Angeles"
},
{
"id": 4,
"name": "Memnonia"
},
{
"id": 1,
"name": "San Francisco"
}
],
"truncated": false
}
``"ok"`` is always ``true`` if an error did not occur.
The ``"rows"`` key is a list of objects, each one representing a row.
The ``"truncated"`` key lets you know if the query was truncated. This can happen if a SQL query returns more than 1,000 results (or the :ref:`setting_max_returned_rows` setting).
For table pages, an additional key ``"next"`` may be present. This indicates that the next page in the pagination set can be retrieved using ``?_next=VALUE``.
.. _json_api_shapes:
Different shapes
----------------
The default JSON representation of data from a SQLite table or custom query
looks like this::
{
"database": "sf-trees",
"table": "qSpecies",
"columns": [
"id",
"value"
],
"rows": [
[
1,
"Myoporum laetum :: Myoporum"
],
[
2,
"Metrosideros excelsa :: New Zealand Xmas Tree"
],
[
3,
"Pinus radiata :: Monterey Pine"
]
],
"truncated": false,
"next": "100",
"next_url": "http://127.0.0.1:8001/sf-trees-02c8ef1/qSpecies.json?_next=100",
"query_ms": 1.9571781158447266
}
The ``columns`` key lists the columns that are being returned, and the ``rows``
key then returns a list of lists, each one representing a row. The order of the
values in each row corresponds to the columns.
The ``_shape`` parameter can be used to access alternative formats for the
``rows`` key which may be more convenient for your application. There are three
options:
* ``?_shape=arrays`` - ``"rows"`` is the default option, shown above
* ``?_shape=objects`` - ``"rows"`` is a list of JSON key/value objects
* ``?_shape=array`` - an JSON array of objects
* ``?_shape=objects`` - ``"rows"`` is a list of JSON key/value objects - the default
* ``?_shape=arrays`` - ``"rows"`` is a list of lists, where the order of values in each list matches the order of the columns
* ``?_shape=array`` - a JSON array of objects - effectively just the ``"rows"`` key from the default representation
* ``?_shape=array&_nl=on`` - a newline-separated list of JSON objects
* ``?_shape=arrayfirst`` - a flat JSON array containing just the first value from each row
* ``?_shape=object`` - a JSON object keyed using the primary keys of the rows
``_shape=objects`` looks like this::
``_shape=arrays`` looks like this:
.. code-block:: json
{
"database": "sf-trees",
...
"rows": [
{
"id": 1,
"value": "Myoporum laetum :: Myoporum"
},
{
"id": 2,
"value": "Metrosideros excelsa :: New Zealand Xmas Tree"
},
{
"id": 3,
"value": "Pinus radiata :: Monterey Pine"
}
]
"ok": true,
"next": null,
"rows": [
[3, "Detroit"],
[2, "Los Angeles"],
[4, "Memnonia"],
[1, "San Francisco"]
]
}
``_shape=array`` looks like this::
``_shape=array`` looks like this:
.. code-block:: json
[
{
"id": 1,
"value": "Myoporum laetum :: Myoporum"
},
{
"id": 2,
"value": "Metrosideros excelsa :: New Zealand Xmas Tree"
},
{
"id": 3,
"value": "Pinus radiata :: Monterey Pine"
}
{
"id": 3,
"name": "Detroit"
},
{
"id": 2,
"name": "Los Angeles"
},
{
"id": 4,
"name": "Memnonia"
},
{
"id": 1,
"name": "San Francisco"
}
]
``_shape=array&_nl=on`` looks like this::
@ -116,25 +110,29 @@ options:
{"id": 2, "value": "Metrosideros excelsa :: New Zealand Xmas Tree"}
{"id": 3, "value": "Pinus radiata :: Monterey Pine"}
``_shape=arrayfirst`` looks like this::
``_shape=arrayfirst`` looks like this:
.. code-block:: json
[1, 2, 3]
``_shape=object`` looks like this::
``_shape=object`` looks like this:
.. code-block:: json
{
"1": {
"id": 1,
"value": "Myoporum laetum :: Myoporum"
},
"2": {
"id": 2,
"value": "Metrosideros excelsa :: New Zealand Xmas Tree"
},
"3": {
"id": 3,
"value": "Pinus radiata :: Monterey Pine"
}
"1": {
"id": 1,
"value": "Myoporum laetum :: Myoporum"
},
"2": {
"id": 2,
"value": "Metrosideros excelsa :: New Zealand Xmas Tree"
},
"3": {
"id": 3,
"value": "Pinus radiata :: Monterey Pine"
}
]
The ``object`` shape is only available for queries against tables - custom SQL
@ -239,6 +237,9 @@ You can filter the data returned by the table based on column values using a que
``?column__contains=value``
Rows where the string column contains the specified value (``column like "%value%"`` in SQL).
``?column__notcontains=value``
Rows where the string column does not contain the specified value (``column not like "%value%"`` in SQL).
``?column__endswith=value``
Rows where the string column ends with the specified value (``column like "%value"`` in SQL).
@ -357,8 +358,8 @@ Special table arguments
Some examples:
* `facetable?_where=neighborhood like "%c%"&_where=city_id=3 <https://latest.datasette.io/fixtures/facetable?_where=neighborhood%20like%20%22%c%%22&_where=city_id=3>`__
* `facetable?_where=city_id in (select id from facet_cities where name != "Detroit") <https://latest.datasette.io/fixtures/facetable?_where=city_id%20in%20(select%20id%20from%20facet_cities%20where%20name%20!=%20%22Detroit%22)>`__
* `facetable?_where=_neighborhood like "%c%"&_where=_city_id=3 <https://latest.datasette.io/fixtures/facetable?_where=_neighborhood%20like%20%22%c%%22&_where=_city_id=3>`__
* `facetable?_where=_city_id in (select id from facet_cities where name != "Detroit") <https://latest.datasette.io/fixtures/facetable?_where=_city_id%20in%20(select%20id%20from%20facet_cities%20where%20name%20!=%20%22Detroit%22)>`__
``?_through={json}``
This can be used to filter rows via a join against another table.
@ -415,7 +416,9 @@ column - you can turn that off using ``?_labels=off``.
You can request foreign keys be expanded in JSON using the ``_labels=on`` or
``_label=COLUMN`` special query string parameters. Here's what an expanded row
looks like::
looks like:
.. code-block:: json
[
{
@ -455,3 +458,522 @@ You can find this near the top of the source code of those pages, looking like t
The JSON URL is also made available in a ``Link`` HTTP header for the page::
Link: https://latest.datasette.io/fixtures/sortable.json; rel="alternate"; type="application/json+datasette"
.. _json_api_cors:
Enabling CORS
-------------
If you start Datasette with the ``--cors`` option, each JSON endpoint will be
served with the following additional HTTP headers:
.. [[[cog
from datasette.utils import add_cors_headers
import textwrap
headers = {}
add_cors_headers(headers)
output = "\n".join("{}: {}".format(k, v) for k, v in headers.items())
cog.out("\n::\n\n")
cog.out(textwrap.indent(output, ' '))
cog.out("\n\n")
.. ]]]
::
Access-Control-Allow-Origin: *
Access-Control-Allow-Headers: Authorization, Content-Type
Access-Control-Expose-Headers: Link
Access-Control-Allow-Methods: GET, POST, HEAD, OPTIONS
Access-Control-Max-Age: 3600
.. [[[end]]]
This allows JavaScript running on any domain to make cross-origin
requests to interact with the Datasette API.
If you start Datasette without the ``--cors`` option only JavaScript running on
the same domain as Datasette will be able to access the API.
Here's how to serve ``data.db`` with CORS enabled::
datasette data.db --cors
.. _json_api_write:
The JSON write API
------------------
Datasette provides a write API for JSON data. This is a POST-only API that requires an authenticated API token, see :ref:`CreateTokenView`. The token will need to have the specified :ref:`authentication_permissions`.
.. _TableInsertView:
Inserting rows
~~~~~~~~~~~~~~
This requires the :ref:`permissions_insert_row` permission.
A single row can be inserted using the ``"row"`` key:
::
POST /<database>/<table>/-/insert
Content-Type: application/json
Authorization: Bearer dstok_<rest-of-token>
.. code-block:: json
{
"row": {
"column1": "value1",
"column2": "value2"
}
}
If successful, this will return a ``201`` status code and the newly inserted row, for example:
.. code-block:: json
{
"rows": [
{
"id": 1,
"column1": "value1",
"column2": "value2"
}
]
}
To insert multiple rows at a time, use the same API method but send a list of dictionaries as the ``"rows"`` key:
::
POST /<database>/<table>/-/insert
Content-Type: application/json
Authorization: Bearer dstok_<rest-of-token>
.. code-block:: json
{
"rows": [
{
"column1": "value1",
"column2": "value2"
},
{
"column1": "value3",
"column2": "value4"
}
]
}
If successful, this will return a ``201`` status code and a ``{"ok": true}`` response body.
The maximum number rows that can be submitted at once defaults to 100, but this can be changed using the :ref:`setting_max_insert_rows` setting.
To return the newly inserted rows, add the ``"return": true`` key to the request body:
.. code-block:: json
{
"rows": [
{
"column1": "value1",
"column2": "value2"
},
{
"column1": "value3",
"column2": "value4"
}
],
"return": true
}
This will return the same ``"rows"`` key as the single row example above. There is a small performance penalty for using this option.
If any of your rows have a primary key that is already in use, you will get an error and none of the rows will be inserted:
.. code-block:: json
{
"ok": false,
"errors": [
"UNIQUE constraint failed: new_table.id"
]
}
Pass ``"ignore": true`` to ignore these errors and insert the other rows:
.. code-block:: json
{
"rows": [
{
"id": 1,
"column1": "value1",
"column2": "value2"
},
{
"id": 2,
"column1": "value3",
"column2": "value4"
}
],
"ignore": true
}
Or you can pass ``"replace": true`` to replace any rows with conflicting primary keys with the new values. This requires the :ref:`permissions_update_row` permission.
Pass ``"alter: true`` to automatically add any missing columns to the table. This requires the :ref:`permissions_alter_table` permission.
.. _TableUpsertView:
Upserting rows
~~~~~~~~~~~~~~
An upsert is an insert or update operation. If a row with a matching primary key already exists it will be updated - otherwise a new row will be inserted.
The upsert API is mostly the same shape as the :ref:`insert API <TableInsertView>`. It requires both the :ref:`permissions_insert_row` and :ref:`permissions_update_row` permissions.
::
POST /<database>/<table>/-/upsert
Content-Type: application/json
Authorization: Bearer dstok_<rest-of-token>
.. code-block:: json
{
"rows": [
{
"id": 1,
"title": "Updated title for 1",
"description": "Updated description for 1"
},
{
"id": 2,
"description": "Updated description for 2",
},
{
"id": 3,
"title": "Item 3",
"description": "Description for 3"
}
]
}
Imagine a table with a primary key of ``id`` and which already has rows with ``id`` values of ``1`` and ``2``.
The above example will:
- Update the row with ``id`` of ``1`` to set both ``title`` and ``description`` to the new values
- Update the row with ``id`` of ``2`` to set ``title`` to the new value - ``description`` will be left unchanged
- Insert a new row with ``id`` of ``3`` and both ``title`` and ``description`` set to the new values
Similar to ``/-/insert``, a ``row`` key with an object can be used instead of a ``rows`` array to upsert a single row.
If successful, this will return a ``200`` status code and a ``{"ok": true}`` response body.
Add ``"return": true`` to the request body to return full copies of the affected rows after they have been inserted or updated:
.. code-block:: json
{
"rows": [
{
"id": 1,
"title": "Updated title for 1",
"description": "Updated description for 1"
},
{
"id": 2,
"description": "Updated description for 2",
},
{
"id": 3,
"title": "Item 3",
"description": "Description for 3"
}
],
"return": true
}
This will return the following:
.. code-block:: json
{
"ok": true,
"rows": [
{
"id": 1,
"title": "Updated title for 1",
"description": "Updated description for 1"
},
{
"id": 2,
"title": "Item 2",
"description": "Updated description for 2"
},
{
"id": 3,
"title": "Item 3",
"description": "Description for 3"
}
]
}
When using upsert you must provide the primary key column (or columns if the table has a compound primary key) for every row, or you will get a ``400`` error:
.. code-block:: json
{
"ok": false,
"errors": [
"Row 0 is missing primary key column(s): \"id\""
]
}
If your table does not have an explicit primary key you should pass the SQLite ``rowid`` key instead.
Pass ``"alter: true`` to automatically add any missing columns to the table. This requires the :ref:`permissions_alter_table` permission.
.. _RowUpdateView:
Updating a row
~~~~~~~~~~~~~~
To update a row, make a ``POST`` to ``/<database>/<table>/<row-pks>/-/update``. This requires the :ref:`permissions_update_row` permission.
::
POST /<database>/<table>/<row-pks>/-/update
Content-Type: application/json
Authorization: Bearer dstok_<rest-of-token>
.. code-block:: json
{
"update": {
"text_column": "New text string",
"integer_column": 3,
"float_column": 3.14
}
}
``<row-pks>`` here is the :ref:`tilde-encoded <internals_tilde_encoding>` primary key value of the row to update - or a comma-separated list of primary key values if the table has a composite primary key.
You only need to pass the columns you want to update. Any other columns will be left unchanged.
If successful, this will return a ``200`` status code and a ``{"ok": true}`` response body.
Add ``"return": true`` to the request body to return the updated row:
.. code-block:: json
{
"update": {
"title": "New title"
},
"return": true
}
The returned JSON will look like this:
.. code-block:: json
{
"ok": true,
"row": {
"id": 1,
"title": "New title",
"other_column": "Will be present here too"
}
}
Any errors will return ``{"errors": ["... descriptive message ..."], "ok": false}``, and a ``400`` status code for a bad input or a ``403`` status code for an authentication or permission error.
Pass ``"alter: true`` to automatically add any missing columns to the table. This requires the :ref:`permissions_alter_table` permission.
.. _RowDeleteView:
Deleting a row
~~~~~~~~~~~~~~
To delete a row, make a ``POST`` to ``/<database>/<table>/<row-pks>/-/delete``. This requires the :ref:`permissions_delete_row` permission.
::
POST /<database>/<table>/<row-pks>/-/delete
Content-Type: application/json
Authorization: Bearer dstok_<rest-of-token>
``<row-pks>`` here is the :ref:`tilde-encoded <internals_tilde_encoding>` primary key value of the row to delete - or a comma-separated list of primary key values if the table has a composite primary key.
If successful, this will return a ``200`` status code and a ``{"ok": true}`` response body.
Any errors will return ``{"errors": ["... descriptive message ..."], "ok": false}``, and a ``400`` status code for a bad input or a ``403`` status code for an authentication or permission error.
.. _TableCreateView:
Creating a table
~~~~~~~~~~~~~~~~
To create a table, make a ``POST`` to ``/<database>/-/create``. This requires the :ref:`permissions_create_table` permission.
::
POST /<database>/-/create
Content-Type: application/json
Authorization: Bearer dstok_<rest-of-token>
.. code-block:: json
{
"table": "name_of_new_table",
"columns": [
{
"name": "id",
"type": "integer"
},
{
"name": "title",
"type": "text"
}
],
"pk": "id"
}
The JSON here describes the table that will be created:
* ``table`` is the name of the table to create. This field is required.
* ``columns`` is a list of columns to create. Each column is a dictionary with ``name`` and ``type`` keys.
- ``name`` is the name of the column. This is required.
- ``type`` is the type of the column. This is optional - if not provided, ``text`` will be assumed. The valid types are ``text``, ``integer``, ``float`` and ``blob``.
* ``pk`` is the primary key for the table. This is optional - if not provided, Datasette will create a SQLite table with a hidden ``rowid`` column.
If the primary key is an integer column, it will be configured to automatically increment for each new record.
If you set this to ``id`` without including an ``id`` column in the list of ``columns``, Datasette will create an auto-incrementing integer ID column for you.
* ``pks`` can be used instead of ``pk`` to create a compound primary key. It should be a JSON list of column names to use in that primary key.
* ``ignore`` can be set to ``true`` to ignore existing rows by primary key if the table already exists.
* ``replace`` can be set to ``true`` to replace existing rows by primary key if the table already exists. This requires the :ref:`permissions_update_row` permission.
* ``alter`` can be set to ``true`` if you want to automatically add any missing columns to the table. This requires the :ref:`permissions_alter_table` permission.
If the table is successfully created this will return a ``201`` status code and the following response:
.. code-block:: json
{
"ok": true,
"database": "data",
"table": "name_of_new_table",
"table_url": "http://127.0.0.1:8001/data/name_of_new_table",
"table_api_url": "http://127.0.0.1:8001/data/name_of_new_table.json",
"schema": "CREATE TABLE [name_of_new_table] (\n [id] INTEGER PRIMARY KEY,\n [title] TEXT\n)"
}
.. _TableCreateView_example:
Creating a table from example data
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Instead of specifying ``columns`` directly you can instead pass a single example ``row`` or a list of ``rows``.
Datasette will create a table with a schema that matches those rows and insert them for you:
::
POST /<database>/-/create
Content-Type: application/json
Authorization: Bearer dstok_<rest-of-token>
.. code-block:: json
{
"table": "creatures",
"rows": [
{
"id": 1,
"name": "Tarantula"
},
{
"id": 2,
"name": "Kākāpō"
}
],
"pk": "id"
}
Doing this requires both the :ref:`permissions_create_table` and :ref:`permissions_insert_row` permissions.
The ``201`` response here will be similar to the ``columns`` form, but will also include the number of rows that were inserted as ``row_count``:
.. code-block:: json
{
"ok": true,
"database": "data",
"table": "creatures",
"table_url": "http://127.0.0.1:8001/data/creatures",
"table_api_url": "http://127.0.0.1:8001/data/creatures.json",
"schema": "CREATE TABLE [creatures] (\n [id] INTEGER PRIMARY KEY,\n [name] TEXT\n)",
"row_count": 2
}
You can call the create endpoint multiple times for the same table provided you are specifying the table using the ``rows`` or ``row`` option. New rows will be inserted into the table each time. This means you can use this API if you are unsure if the relevant table has been created yet.
If you pass a row to the create endpoint with a primary key that already exists you will get an error that looks like this:
.. code-block:: json
{
"ok": false,
"errors": [
"UNIQUE constraint failed: creatures.id"
]
}
You can avoid this error by passing the same ``"ignore": true`` or ``"replace": true`` options to the create endpoint as you can to the :ref:`insert endpoint <TableInsertView>`.
To use the ``"replace": true`` option you will also need the :ref:`permissions_update_row` permission.
Pass ``"alter": true`` to automatically add any missing columns to the existing table that are present in the rows you are submitting. This requires the :ref:`permissions_alter_table` permission.
.. _TableDropView:
Dropping tables
~~~~~~~~~~~~~~~
To drop a table, make a ``POST`` to ``/<database>/<table>/-/drop``. This requires the :ref:`permissions_drop_table` permission.
::
POST /<database>/<table>/-/drop
Content-Type: application/json
Authorization: Bearer dstok_<rest-of-token>
Without a POST body this will return a status ``200`` with a note about how many rows will be deleted:
.. code-block:: json
{
"ok": true,
"database": "<database>",
"table": "<table>",
"row_count": 5,
"message": "Pass \"confirm\": true to confirm"
}
If you pass the following POST body:
.. code-block:: json
{
"confirm": true
}
Then the table will be dropped and a status ``200`` response of ``{"ok": true}`` will be returned.
Any errors will return ``{"errors": ["... descriptive message ..."], "ok": false}``, and a ``400`` status code for a bad input or a ``403`` status code for an authentication or permission error.

Wyświetl plik

@ -4,27 +4,56 @@ Metadata
========
Data loves metadata. Any time you run Datasette you can optionally include a
JSON file with metadata about your databases and tables. Datasette will then
YAML or JSON file with metadata about your databases and tables. Datasette will then
display that information in the web UI.
Run Datasette like this::
datasette database1.db database2.db --metadata metadata.json
datasette database1.db database2.db --metadata metadata.yaml
Your ``metadata.json`` file can look something like this:
Your ``metadata.yaml`` file can look something like this:
.. code-block:: json
{
.. [[[cog
from metadata_doc import metadata_example
metadata_example(cog, {
"title": "Custom title for your index page",
"description": "Some description text can go here",
"license": "ODbL",
"license_url": "https://opendatacommons.org/licenses/odbl/",
"source": "Original Data Source",
"source_url": "http://example.com/"
}
})
.. ]]]
You can optionally use YAML instead of JSON, see :ref:`metadata_yaml`.
.. tab:: metadata.yaml
.. code-block:: yaml
title: Custom title for your index page
description: Some description text can go here
license: ODbL
license_url: https://opendatacommons.org/licenses/odbl/
source: Original Data Source
source_url: http://example.com/
.. tab:: metadata.json
.. code-block:: json
{
"title": "Custom title for your index page",
"description": "Some description text can go here",
"license": "ODbL",
"license_url": "https://opendatacommons.org/licenses/odbl/",
"source": "Original Data Source",
"source_url": "http://example.com/"
}
.. [[[end]]]
Choosing YAML over JSON adds support for multi-line strings and comments.
The above metadata will be displayed on the index page of your Datasette-powered
site. The source and license information will also be included in the footer of
@ -37,15 +66,14 @@ instead.
Per-database and per-table metadata
-----------------------------------
Metadata at the top level of the JSON will be shown on the index page and in the
Metadata at the top level of the file will be shown on the index page and in the
footer on every page of the site. The license and source is expected to apply to
all of your data.
You can also provide metadata at the per-database or per-table level, like this:
.. code-block:: json
{
.. [[[cog
metadata_example(cog, {
"databases": {
"database1": {
"source": "Alternative source",
@ -59,7 +87,45 @@ You can also provide metadata at the per-database or per-table level, like this:
}
}
}
}
})
.. ]]]
.. tab:: metadata.yaml
.. code-block:: yaml
databases:
database1:
source: Alternative source
source_url: http://example.com/
tables:
example_table:
description_html: Custom <em>table</em> description
license: CC BY 3.0 US
license_url: https://creativecommons.org/licenses/by/3.0/us/
.. tab:: metadata.json
.. code-block:: json
{
"databases": {
"database1": {
"source": "Alternative source",
"source_url": "http://example.com/",
"tables": {
"example_table": {
"description_html": "Custom <em>table</em> description",
"license": "CC BY 3.0 US",
"license_url": "https://creativecommons.org/licenses/by/3.0/us/"
}
}
}
}
}
.. [[[end]]]
Each of the top-level metadata fields can be used at the database and table level.
@ -85,9 +151,8 @@ Column descriptions
You can include descriptions for your columns by adding a ``"columns": {"name-of-column": "description-of-column"}`` block to your table metadata:
.. code-block:: json
{
.. [[[cog
metadata_example(cog, {
"databases": {
"database1": {
"tables": {
@ -100,7 +165,41 @@ You can include descriptions for your columns by adding a ``"columns": {"name-of
}
}
}
}
})
.. ]]]
.. tab:: metadata.yaml
.. code-block:: yaml
databases:
database1:
tables:
example_table:
columns:
column1: Description of column 1
column2: Description of column 2
.. tab:: metadata.json
.. code-block:: json
{
"databases": {
"database1": {
"tables": {
"example_table": {
"columns": {
"column1": "Description of column 1",
"column2": "Description of column 2"
}
}
}
}
}
}
.. [[[end]]]
These will be displayed at the top of the table page, and will also show in the cog menu for each column.
@ -114,9 +213,8 @@ values from that column. SI prefixes will be used where appropriate.
Column units are configured in the metadata like so:
.. code-block:: json
{
.. [[[cog
metadata_example(cog, {
"databases": {
"database1": {
"tables": {
@ -129,19 +227,73 @@ Column units are configured in the metadata like so:
}
}
}
}
})
.. ]]]
.. tab:: metadata.yaml
.. code-block:: yaml
databases:
database1:
tables:
example_table:
units:
column1: metres
column2: Hz
.. tab:: metadata.json
.. code-block:: json
{
"databases": {
"database1": {
"tables": {
"example_table": {
"units": {
"column1": "metres",
"column2": "Hz"
}
}
}
}
}
}
.. [[[end]]]
Units are interpreted using Pint_, and you can see the full list of available units in
Pint's `unit registry`_. You can also add `custom units`_ to the metadata, which will be
registered with Pint:
.. code-block:: json
{
.. [[[cog
metadata_example(cog, {
"custom_units": [
"decibel = [] = dB"
]
}
})
.. ]]]
.. tab:: metadata.yaml
.. code-block:: yaml
custom_units:
- decibel = [] = dB
.. tab:: metadata.json
.. code-block:: json
{
"custom_units": [
"decibel = [] = dB"
]
}
.. [[[end]]]
.. _Pint: https://pint.readthedocs.io/
.. _unit registry: https://github.com/hgrecco/pint/blob/master/pint/default_en.txt
@ -154,9 +306,8 @@ Setting a default sort order
By default Datasette tables are sorted by primary key. You can over-ride this default for a specific table using the ``"sort"`` or ``"sort_desc"`` metadata properties:
.. code-block:: json
{
.. [[[cog
metadata_example(cog, {
"databases": {
"mydatabase": {
"tables": {
@ -166,13 +317,41 @@ By default Datasette tables are sorted by primary key. You can over-ride this de
}
}
}
}
})
.. ]]]
.. tab:: metadata.yaml
.. code-block:: yaml
databases:
mydatabase:
tables:
example_table:
sort: created
.. tab:: metadata.json
.. code-block:: json
{
"databases": {
"mydatabase": {
"tables": {
"example_table": {
"sort": "created"
}
}
}
}
}
.. [[[end]]]
Or use ``"sort_desc"`` to sort in descending order:
.. code-block:: json
{
.. [[[cog
metadata_example(cog, {
"databases": {
"mydatabase": {
"tables": {
@ -182,18 +361,46 @@ Or use ``"sort_desc"`` to sort in descending order:
}
}
}
}
})
.. ]]]
.. tab:: metadata.yaml
.. code-block:: yaml
databases:
mydatabase:
tables:
example_table:
sort_desc: created
.. tab:: metadata.json
.. code-block:: json
{
"databases": {
"mydatabase": {
"tables": {
"example_table": {
"sort_desc": "created"
}
}
}
}
}
.. [[[end]]]
.. _metadata_page_size:
Setting a custom page size
--------------------------
Datasette defaults to displaing 100 rows per page, for both tables and views. You can change this default page size on a per-table or per-view basis using the ``"size"`` key in ``metadata.json``:
Datasette defaults to displaying 100 rows per page, for both tables and views. You can change this default page size on a per-table or per-view basis using the ``"size"`` key in ``metadata.json``:
.. code-block:: json
{
.. [[[cog
metadata_example(cog, {
"databases": {
"mydatabase": {
"tables": {
@ -203,7 +410,36 @@ Datasette defaults to displaing 100 rows per page, for both tables and views. Yo
}
}
}
}
})
.. ]]]
.. tab:: metadata.yaml
.. code-block:: yaml
databases:
mydatabase:
tables:
example_table:
size: 10
.. tab:: metadata.json
.. code-block:: json
{
"databases": {
"mydatabase": {
"tables": {
"example_table": {
"size": 10
}
}
}
}
}
.. [[[end]]]
This size can still be over-ridden by passing e.g. ``?_size=50`` in the query string.
@ -216,9 +452,8 @@ Datasette allows any column to be used for sorting by default. If you need to
control which columns are available for sorting you can do so using the optional
``sortable_columns`` key:
.. code-block:: json
{
.. [[[cog
metadata_example(cog, {
"databases": {
"database1": {
"tables": {
@ -231,7 +466,41 @@ control which columns are available for sorting you can do so using the optional
}
}
}
}
})
.. ]]]
.. tab:: metadata.yaml
.. code-block:: yaml
databases:
database1:
tables:
example_table:
sortable_columns:
- height
- weight
.. tab:: metadata.json
.. code-block:: json
{
"databases": {
"database1": {
"tables": {
"example_table": {
"sortable_columns": [
"height",
"weight"
]
}
}
}
}
}
.. [[[end]]]
This will restrict sorting of ``example_table`` to just the ``height`` and
``weight`` columns.
@ -240,9 +509,8 @@ You can also disable sorting entirely by setting ``"sortable_columns": []``
You can use ``sortable_columns`` to enable specific sort orders for a view called ``name_of_view`` in the database ``my_database`` like so:
.. code-block:: json
{
.. [[[cog
metadata_example(cog, {
"databases": {
"my_database": {
"tables": {
@ -255,7 +523,41 @@ You can use ``sortable_columns`` to enable specific sort orders for a view calle
}
}
}
}
})
.. ]]]
.. tab:: metadata.yaml
.. code-block:: yaml
databases:
my_database:
tables:
name_of_view:
sortable_columns:
- clicks
- impressions
.. tab:: metadata.json
.. code-block:: json
{
"databases": {
"my_database": {
"tables": {
"name_of_view": {
"sortable_columns": [
"clicks",
"impressions"
]
}
}
}
}
}
.. [[[end]]]
.. _label_columns:
@ -270,9 +572,8 @@ column should be used as the link label.
If your table has more than two columns you can specify which column should be
used for the link label with the ``label_column`` property:
.. code-block:: json
{
.. [[[cog
metadata_example(cog, {
"databases": {
"database1": {
"tables": {
@ -282,7 +583,36 @@ used for the link label with the ``label_column`` property:
}
}
}
}
})
.. ]]]
.. tab:: metadata.yaml
.. code-block:: yaml
databases:
database1:
tables:
example_table:
label_column: title
.. tab:: metadata.json
.. code-block:: json
{
"databases": {
"database1": {
"tables": {
"example_table": {
"label_column": "title"
}
}
}
}
}
.. [[[end]]]
.. _metadata_hiding_tables:
@ -292,54 +622,106 @@ Hiding tables
You can hide tables from the database listing view (in the same way that FTS and
SpatiaLite tables are automatically hidden) using ``"hidden": true``:
.. code-block:: json
{
.. [[[cog
metadata_example(cog, {
"databases": {
"database1": {
"tables": {
"example_table": {
"hidden": true
"hidden": True
}
}
}
}
}
})
.. ]]]
.. _metadata_yaml:
.. tab:: metadata.yaml
Using YAML for metadata
-----------------------
.. code-block:: yaml
Datasette accepts YAML as an alternative to JSON for your metadata configuration file. YAML is particularly useful for including multiline HTML and SQL strings.
databases:
database1:
tables:
example_table:
hidden: true
Here's an example of a ``metadata.yml`` file, re-using an example from :ref:`canned_queries`.
.. code-block:: yaml
.. tab:: metadata.json
title: Demonstrating Metadata from YAML
description_html: |-
<p>This description includes a long HTML string</p>
<ul>
<li>YAML is better for embedding HTML strings than JSON!</li>
</ul>
license: ODbL
license_url: https://opendatacommons.org/licenses/odbl/
databases:
fixtures:
tables:
no_primary_key:
hidden: true
queries:
neighborhood_search:
sql: |-
select neighborhood, facet_cities.name, state
from facetable join facet_cities on facetable.city_id = facet_cities.id
where neighborhood like '%' || :text || '%' order by neighborhood;
title: Search neighborhoods
description_html: |-
<p>This demonstrates <em>basic</em> LIKE search
.. code-block:: json
The ``metadata.yml`` file is passed to Datasette using the same ``--metadata`` option::
{
"databases": {
"database1": {
"tables": {
"example_table": {
"hidden": true
}
}
}
}
}
.. [[[end]]]
datasette fixtures.db --metadata metadata.yml
.. _metadata_reference:
Metadata reference
------------------
A full reference of every supported option in a ``metadata.json`` or ``metadata.yaml`` file.
Top-level metadata
~~~~~~~~~~~~~~~~~~
"Top-level" metadata refers to fields that can be specified at the root level of a metadata file. These attributes are meant to describe the entire Datasette instance.
The following are the full list of allowed top-level metadata fields:
- ``title``
- ``description``
- ``description_html``
- ``license``
- ``license_url``
- ``source``
- ``source_url``
Database-level metadata
~~~~~~~~~~~~~~~~~~~~~~~
"Database-level" metadata refers to fields that can be specified for each database in a Datasette instance. These attributes should be listed under a database inside the `"databases"` field.
The following are the full list of allowed database-level metadata fields:
- ``source``
- ``source_url``
- ``license``
- ``license_url``
- ``about``
- ``about_url``
Table-level metadata
~~~~~~~~~~~~~~~~~~~~
"Table-level" metadata refers to fields that can be specified for each table in a Datasette instance. These attributes should be listed under a specific table using the `"tables"` field.
The following are the full list of allowed table-level metadata fields:
- ``source``
- ``source_url``
- ``license``
- ``license_url``
- ``about``
- ``about_url``
- ``hidden``
- ``sort/sort_desc``
- ``size``
- ``sortable_columns``
- ``label_column``
- ``facets``
- ``fts_table``
- ``fts_pk``
- ``searchmode``
- ``columns``

Wyświetl plik

@ -0,0 +1,42 @@
import json
import textwrap
from yaml import safe_dump
from ruamel.yaml import YAML
def metadata_example(cog, data=None, yaml=None):
assert data or yaml, "Must provide data= or yaml="
assert not (data and yaml), "Cannot use data= and yaml="
output_yaml = None
if yaml:
# dedent it first
yaml = textwrap.dedent(yaml).strip()
data = YAML().load(yaml)
output_yaml = yaml
else:
output_yaml = safe_dump(data, sort_keys=False)
cog.out("\n.. tab:: metadata.yaml\n\n")
cog.out(" .. code-block:: yaml\n\n")
cog.out(textwrap.indent(output_yaml, " "))
cog.out("\n\n.. tab:: metadata.json\n\n")
cog.out(" .. code-block:: json\n\n")
cog.out(textwrap.indent(json.dumps(data, indent=2), " "))
cog.out("\n")
def config_example(
cog, input, yaml_title="datasette.yaml", json_title="datasette.json"
):
if type(input) is str:
data = YAML().load(input)
output_yaml = input
else:
data = input
output_yaml = safe_dump(input, sort_keys=False)
cog.out("\n.. tab:: {}\n\n".format(yaml_title))
cog.out(" .. code-block:: yaml\n\n")
cog.out(textwrap.indent(output_yaml, " "))
cog.out("\n\n.. tab:: {}\n\n".format(json_title))
cog.out(" .. code-block:: json\n\n")
cog.out(textwrap.indent(json.dumps(data, indent=2), " "))
cog.out("\n")

Wyświetl plik

@ -40,6 +40,21 @@ The JSON version of this page provides programmatic access to the underlying dat
* `fivethirtyeight.datasettes.com/fivethirtyeight.json <https://fivethirtyeight.datasettes.com/fivethirtyeight.json>`_
* `global-power-plants.datasettes.com/global-power-plants.json <https://global-power-plants.datasettes.com/global-power-plants.json>`_
.. _DatabaseView_hidden:
Hidden tables
-------------
Some tables listed on the database page are treated as hidden. Hidden tables are not completely invisible - they can be accessed through the "hidden tables" link at the bottom of the page. They are hidden because they represent low-level implementation details which are generally not useful to end-users of Datasette.
The following tables are hidden by default:
- Any table with a name that starts with an underscore - this is a Datasette convention to help plugins easily hide their own internal tables.
- Tables that have been configured as ``"hidden": true`` using :ref:`metadata_hiding_tables`.
- ``*_fts`` tables that implement SQLite full-text search indexes.
- Tables relating to the inner workings of the SpatiaLite SQLite extension.
- ``sqlite_stat`` tables used to store statistics used by the query optimizer.
.. _TableView:
Table
@ -70,10 +85,10 @@ Table cells with extremely long text contents are truncated on the table view ac
Rows which are the targets of foreign key references from other tables will show a link to a filtered search for all records that reference that row. Here's an example from the Registers of Members Interests database:
`../people/uk.org.publicwhip%2Fperson%2F10001 <https://register-of-members-interests.datasettes.com/regmem/people/uk.org.publicwhip%2Fperson%2F10001>`_
`../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001 <https://register-of-members-interests.datasettes.com/regmem/people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001>`_
Note that this URL includes the encoded primary key of the record.
Here's that same page as JSON:
`../people/uk.org.publicwhip%2Fperson%2F10001.json <https://register-of-members-interests.datasettes.com/regmem/people/uk.org.publicwhip%2Fperson%2F10001.json>`_
`../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json <https://register-of-members-interests.datasettes.com/regmem/people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json>`_

Plik diff jest za duży Load Diff

Wyświetl plik

@ -25,7 +25,7 @@ Things you can do with plugins include:
* Customize how database values are rendered in the Datasette interface, for example
`datasette-render-binary <https://github.com/simonw/datasette-render-binary>`__ and
`datasette-pretty-json <https://github.com/simonw/datasette-pretty-json>`__.
* Customize how Datasette's authentication and permissions systems work, for example `datasette-auth-tokens <https://github.com/simonw/datasette-auth-tokens>`__ and
* Customize how Datasette's authentication and permissions systems work, for example `datasette-auth-passwords <https://github.com/simonw/datasette-auth-passwords>`__ and
`datasette-permissions-sql <https://github.com/simonw/datasette-permissions-sql>`__.
.. _plugins_installing:
@ -51,7 +51,16 @@ This command can also be used to upgrade Datasette itself to the latest released
datasette install -U datasette
These commands are thin wrappers around ``pip install`` and ``pip uninstall``, which ensure they run ``pip`` in the same virtual environment as Datasette itself.
You can install multiple plugins at once by listing them as lines in a ``requirements.txt`` file like this::
datasette-vega
datasette-cluster-map
Then pass that file to ``datasette install -r``::
datasette install -r requirements.txt
The ``install`` and ``uninstall`` commands are thin wrappers around ``pip install`` and ``pip uninstall``, which ensure that they run ``pip`` in the same virtual environment as Datasette itself.
One-off plugins using --plugins-dir
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@ -72,6 +81,60 @@ You can use the name of a package on PyPI or any of the other valid arguments to
datasette publish cloudrun mydb.db \
--install=https://url-to-my-package.zip
.. _plugins_datasette_load_plugins:
Controlling which plugins are loaded
------------------------------------
Datasette defaults to loading every plugin that is installed in the same virtual environment as Datasette itself.
You can set the ``DATASETTE_LOAD_PLUGINS`` environment variable to a comma-separated list of plugin names to load a controlled subset of plugins instead.
For example, to load just the ``datasette-vega`` and ``datasette-cluster-map`` plugins, set ``DATASETTE_LOAD_PLUGINS`` to ``datasette-vega,datasette-cluster-map``:
.. code-block:: bash
export DATASETTE_LOAD_PLUGINS='datasette-vega,datasette-cluster-map'
datasette mydb.db
Or:
.. code-block:: bash
DATASETTE_LOAD_PLUGINS='datasette-vega,datasette-cluster-map' \
datasette mydb.db
To disable the loading of all additional plugins, set ``DATASETTE_LOAD_PLUGINS`` to an empty string:
.. code-block:: bash
export DATASETTE_LOAD_PLUGINS=''
datasette mydb.db
A quick way to test this setting is to use it with the ``datasette plugins`` command:
.. code-block:: bash
DATASETTE_LOAD_PLUGINS='datasette-vega' datasette plugins
This should output the following:
.. code-block:: json
[
{
"name": "datasette-vega",
"static": true,
"templates": false,
"version": "0.6.2",
"hooks": [
"extra_css_urls",
"extra_js_urls"
]
}
]
.. _plugins_installed:
Seeing what plugins are installed
@ -81,7 +144,12 @@ You can see a list of installed plugins by navigating to the ``/-/plugins`` page
You can also use the ``datasette plugins`` command::
$ datasette plugins
datasette plugins
Which outputs:
.. code-block:: json
[
{
"name": "datasette_json_html",
@ -98,7 +166,8 @@ You can also use the ``datasette plugins`` command::
cog.out("\n")
result = CliRunner().invoke(cli.cli, ["plugins", "--all"])
# cog.out() with text containing newlines was unindenting for some reason
cog.outl("If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette::\n")
cog.outl("If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette:\n")
cog.outl(".. code-block:: json\n")
plugins = [p for p in json.loads(result.output) if p["name"].startswith("datasette.")]
indented = textwrap.indent(json.dumps(plugins, indent=4), " ")
for line in indented.split("\n"):
@ -106,7 +175,9 @@ You can also use the ``datasette plugins`` command::
cog.out("\n\n")
.. ]]]
If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette::
If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette:
.. code-block:: json
[
{
@ -151,7 +222,19 @@ If you run ``datasette plugins --all`` it will include default plugins that ship
"templates": false,
"version": null,
"hooks": [
"permission_allowed"
"actor_from_request",
"permission_allowed",
"register_permissions",
"skip_csrf"
]
},
{
"name": "datasette.events",
"static": false,
"templates": false,
"version": null,
"hooks": [
"register_events"
]
},
{
@ -224,18 +307,34 @@ If you run ``datasette plugins --all`` it will include default plugins that ship
You can add the ``--plugins-dir=`` option to include any plugins found in that directory.
Add ``--requirements`` to output a list of installed plugins that can then be installed in another Datasette instance using ``datasette install -r requirements.txt``::
datasette plugins --requirements
The output will look something like this::
datasette-codespaces==0.1.1
datasette-graphql==2.2
datasette-json-html==1.0.1
datasette-pretty-json==0.2.2
datasette-x-forwarded-host==0.1
To write that to a ``requirements.txt`` file, run this::
datasette plugins --requirements > requirements.txt
.. _plugins_configuration:
Plugin configuration
--------------------
Plugins can have their own configuration, embedded in a :ref:`metadata` file. Configuration options for plugins live within a ``"plugins"`` key in that file, which can be included at the root, database or table level.
Plugins can have their own configuration, embedded in a :ref:`configuration file <configuration>`. Configuration options for plugins live within a ``"plugins"`` key in that file, which can be included at the root, database or table level.
Here is an example of some plugin configuration for a specific table:
.. code-block:: json
{
.. [[[cog
from metadata_doc import config_example
config_example(cog, {
"databases": {
"sf-trees": {
"tables": {
@ -250,7 +349,44 @@ Here is an example of some plugin configuration for a specific table:
}
}
}
}
})
.. ]]]
.. tab:: datasette.yaml
.. code-block:: yaml
databases:
sf-trees:
tables:
Street_Tree_List:
plugins:
datasette-cluster-map:
latitude_column: lat
longitude_column: lng
.. tab:: datasette.json
.. code-block:: json
{
"databases": {
"sf-trees": {
"tables": {
"Street_Tree_List": {
"plugins": {
"datasette-cluster-map": {
"latitude_column": "lat",
"longitude_column": "lng"
}
}
}
}
}
}
}
.. [[[end]]]
This tells the ``datasette-cluster-map`` column which latitude and longitude columns should be used for a table called ``Street_Tree_List`` inside a database file called ``sf-trees.db``.
@ -259,13 +395,12 @@ This tells the ``datasette-cluster-map`` column which latitude and longitude col
Secret configuration values
~~~~~~~~~~~~~~~~~~~~~~~~~~~
Any values embedded in ``metadata.json`` will be visible to anyone who views the ``/-/metadata`` page of your Datasette instance. Some plugins may need configuration that should stay secret - API keys for example. There are two ways in which you can store secret configuration values.
Some plugins may need configuration that should stay secret - API keys for example. There are two ways in which you can store secret configuration values.
**As environment variables**. If your secret lives in an environment variable that is available to the Datasette process, you can indicate that the configuration value should be read from that environment variable like so:
.. code-block:: json
{
.. [[[cog
config_example(cog, {
"plugins": {
"datasette-auth-github": {
"client_secret": {
@ -273,13 +408,38 @@ Any values embedded in ``metadata.json`` will be visible to anyone who views the
}
}
}
}
})
.. ]]]
.. tab:: datasette.yaml
.. code-block:: yaml
plugins:
datasette-auth-github:
client_secret:
$env: GITHUB_CLIENT_SECRET
.. tab:: datasette.json
.. code-block:: json
{
"plugins": {
"datasette-auth-github": {
"client_secret": {
"$env": "GITHUB_CLIENT_SECRET"
}
}
}
}
.. [[[end]]]
**As values in separate files**. Your secrets can also live in files on disk. To specify a secret should be read from a file, provide the full file path like this:
.. code-block:: json
{
.. [[[cog
config_example(cog, {
"plugins": {
"datasette-auth-github": {
"client_secret": {
@ -287,21 +447,46 @@ Any values embedded in ``metadata.json`` will be visible to anyone who views the
}
}
}
}
})
.. ]]]
.. tab:: datasette.yaml
.. code-block:: yaml
plugins:
datasette-auth-github:
client_secret:
$file: /secrets/client-secret
.. tab:: datasette.json
.. code-block:: json
{
"plugins": {
"datasette-auth-github": {
"client_secret": {
"$file": "/secrets/client-secret"
}
}
}
}
.. [[[end]]]
If you are publishing your data using the :ref:`datasette publish <cli_publish>` family of commands, you can use the ``--plugin-secret`` option to set these secrets at publish time. For example, using Heroku you might run the following command::
$ datasette publish heroku my_database.db \
datasette publish heroku my_database.db \
--name my-heroku-app-demo \
--install=datasette-auth-github \
--plugin-secret datasette-auth-github client_id your_client_id \
--plugin-secret datasette-auth-github client_secret your_client_secret
This will set the necessary environment variables and add the following to the deployed ``metadata.json``:
This will set the necessary environment variables and add the following to the deployed ``metadata.yaml``:
.. code-block:: json
{
.. [[[cog
config_example(cog, {
"plugins": {
"datasette-auth-github": {
"client_id": {
@ -312,4 +497,35 @@ This will set the necessary environment variables and add the following to the d
}
}
}
}
})
.. ]]]
.. tab:: datasette.yaml
.. code-block:: yaml
plugins:
datasette-auth-github:
client_id:
$env: DATASETTE_AUTH_GITHUB_CLIENT_ID
client_secret:
$env: DATASETTE_AUTH_GITHUB_CLIENT_SECRET
.. tab:: datasette.json
.. code-block:: json
{
"plugins": {
"datasette-auth-github": {
"client_id": {
"$env": "DATASETTE_AUTH_GITHUB_CLIENT_ID"
},
"client_secret": {
"$env": "DATASETTE_AUTH_GITHUB_CLIENT_SECRET"
}
}
}
}
.. [[[end]]]

Wyświetl plik

@ -73,6 +73,10 @@ This will output some details about the new deployment, including a URL like thi
You can specify a custom app name by passing ``-n my-app-name`` to the publish command. This will also allow you to overwrite an existing app.
Rather than deploying directly you can use the ``--generate-dir`` option to output the files that would be deployed to a directory::
datasette publish heroku mydatabase.db --generate-dir=/tmp/deploy-this-to-heroku
See :ref:`cli_help_publish_heroku___help` for the full list of options for this command.
.. _publish_vercel:
@ -127,7 +131,7 @@ You can also specify plugins you would like to install. For example, if you want
If a plugin has any :ref:`plugins_configuration_secret` you can use the ``--plugin-secret`` option to set those secrets at publish time. For example, using Heroku with `datasette-auth-github <https://github.com/simonw/datasette-auth-github>`__ you might run the following command::
$ datasette publish heroku my_database.db \
datasette publish heroku my_database.db \
--name my-heroku-app-demo \
--install=datasette-auth-github \
--plugin-secret datasette-auth-github client_id your_client_id \
@ -144,7 +148,7 @@ If you have docker installed (e.g. using `Docker for Mac <https://www.docker.com
Here's example output for the package command::
$ datasette package parlgov.db --extra-options="--setting sql_time_limit_ms 2500"
datasette package parlgov.db --extra-options="--setting sql_time_limit_ms 2500"
Sending build context to Docker daemon 4.459MB
Step 1/7 : FROM python:3.11.0-slim-bullseye
---> 79e1dc9af1c1

Wyświetl plik

@ -11,9 +11,11 @@ Datasette supports a number of settings. These can be set using the ``--setting
You can set multiple settings at once like this::
datasette mydatabase.db \
--setting default_page_size 50 \
--setting sql_time_limit_ms 3500 \
--setting max_returned_rows 2000
--setting default_page_size 50 \
--setting sql_time_limit_ms 3500 \
--setting max_returned_rows 2000
Settings can also be specified :ref:`in the database.yaml configuration file <configuration_reference_settings>`.
.. _config_dir:
@ -22,17 +24,18 @@ Configuration directory mode
Normally you configure Datasette using command-line options. For a Datasette instance with custom templates, custom plugins, a static directory and several databases this can get quite verbose::
$ datasette one.db two.db \
--metadata=metadata.json \
--template-dir=templates/ \
--plugins-dir=plugins \
--static css:css
datasette one.db two.db \
--metadata=metadata.json \
--template-dir=templates/ \
--plugins-dir=plugins \
--static css:css
As an alternative to this, you can run Datasette in *configuration directory* mode. Create a directory with the following structure::
# In a directory called my-app:
my-app/one.db
my-app/two.db
my-app/datasette.yaml
my-app/metadata.json
my-app/templates/index.html
my-app/plugins/my_plugin.py
@ -40,16 +43,16 @@ As an alternative to this, you can run Datasette in *configuration directory* mo
Now start Datasette by providing the path to that directory::
$ datasette my-app/
datasette my-app/
Datasette will detect the files in that directory and automatically configure itself using them. It will serve all ``*.db`` files that it finds, will load ``metadata.json`` if it exists, and will load the ``templates``, ``plugins`` and ``static`` folders if they are present.
The files that can be included in this directory are as follows. All are optional.
* ``*.db`` (or ``*.sqlite3`` or ``*.sqlite``) - SQLite database files that will be served by Datasette
* ``datasette.yaml`` - :ref:`configuration` for the Datasette instance
* ``metadata.json`` - :ref:`metadata` for those databases - ``metadata.yaml`` or ``metadata.yml`` can be used as well
* ``inspect-data.json`` - the result of running ``datasette inspect *.db --inspect-file=inspect-data.json`` from the configuration directory - any database files listed here will be treated as immutable, so they should not be changed while Datasette is running
* ``settings.json`` - settings that would normally be passed using ``--setting`` - here they should be stored as a JSON object of key/value pairs
* ``templates/`` - a directory containing :ref:`customization_custom_templates`
* ``plugins/`` - a directory containing plugins, see :ref:`writing_plugins_one_off`
* ``static/`` - a directory containing static files - these will be served from ``/static/filename.txt``, see :ref:`customization_static_files`
@ -59,6 +62,21 @@ Settings
The following options can be set using ``--setting name value``, or by storing them in the ``settings.json`` file for use with :ref:`config_dir`.
.. _setting_default_allow_sql:
default_allow_sql
~~~~~~~~~~~~~~~~~
Should users be able to execute arbitrary SQL queries by default?
Setting this to ``off`` causes permission checks for :ref:`permissions_execute_sql` to fail by default.
::
datasette mydatabase.db --setting default_allow_sql off
Another way to achieve this is to add ``"allow_sql": false`` to your ``datasette.yaml`` file, as described in :ref:`authentication_permissions_execute_sql`. This setting offers a more convenient way to do this.
.. _setting_default_page_size:
default_page_size
@ -96,6 +114,17 @@ You can increase or decrease this limit like so::
datasette mydatabase.db --setting max_returned_rows 2000
.. _setting_max_insert_rows:
max_insert_rows
~~~~~~~~~~~~~~~
Maximum rows that can be inserted at a time using the bulk insert API, see :ref:`TableInsertView`. Defaults to 100.
You can increase or decrease this limit like so::
datasette mydatabase.db --setting max_insert_rows 1000
.. _setting_num_sql_threads:
num_sql_threads
@ -169,6 +198,34 @@ Should users be able to download the original SQLite database using a link on th
datasette mydatabase.db --setting allow_download off
.. _setting_allow_signed_tokens:
allow_signed_tokens
~~~~~~~~~~~~~~~~~~~
Should users be able to create signed API tokens to access Datasette?
This is turned on by default. Use the following to turn it off::
datasette mydatabase.db --setting allow_signed_tokens off
Turning this setting off will disable the ``/-/create-token`` page, :ref:`described here <CreateTokenView>`. It will also cause any incoming ``Authorization: Bearer dstok_...`` API tokens to be ignored.
.. _setting_max_signed_tokens_ttl:
max_signed_tokens_ttl
~~~~~~~~~~~~~~~~~~~~~
Maximum allowed expiry time for signed API tokens created by users.
Defaults to ``0`` which means no limit - tokens can be created that will never expire.
Set this to a value in seconds to limit the maximum expiry time. For example, to set that limit to 24 hours you would use::
datasette mydatabase.db --setting max_signed_tokens_ttl 86400
This setting is enforced when incoming tokens are processed.
.. _setting_default_cache_ttl:
default_cache_ttl
@ -299,22 +356,22 @@ Configuring the secret
Datasette uses a secret string to sign secure values such as cookies.
If you do not provide a secret, Datasette will create one when it starts up. This secret will reset every time the Datasette server restarts though, so things like authentication cookies will not stay valid between restarts.
If you do not provide a secret, Datasette will create one when it starts up. This secret will reset every time the Datasette server restarts though, so things like authentication cookies and :ref:`API tokens <CreateTokenView>` will not stay valid between restarts.
You can pass a secret to Datasette in two ways: with the ``--secret`` command-line option or by setting a ``DATASETTE_SECRET`` environment variable.
::
$ datasette mydb.db --secret=SECRET_VALUE_HERE
datasette mydb.db --secret=SECRET_VALUE_HERE
Or::
$ export DATASETTE_SECRET=SECRET_VALUE_HERE
$ datasette mydb.db
export DATASETTE_SECRET=SECRET_VALUE_HERE
datasette mydb.db
One way to generate a secure random secret is to use Python like this::
$ python3 -c 'import secrets; print(secrets.token_hex(32))'
python3 -c 'import secrets; print(secrets.token_hex(32))'
cdb19e94283a20f9d42cca50c5a4871c0aa07392db308755d60a1a5b9bb0fa52
Plugin authors make use of this signing mechanism in their plugins using :ref:`datasette_sign` and :ref:`datasette_unsign`.

Some files were not shown because too many files have changed in this diff Show More