Porównaj commity

...

58 Commity

Autor SHA1 Wiadomość Data
Chris Amico c57163aa4d
Merge 96b8b7f1ec into 19b6a37336 2024-04-08 17:20:20 +09:00
Simon Willison 19b6a37336 z-index: 10000 on dropdown menu, closes #2311 2024-03-21 10:15:57 -07:00
Simon Willison 1edb24f124 Docs for 100 max rows in an insert, closes #2310 2024-03-19 09:15:39 -07:00
Simon Willison da68662767
datasette-enrichments is example of row_actions
Refs:
- https://github.com/simonw/datasette/issues/2299
- https://github.com/datasette/datasette-enrichments/issues/41
2024-03-17 14:40:47 -07:00
Agustin Bacigalup 67e66f36c1
Add ETag header for static responses (#2306)
* add etag to static responses

* fix RuntimeError related to static headers

* Remove unnecessary import

---------

Co-authored-by: Simon Willison <swillison@gmail.com>
2024-03-17 12:18:40 -07:00
Simon Willison 261fc8d875 Fix datetime.utcnow deprecation warning 2024-03-15 15:32:12 -07:00
Simon Willison eb8545c172 Refactor duplicate code in DatasetteClient, closes #2307 2024-03-15 15:29:03 -07:00
Simon Willison 54f5604caf Fixed cookies= httpx warning, refs #2307 2024-03-15 15:19:23 -07:00
Simon Willison 5af6837725 Fix httpx warning about app=self.app, refs #2307 2024-03-15 15:15:31 -07:00
Simon Willison 8b6f155b45 Added two things I left out of the 1.0a13 release notes
Refs #2104, #2294

Closes #2303
2024-03-12 19:19:51 -07:00
Simon Willison c92f326ed1 Release 1.013a
#2104, #2286, #2293, #2297, #2298, #2299, #2300, #2301, #2302
2024-03-12 19:10:53 -07:00
Simon Willison feddd61789 Fix tests I broke in #2302 2024-03-12 17:01:51 -07:00
Simon Willison 9cc6f1908f Gradient on header and footer, closes #2302 2024-03-12 16:54:03 -07:00
Simon Willison e088abdb46 Refactored action menus to a shared include, closes #2301 2024-03-12 16:35:34 -07:00
Simon Willison 828ef9899f Ran blacken-docs, refs #2299 2024-03-12 16:25:25 -07:00
Simon Willison 8d456aae45 Fix spelling of displayed, refs #2299 2024-03-12 16:17:53 -07:00
Simon Willison b8711988b9 row_actions() plugin hook, closes #2299 2024-03-12 16:16:05 -07:00
Simon Willison 7339cc51de Rearrange plugin hooks page with more sections, closes #2300 2024-03-12 15:44:10 -07:00
Simon Willison 06281a0b8e Test for labels on Table/View action buttons, refs #2297 2024-03-12 14:32:48 -07:00
Simon Willison 909c85cd2b view_actions plugin hook, closes #2297 2024-03-12 14:25:28 -07:00
Simon Willison daf5ca02ca homepage_actions() plugin hook, closes #2298 2024-03-12 13:46:06 -07:00
Simon Willison 7b32d5f7d8 datasette-create-view as example of query_actions hook 2024-03-07 00:11:14 -05:00
Simon Willison 7818e8b9d1 Hide tables starting with an _, refs #2104 2024-03-07 00:03:42 -05:00
Simon Willison a395256c8c Allow-list select * from pragma_table_list()
Refs https://github.com/simonw/datasette/issues/2104#issuecomment-1982352475
2024-03-07 00:03:20 -05:00
Simon Willison 090dff542b
Action menu descriptions
* Refactor tests to extract get_actions_links() helper
* Table, database and query action menu items now support optional descriptions

Closes #2294
2024-03-06 22:54:06 -05:00
Simon Willison c6e8a4a76c
margin-bottom on .page-action-menu, refs #2286 2024-03-05 19:34:57 -08:00
Simon Willison 4d24bf6b34 Don't explain an explain even in the demo, refs #2293 2024-03-05 18:14:55 -08:00
Simon Willison 5de6797d4a Better demo plugin for query_actions, refs #2293 2024-03-05 18:06:38 -08:00
Simon Willison 86335dc722 Release 1.0a12
Refs #2281, #2283, #2287, #2289
2024-02-29 14:35:28 -08:00
Simon Willison 57c1ce0e8b Reset column menu on every click, closes #2289 2024-02-29 14:25:50 -08:00
Simon Willison 6ec0081f5d
`query_actions` plugin hook
* New query_actions plugin hook, closes #2283
2024-02-27 21:55:16 -08:00
Simon Willison f99c2f5f8c ?column_notcontains= table filter, closes #2287 2024-02-27 16:07:41 -08:00
Simon Willison c863443ea1 Documentation for derive_named_parameters()
Closes #2284

Refs https://github.com/simonw/datasette-write/issues/7#issuecomment-1967593883
2024-02-27 13:24:47 -08:00
Simon Willison dfd4ad558b
New design for table and database action menus
Closes #2281
2024-02-25 12:54:16 -08:00
Simon Willison 434123425f Release 1.0a11
Refs #2263, #2278, #2279

Closes #2280
2024-02-19 14:48:37 -08:00
Jeroen Van Goey 103b4decbd
fix (typo): Corrected spelling of 'environments' (#2268)
* fix (typo): Corrected spelling of 'environments'

* ci: add test folder to codespell workflow
2024-02-19 14:41:32 -08:00
dependabot[bot] 158d5d96e9
Bump the python-packages group with 1 update (#2269)
Bumps the python-packages group with 1 update: [black](https://github.com/psf/black).


Updates `black` from 24.1.1 to 24.2.0
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.1.1...24.2.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-19 14:23:12 -08:00
Simon Willison 28bf3a933f Applied Black, refs #2278 2024-02-19 14:22:59 -08:00
Simon Willison 26300738e3 Fixes for permissions debug page, closes #2278 2024-02-19 14:17:37 -08:00
Simon Willison 27409a7892 Fix for hook position in wide column names, refs #2263 2024-02-19 14:01:55 -08:00
Simon Willison 392ca2e24c Improvements to table column cog menu display, closes #2263
- Repositions if menu would cause a horizontal scrollbar
- Arrow tip on menu now attempts to align with cog icon on column
2024-02-19 13:40:48 -08:00
Simon Willison b36a2d8f4b Require update-row to use insert replace, closes #2279 2024-02-19 12:55:51 -08:00
Simon Willison 3856a8cb24 Consistent Permission denied:, refs #2279 2024-02-19 12:51:14 -08:00
Simon Willison 81629dbeff Upgrade GitHub Actions, including PyPI publishing 2024-02-17 21:03:41 -08:00
Simon Willison a4fa1ef3bd Release 1.0a10
Refs #2277
2024-02-17 20:56:15 -08:00
Simon Willison 10f9ba1a00 Take advantage of execute_write_fn(transaction=True)
A bunch of places no longer need to do manual transaction handling
thanks to this change. Refs #2277
2024-02-17 20:51:19 -08:00
Simon Willison 5e0e440f2c database.execute_write_fn(transaction=True) parameter, closes #2277 2024-02-17 20:28:15 -08:00
Simon Willison e1c80efff8 Note about activating alpha documentation versions on ReadTheDocs 2024-02-16 14:43:36 -08:00
Simon Willison 9906f937d9 Release 1.0a9
Refs #2101, #2260, #2262, #2265, #2270, #2273, #2274, #2275

Closes #2276
2024-02-16 14:36:12 -08:00
Simon Willison 3a999a85fb Fire insert-rows on /db/-/create if rows were inserted, refs #2260 2024-02-16 13:59:56 -08:00
Simon Willison 244f3ff83a Test demonstrating fix for permisisons bug in #2262 2024-02-16 13:39:57 -08:00
Simon Willison 8bfa3a51c2 Consider every plugins opinion in datasette.permission_allowed()
Closes #2275, refs #2262
2024-02-16 13:29:39 -08:00
Simon Willison 232a30459b DATASETTE_TRACE_PLUGINS setting, closes #2274 2024-02-16 13:00:24 -08:00
Simon Willison 47e29e948b Better comments in permission_allowed_default() 2024-02-16 10:05:18 -08:00
Simon Willison 97de4d6362 Use transaction in delete_everything(), closes #2273 2024-02-15 21:35:49 -08:00
Simon Willison b89cac3b6a
Use MD5 usedforsecurity=False on Python 3.9 and higher to pass FIPS
Closes #2270
2024-02-13 18:23:54 -08:00
Chris Amico 96b8b7f1ec Maybe fix a doc test 2023-01-19 07:48:47 -05:00
Chris Amico ed44bc06d4 Document custom json encoder 2023-01-18 11:51:12 -05:00
54 zmienionych plików z 1352 dodań i 403 usunięć

Wyświetl plik

@ -12,20 +12,15 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.8", "3.9", "3.10", "3.11"]
python-version: ["3.8", "3.9", "3.10", "3.11", "3.12"]
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- uses: actions/cache@v3
name: Configure pip caching
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('**/setup.py') }}
restore-keys: |
${{ runner.os }}-pip-
cache: pip
cache-dependency-path: setup.py
- name: Install dependencies
run: |
pip install -e '.[test]'
@ -36,47 +31,38 @@ jobs:
deploy:
runs-on: ubuntu-latest
needs: [test]
environment: release
permissions:
id-token: write
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: '3.11'
- uses: actions/cache@v3
name: Configure pip caching
with:
path: ~/.cache/pip
key: ${{ runner.os }}-publish-pip-${{ hashFiles('**/setup.py') }}
restore-keys: |
${{ runner.os }}-publish-pip-
python-version: '3.12'
cache: pip
cache-dependency-path: setup.py
- name: Install dependencies
run: |
pip install setuptools wheel twine
- name: Publish
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
pip install setuptools wheel build
- name: Build
run: |
python setup.py sdist bdist_wheel
twine upload dist/*
python -m build
- name: Publish
uses: pypa/gh-action-pypi-publish@release/v1
deploy_static_docs:
runs-on: ubuntu-latest
needs: [deploy]
if: "!github.event.release.prerelease"
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v2
uses: actions/setup-python@v5
with:
python-version: '3.9'
- uses: actions/cache@v2
name: Configure pip caching
with:
path: ~/.cache/pip
key: ${{ runner.os }}-publish-pip-${{ hashFiles('**/setup.py') }}
restore-keys: |
${{ runner.os }}-publish-pip-
cache: pip
cache-dependency-path: setup.py
- name: Install dependencies
run: |
python -m pip install -e .[docs]
@ -105,7 +91,7 @@ jobs:
needs: [deploy]
if: "!github.event.release.prerelease"
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Build and push to Docker Hub
env:
DOCKER_USER: ${{ secrets.DOCKER_USER }}

Wyświetl plik

@ -24,3 +24,4 @@ jobs:
codespell README.md --ignore-words docs/codespell-ignore-words.txt
codespell docs/*.rst --ignore-words docs/codespell-ignore-words.txt
codespell datasette -S datasette/static --ignore-words docs/codespell-ignore-words.txt
codespell tests --ignore-words docs/codespell-ignore-words.txt

Wyświetl plik

@ -12,19 +12,14 @@ jobs:
matrix:
python-version: ["3.8", "3.9", "3.10", "3.11", "3.12"]
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
allow-prereleases: true
- uses: actions/cache@v3
name: Configure pip caching
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('**/setup.py') }}
restore-keys: |
${{ runner.os }}-pip-
cache: pip
cache-dependency-path: setup.py
- name: Build extension for --load-extension test
run: |-
(cd tests && gcc ext.c -fPIC -shared -o ext.so)

Wyświetl plik

@ -15,6 +15,7 @@ export DATASETTE_SECRET := "not_a_secret"
pipenv run codespell README.md --ignore-words docs/codespell-ignore-words.txt
pipenv run codespell docs/*.rst --ignore-words docs/codespell-ignore-words.txt
pipenv run codespell datasette -S datasette/static --ignore-words docs/codespell-ignore-words.txt
pipenv run tests --ignore-words docs/codespell-ignore-words.txt
# Run linters: black, flake8, mypy, cog
@lint: codespell

Wyświetl plik

@ -903,6 +903,8 @@ class Datasette:
# Use default from registered permission, if available
if default is DEFAULT_NOT_SET and action in self.permissions:
default = self.permissions[action].default
opinions = []
# Every plugin is consulted for their opinion
for check in pm.hook.permission_allowed(
datasette=self,
actor=actor,
@ -911,14 +913,24 @@ class Datasette:
):
check = await await_me_maybe(check)
if check is not None:
result = check
opinions.append(check)
result = None
# If any plugin said False it's false - the veto rule
if any(not r for r in opinions):
result = False
elif any(r for r in opinions):
# Otherwise, if any plugin said True it's true
result = True
used_default = False
if result is None:
# No plugin expressed an opinion, so use the default
result = default
used_default = True
self._permission_checks.append(
{
"when": datetime.datetime.utcnow().isoformat(),
"when": datetime.datetime.now(datetime.timezone.utc).isoformat(),
"actor": actor,
"action": action,
"resource": resource,
@ -1921,37 +1933,40 @@ class DatasetteClient:
path = f"http://localhost{path}"
return path
async def _request(self, method, path, **kwargs):
async with httpx.AsyncClient(
transport=httpx.ASGITransport(app=self.app),
cookies=kwargs.pop("cookies", None),
) as client:
return await getattr(client, method)(self._fix(path), **kwargs)
async def get(self, path, **kwargs):
async with httpx.AsyncClient(app=self.app) as client:
return await client.get(self._fix(path), **kwargs)
return await self._request("get", path, **kwargs)
async def options(self, path, **kwargs):
async with httpx.AsyncClient(app=self.app) as client:
return await client.options(self._fix(path), **kwargs)
return await self._request("options", path, **kwargs)
async def head(self, path, **kwargs):
async with httpx.AsyncClient(app=self.app) as client:
return await client.head(self._fix(path), **kwargs)
return await self._request("head", path, **kwargs)
async def post(self, path, **kwargs):
async with httpx.AsyncClient(app=self.app) as client:
return await client.post(self._fix(path), **kwargs)
return await self._request("post", path, **kwargs)
async def put(self, path, **kwargs):
async with httpx.AsyncClient(app=self.app) as client:
return await client.put(self._fix(path), **kwargs)
return await self._request("put", path, **kwargs)
async def patch(self, path, **kwargs):
async with httpx.AsyncClient(app=self.app) as client:
return await client.patch(self._fix(path), **kwargs)
return await self._request("patch", path, **kwargs)
async def delete(self, path, **kwargs):
async with httpx.AsyncClient(app=self.app) as client:
return await client.delete(self._fix(path), **kwargs)
return await self._request("delete", path, **kwargs)
async def request(self, method, path, **kwargs):
avoid_path_rewrites = kwargs.pop("avoid_path_rewrites", None)
async with httpx.AsyncClient(app=self.app) as client:
async with httpx.AsyncClient(
transport=httpx.ASGITransport(app=self.app),
cookies=kwargs.pop("cookies", None),
) as client:
return await client.request(
method, self._fix(path, avoid_path_rewrites), **kwargs
)

Wyświetl plik

@ -1,7 +1,6 @@
import asyncio
from collections import namedtuple
from pathlib import Path
import hashlib
import janus
import queue
import sys
@ -15,6 +14,7 @@ from .utils import (
detect_spatialite,
get_all_foreign_keys,
get_outbound_foreign_keys,
md5_not_usedforsecurity,
sqlite_timelimit,
sqlite3,
table_columns,
@ -74,7 +74,7 @@ class Database:
def color(self):
if self.hash:
return self.hash[:6]
return hashlib.md5(self.name.encode("utf8")).hexdigest()[:6]
return md5_not_usedforsecurity(self.name)[:6]
def suggest_name(self):
if self.path:
@ -123,8 +123,7 @@ class Database:
async def execute_write(self, sql, params=None, block=True):
def _inner(conn):
with conn:
return conn.execute(sql, params or [])
return conn.execute(sql, params or [])
with trace("sql", database=self.name, sql=sql.strip(), params=params):
results = await self.execute_write_fn(_inner, block=block)
@ -132,8 +131,7 @@ class Database:
async def execute_write_script(self, sql, block=True):
def _inner(conn):
with conn:
return conn.executescript(sql)
return conn.executescript(sql)
with trace("sql", database=self.name, sql=sql.strip(), executescript=True):
results = await self.execute_write_fn(_inner, block=block)
@ -149,8 +147,7 @@ class Database:
count += 1
yield param
with conn:
return conn.executemany(sql, count_params(params_seq)), count
return conn.executemany(sql, count_params(params_seq)), count
with trace(
"sql", database=self.name, sql=sql.strip(), executemany=True
@ -179,17 +176,25 @@ class Database:
# Threaded mode - send to write thread
return await self._send_to_write_thread(fn, isolated_connection=True)
async def execute_write_fn(self, fn, block=True):
async def execute_write_fn(self, fn, block=True, transaction=True):
if self.ds.executor is None:
# non-threaded mode
if self._write_connection is None:
self._write_connection = self.connect(write=True)
self.ds._prepare_connection(self._write_connection, self.name)
return fn(self._write_connection)
if transaction:
with self._write_connection:
return fn(self._write_connection)
else:
return fn(self._write_connection)
else:
return await self._send_to_write_thread(fn, block)
return await self._send_to_write_thread(
fn, block=block, transaction=transaction
)
async def _send_to_write_thread(self, fn, block=True, isolated_connection=False):
async def _send_to_write_thread(
self, fn, block=True, isolated_connection=False, transaction=True
):
if self._write_queue is None:
self._write_queue = queue.Queue()
if self._write_thread is None:
@ -202,7 +207,9 @@ class Database:
self._write_thread.start()
task_id = uuid.uuid5(uuid.NAMESPACE_DNS, "datasette.io")
reply_queue = janus.Queue()
self._write_queue.put(WriteTask(fn, task_id, reply_queue, isolated_connection))
self._write_queue.put(
WriteTask(fn, task_id, reply_queue, isolated_connection, transaction)
)
if block:
result = await reply_queue.async_q.get()
if isinstance(result, Exception):
@ -244,7 +251,11 @@ class Database:
pass
else:
try:
result = task.fn(conn)
if task.transaction:
with conn:
result = task.fn(conn)
else:
result = task.fn(conn)
except Exception as e:
sys.stderr.write("{}\n".format(e))
sys.stderr.flush()
@ -458,6 +469,7 @@ class Database:
and (
sql like '%VIRTUAL TABLE%USING FTS%'
) or name in ('sqlite_stat1', 'sqlite_stat2', 'sqlite_stat3', 'sqlite_stat4')
or name like '\\_%' escape '\\'
"""
)
).rows
@ -554,13 +566,14 @@ class Database:
class WriteTask:
__slots__ = ("fn", "task_id", "reply_queue", "isolated_connection")
__slots__ = ("fn", "task_id", "reply_queue", "isolated_connection", "transaction")
def __init__(self, fn, task_id, reply_queue, isolated_connection):
def __init__(self, fn, task_id, reply_queue, isolated_connection, transaction):
self.fn = fn
self.task_id = task_id
self.reply_queue = reply_queue
self.isolated_connection = isolated_connection
self.transaction = transaction
class QueryInterrupted(Exception):

Wyświetl plik

@ -24,9 +24,12 @@ def now(key, request):
if key == "epoch":
return int(time.time())
elif key == "date_utc":
return datetime.datetime.utcnow().date().isoformat()
return datetime.datetime.now(datetime.timezone.utc).date().isoformat()
elif key == "datetime_utc":
return datetime.datetime.utcnow().strftime(r"%Y-%m-%dT%H:%M:%S") + "Z"
return (
datetime.datetime.now(datetime.timezone.utc).strftime(r"%Y-%m-%dT%H:%M:%S")
+ "Z"
)
else:
raise KeyError

Wyświetl plik

@ -144,7 +144,7 @@ def permission_allowed_default(datasette, actor, action, resource):
if actor and actor.get("id") == "root":
return True
# Resolve metadata view permissions
# Resolve view permissions in allow blocks in configuration
if action in (
"view-instance",
"view-database",
@ -158,7 +158,7 @@ def permission_allowed_default(datasette, actor, action, resource):
if result is not None:
return result
# Check custom permissions: blocks
# Resolve custom permissions: blocks in configuration
result = await _resolve_config_permissions_blocks(
datasette, actor, action, resource
)

Wyświetl plik

@ -281,6 +281,13 @@ class Filters:
'{c} contains "{v}"',
format="%{}%",
),
TemplatedFilter(
"notcontains",
"does not contain",
'"{c}" not like :{p}',
'{c} does not contain "{v}"',
format="%{}%",
),
TemplatedFilter(
"endswith",
"ends with",

Wyświetl plik

@ -140,16 +140,36 @@ def menu_links(datasette, actor, request):
"""Links for the navigation menu"""
@hookspec
def row_actions(datasette, actor, request, database, table, row):
"""Links for the row actions menu"""
@hookspec
def table_actions(datasette, actor, database, table, request):
"""Links for the table actions menu"""
@hookspec
def view_actions(datasette, actor, database, view, request):
"""Links for the view actions menu"""
@hookspec
def query_actions(datasette, actor, database, query_name, request, sql, params):
"""Links for the query and canned query actions menu"""
@hookspec
def database_actions(datasette, actor, database, request):
"""Links for the database actions menu"""
@hookspec
def homepage_actions(datasette, actor, request):
"""Links for the homepage actions menu"""
@hookspec
def skip_csrf(datasette, scope):
"""Mechanism for skipping CSRF checks for certain requests"""

Wyświetl plik

@ -1,6 +1,7 @@
import importlib
import os
import pluggy
from pprint import pprint
import sys
from . import hookspecs
@ -33,6 +34,29 @@ DEFAULT_PLUGINS = (
pm = pluggy.PluginManager("datasette")
pm.add_hookspecs(hookspecs)
DATASETTE_TRACE_PLUGINS = os.environ.get("DATASETTE_TRACE_PLUGINS", None)
def before(hook_name, hook_impls, kwargs):
print(file=sys.stderr)
print(f"{hook_name}:", file=sys.stderr)
pprint(kwargs, width=40, indent=4, stream=sys.stderr)
print("Hook implementations:", file=sys.stderr)
pprint(hook_impls, width=40, indent=4, stream=sys.stderr)
def after(outcome, hook_name, hook_impls, kwargs):
results = outcome.get_result()
if not isinstance(results, list):
results = [results]
print(f"Results:", file=sys.stderr)
pprint(results, width=40, indent=4, stream=sys.stderr)
if DATASETTE_TRACE_PLUGINS:
pm.add_hookcall_monitoring(before, after)
DATASETTE_LOAD_PLUGINS = os.environ.get("DATASETTE_LOAD_PLUGINS", None)
if not hasattr(sys, "_called_from_test") and DATASETTE_LOAD_PLUGINS is None:

Wyświetl plik

@ -163,28 +163,22 @@ h6,
}
.page-header {
display: flex;
align-items: center;
padding-left: 10px;
border-left: 10px solid #666;
margin-bottom: 0.75rem;
margin-top: 1rem;
}
.page-header h1 {
display: inline;
margin: 0;
font-size: 2rem;
padding-right: 0.2em;
}
.page-header details {
display: inline-flex;
}
.page-header details > summary {
.page-action-menu details > summary {
list-style: none;
display: inline-flex;
cursor: pointer;
}
.page-header details > summary::-webkit-details-marker {
.page-action-menu details > summary::-webkit-details-marker {
display: none;
}
@ -275,6 +269,7 @@ header,
footer {
padding: 0.6rem 1rem 0.5rem 1rem;
background-color: #276890;
background: linear-gradient(180deg, rgba(96,144,173,1) 0%, rgba(39,104,144,1) 50%);
color: rgba(255,255,244,0.9);
overflow: hidden;
box-sizing: border-box;
@ -352,25 +347,59 @@ details.nav-menu > summary::-webkit-details-marker {
}
details .nav-menu-inner {
position: absolute;
top: 2rem;
top: 2.6rem;
right: 10px;
width: 180px;
background-color: #276890;
padding: 1rem;
z-index: 1000;
padding: 0;
}
.nav-menu-inner li,
form.nav-menu-logout {
padding: 0.3rem 0.5rem;
border-top: 1px solid #ffffff69;
}
.nav-menu-inner a {
display: block;
}
/* Table/database actions menu */
.page-header {
.page-action-menu {
position: relative;
margin-bottom: 0.5em;
}
.actions-menu-links {
display: inline;
}
.actions-menu-links .dropdown-menu {
position: absolute;
top: calc(100% + 10px);
left: -10px;
left: 0;
z-index: 10000;
}
.page-action-menu .icon-text {
display: inline-flex;
align-items: center;
border-radius: .25rem;
padding: 5px 12px 3px 7px;
color: #fff;
font-weight: 400;
font-size: 0.8em;
background: linear-gradient(180deg, #007bff 0%, #4E79C7 100%);
border-color: #007bff;
}
.page-action-menu .icon-text span {
/* Nudge text up a bit */
position: relative;
top: -2px;
}
.page-action-menu .icon-text:hover {
cursor: pointer;
}
.page-action-menu .icon {
width: 18px;
height: 18px;
margin-right: 4px;
}
/* Components ============================================================== */
@ -536,7 +565,7 @@ form input[type=submit], form button[type=button] {
form input[type=submit] {
color: #fff;
background-color: #007bff;
background: linear-gradient(180deg, #007bff 0%, #4E79C7 100%);
border-color: #007bff;
-webkit-appearance: button;
}
@ -819,6 +848,13 @@ svg.dropdown-menu-icon {
.dropdown-menu a:hover {
background-color: #eee;
}
.dropdown-menu .dropdown-description {
margin: 0;
color: #666;
font-size: 0.8em;
max-width: 80vw;
white-space: normal;
}
.dropdown-menu .hook {
display: block;
position: absolute;

Wyświetl plik

@ -88,6 +88,7 @@ const initDatasetteTable = function (manager) {
function onTableHeaderClick(ev) {
ev.preventDefault();
ev.stopPropagation();
menu.innerHTML = DROPDOWN_HTML;
var th = ev.target;
while (th.nodeName != "TH") {
th = th.parentNode;
@ -217,6 +218,25 @@ const initDatasetteTable = function (manager) {
menuList.appendChild(menuItem);
});
// Measure width of menu and adjust position if too far right
const menuWidth = menu.offsetWidth;
const windowWidth = window.innerWidth;
if (menuLeft + menuWidth > windowWidth) {
menu.style.left = windowWidth - menuWidth - 20 + "px";
}
// Align menu .hook arrow with the column cog icon
const hook = menu.querySelector('.hook');
const icon = th.querySelector('.dropdown-menu-icon');
const iconRect = icon.getBoundingClientRect();
const hookLeft = (iconRect.left - menuLeft + 1) + 'px';
hook.style.left = hookLeft;
// Move the whole menu right if the hook is too far right
const menuRect = menu.getBoundingClientRect();
if (iconRect.right > menuRect.right) {
menu.style.left = (iconRect.right - menuWidth) + 'px';
// And move hook tip as well
hook.style.left = (menuWidth - 13) + 'px';
}
}
var svg = document.createElement("div");

Wyświetl plik

@ -0,0 +1,28 @@
{% if action_links %}
<div class="page-action-menu">
<details class="actions-menu-links details-menu">
<summary>
<div class="icon-text">
<svg class="icon" aria-labelledby="actions-menu-links-title" role="img" style="color: #fff" xmlns="http://www.w3.org/2000/svg" width="28" height="28" viewBox="0 0 28 28" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">{{ action_title }}</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg>
<span>{{ action_title }}</span>
</div>
</summary>
<div class="dropdown-menu">
<div class="hook"></div>
<ul>
{% for link in action_links %}
<li><a href="{{ link.href }}">{{ link.label }}
{% if link.description %}
<p class="dropdown-description">{{ link.description }}</p>
{% endif %}</a>
</li>
{% endfor %}
</ul>
</div>
</details>
</div>
{% endif %}

Wyświetl plik

@ -37,7 +37,7 @@
</ul>
{% endif %}
{% if show_logout %}
<form action="{{ urls.logout() }}" method="post">
<form class="nav-menu-logout" action="{{ urls.logout() }}" method="post">
<input type="hidden" name="csrftoken" value="{{ csrftoken() }}">
<button class="button-as-link">Log out</button>
</form>{% endif %}

Wyświetl plik

@ -12,27 +12,9 @@
{% block content %}
<div class="page-header" style="border-color: #{{ database_color }}">
<h1>{{ metadata.title or database }}{% if private %} 🔒{% endif %}</h1>
{% set links = database_actions() %}{% if links %}
<details class="actions-menu-links details-menu">
<summary><svg aria-labelledby="actions-menu-links-title" role="img"
style="color: #666" xmlns="http://www.w3.org/2000/svg"
width="28" height="28" viewBox="0 0 24 24" fill="none"
stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">Table actions</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg></summary>
<div class="dropdown-menu">
{% if links %}
<ul>
{% for link in links %}
<li><a href="{{ link.href }}">{{ link.label }}</a></li>
{% endfor %}
</ul>
{% endif %}
</div>
</details>{% endif %}
</div>
</div>
{% set action_links, action_title = database_actions(), "Database actions" %}
{% include "_action_menu.html" %}
{{ top_database() }}

Wyświetl plik

@ -7,6 +7,9 @@
{% block content %}
<h1>{{ metadata.title or "Datasette" }}{% if private %} 🔒{% endif %}</h1>
{% set action_links, action_title = homepage_actions, "Homepage actions" %}
{% include "_action_menu.html" %}
{{ top_homepage() }}
{% block description_source_license %}{% include "_description_source_license.html" %}{% endblock %}

Wyświetl plik

@ -26,7 +26,7 @@
<li><a href="/-/plugins">Installed plugins</a></li>
<li><a href="/-/versions">Version info</a></li>
</ul>
<form action="/-/logout" method="post">
<form class="nav-menu-logout" action="/-/logout" method="post">
<button class="button-as-link">Log out</button>
</form>
</div>
@ -96,18 +96,24 @@
<section class="content">
<div class="page-header" style="border-color: #ff0000">
<h1>fixtures</h1>
</div>
<div class="page-action-menu">
<details class="actions-menu-links details-menu">
<summary><svg aria-labelledby="actions-menu-links-title" role="img"
style="color: #666" xmlns="http://www.w3.org/2000/svg"
width="28" height="28" viewBox="0 0 24 24" fill="none"
stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">Table actions</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg></summary>
<summary>
<div class="icon-text">
<svg class="icon" aria-labelledby="actions-menu-links-title" role="img" style="color: #fff" xmlns="http://www.w3.org/2000/svg" width="28" height="28" viewBox="0 0 28 28" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">Database actions</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg>
<span>Database actions</span>
</div>
</summary>
<div class="dropdown-menu">
<div class="hook"></div>
<ul>
<li><a href="#">Database action</a></li>
<li><a href="#">Action one</a></li>
<li><a href="#">Action two</a></li>
</ul>
</div>
</details>
@ -158,18 +164,24 @@
<section class="content">
<div class="page-header" style="border-color: #ff0000">
<h1>roadside_attraction_characteristics</h1>
</div>
<div class="page-action-menu">
<details class="actions-menu-links details-menu">
<summary><svg aria-labelledby="actions-menu-links-title" role="img"
style="color: #666" xmlns="http://www.w3.org/2000/svg"
width="28" height="28" viewBox="0 0 24 24" fill="none"
stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">Table actions</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg></summary>
<summary>
<div class="icon-text">
<svg class="icon" aria-labelledby="actions-menu-links-title" role="img" style="color: #fff" xmlns="http://www.w3.org/2000/svg" width="28" height="28" viewBox="0 0 28 28" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">Database actions</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg>
<span>Table actions</span>
</div>
</summary>
<div class="dropdown-menu">
<div class="hook"></div>
<ul>
<li><a href="#">Table action</a></li>
<li><a href="#">Action one</a></li>
<li><a href="#">Action two</a></li>
</ul>
</div>
</details>

Wyświetl plik

@ -57,7 +57,7 @@ textarea {
<p><label for="permission" style="display:block">Permission</label>
<select name="permission" id="permission">
{% for permission in permissions %}
<option value="{{ permission.0 }}">{{ permission.name }} (default {{ permission.default }})</option>
<option value="{{ permission.name }}">{{ permission.name }} (default {{ permission.default }})</option>
{% endfor %}
</select>
<p><label for="resource_1">Database name</label><input type="text" id="resource_1" name="resource_1"></p>
@ -71,19 +71,19 @@ textarea {
<script>
var rawPerms = {{ permissions|tojson }};
var permissions = Object.fromEntries(rawPerms.map(([label, abbr, needs_resource_1, needs_resource_2, def]) => [label, {needs_resource_1, needs_resource_2, def}]))
var permissions = Object.fromEntries(rawPerms.map(p => [p.name, p]));
var permissionSelect = document.getElementById('permission');
var resource1 = document.getElementById('resource_1');
var resource2 = document.getElementById('resource_2');
function updateResourceVisibility() {
var permission = permissionSelect.value;
var {needs_resource_1, needs_resource_2} = permissions[permission];
if (needs_resource_1) {
var {takes_database, takes_resource} = permissions[permission];
if (takes_database) {
resource1.closest('p').style.display = 'block';
} else {
resource1.closest('p').style.display = 'none';
}
if (needs_resource_2) {
if (takes_resource) {
resource2.closest('p').style.display = 'block';
} else {
resource2.closest('p').style.display = 'none';

Wyświetl plik

@ -29,6 +29,8 @@
{% endif %}
<h1 style="padding-left: 10px; border-left: 10px solid #{{ database_color }}">{{ metadata.title or database }}{% if canned_query and not metadata.title %}: {{ canned_query }}{% endif %}{% if private %} 🔒{% endif %}</h1>
{% set action_links, action_title = query_actions(), "Query actions" %}
{% include "_action_menu.html" %}
{% if canned_query %}{{ top_canned_query() }}{% else %}{{ top_query() }}{% endif %}

Wyświetl plik

@ -22,6 +22,9 @@
{% block content %}
<h1 style="padding-left: 10px; border-left: 10px solid #{{ database_color }}">{{ table }}: {{ ', '.join(primary_key_values) }}{% if private %} 🔒{% endif %}</h1>
{% set action_links, action_title = row_actions, "Row actions" %}
{% include "_action_menu.html" %}
{{ top_row() }}
{% block description_source_license %}{% include "_description_source_license.html" %}{% endblock %}

Wyświetl plik

@ -23,27 +23,9 @@
{% block content %}
<div class="page-header" style="border-color: #{{ database_color }}">
<h1>{{ metadata.get("title") or table }}{% if is_view %} (view){% endif %}{% if private %} 🔒{% endif %}</h1>
{% set links = table_actions() %}{% if links %}
<details class="actions-menu-links details-menu">
<summary><svg aria-labelledby="actions-menu-links-title" role="img"
style="color: #666" xmlns="http://www.w3.org/2000/svg"
width="28" height="28" viewBox="0 0 24 24" fill="none"
stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">Table actions</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg></summary>
<div class="dropdown-menu">
{% if links %}
<ul>
{% for link in links %}
<li><a href="{{ link.href }}">{{ link.label }}</a></li>
{% endfor %}
</ul>
{% endif %}
</div>
</details>{% endif %}
</div>
{% set action_links, action_title = actions(), "View actions" if is_view else "Table actions" %}
{% include "_action_menu.html" %}
{{ top_table() }}

Wyświetl plik

@ -1,5 +1,6 @@
import asyncio
from contextlib import contextmanager
import aiofiles
import click
from collections import OrderedDict, namedtuple, Counter
import copy
@ -18,12 +19,14 @@ import time
import types
import secrets
import shutil
from typing import Iterable, Tuple
from typing import Iterable, List, Tuple
import urllib
import yaml
from .shutil_backport import copytree
from .sqlite import sqlite3, supports_table_xinfo
if typing.TYPE_CHECKING:
from datasette.database import Database
# From https://www.sqlite.org/lang_keywords.html
reserved_words = set(
@ -162,7 +165,31 @@ def compound_keys_after_sql(pks, start_index=0):
return "({})".format("\n or\n".join(or_clauses))
@documented
class CustomJSONEncoder(json.JSONEncoder):
"""
The CustomJSONEncoder class handles serialization for objects commonly used by Datasette,
including SQLite cursors and binary blobs. Datasette uses it internally to serve .json endpoints,
and plugins that return JSON can use it to match Datasette's own handling.
Built-in types (text, numbers, lists, etc) are encoded the same as Python's built-in ``json`` module.
- ``sqlite3.Row`` becomes a tuple
- ``sqlite3.Cursor`` becomes a list
If a binary blob can be decoded as UTF-8, the encoder returns it as text.
If it can't (for example, images), it is encoded as an object, with the actual
data base64-encoded, like so: ::
{
"$base64": True,
"encoded": ...,
}
Example: https://latest.datasette.io/fixtures/binary_data.json
"""
def default(self, obj):
if isinstance(obj, sqlite3.Row):
return tuple(obj)
@ -244,6 +271,7 @@ allowed_pragmas = (
"schema_version",
"table_info",
"table_xinfo",
"table_list",
)
disallawed_sql_res = [
(
@ -713,7 +741,7 @@ def to_css_class(s):
"""
if css_class_re.match(s):
return s
md5_suffix = hashlib.md5(s.encode("utf8")).hexdigest()[:6]
md5_suffix = md5_not_usedforsecurity(s)[:6]
# Strip leading _, -
s = s.lstrip("_").lstrip("-")
# Replace any whitespace with hyphens
@ -1130,7 +1158,13 @@ class StartupError(Exception):
_re_named_parameter = re.compile(":([a-zA-Z0-9_]+)")
async def derive_named_parameters(db, sql):
@documented
async def derive_named_parameters(db: "Database", sql: str) -> List[str]:
"""
Given a SQL statement, return a list of named parameters that are used in the statement
e.g. for ``select * from foo where id=:id`` this would return ``["id"]``
"""
explain = "explain {}".format(sql.strip().rstrip(";"))
possible_params = _re_named_parameter.findall(sql)
try:
@ -1401,3 +1435,32 @@ def redact_keys(original: dict, key_patterns: Iterable) -> dict:
return data
return redact(original)
def md5_not_usedforsecurity(s):
try:
return hashlib.md5(s.encode("utf8"), usedforsecurity=False).hexdigest()
except TypeError:
# For Python 3.8 which does not support usedforsecurity=False
return hashlib.md5(s.encode("utf8")).hexdigest()
_etag_cache = {}
async def calculate_etag(filepath, chunk_size=4096):
if filepath in _etag_cache:
return _etag_cache[filepath]
hasher = hashlib.md5()
async with aiofiles.open(filepath, "rb") as f:
while True:
chunk = await f.read(chunk_size)
if not chunk:
break
hasher.update(chunk)
etag = f'"{hasher.hexdigest()}"'
_etag_cache[filepath] = etag
return etag

Wyświetl plik

@ -1,5 +1,6 @@
import hashlib
import json
from datasette.utils import MultiParams
from datasette.utils import MultiParams, calculate_etag
from mimetypes import guess_type
from urllib.parse import parse_qs, urlunparse, parse_qsl
from pathlib import Path
@ -285,6 +286,7 @@ async def asgi_send_file(
headers = headers or {}
if filename:
headers["content-disposition"] = f'attachment; filename="{filename}"'
first = True
headers["content-length"] = str((await aiofiles.os.stat(str(filepath))).st_size)
async with aiofiles.open(str(filepath), mode="rb") as fp:
@ -307,9 +309,14 @@ async def asgi_send_file(
def asgi_static(root_path, chunk_size=4096, headers=None, content_type=None):
root_path = Path(root_path)
static_headers = {}
if headers:
static_headers = headers.copy()
async def inner_static(request, send):
path = request.scope["url_route"]["kwargs"]["path"]
headers = static_headers.copy()
try:
full_path = (root_path / path).resolve().absolute()
except FileNotFoundError:
@ -325,7 +332,15 @@ def asgi_static(root_path, chunk_size=4096, headers=None, content_type=None):
await asgi_send_html(send, "404: Path not inside root path", 404)
return
try:
await asgi_send_file(send, full_path, chunk_size=chunk_size)
# Calculate ETag for filepath
etag = await calculate_etag(full_path, chunk_size=chunk_size)
headers["ETag"] = etag
if_none_match = request.headers.get("if-none-match")
if if_none_match and if_none_match == etag:
return await asgi_send(send, "", 304)
await asgi_send_file(
send, full_path, chunk_size=chunk_size, headers=headers
)
except FileNotFoundError:
await asgi_send_html(send, "404: File not found", 404)
return

Wyświetl plik

@ -76,7 +76,8 @@ async def populate_schema_tables(internal_db, db):
"DELETE FROM catalog_columns WHERE database_name = ?", [database_name]
)
conn.execute(
"DELETE FROM catalog_foreign_keys WHERE database_name = ?", [database_name]
"DELETE FROM catalog_foreign_keys WHERE database_name = ?",
[database_name],
)
conn.execute(
"DELETE FROM catalog_indexes WHERE database_name = ?", [database_name]

Wyświetl plik

@ -1,2 +1,2 @@
__version__ = "1.0a8"
__version__ = "1.0a13"
__version_info__ = tuple(__version__.split("."))

Wyświetl plik

@ -9,8 +9,9 @@ import os
import re
import sqlite_utils
import textwrap
from typing import List
from datasette.events import AlterTableEvent, CreateTableEvent
from datasette.events import AlterTableEvent, CreateTableEvent, InsertRowsEvent
from datasette.database import QueryInterrupted
from datasette.utils import (
add_cors_headers,
@ -256,6 +257,11 @@ class QueryContext:
top_canned_query: callable = field(
metadata={"help": "Callable to render the top_canned_query slot"}
)
query_actions: callable = field(
metadata={
"help": "Callable returning a list of links for the query action menu"
}
)
async def get_tables(datasette, request, db):
@ -694,6 +700,22 @@ class QueryView(View):
)
)
async def query_actions():
query_actions = []
for hook in pm.hook.query_actions(
datasette=datasette,
actor=request.actor,
database=database,
query_name=canned_query["name"] if canned_query else None,
request=request,
sql=sql,
params=params,
):
extra_links = await await_me_maybe(hook)
if extra_links:
query_actions.extend(extra_links)
return query_actions
r = Response.html(
await datasette.render_template(
template,
@ -749,6 +771,7 @@ class QueryView(View):
database=database,
query_name=canned_query["name"] if canned_query else None,
),
query_actions=query_actions,
),
request=request,
view_name="database",
@ -860,7 +883,7 @@ class TableCreateView(BaseView):
if not await self.ds.permission_allowed(
request.actor, "update-row", resource=database_name
):
return _error(["Permission denied - need update-row"], 403)
return _error(["Permission denied: need update-row"], 403)
table_name = data.get("table")
if not table_name:
@ -884,7 +907,7 @@ class TableCreateView(BaseView):
if not await self.ds.permission_allowed(
request.actor, "insert-row", resource=database_name
):
return _error(["Permission denied - need insert-row"], 403)
return _error(["Permission denied: need insert-row"], 403)
alter = False
if rows or row:
@ -897,7 +920,7 @@ class TableCreateView(BaseView):
if not await self.ds.permission_allowed(
request.actor, "alter-table", resource=database_name
):
return _error(["Permission denied - need alter-table"], 403)
return _error(["Permission denied: need alter-table"], 403)
alter = True
if columns:
@ -1022,6 +1045,17 @@ class TableCreateView(BaseView):
request.actor, database=db.name, table=table_name, schema=schema
)
)
if rows:
await self.ds.track_event(
InsertRowsEvent(
request.actor,
database=db.name,
table=table_name,
num_rows=len(rows),
ignore=ignore,
replace=replace,
)
)
return Response.json(details, status=201)

Wyświetl plik

@ -1,6 +1,12 @@
import json
from datasette.utils import add_cors_headers, make_slot_function, CustomJSONEncoder
from datasette.plugins import pm
from datasette.utils import (
add_cors_headers,
await_me_maybe,
make_slot_function,
CustomJSONEncoder,
)
from datasette.utils.asgi import Response
from datasette.version import __version__
@ -131,6 +137,15 @@ class IndexView(BaseView):
headers=headers,
)
else:
homepage_actions = []
for hook in pm.hook.homepage_actions(
datasette=self.ds,
actor=request.actor,
request=request,
):
extra_links = await await_me_maybe(hook)
if extra_links:
homepage_actions.extend(extra_links)
return await self.render(
["index.html"],
request=request,
@ -144,5 +159,6 @@ class IndexView(BaseView):
"top_homepage": make_slot_function(
"top_homepage", self.ds, request
),
"homepage_actions": homepage_actions,
},
)

Wyświetl plik

@ -3,10 +3,12 @@ from datasette.database import QueryInterrupted
from datasette.events import UpdateRowEvent, DeleteRowEvent
from .base import DataView, BaseView, _error
from datasette.utils import (
await_me_maybe,
make_slot_function,
to_css_class,
escape_sqlite,
)
from datasette.plugins import pm
import json
import sqlite_utils
from .table import display_columns_and_rows
@ -55,6 +57,20 @@ class RowView(DataView):
)
for column in display_columns:
column["sortable"] = False
row_actions = []
for hook in pm.hook.row_actions(
datasette=self.ds,
actor=request.actor,
request=request,
database=database,
table=table,
row=rows[0],
):
extra_links = await await_me_maybe(hook)
if extra_links:
row_actions.extend(extra_links)
return {
"private": private,
"foreign_key_tables": await self.foreign_key_tables(
@ -68,6 +84,7 @@ class RowView(DataView):
f"_table-row-{to_css_class(database)}-{to_css_class(table)}.html",
"_table.html",
],
"row_actions": row_actions,
"metadata": (self.ds.metadata("databases") or {})
.get(database, {})
.get("tables", {})

Wyświetl plik

@ -125,14 +125,14 @@ class PermissionsDebugView(BaseView):
{
"permission_checks": list(reversed(self.ds._permission_checks)),
"permissions": [
(
p.name,
p.abbr,
p.description,
p.takes_database,
p.takes_resource,
p.default,
)
{
"name": p.name,
"abbr": p.abbr,
"description": p.description,
"takes_database": p.takes_database,
"takes_resource": p.takes_resource,
"default": p.default,
}
for p in self.ds.permissions.values()
],
},
@ -164,6 +164,7 @@ class PermissionsDebugView(BaseView):
"permission": permission,
"resource": resource,
"result": result,
"default": self.ds.permissions[permission].default,
}
)

Wyświetl plik

@ -485,6 +485,11 @@ class TableInsertView(BaseView):
if upsert and (ignore or replace):
return _error(["Upsert does not support ignore or replace"], 400)
if replace and not await self.ds.permission_allowed(
request.actor, "update-row", resource=(database_name, table_name)
):
return _error(['Permission denied: need update-row to use "replace"'], 403)
initial_schema = None
if alter:
# Must have alter-table permission
@ -1396,22 +1401,28 @@ async def table_view_data(
"Primary keys for this table"
return pks
async def extra_table_actions():
async def table_actions():
async def extra_actions():
async def actions():
links = []
for hook in pm.hook.table_actions(
datasette=datasette,
table=table_name,
database=database_name,
actor=request.actor,
request=request,
):
kwargs = {
"datasette": datasette,
"database": database_name,
"actor": request.actor,
"request": request,
}
if is_view:
kwargs["view"] = table_name
method = pm.hook.view_actions
else:
kwargs["table"] = table_name
method = pm.hook.table_actions
for hook in method(**kwargs):
extra_links = await await_me_maybe(hook)
if extra_links:
links.extend(extra_links)
return links
return table_actions
return actions
async def extra_is_view():
return is_view
@ -1601,7 +1612,7 @@ async def table_view_data(
"database",
"table",
"database_color",
"table_actions",
"actions",
"filters",
"renderers",
"custom_table_templates",
@ -1642,7 +1653,7 @@ async def table_view_data(
extra_database,
extra_table,
extra_database_color,
extra_table_actions,
extra_actions,
extra_filters,
extra_renderers,
extra_custom_table_templates,

Wyświetl plik

@ -20,4 +20,4 @@ help:
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
livehtml:
sphinx-autobuild -b html "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(0)
sphinx-autobuild -b html --watch ../datasette "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(0)

Wyświetl plik

@ -71,6 +71,23 @@ Datasette's built-in view permissions (``view-database``, ``view-table`` etc) de
Permissions with potentially harmful effects should default to *deny*. Plugin authors should account for this when designing new plugins - for example, the `datasette-upload-csvs <https://github.com/simonw/datasette-upload-csvs>`__ plugin defaults to deny so that installations don't accidentally allow unauthenticated users to create new tables by uploading a CSV file.
.. _authentication_permissions_explained:
How permissions are resolved
----------------------------
The :ref:`datasette.permission_allowed(actor, action, resource=None, default=...)<datasette_permission_allowed>` method is called to check if an actor is allowed to perform a specific action.
This method asks every plugin that implements the :ref:`plugin_hook_permission_allowed` hook if the actor is allowed to perform the action.
Each plugin can return ``True`` to indicate that the actor is allowed to perform the action, ``False`` if they are not allowed and ``None`` if the plugin has no opinion on the matter.
``False`` acts as a veto - if any plugin returns ``False`` then the permission check is denied. Otherwise, if any plugin returns ``True`` then the permission check is allowed.
The ``resource`` argument can be used to specify a specific resource that the action is being performed against. Some permissions, such as ``view-instance``, do not involve a resource. Others such as ``view-database`` have a resource that is a string naming the database. Permissions that take both a database name and the name of a table, view or canned query within that database use a resource that is a tuple of two strings, ``(database_name, resource_name)``.
Plugins that implement the ``permission_allowed()`` hook can decide if they are going to consider the provided resource or not.
.. _authentication_permissions_allow:
Defining permissions with "allow" blocks

Wyświetl plik

@ -4,6 +4,98 @@
Changelog
=========
.. _v1_0_a13:
1.0a13 (2024-03-12)
-------------------
Each of the key concepts in Datasette now has an :ref:`actions menu <plugin_actions>`, which plugins can use to add additional functionality targeting that entity.
- Plugin hook: :ref:`view_actions() <plugin_hook_view_actions>` for actions that can be applied to a SQL view. (:issue:`2297`)
- Plugin hook: :ref:`homepage_actions() <plugin_hook_homepage_actions>` for actions that apply to the instance homepage. (:issue:`2298`)
- Plugin hook: :ref:`row_actions() <plugin_hook_row_actions>` for actions that apply to the row page. (:issue:`2299`)
- Action menu items for all of the ``*_actions()`` plugin hooks can now return an optional ``"description"`` key, which will be displayed in the menu below the action label. (:issue:`2294`)
- :ref:`Plugin hooks <plugin_hooks>` documentation page is now organized with additional headings. (:issue:`2300`)
- Improved the display of action buttons on pages that also display metadata. (:issue:`2286`)
- The header and footer of the page now uses a subtle gradient effect, and options in the navigation menu are better visually defined. (:issue:`2302`)
- Table names that start with an underscore now default to hidden. (:issue:`2104`)
- ``pragma_table_list`` has been added to the allow-list of SQLite pragma functions supported by Datasette. ``select * from pragma_table_list()`` is no longer blocked. (`#2104 <https://github.com/simonw/datasette/issues/2104#issuecomment-1982352475>`__)
.. _v1_0_a12:
1.0a12 (2024-02-29)
-------------------
- New :ref:`query_actions() <plugin_hook_query_actions>` plugin hook, similar to :ref:`table_actions() <plugin_hook_table_actions>` and :ref:`database_actions() <plugin_hook_database_actions>`. Can be used to add a menu of actions to the canned query or arbitrary SQL query page. (:issue:`2283`)
- New design for the button that opens the query, table and database actions menu. (:issue:`2281`)
- "does not contain" table filter for finding rows that do not contain a string. (:issue:`2287`)
- Fixed a bug in the :ref:`javascript_plugins_makeColumnActions` JavaScript plugin mechanism where the column action menu was not fully reset in between each interaction. (:issue:`2289`)
.. _v1_0_a11:
1.0a11 (2024-02-19)
-------------------
- The ``"replace": true`` argument to the ``/db/table/-/insert`` API now requires the actor to have the ``update-row`` permission. (:issue:`2279`)
- Fixed some UI bugs in the interactive permissions debugging tool. (:issue:`2278`)
- The column action menu now aligns better with the cog icon, and positions itself taking into account the width of the browser window. (:issue:`2263`)
.. _v1_0_a10:
1.0a10 (2024-02-17)
-------------------
The only changes in this alpha correspond to the way Datasette handles database transactions. (:issue:`2277`)
- The :ref:`database.execute_write_fn() <database_execute_write_fn>` method has a new ``transaction=True`` parameter. This defaults to ``True`` which means all functions executed using this method are now automatically wrapped in a transaction - previously the functions needed to roll transaction handling on their own, and many did not.
- Pass ``transaction=False`` to ``execute_write_fn()`` if you want to manually handle transactions in your function.
- Several internal Datasette features, including parts of the :ref:`JSON write API <json_api_write>`, had been failing to wrap their operations in a transaction. This has been fixed by the new ``transaction=True`` default.
.. _v1_0_a9:
1.0a9 (2024-02-16)
------------------
This alpha release adds basic alter table support to the Datasette Write API and fixes a permissions bug relating to the ``/upsert`` API endpoint.
Alter table support for create, insert, upsert and update
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The :ref:`JSON write API <json_api_write>` can now be used to apply simple alter table schema changes, provided the acting actor has the new :ref:`permissions_alter_table` permission. (:issue:`2101`)
The only alter operation supported so far is adding new columns to an existing table.
* The :ref:`/db/-/create <TableCreateView>` API now adds new columns during large operations to create a table based on incoming example ``"rows"``, in the case where one of the later rows includes columns that were not present in the earlier batches. This requires the ``create-table`` but not the ``alter-table`` permission.
* When ``/db/-/create`` is called with rows in a situation where the table may have been already created, an ``"alter": true`` key can be included to indicate that any missing columns from the new rows should be added to the table. This requires the ``alter-table`` permission.
* :ref:`/db/table/-/insert <TableInsertView>` and :ref:`/db/table/-/upsert <TableUpsertView>` and :ref:`/db/table/row-pks/-/update <RowUpdateView>` all now also accept ``"alter": true``, depending on the ``alter-table`` permission.
Operations that alter a table now fire the new :ref:`alter-table event <events>`.
Permissions fix for the upsert API
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The :ref:`/database/table/-/upsert API <TableUpsertView>` had a minor permissions bug, only affecting Datasette instances that had configured the ``insert-row`` and ``update-row`` permissions to apply to a specific table rather than the database or instance as a whole. Full details in issue :issue:`2262`.
To avoid similar mistakes in the future the :ref:`datasette.permission_allowed() <datasette_permission_allowed>` method now specifies ``default=`` as a keyword-only argument.
Permission checks now consider opinions from every plugin
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The :ref:`datasette.permission_allowed() <datasette_permission_allowed>` method previously consulted every plugin that implemented the :ref:`permission_allowed() <plugin_hook_permission_allowed>` plugin hook and obeyed the opinion of the last plugin to return a value. (:issue:`2275`)
Datasette now consults every plugin and checks to see if any of them returned ``False`` (the veto rule), and if none of them did, it then checks to see if any of them returned ``True``.
This is explained at length in the new documentation covering :ref:`authentication_permissions_explained`.
Other changes
~~~~~~~~~~~~~
- The new :ref:`DATASETTE_TRACE_PLUGINS=1 environment variable <writing_plugins_tracing>` turns on detailed trace output for every executed plugin hook, useful for debugging and understanding how the plugin system works at a low level. (:issue:`2274`)
- Datasette on Python 3.9 or above marks its non-cryptographic uses of the MD5 hash function as ``usedforsecurity=False``, for compatibility with FIPS systems. (:issue:`2270`)
- SQL relating to :ref:`internals_internal` now executes inside a transaction, avoiding a potential database locked error. (:issue:`2273`)
- The ``/-/threads`` debug page now identifies the database in the name associated with each dedicated write thread. (:issue:`2265`)
- The ``/db/-/create`` API now fires a ``insert-rows`` event if rows were inserted after the table was created. (:issue:`2260`)
.. _v1_0_a8:
1.0a8 (2024-02-07)

Wyświetl plik

@ -1,2 +1,5 @@
ro
alls
fo
ro
te
ths

Wyświetl plik

@ -254,6 +254,7 @@ Datasette releases are performed using tags. When a new release is published on
* Re-point the "latest" tag on Docker Hub to the new image
* Build a wheel bundle of the underlying Python source code
* Push that new wheel up to PyPI: https://pypi.org/project/datasette/
* If the release is an alpha, navigate to https://readthedocs.org/projects/datasette/versions/ and search for the tag name in the "Activate a version" filter, then mark that version as "active" to ensure it will appear on the public ReadTheDocs documentation site.
To deploy new releases you will need to have push access to the main Datasette GitHub repository.

Wyświetl plik

@ -1010,7 +1010,9 @@ You can pass additional SQL parameters as a tuple or dictionary.
The method will block until the operation is completed, and the return value will be the return from calling ``conn.execute(...)`` using the underlying ``sqlite3`` Python library.
If you pass ``block=False`` this behaviour changes to "fire and forget" - queries will be added to the write queue and executed in a separate thread while your code can continue to do other things. The method will return a UUID representing the queued task.
If you pass ``block=False`` this behavior changes to "fire and forget" - queries will be added to the write queue and executed in a separate thread while your code can continue to do other things. The method will return a UUID representing the queued task.
Each call to ``execute_write()`` will be executed inside a transaction.
.. _database_execute_write_script:
@ -1019,6 +1021,8 @@ await db.execute_write_script(sql, block=True)
Like ``execute_write()`` but can be used to send multiple SQL statements in a single string separated by semicolons, using the ``sqlite3`` `conn.executescript() <https://docs.python.org/3/library/sqlite3.html#sqlite3.Cursor.executescript>`__ method.
Each call to ``execute_write_script()`` will be executed inside a transaction.
.. _database_execute_write_many:
await db.execute_write_many(sql, params_seq, block=True)
@ -1033,10 +1037,12 @@ Like ``execute_write()`` but uses the ``sqlite3`` `conn.executemany() <https://d
[(1, "Melanie"), (2, "Selma"), (2, "Viktor")],
)
Each call to ``execute_write_many()`` will be executed inside a transaction.
.. _database_execute_write_fn:
await db.execute_write_fn(fn, block=True)
-----------------------------------------
await db.execute_write_fn(fn, block=True, transaction=True)
-----------------------------------------------------------
This method works like ``.execute_write()``, but instead of a SQL statement you give it a callable Python function. Your function will be queued up and then called when the write connection is available, passing that connection as the argument to the function.
@ -1052,7 +1058,6 @@ For example:
def delete_and_return_count(conn):
conn.execute("delete from some_table where id > 5")
conn.commit()
return conn.execute(
"select count(*) from some_table"
).fetchone()[0]
@ -1069,7 +1074,7 @@ The value returned from ``await database.execute_write_fn(...)`` will be the ret
If your function raises an exception that exception will be propagated up to the ``await`` line.
If you see ``OperationalError: database table is locked`` errors you should check that you remembered to explicitly call ``conn.commit()`` in your write function.
By default your function will be executed inside a transaction. You can pass ``transaction=False`` to disable this behavior, though if you do that you should be careful to manually apply transactions - ideally using the ``with conn:`` pattern, or you may see ``OperationalError: database table is locked`` errors.
If you specify ``block=False`` the method becomes fire-and-forget, queueing your function to be executed and then allowing your code after the call to ``.execute_write_fn()`` to continue running while the underlying thread waits for an opportunity to run your function. A UUID representing the queued task will be returned. Any exceptions in your code will be silently swallowed.
@ -1251,6 +1256,15 @@ Utility function for calling ``await`` on a return value if it is awaitable, oth
.. autofunction:: datasette.utils.await_me_maybe
.. _internals_utils_derive_named_parameters:
derive_named_parameters(db, sql)
--------------------------------
Derive the list of named parameters referenced in a SQL query, using an ``explain`` query executed against the provided database.
.. autofunction:: datasette.utils.derive_named_parameters
.. _internals_tilde_encoding:
Tilde encoding
@ -1281,6 +1295,13 @@ Note that the space character is a special case: it will be replaced with a ``+`
.. _internals_tracer:
JSON encoding
-------------
.. _internals_utils_CustomJSONEncoder:
.. autoclass:: datasette.utils.CustomJSONEncoder
datasette.tracer
================

Wyświetl plik

@ -237,6 +237,9 @@ You can filter the data returned by the table based on column values using a que
``?column__contains=value``
Rows where the string column contains the specified value (``column like "%value%"`` in SQL).
``?column__notcontains=value``
Rows where the string column does not contain the specified value (``column not like "%value%"`` in SQL).
``?column__endswith=value``
Rows where the string column ends with the specified value (``column like "%value"`` in SQL).
@ -565,6 +568,8 @@ To insert multiple rows at a time, use the same API method but send a list of di
If successful, this will return a ``201`` status code and a ``{"ok": true}`` response body.
The maximum number rows that can be submitted at once defaults to 100, but this can be changed using the :ref:`setting_max_insert_rows` setting.
To return the newly inserted rows, add the ``"return": true`` key to the request body:
.. code-block:: json
@ -616,7 +621,7 @@ Pass ``"ignore": true`` to ignore these errors and insert the other rows:
"ignore": true
}
Or you can pass ``"replace": true`` to replace any rows with conflicting primary keys with the new values.
Or you can pass ``"replace": true`` to replace any rows with conflicting primary keys with the new values. This requires the :ref:`permissions_update_row` permission.
Pass ``"alter: true`` to automatically add any missing columns to the table. This requires the :ref:`permissions_alter_table` permission.
@ -854,7 +859,7 @@ The JSON here describes the table that will be created:
* ``pks`` can be used instead of ``pk`` to create a compound primary key. It should be a JSON list of column names to use in that primary key.
* ``ignore`` can be set to ``true`` to ignore existing rows by primary key if the table already exists.
* ``replace`` can be set to ``true`` to replace existing rows by primary key if the table already exists.
* ``replace`` can be set to ``true`` to replace existing rows by primary key if the table already exists. This requires the :ref:`permissions_update_row` permission.
* ``alter`` can be set to ``true`` if you want to automatically add any missing columns to the table. This requires the :ref:`permissions_alter_table` permission.
If the table is successfully created this will return a ``201`` status code and the following response:

Wyświetl plik

@ -40,6 +40,21 @@ The JSON version of this page provides programmatic access to the underlying dat
* `fivethirtyeight.datasettes.com/fivethirtyeight.json <https://fivethirtyeight.datasettes.com/fivethirtyeight.json>`_
* `global-power-plants.datasettes.com/global-power-plants.json <https://global-power-plants.datasettes.com/global-power-plants.json>`_
.. _DatabaseView_hidden:
Hidden tables
-------------
Some tables listed on the database page are treated as hidden. Hidden tables are not completely invisible - they can be accessed through the "hidden tables" link at the bottom of the page. They are hidden because they represent low-level implementation details which are generally not useful to end-users of Datasette.
The following tables are hidden by default:
- Any table with a name that starts with an underscore - this is a Datasette convention to help plugins easily hide their own internal tables.
- Tables that have been configured as ``"hidden": true`` using :ref:`metadata_hiding_tables`.
- ``*_fts`` tables that implement SQLite full-text search indexes.
- Tables relating to the inner workings of the SpatiaLite SQLite extension.
- ``sqlite_stat`` tables used to store statistics used by the query optimizer.
.. _TableView:
Table

Wyświetl plik

@ -92,10 +92,17 @@ This function can return an awaitable function if it needs to run any async code
Examples: `datasette-edit-templates <https://datasette.io/plugins/datasette-edit-templates>`_
.. _plugin_page_extras:
Page extras
-----------
These plugin hooks can be used to affect the way HTML pages for different Datasette interfaces are rendered.
.. _plugin_hook_extra_template_vars:
extra_template_vars(template, database, table, columns, view_name, request, datasette)
--------------------------------------------------------------------------------------
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Extra template variables that should be made available in the rendered template context.
@ -184,7 +191,7 @@ Examples: `datasette-search-all <https://datasette.io/plugins/datasette-search-a
.. _plugin_hook_extra_css_urls:
extra_css_urls(template, database, table, columns, view_name, request, datasette)
---------------------------------------------------------------------------------
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
This takes the same arguments as :ref:`extra_template_vars(...) <plugin_hook_extra_template_vars>`
@ -238,7 +245,7 @@ Examples: `datasette-cluster-map <https://datasette.io/plugins/datasette-cluster
.. _plugin_hook_extra_js_urls:
extra_js_urls(template, database, table, columns, view_name, request, datasette)
--------------------------------------------------------------------------------
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
This takes the same arguments as :ref:`extra_template_vars(...) <plugin_hook_extra_template_vars>`
@ -288,7 +295,7 @@ Examples: `datasette-cluster-map <https://datasette.io/plugins/datasette-cluster
.. _plugin_hook_extra_body_script:
extra_body_script(template, database, table, columns, view_name, request, datasette)
------------------------------------------------------------------------------------
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Extra JavaScript to be added to a ``<script>`` block at the end of the ``<body>`` element on the page.
@ -1430,147 +1437,6 @@ This example logs an error to `Sentry <https://sentry.io/>`__ and then renders a
Example: `datasette-sentry <https://datasette.io/plugins/datasette-sentry>`_
.. _plugin_hook_menu_links:
menu_links(datasette, actor, request)
-------------------------------------
``datasette`` - :ref:`internals_datasette`
You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
``actor`` - dictionary or None
The currently authenticated :ref:`actor <authentication_actor>`.
``request`` - :ref:`internals_request` or None
The current HTTP request. This can be ``None`` if the request object is not available.
This hook allows additional items to be included in the menu displayed by Datasette's top right menu icon.
The hook should return a list of ``{"href": "...", "label": "..."}`` menu items. These will be added to the menu.
It can alternatively return an ``async def`` awaitable function which returns a list of menu items.
This example adds a new menu item but only if the signed in user is ``"root"``:
.. code-block:: python
from datasette import hookimpl
@hookimpl
def menu_links(datasette, actor):
if actor and actor.get("id") == "root":
return [
{
"href": datasette.urls.path(
"/-/edit-schema"
),
"label": "Edit schema",
},
]
Using :ref:`internals_datasette_urls` here ensures that links in the menu will take the :ref:`setting_base_url` setting into account.
Examples: `datasette-search-all <https://datasette.io/plugins/datasette-search-all>`_, `datasette-graphql <https://datasette.io/plugins/datasette-graphql>`_
.. _plugin_hook_table_actions:
table_actions(datasette, actor, database, table, request)
---------------------------------------------------------
``datasette`` - :ref:`internals_datasette`
You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
``actor`` - dictionary or None
The currently authenticated :ref:`actor <authentication_actor>`.
``database`` - string
The name of the database.
``table`` - string
The name of the table.
``request`` - :ref:`internals_request` or None
The current HTTP request. This can be ``None`` if the request object is not available.
This hook allows table actions to be displayed in a menu accessed via an action icon at the top of the table page. It should return a list of ``{"href": "...", "label": "..."}`` menu items.
It can alternatively return an ``async def`` awaitable function which returns a list of menu items.
This example adds a new table action if the signed in user is ``"root"``:
.. code-block:: python
from datasette import hookimpl
@hookimpl
def table_actions(datasette, actor, database, table):
if actor and actor.get("id") == "root":
return [
{
"href": datasette.urls.path(
"/-/edit-schema/{}/{}".format(
database, table
)
),
"label": "Edit schema for this table",
}
]
Example: `datasette-graphql <https://datasette.io/plugins/datasette-graphql>`_
.. _plugin_hook_database_actions:
database_actions(datasette, actor, database, request)
-----------------------------------------------------
``datasette`` - :ref:`internals_datasette`
You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
``actor`` - dictionary or None
The currently authenticated :ref:`actor <authentication_actor>`.
``database`` - string
The name of the database.
``request`` - :ref:`internals_request`
The current HTTP request.
This hook is similar to :ref:`plugin_hook_table_actions` but populates an actions menu on the database page.
This example adds a new database action for creating a table, if the user has the ``edit-schema`` permission:
.. code-block:: python
from datasette import hookimpl
@hookimpl
def database_actions(datasette, actor, database):
async def inner():
if not await datasette.permission_allowed(
actor,
"edit-schema",
resource=database,
default=False,
):
return []
return [
{
"href": datasette.urls.path(
"/-/edit-schema/{}/-/create".format(
database
)
),
"label": "Create a table",
}
]
return inner
Example: `datasette-graphql <https://datasette.io/plugins/datasette-graphql>`_, `datasette-edit-schema <https://datasette.io/plugins/datasette-edit-schema>`_
.. _plugin_hook_skip_csrf:
skip_csrf(datasette, scope)
@ -1641,6 +1507,316 @@ This hook is responsible for returning a dictionary corresponding to Datasette :
Example: `datasette-remote-metadata plugin <https://datasette.io/plugins/datasette-remote-metadata>`__
.. _plugin_hook_menu_links:
menu_links(datasette, actor, request)
-------------------------------------
``datasette`` - :ref:`internals_datasette`
You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
``actor`` - dictionary or None
The currently authenticated :ref:`actor <authentication_actor>`.
``request`` - :ref:`internals_request` or None
The current HTTP request. This can be ``None`` if the request object is not available.
This hook allows additional items to be included in the menu displayed by Datasette's top right menu icon.
The hook should return a list of ``{"href": "...", "label": "..."}`` menu items. These will be added to the menu.
It can alternatively return an ``async def`` awaitable function which returns a list of menu items.
This example adds a new menu item but only if the signed in user is ``"root"``:
.. code-block:: python
from datasette import hookimpl
@hookimpl
def menu_links(datasette, actor):
if actor and actor.get("id") == "root":
return [
{
"href": datasette.urls.path(
"/-/edit-schema"
),
"label": "Edit schema",
},
]
Using :ref:`internals_datasette_urls` here ensures that links in the menu will take the :ref:`setting_base_url` setting into account.
Examples: `datasette-search-all <https://datasette.io/plugins/datasette-search-all>`_, `datasette-graphql <https://datasette.io/plugins/datasette-graphql>`_
.. _plugin_actions:
Action hooks
------------
Action hooks can be used to add items to the action menus that appear at the top of different pages within Datasette. Unlike :ref:`menu_links() <plugin_hook_menu_links>`, actions which are displayed on every page, actions should only be relevant to the page the user is currently viewing.
Each of these hooks should return return a list of ``{"href": "...", "label": "..."}`` menu items, with optional ``"description": "..."`` keys describing each action in more detail.
They can alternatively return an ``async def`` awaitable function which, when called, returns a list of those menu items.
.. _plugin_hook_table_actions:
table_actions(datasette, actor, database, table, request)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
``datasette`` - :ref:`internals_datasette`
You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
``actor`` - dictionary or None
The currently authenticated :ref:`actor <authentication_actor>`.
``database`` - string
The name of the database.
``table`` - string
The name of the table.
``request`` - :ref:`internals_request` or None
The current HTTP request. This can be ``None`` if the request object is not available.
This example adds a new table action if the signed in user is ``"root"``:
.. code-block:: python
from datasette import hookimpl
@hookimpl
def table_actions(datasette, actor, database, table):
if actor and actor.get("id") == "root":
return [
{
"href": datasette.urls.path(
"/-/edit-schema/{}/{}".format(
database, table
)
),
"label": "Edit schema for this table",
"description": "Add, remove, rename or alter columns for this table.",
}
]
Example: `datasette-graphql <https://datasette.io/plugins/datasette-graphql>`_
.. _plugin_hook_view_actions:
view_actions(datasette, actor, database, view, request)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
``datasette`` - :ref:`internals_datasette`
You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
``actor`` - dictionary or None
The currently authenticated :ref:`actor <authentication_actor>`.
``database`` - string
The name of the database.
``view`` - string
The name of the SQL view.
``request`` - :ref:`internals_request` or None
The current HTTP request. This can be ``None`` if the request object is not available.
Like :ref:`plugin_hook_table_actions` but for SQL views.
.. _plugin_hook_query_actions:
query_actions(datasette, actor, database, query_name, request, sql, params)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
``datasette`` - :ref:`internals_datasette`
You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
``actor`` - dictionary or None
The currently authenticated :ref:`actor <authentication_actor>`.
``database`` - string
The name of the database.
``query_name`` - string or None
The name of the canned query, or ``None`` if this is an arbitrary SQL query.
``request`` - :ref:`internals_request`
The current HTTP request.
``sql`` - string
The SQL query being executed
``params`` - dictionary
The parameters passed to the SQL query, if any.
Populates a "Query actions" menu on the canned query and arbitrary SQL query pages.
This example adds a new query action linking to a page for explaining a query:
.. code-block:: python
from datasette import hookimpl
import urllib
@hookimpl
def query_actions(datasette, database, query_name, sql):
# Don't explain an explain
if sql.lower().startswith("explain"):
return
return [
{
"href": datasette.urls.database(database)
+ "?"
+ urllib.parse.urlencode(
{
"sql": "explain " + sql,
}
),
"label": "Explain this query",
"description": "Get a summary of how SQLite executes the query",
},
]
Example: `datasette-create-view <https://datasette.io/plugins/datasette-create-view>`_
.. _plugin_hook_row_actions:
row_actions(datasette, actor, request, database, table, row)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
``datasette`` - :ref:`internals_datasette`
You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
``actor`` - dictionary or None
The currently authenticated :ref:`actor <authentication_actor>`.
``request`` - :ref:`internals_request` or None
The current HTTP request.
``database`` - string
The name of the database.
``table`` - string
The name of the table.
``row`` - ``sqlite.Row``
The SQLite row object being displayed on the page.
Return links for the "Row actions" menu shown at the top of the row page.
This example displays the row in JSON plus some additional debug information if the user is signed in:
.. code-block:: python
from datasette import hookimpl
@hookimpl
def row_actions(datasette, database, table, actor, row):
if actor:
return [
{
"href": datasette.urls.instance(),
"label": f"Row details for {actor['id']}",
"description": json.dumps(
dict(row), default=repr
),
},
]
Example: `datasette-enrichments <https://datasette.io/plugins/datasette-enrichments>`_
.. _plugin_hook_database_actions:
database_actions(datasette, actor, database, request)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
``datasette`` - :ref:`internals_datasette`
You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
``actor`` - dictionary or None
The currently authenticated :ref:`actor <authentication_actor>`.
``database`` - string
The name of the database.
``request`` - :ref:`internals_request`
The current HTTP request.
Populates an actions menu on the database page.
This example adds a new database action for creating a table, if the user has the ``edit-schema`` permission:
.. code-block:: python
from datasette import hookimpl
@hookimpl
def database_actions(datasette, actor, database):
async def inner():
if not await datasette.permission_allowed(
actor,
"edit-schema",
resource=database,
default=False,
):
return []
return [
{
"href": datasette.urls.path(
"/-/edit-schema/{}/-/create".format(
database
)
),
"label": "Create a table",
}
]
return inner
Example: `datasette-graphql <https://datasette.io/plugins/datasette-graphql>`_, `datasette-edit-schema <https://datasette.io/plugins/datasette-edit-schema>`_
.. _plugin_hook_homepage_actions:
homepage_actions(datasette, actor, request)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
``datasette`` - :ref:`internals_datasette`
You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
``actor`` - dictionary or None
The currently authenticated :ref:`actor <authentication_actor>`.
``request`` - :ref:`internals_request`
The current HTTP request.
Populates an actions menu on the top-level index homepage of the Datasette instance.
This example adds a link an imagined tool for editing the homepage, only for signed in users:
.. code-block:: python
from datasette import hookimpl
@hookimpl
def homepage_actions(datasette, actor):
if actor:
return [
{
"href": datasette.urls.path(
"/-/customize-homepage"
),
"label": "Customize homepage",
}
]
.. _plugin_hook_slots:
Template slots

Wyświetl plik

@ -7,6 +7,30 @@ You can write one-off plugins that apply to just one Datasette instance, or you
Want to start by looking at an example? The `Datasette plugins directory <https://datasette.io/plugins>`__ lists more than 90 open source plugins with code you can explore. The :ref:`plugin hooks <plugin_hooks>` page includes links to example plugins for each of the documented hooks.
.. _writing_plugins_tracing:
Tracing plugin hooks
--------------------
The ``DATASETTE_TRACE_PLUGINS`` environment variable turns on detailed tracing showing exactly which hooks are being run. This can be useful for understanding how Datasette is using your plugin.
.. code-block:: bash
DATASETTE_TRACE_PLUGINS=1 datasette mydb.db
Example output::
actor_from_request:
{ 'datasette': <datasette.app.Datasette object at 0x100bc7220>,
'request': <asgi.Request method="GET" url="http://127.0.0.1:4433/">}
Hook implementations:
[ <HookImpl plugin_name='codespaces', plugin=<module 'datasette_codespaces' from '.../site-packages/datasette_codespaces/__init__.py'>>,
<HookImpl plugin_name='datasette.actor_auth_cookie', plugin=<module 'datasette.actor_auth_cookie' from '.../datasette/datasette/actor_auth_cookie.py'>>,
<HookImpl plugin_name='datasette.default_permissions', plugin=<module 'datasette.default_permissions' from '.../datasette/default_permissions.py'>>]
Results:
[{'id': 'root'}]
.. _writing_plugins_one_off:
Writing one-off plugins

Wyświetl plik

@ -84,7 +84,7 @@ setup(
"pytest-xdist>=2.2.1",
"pytest-asyncio>=0.17",
"beautifulsoup4>=4.8.1",
"black==24.1.1",
"black==24.2.0",
"blacken-docs==1.16.0",
"pytest-timeout>=1.4.2",
"trustme>=0.7",

Wyświetl plik

@ -42,18 +42,22 @@ EXPECTED_PLUGINS = [
"extra_js_urls",
"extra_template_vars",
"forbidden",
"homepage_actions",
"menu_links",
"permission_allowed",
"prepare_connection",
"prepare_jinja2_environment",
"query_actions",
"register_facet_classes",
"register_magic_parameters",
"register_permissions",
"register_routes",
"render_cell",
"row_actions",
"skip_csrf",
"startup",
"table_actions",
"view_actions",
],
},
{

Wyświetl plik

@ -7,6 +7,7 @@ from datasette.utils.asgi import asgi_send_json, Response
import base64
import pint
import json
import urllib
ureg = pint.UnitRegistry()
@ -390,6 +391,50 @@ def table_actions(datasette, database, table, actor):
]
@hookimpl
def view_actions(datasette, database, view, actor):
if actor:
return [
{
"href": datasette.urls.instance(),
"label": f"Database: {database}",
},
{"href": datasette.urls.instance(), "label": f"View: {view}"},
]
@hookimpl
def query_actions(datasette, database, query_name, sql):
# Don't explain an explain
if sql.lower().startswith("explain"):
return
return [
{
"href": datasette.urls.database(database)
+ "?"
+ urllib.parse.urlencode(
{
"sql": "explain " + sql,
}
),
"label": "Explain this query",
"description": "Runs a SQLite explain",
},
]
@hookimpl
def row_actions(datasette, database, table, actor, row):
if actor:
return [
{
"href": datasette.urls.instance(),
"label": f"Row details for {actor['id']}",
"description": json.dumps(dict(row), default=repr),
},
]
@hookimpl
def database_actions(datasette, database, actor, request):
if actor:
@ -404,6 +449,18 @@ def database_actions(datasette, database, actor, request):
]
@hookimpl
def homepage_actions(datasette, actor, request):
if actor:
label = f"Custom homepage for: {actor['id']}"
return [
{
"href": datasette.urls.path("/-/custom-homepage"),
"label": label,
}
]
@hookimpl
def skip_csrf(scope):
return scope["path"] == "/skip-csrf"

Wyświetl plik

@ -1,5 +1,5 @@
#!/bin/bash
# This should only run in environemnts where both
# This should only run in environments where both
# datasette-init and datasette-json-html are installed
PLUGINS=$(datasette plugins)

Wyświetl plik

@ -1018,6 +1018,21 @@ async def test_hidden_sqlite_stat1_table():
)
@pytest.mark.asyncio
async def test_hide_tables_starting_with_underscore():
ds = Datasette()
db = ds.add_memory_database("test_hide_tables_starting_with_underscore")
await db.execute_write("create table normal (id integer primary key, name text)")
await db.execute_write("create table _hidden (id integer primary key, name text)")
data = (
await ds.client.get(
"/test_hide_tables_starting_with_underscore.json?_show_hidden=1"
)
).json()
tables = [(t["name"], t["hidden"]) for t in data["tables"]]
assert tables == [("normal", False), ("_hidden", True)]
@pytest.mark.asyncio
@pytest.mark.parametrize("db_name", ("foo", r"fo%o", "f~/c.d"))
async def test_tilde_encoded_database_names(db_name):

Wyświetl plik

@ -221,6 +221,14 @@ async def test_insert_rows(ds_write, return_rows):
400,
['Cannot use "ignore" and "replace" at the same time'],
),
(
# Replace is not allowed if you don't have update-row
"/data/docs/-/insert",
{"rows": [{"title": "Test"}], "replace": True},
"insert-but-not-update",
403,
['Permission denied: need update-row to use "replace"'],
),
(
"/data/docs/-/insert",
{"rows": [{"title": "Test"}], "invalid_param": True},
@ -365,6 +373,41 @@ async def test_insert_or_upsert_row_errors(
assert before_count == after_count
@pytest.mark.asyncio
@pytest.mark.parametrize("allowed", (True, False))
async def test_upsert_permissions_per_table(ds_write, allowed):
# https://github.com/simonw/datasette/issues/2262
token = "dstok_{}".format(
ds_write.sign(
{
"a": "root",
"token": "dstok",
"t": int(time.time()),
"_r": {
"r": {
"data": {
"docs" if allowed else "other": ["ir", "ur"],
}
}
},
},
namespace="token",
)
)
response = await ds_write.client.post(
"/data/docs/-/upsert",
json={"rows": [{"id": 1, "title": "One"}]},
headers={
"Authorization": "Bearer {}".format(token),
},
)
if allowed:
assert response.status_code == 200
assert response.json()["ok"] is True
else:
assert response.status_code == 403
@pytest.mark.asyncio
@pytest.mark.parametrize(
"ignore,replace,expected_rows",
@ -822,13 +865,14 @@ async def test_drop_table(ds_write, scenario):
@pytest.mark.asyncio
@pytest.mark.parametrize(
"input,expected_status,expected_response",
"input,expected_status,expected_response,expected_events",
(
# Permission error with a bad token
(
{"table": "bad", "row": {"id": 1}},
403,
{"ok": False, "errors": ["Permission denied"]},
[],
),
# Successful creation with columns:
(
@ -875,6 +919,7 @@ async def test_drop_table(ds_write, scenario):
")"
),
},
["create-table"],
),
# Successful creation with rows:
(
@ -910,6 +955,7 @@ async def test_drop_table(ds_write, scenario):
),
"row_count": 2,
},
["create-table", "insert-rows"],
),
# Successful creation with row:
(
@ -938,6 +984,7 @@ async def test_drop_table(ds_write, scenario):
),
"row_count": 1,
},
["create-table", "insert-rows"],
),
# Create with row and no primary key
(
@ -957,6 +1004,7 @@ async def test_drop_table(ds_write, scenario):
"schema": ("CREATE TABLE [four] (\n" " [name] TEXT\n" ")"),
"row_count": 1,
},
["create-table", "insert-rows"],
),
# Create table with compound primary key
(
@ -978,6 +1026,7 @@ async def test_drop_table(ds_write, scenario):
),
"row_count": 1,
},
["create-table", "insert-rows"],
),
# Error: Table is required
(
@ -989,6 +1038,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["Table is required"],
},
[],
),
# Error: Invalid table name
(
@ -1001,6 +1051,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["Invalid table name"],
},
[],
),
# Error: JSON must be an object
(
@ -1010,6 +1061,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["JSON must be an object"],
},
[],
),
# Error: Cannot specify columns with rows or row
(
@ -1023,6 +1075,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["Cannot specify columns with rows or row"],
},
[],
),
# Error: columns, rows or row is required
(
@ -1034,6 +1087,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["columns, rows or row is required"],
},
[],
),
# Error: columns must be a list
(
@ -1046,6 +1100,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["columns must be a list"],
},
[],
),
# Error: columns must be a list of objects
(
@ -1058,6 +1113,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["columns must be a list of objects"],
},
[],
),
# Error: Column name is required
(
@ -1070,6 +1126,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["Column name is required"],
},
[],
),
# Error: Unsupported column type
(
@ -1082,6 +1139,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["Unsupported column type: bad"],
},
[],
),
# Error: Duplicate column name
(
@ -1097,6 +1155,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["Duplicate column name: id"],
},
[],
),
# Error: rows must be a list
(
@ -1109,6 +1168,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["rows must be a list"],
},
[],
),
# Error: rows must be a list of objects
(
@ -1121,6 +1181,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["rows must be a list of objects"],
},
[],
),
# Error: pk must be a string
(
@ -1134,6 +1195,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["pk must be a string"],
},
[],
),
# Error: Cannot specify both pk and pks
(
@ -1148,6 +1210,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["Cannot specify both pk and pks"],
},
[],
),
# Error: pks must be a list
(
@ -1161,12 +1224,14 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["pks must be a list"],
},
[],
),
# Error: pks must be a list of strings
(
{"table": "bad", "row": {"id": 1, "name": "Row 1"}, "pks": [1, 2]},
400,
{"ok": False, "errors": ["pks must be a list of strings"]},
[],
),
# Error: ignore and replace are mutually exclusive
(
@ -1182,6 +1247,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["ignore and replace are mutually exclusive"],
},
[],
),
# ignore and replace require row or rows
(
@ -1195,6 +1261,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["ignore and replace require row or rows"],
},
[],
),
# ignore and replace require pk or pks
(
@ -1208,6 +1275,7 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["ignore and replace require pk or pks"],
},
[],
),
(
{
@ -1220,10 +1288,14 @@ async def test_drop_table(ds_write, scenario):
"ok": False,
"errors": ["ignore and replace require pk or pks"],
},
[],
),
),
)
async def test_create_table(ds_write, input, expected_status, expected_response):
async def test_create_table(
ds_write, input, expected_status, expected_response, expected_events
):
ds_write._tracked_events = []
# Special case for expected status of 403
if expected_status == 403:
token = "bad_token"
@ -1237,12 +1309,9 @@ async def test_create_table(ds_write, input, expected_status, expected_response)
assert response.status_code == expected_status
data = response.json()
assert data == expected_response
# create-table event
if expected_status == 201:
event = last_event(ds_write)
assert event.name == "create-table"
assert event.actor == {"id": "root", "token": "dstok"}
assert event.schema.startswith("CREATE TABLE ")
# Should have tracked the expected events
events = ds_write._tracked_events
assert [e.name for e in events] == expected_events
@pytest.mark.asyncio
@ -1255,7 +1324,7 @@ async def test_create_table(ds_write, input, expected_status, expected_response)
["create-table"],
{"table": "t", "rows": [{"name": "c"}]},
403,
["Permission denied - need insert-row"],
["Permission denied: need insert-row"],
),
# This should work:
(
@ -1269,7 +1338,7 @@ async def test_create_table(ds_write, input, expected_status, expected_response)
["create-table", "insert-row"],
{"table": "t", "rows": [{"id": 1}], "pk": "id", "replace": True},
403,
["Permission denied - need update-row"],
["Permission denied: need update-row"],
),
),
)
@ -1341,6 +1410,8 @@ async def test_create_table_ignore_replace(ds_write, input, expected_rows_after)
)
assert first_response.status_code == 201
ds_write._tracked_events = []
# Try a second time
second_response = await ds_write.client.post(
"/data/-/create",
@ -1352,6 +1423,10 @@ async def test_create_table_ignore_replace(ds_write, input, expected_rows_after)
rows = await ds_write.client.get("/data/test_insert_replace.json?_shape=array")
assert rows.json() == expected_rows_after
# Check it fired the right events
event_names = [e.name for e in ds_write._tracked_events]
assert event_names == ["insert-rows"]
@pytest.mark.asyncio
async def test_create_table_error_if_pk_changed(ds_write):
@ -1436,6 +1511,7 @@ async def test_method_not_allowed(ds_write, path):
@pytest.mark.asyncio
async def test_create_uses_alter_by_default_for_new_table(ds_write):
ds_write._tracked_events = []
token = write_token(ds_write)
response = await ds_write.client.post(
"/data/-/create",
@ -1455,8 +1531,8 @@ async def test_create_uses_alter_by_default_for_new_table(ds_write):
headers=_headers(token),
)
assert response.status_code == 201
event = last_event(ds_write)
assert event.name == "create-table"
event_names = [e.name for e in ds_write._tracked_events]
assert event_names == ["create-table", "insert-rows"]
@pytest.mark.asyncio
@ -1482,6 +1558,8 @@ async def test_create_using_alter_against_existing_table(
headers=_headers(token),
)
assert response.status_code == 201
ds_write._tracked_events = []
# Now try to insert more rows using /-/create with alter=True
response2 = await ds_write.client.post(
"/data/-/create",
@ -1497,12 +1575,20 @@ async def test_create_using_alter_against_existing_table(
assert response2.status_code == 403
assert response2.json() == {
"ok": False,
"errors": ["Permission denied - need alter-table"],
"errors": ["Permission denied: need alter-table"],
}
else:
assert response2.status_code == 201
event_names = [e.name for e in ds_write._tracked_events]
assert event_names == ["alter-table", "insert-rows"]
# It should have altered the table
event = last_event(ds_write)
assert event.name == "alter-table"
assert "extra" not in event.before_schema
assert "extra" in event.after_schema
alter_event = ds_write._tracked_events[0]
assert alter_event.name == "alter-table"
assert "extra" not in alter_event.before_schema
assert "extra" in alter_event.after_schema
insert_rows_event = ds_write._tracked_events[1]
assert insert_rows_event.name == "insert-rows"
assert insert_rows_event.num_rows == 1

Wyświetl plik

@ -110,7 +110,7 @@ async def test_logout_button_in_navigation(ds_client, path):
anon_response = await ds_client.get(path)
for fragment in (
"<strong>test</strong>",
'<form action="/-/logout" method="post">',
'<form class="nav-menu-logout" action="/-/logout" method="post">',
):
assert fragment in response.text
assert fragment not in anon_response.text
@ -121,7 +121,10 @@ async def test_logout_button_in_navigation(ds_client, path):
async def test_no_logout_button_in_navigation_if_no_ds_actor_cookie(ds_client, path):
response = await ds_client.get(path + "?_bot=1")
assert "<strong>bot</strong>" in response.text
assert '<form action="/-/logout" method="post">' not in response.text
assert (
'<form class="nav-menu-logout" action="/-/logout" method="post">'
not in response.text
)
@pytest.mark.parametrize(

Wyświetl plik

@ -7,6 +7,11 @@ import pytest
"args,expected_where,expected_params",
[
((("name_english__contains", "foo"),), ['"name_english" like :p0'], ["%foo%"]),
(
(("name_english__notcontains", "foo"),),
['"name_english" not like :p0'],
["%foo%"],
),
(
(("foo", "bar"), ("bar__contains", "baz")),
['"bar" like :p0', '"foo" = :p1'],

Wyświetl plik

@ -78,6 +78,10 @@ async def test_static(ds_client):
response = await ds_client.get("/-/static/app.css")
assert response.status_code == 200
assert "text/css" == response.headers["content-type"]
assert "etag" in response.headers
etag = response.headers.get("etag")
response = await ds_client.get("/-/static/app.css", headers={"if-none-match": etag})
assert response.status_code == 304
def test_static_mounts():

Wyświetl plik

@ -66,6 +66,33 @@ async def test_execute_fn(db):
assert 2 == await db.execute_fn(get_1_plus_1)
@pytest.mark.asyncio
async def test_execute_fn_transaction_false():
datasette = Datasette(memory=True)
db = datasette.add_memory_database("test_execute_fn_transaction_false")
def run(conn):
try:
with conn:
conn.execute("create table foo (id integer primary key)")
conn.execute("insert into foo (id) values (44)")
# Table should exist
assert (
conn.execute(
'select count(*) from sqlite_master where name = "foo"'
).fetchone()[0]
== 1
)
assert conn.execute("select id from foo").fetchall()[0][0] == 44
raise ValueError("Cancel commit")
except ValueError:
pass
# Row should NOT exist
assert conn.execute("select count(*) from foo").fetchone()[0] == 0
await db.execute_write_fn(run, transaction=False)
@pytest.mark.parametrize(
"tables,exists",
(
@ -474,9 +501,8 @@ async def test_execute_write_has_correctly_prepared_connection(db):
@pytest.mark.asyncio
async def test_execute_write_fn_block_false(db):
def write_fn(conn):
with conn:
conn.execute("delete from roadside_attractions where pk = 1;")
row = conn.execute("select count(*) from roadside_attractions").fetchone()
conn.execute("delete from roadside_attractions where pk = 1;")
row = conn.execute("select count(*) from roadside_attractions").fetchone()
return row[0]
task_id = await db.execute_write_fn(write_fn, block=False)
@ -486,9 +512,8 @@ async def test_execute_write_fn_block_false(db):
@pytest.mark.asyncio
async def test_execute_write_fn_block_true(db):
def write_fn(conn):
with conn:
conn.execute("delete from roadside_attractions where pk = 1;")
row = conn.execute("select count(*) from roadside_attractions").fetchone()
conn.execute("delete from roadside_attractions where pk = 1;")
row = conn.execute("select count(*) from roadside_attractions").fetchone()
return row[0]
new_count = await db.execute_write_fn(write_fn)

Wyświetl plik

@ -378,6 +378,13 @@ async def test_permissions_debug(ds_client):
cookie = ds_client.actor_cookie({"id": "root"})
response = await ds_client.get("/-/permissions", cookies={"ds_actor": cookie})
assert response.status_code == 200
# Should have a select box listing permissions
for fragment in (
'<select name="permission" id="permission">',
'<option value="view-instance">view-instance (default True)</option>',
'<option value="insert-row">insert-row (default False)</option>',
):
assert fragment in response.text
# Should show one failure and one success
soup = Soup(response.text, "html.parser")
check_divs = soup.findAll("div", {"class": "check"})
@ -673,6 +680,7 @@ async def test_actor_restricted_permissions(
"permission": permission,
"resource": expected_resource,
"result": expected_result,
"default": perms_ds.permissions[permission].default,
}
assert response.json() == expected

Wyświetl plik

@ -923,43 +923,128 @@ async def test_hook_menu_links(ds_client):
@pytest.mark.asyncio
@pytest.mark.parametrize("table_or_view", ["facetable", "simple_view"])
async def test_hook_table_actions(ds_client, table_or_view):
def get_table_actions_links(html):
soup = Soup(html, "html.parser")
details = soup.find("details", {"class": "actions-menu-links"})
if details is None:
return []
return [{"label": a.text, "href": a["href"]} for a in details.select("a")]
response = await ds_client.get(f"/fixtures/{table_or_view}")
assert get_table_actions_links(response.text) == []
response_2 = await ds_client.get(f"/fixtures/{table_or_view}?_bot=1&_hello=BOB")
async def test_hook_table_actions(ds_client):
response = await ds_client.get("/fixtures/facetable")
assert get_actions_links(response.text) == []
response_2 = await ds_client.get("/fixtures/facetable?_bot=1&_hello=BOB")
assert ">Table actions<" in response_2.text
assert sorted(
get_table_actions_links(response_2.text), key=lambda link: link["label"]
get_actions_links(response_2.text), key=lambda link: link["label"]
) == [
{"label": "Database: fixtures", "href": "/"},
{"label": "From async BOB", "href": "/"},
{"label": f"Table: {table_or_view}", "href": "/"},
{"label": "Database: fixtures", "href": "/", "description": None},
{"label": "From async BOB", "href": "/", "description": None},
{"label": "Table: facetable", "href": "/", "description": None},
]
@pytest.mark.asyncio
async def test_hook_view_actions(ds_client):
response = await ds_client.get("/fixtures/simple_view")
assert get_actions_links(response.text) == []
response_2 = await ds_client.get(
"/fixtures/simple_view",
cookies={"ds_actor": ds_client.actor_cookie({"id": "bob"})},
)
assert ">View actions<" in response_2.text
assert sorted(
get_actions_links(response_2.text), key=lambda link: link["label"]
) == [
{"label": "Database: fixtures", "href": "/", "description": None},
{"label": "View: simple_view", "href": "/", "description": None},
]
def get_actions_links(html):
soup = Soup(html, "html.parser")
details = soup.find("details", {"class": "actions-menu-links"})
if details is None:
return []
links = []
for a_el in details.select("a"):
description = None
if a_el.find("p") is not None:
description = a_el.find("p").text.strip()
a_el.find("p").extract()
label = a_el.text.strip()
href = a_el["href"]
links.append({"label": label, "href": href, "description": description})
return links
@pytest.mark.asyncio
@pytest.mark.parametrize(
"path,expected_url",
(
("/fixtures?sql=select+1", "/fixtures?sql=explain+select+1"),
(
"/fixtures/pragma_cache_size",
"/fixtures?sql=explain+PRAGMA+cache_size%3B",
),
# Don't attempt to explain an explain
("/fixtures?sql=explain+select+1", None),
),
)
async def test_hook_query_actions(ds_client, path, expected_url):
response = await ds_client.get(path)
assert response.status_code == 200
links = get_actions_links(response.text)
if expected_url is None:
assert links == []
else:
assert links == [
{
"label": "Explain this query",
"href": expected_url,
"description": "Runs a SQLite explain",
}
]
@pytest.mark.asyncio
async def test_hook_row_actions(ds_client):
response = await ds_client.get("/fixtures/facet_cities/1")
assert get_actions_links(response.text) == []
response_2 = await ds_client.get(
"/fixtures/facet_cities/1",
cookies={"ds_actor": ds_client.actor_cookie({"id": "sam"})},
)
assert get_actions_links(response_2.text) == [
{
"label": "Row details for sam",
"href": "/",
"description": '{"id": 1, "name": "San Francisco"}',
}
]
@pytest.mark.asyncio
async def test_hook_database_actions(ds_client):
def get_table_actions_links(html):
soup = Soup(html, "html.parser")
details = soup.find("details", {"class": "actions-menu-links"})
if details is None:
return []
return [{"label": a.text, "href": a["href"]} for a in details.select("a")]
response = await ds_client.get("/fixtures")
assert get_table_actions_links(response.text) == []
assert get_actions_links(response.text) == []
response_2 = await ds_client.get("/fixtures?_bot=1&_hello=BOB")
assert get_table_actions_links(response_2.text) == [
{"label": "Database: fixtures - BOB", "href": "/"},
assert get_actions_links(response_2.text) == [
{"label": "Database: fixtures - BOB", "href": "/", "description": None},
]
@pytest.mark.asyncio
async def test_hook_homepage_actions(ds_client):
response = await ds_client.get("/")
# No button for anonymous users
assert "<span>Homepage actions</span>" not in response.text
# Signed in user gets an action
response2 = await ds_client.get(
"/", cookies={"ds_actor": ds_client.actor_cookie({"id": "troy"})}
)
assert "<span>Homepage actions</span>" in response2.text
assert get_actions_links(response2.text) == [
{
"label": "Custom homepage for: troy",
"href": "/-/custom-homepage",
"description": None,
},
]

Wyświetl plik

@ -706,3 +706,15 @@ def test_truncate_url(url, length, expected):
def test_pairs_to_nested_config(pairs, expected):
actual = utils.pairs_to_nested_config(pairs)
assert actual == expected
@pytest.mark.asyncio
async def test_calculate_etag(tmp_path):
path = tmp_path / "test.txt"
path.write_text("hello")
etag = '"5d41402abc4b2a76b9719d911017c592"'
assert etag == await utils.calculate_etag(path)
assert utils._etag_cache[path] == etag
utils._etag_cache[path] = "hash"
assert "hash" == await utils.calculate_etag(path)
utils._etag_cache.clear()