Porównaj commity

...

22 Commity
1.0a17 ... main

Autor SHA1 Wiadomość Data
Simon Willison 909448fb7a Run CLI coroutines on explicit event loops
With the help of Codex CLI: https://gist.github.com/simonw/d2de93bfdf85a014a29093720c511093
2025-10-01 12:59:14 -07:00
Simon Willison 5d09ab3ff1 Remove legacy event_loop fixture usage 2025-10-01 12:51:23 -07:00
Simon Willison 571ce651c1 Use venv Python to launch datasette fixtures 2025-10-01 12:49:09 -07:00
Simon Willison d87bd12dbc Remove obsolete mix_stderr=False 2025-09-30 14:33:24 -07:00
Simon Willison 9dc2a3ffe5 Removed broken refs to Glitch, closes #2503 2025-09-28 21:15:58 -07:00
Simon Willison 7a602140df catalog_views table, closes #2495
Refs https://github.com/datasette/datasette-queries/issues/1#issuecomment-3074491003
2025-07-15 10:22:56 -07:00
Simon Willison e2497fdb59 Replace Glitch with Codespaces, closes #2488 2025-05-28 19:17:22 -07:00
Simon Willison 1c77a7e33f Fix global-power-points references
Refs https://github.com/simonw/datasette.io/issues/167
2025-05-28 19:07:46 -07:00
Simon Willison 6f7f4c7d89 Release 1.0a19
Refs #2479
2025-04-21 22:38:53 -07:00
Simon Willison f4274e7a2e CSS fix for table headings on mobile, closes #2479 2025-04-21 22:33:34 -07:00
Simon Willison 271aa09056 Release 1.0a18
Refs #2466, #2468, #2470, #2476, #2477
2025-04-16 22:16:25 -07:00
Jack Stratton d5c6e502fb
fix: tilde encode database name in expanded foreign key links (#2476)
* Tilde encode database for expanded foreign key links
* Test for foreign key fix in #2476

---------

Co-authored-by: Simon Willison <swillison@gmail.com>
2025-04-16 22:15:11 -07:00
Simon Willison f2485dce9c
Hide FTS tables that have content=
* Hide FTS tables that have content=, closes #2477
2025-04-16 21:44:09 -07:00
Simon Willison f6446b3095 Further wording tweaks 2025-04-16 08:25:03 -07:00
Simon Willison d03273e205
Wording tweak 2025-04-16 08:19:22 -07:00
Simon Willison d021ce97aa
Note that only first actor_from_request value is respected
https://github.com/datasette/datasette-profiles/issues/4#issuecomment-2758588167
2025-03-27 09:09:57 -07:00
Simon Willison 7945f4fbf2 Improved docs for db.get_all_foreign_keys() 2025-03-12 15:42:11 -07:00
dependabot[bot] da209ed2ba
Drop 3.8 testing, add 3.13 testing, upgrade Black
Also bump some GitHub Actions versions.
2025-03-09 20:45:18 -07:00
Simon Willison 333f786cb0 Correct syntax for link headers, closes #2470 2025-03-09 20:05:43 -05:00
Simon Willison 6e512caa59 Upgrade to actions/cache@v4
v2 no longer works.
2025-02-28 22:57:22 -08:00
Simon Willison 209bdee0e8 Don't run prepare_connection() on internal database, closes #2468 2025-02-18 10:23:23 -08:00
Simon Willison e59fd01757 Fix for incorrect REFERENCES in internal DB
Refs #2466
2025-02-12 19:40:43 -08:00
38 zmienionych plików z 303 dodań i 135 usunięć

Wyświetl plik

@ -20,7 +20,7 @@ jobs:
# gcloud commmand breaks on higher Python versions, so stick with 3.9:
with:
python-version: "3.9"
- uses: actions/cache@v3
- uses: actions/cache@v4
name: Configure pip caching
with:
path: ~/.cache/pip

Wyświetl plik

@ -10,8 +10,8 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Check out repo
uses: actions/checkout@v2
- uses: actions/cache@v2
uses: actions/checkout@v4
- uses: actions/cache@v4
name: Configure npm caching
with:
path: ~/.npm

Wyświetl plik

@ -12,15 +12,15 @@ jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python 3.10
uses: actions/setup-python@v3
uses: actions/setup-python@v5
with:
python-version: "3.10"
cache: 'pip'
cache-dependency-path: '**/setup.py'
- name: Cache Playwright browsers
uses: actions/cache@v2
uses: actions/cache@v4
with:
path: ~/.cache/ms-playwright/
key: ${{ runner.os }}-browsers

Wyświetl plik

@ -12,7 +12,7 @@ jobs:
strategy:
matrix:
platform: [ubuntu-latest]
python-version: [ "3.8", "3.9", "3.10", "3.11", "3.12"]
python-version: ["3.9", "3.10", "3.11", "3.12", "3.13"]
sqlite-version: [
#"3", # latest version
"3.46",

Wyświetl plik

@ -10,7 +10,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.8", "3.9", "3.10", "3.11", "3.12", "3.13"]
python-version: ["3.9", "3.10", "3.11", "3.12", "3.13"]
steps:
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}

Wyświetl plik

@ -15,7 +15,7 @@ Datasette is a tool for exploring and publishing data. It helps people take data
Datasette is aimed at data journalists, museum curators, archivists, local governments, scientists, researchers and anyone else who has data that they wish to share with the world.
[Explore a demo](https://global-power-plants.datasettes.com/global-power-plants/global-power-plants), watch [a video about the project](https://simonwillison.net/2021/Feb/7/video/) or try it out by [uploading and publishing your own CSV data](https://docs.datasette.io/en/stable/getting_started.html#try-datasette-without-installing-anything-using-glitch).
[Explore a demo](https://datasette.io/global-power-plants/global-power-plants), watch [a video about the project](https://simonwillison.net/2021/Feb/7/video/) or try it out [on GitHub Codespaces](https://github.com/datasette/datasette-studio).
* [datasette.io](https://datasette.io/) is the official project website
* Latest [Datasette News](https://datasette.io/news)

Wyświetl plik

@ -116,6 +116,8 @@ app_root = Path(__file__).parent.parent
# https://github.com/simonw/datasette/issues/283#issuecomment-781591015
SQLITE_LIMIT_ATTACHED = 10
INTERNAL_DB_NAME = "__INTERNAL__"
Setting = collections.namedtuple("Setting", ("name", "default", "help"))
SETTINGS = (
Setting("default_page_size", 100, "Default page size for the table view"),
@ -328,7 +330,7 @@ class Datasette:
self._internal_database = Database(self, memory_name=secrets.token_hex())
else:
self._internal_database = Database(self, path=internal, mode="rwc")
self._internal_database.name = "__INTERNAL__"
self._internal_database.name = INTERNAL_DB_NAME
self.cache_headers = cache_headers
self.cors = cors
@ -878,7 +880,7 @@ class Datasette:
def _prepare_connection(self, conn, database):
conn.row_factory = sqlite3.Row
conn.text_factory = lambda x: str(x, "utf-8", "replace")
if self.sqlite_extensions:
if self.sqlite_extensions and database != INTERNAL_DB_NAME:
conn.enable_load_extension(True)
for extension in self.sqlite_extensions:
# "extension" is either a string path to the extension
@ -891,7 +893,8 @@ class Datasette:
if self.setting("cache_size_kb"):
conn.execute(f"PRAGMA cache_size=-{self.setting('cache_size_kb')}")
# pylint: disable=no-member
pm.hook.prepare_connection(conn=conn, database=database, datasette=self)
if database != INTERNAL_DB_NAME:
pm.hook.prepare_connection(conn=conn, database=database, datasette=self)
# If self.crossdb and this is _memory, connect the first SQLITE_LIMIT_ATTACHED databases
if self.crossdb and database == "_memory":
count = 0

Wyświetl plik

@ -42,6 +42,18 @@ from .utils.sqlite import sqlite3
from .utils.testing import TestClient
from .version import __version__
def run_sync(coro_func):
"""Run an async callable to completion on a fresh event loop."""
loop = asyncio.new_event_loop()
try:
asyncio.set_event_loop(loop)
return loop.run_until_complete(coro_func())
finally:
asyncio.set_event_loop(None)
loop.close()
# Use Rich for tracebacks if it is installed
try:
from rich.traceback import install
@ -135,8 +147,7 @@ def inspect(files, inspect_file, sqlite_extensions):
operations against immutable database files.
"""
app = Datasette([], immutables=files, sqlite_extensions=sqlite_extensions)
loop = asyncio.get_event_loop()
inspect_data = loop.run_until_complete(inspect_(files, sqlite_extensions))
inspect_data = run_sync(lambda: inspect_(files, sqlite_extensions))
if inspect_file == "-":
sys.stdout.write(json.dumps(inspect_data, indent=2))
else:
@ -612,10 +623,10 @@ def serve(
return ds
# Run the "startup" plugin hooks
asyncio.get_event_loop().run_until_complete(ds.invoke_startup())
run_sync(ds.invoke_startup)
# Run async soundness checks - but only if we're not under pytest
asyncio.get_event_loop().run_until_complete(check_databases(ds))
run_sync(lambda: check_databases(ds))
if token and not get:
raise click.ClickException("--token can only be used with --get")
@ -644,9 +655,7 @@ def serve(
if open_browser:
if url is None:
# Figure out most convenient URL - to table, database or homepage
path = asyncio.get_event_loop().run_until_complete(
initial_path_for_datasette(ds)
)
path = run_sync(lambda: initial_path_for_datasette(ds))
url = f"http://{host}:{port}{path}"
webbrowser.open(url)
uvicorn_kwargs = dict(
@ -748,8 +757,7 @@ def create_token(
ds = Datasette(secret=secret, plugins_dir=plugins_dir)
# Run ds.invoke_startup() in an event loop
loop = asyncio.get_event_loop()
loop.run_until_complete(ds.invoke_startup())
run_sync(ds.invoke_startup)
# Warn about any unknown actions
actions = []

Wyświetl plik

@ -578,10 +578,22 @@ class Database:
SELECT name FROM fts3_shadow_tables
)
SELECT name FROM final ORDER BY 1
"""
)
]
# Also hide any FTS tables that have a content= argument
hidden_tables += [
x[0]
for x in await self.execute(
"""
SELECT name
FROM sqlite_master
WHERE sql LIKE '%VIRTUAL TABLE%'
AND sql LIKE '%USING FTS%'
AND sql LIKE '%content=%'
"""
)
]
has_spatialite = await self.execute_fn(detect_spatialite)
if has_spatialite:

Wyświetl plik

@ -468,12 +468,6 @@ table.rows-and-columns th {
table.rows-and-columns a:link {
text-decoration: none;
}
.rows-and-columns td:before {
display: block;
color: black;
margin-left: -10%;
font-size: 0.8em;
}
.rows-and-columns td ol,
.rows-and-columns td ul {
list-style: initial;
@ -765,7 +759,7 @@ p.zero-results {
left: -9999px;
}
.rows-and-columns tr {
table.rows-and-columns tr {
border: 1px solid #ccc;
margin-bottom: 1em;
border-radius: 10px;
@ -773,7 +767,7 @@ p.zero-results {
padding: 0.2rem;
}
.rows-and-columns td {
table.rows-and-columns td {
/* Behave like a "row" */
border: none;
border-bottom: 1px solid #eee;
@ -781,7 +775,7 @@ p.zero-results {
padding-left: 10%;
}
.rows-and-columns td:before {
table.rows-and-columns td:before {
display: block;
color: black;
margin-left: -10%;

Wyświetl plik

@ -17,7 +17,15 @@ async def init_internal_db(db):
rootpage INTEGER,
sql TEXT,
PRIMARY KEY (database_name, table_name),
FOREIGN KEY (database_name) REFERENCES databases(database_name)
FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name)
);
CREATE TABLE IF NOT EXISTS catalog_views (
database_name TEXT,
view_name TEXT,
rootpage INTEGER,
sql TEXT,
PRIMARY KEY (database_name, view_name),
FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name)
);
CREATE TABLE IF NOT EXISTS catalog_columns (
database_name TEXT,
@ -30,8 +38,8 @@ async def init_internal_db(db):
is_pk INTEGER, -- renamed from pk
hidden INTEGER,
PRIMARY KEY (database_name, table_name, name),
FOREIGN KEY (database_name) REFERENCES databases(database_name),
FOREIGN KEY (database_name, table_name) REFERENCES tables(database_name, table_name)
FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name),
FOREIGN KEY (database_name, table_name) REFERENCES catalog_tables(database_name, table_name)
);
CREATE TABLE IF NOT EXISTS catalog_indexes (
database_name TEXT,
@ -42,8 +50,8 @@ async def init_internal_db(db):
origin TEXT,
partial INTEGER,
PRIMARY KEY (database_name, table_name, name),
FOREIGN KEY (database_name) REFERENCES databases(database_name),
FOREIGN KEY (database_name, table_name) REFERENCES tables(database_name, table_name)
FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name),
FOREIGN KEY (database_name, table_name) REFERENCES catalog_tables(database_name, table_name)
);
CREATE TABLE IF NOT EXISTS catalog_foreign_keys (
database_name TEXT,
@ -57,8 +65,8 @@ async def init_internal_db(db):
on_delete TEXT,
match TEXT,
PRIMARY KEY (database_name, table_name, id, seq),
FOREIGN KEY (database_name) REFERENCES databases(database_name),
FOREIGN KEY (database_name, table_name) REFERENCES tables(database_name, table_name)
FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name),
FOREIGN KEY (database_name, table_name) REFERENCES catalog_tables(database_name, table_name)
);
"""
).strip()
@ -111,6 +119,9 @@ async def populate_schema_tables(internal_db, db):
conn.execute(
"DELETE FROM catalog_tables WHERE database_name = ?", [database_name]
)
conn.execute(
"DELETE FROM catalog_views WHERE database_name = ?", [database_name]
)
conn.execute(
"DELETE FROM catalog_columns WHERE database_name = ?", [database_name]
)
@ -125,13 +136,21 @@ async def populate_schema_tables(internal_db, db):
await internal_db.execute_write_fn(delete_everything)
tables = (await db.execute("select * from sqlite_master WHERE type = 'table'")).rows
views = (await db.execute("select * from sqlite_master WHERE type = 'view'")).rows
def collect_info(conn):
tables_to_insert = []
views_to_insert = []
columns_to_insert = []
foreign_keys_to_insert = []
indexes_to_insert = []
for view in views:
view_name = view["name"]
views_to_insert.append(
(database_name, view_name, view["rootpage"], view["sql"])
)
for table in tables:
table_name = table["name"]
tables_to_insert.append(
@ -165,6 +184,7 @@ async def populate_schema_tables(internal_db, db):
)
return (
tables_to_insert,
views_to_insert,
columns_to_insert,
foreign_keys_to_insert,
indexes_to_insert,
@ -172,6 +192,7 @@ async def populate_schema_tables(internal_db, db):
(
tables_to_insert,
views_to_insert,
columns_to_insert,
foreign_keys_to_insert,
indexes_to_insert,
@ -184,6 +205,13 @@ async def populate_schema_tables(internal_db, db):
""",
tables_to_insert,
)
await internal_db.execute_write_many(
"""
INSERT INTO catalog_views (database_name, view_name, rootpage, sql)
values (?, ?, ?, ?)
""",
views_to_insert,
)
await internal_db.execute_write_many(
"""
INSERT INTO catalog_columns (

Wyświetl plik

@ -1,2 +1,2 @@
__version__ = "1.0a17"
__version__ = "1.0a19"
__version_info__ = tuple(__version__.split("."))

Wyświetl plik

@ -1,3 +1,2 @@
class Context:
"Base class for all documented contexts"
pass

Wyświetl plik

@ -158,7 +158,7 @@ class BaseView:
template_context["alternate_url_json"] = alternate_url_json
headers.update(
{
"Link": '{}; rel="alternate"; type="application/json+datasette"'.format(
"Link": '<{}>; rel="alternate"; type="application/json+datasette"'.format(
alternate_url_json
)
}

Wyświetl plik

@ -181,7 +181,7 @@ class DatabaseView(View):
view_name="database",
),
headers={
"Link": '{}; rel="alternate"; type="application/json+datasette"'.format(
"Link": '<{}>; rel="alternate"; type="application/json+datasette"'.format(
alternate_url_json
)
},
@ -630,7 +630,7 @@ class QueryView(View):
data = {}
headers.update(
{
"Link": '{}; rel="alternate"; type="application/json+datasette"'.format(
"Link": '<{}>; rel="alternate"; type="application/json+datasette"'.format(
alternate_url_json
)
}

Wyświetl plik

@ -273,7 +273,7 @@ async def display_columns_and_rows(
link_template = LINK_WITH_LABEL if (label != value) else LINK_WITH_VALUE
display_value = markupsafe.Markup(
link_template.format(
database=database_name,
database=tilde_encode(database_name),
base_url=base_url,
table=tilde_encode(other_table),
link_id=tilde_encode(str(value)),
@ -894,7 +894,7 @@ async def table_view_traced(datasette, request):
)
headers.update(
{
"Link": '{}; rel="alternate"; type="application/json+datasette"'.format(
"Link": '<{}>; rel="alternate"; type="application/json+datasette"'.format(
alternate_url_json
)
}

Wyświetl plik

@ -4,6 +4,25 @@
Changelog
=========
.. _v1_0_a19:
1.0a19 (2025-04-21)
-------------------
- Tiny cosmetic bug fix for mobile display of table rows. (:issue:`2479`)
.. _v1_0_a18:
1.0a18 (2025-04-16)
-------------------
- Fix for incorrect foreign key references in the internal database schema. (:issue:`2466`)
- The ``prepare_connection()`` hook no longer runs for the internal database. (:issue:`2468`)
- Fixed bug where ``link:`` HTTP headers used invalid syntax. (:issue:`2470`)
- No longer tested against Python 3.8. Now tests against Python 3.13.
- FTS tables are now hidden by default if they correspond to a content table. (:issue:`2477`)
- Fixed bug with foreign key links to rows in databases with filenames containing a special character. Thanks, `Jack Stratton <https://github.com/phroa>`__. (`#2476 <https://github.com/simonw/datasette/pull/2476>`__)
.. _v1_0_a17:
1.0a17 (2025-02-06)

Wyświetl plik

@ -8,7 +8,7 @@ Play with a live demo
The best way to experience Datasette for the first time is with a demo:
* `global-power-plants.datasettes.com <https://global-power-plants.datasettes.com/global-power-plants/global-power-plants>`__ provides a searchable database of power plants around the world, using data from the `World Resources Institude <https://www.wri.org/publication/global-power-plant-database>`__ rendered using the `datasette-cluster-map <https://github.com/simonw/datasette-cluster-map>`__ plugin.
* `datasette.io/global-power-plants <https://datasette.io/global-power-plants/global-power-plants>`__ provides a searchable database of power plants around the world, using data from the `World Resources Institude <https://www.wri.org/publication/global-power-plant-database>`__ rendered using the `datasette-cluster-map <https://github.com/simonw/datasette-cluster-map>`__ plugin.
* `fivethirtyeight.datasettes.com <https://fivethirtyeight.datasettes.com/fivethirtyeight>`__ shows Datasette running against over 400 datasets imported from the `FiveThirtyEight GitHub repository <https://github.com/fivethirtyeight/data>`__.
.. _getting_started_tutorial:
@ -33,29 +33,18 @@ You can pass a URL to a CSV, SQLite or raw SQL file directly to Datasette Lite t
This `example link <https://lite.datasette.io/?url=https%3A%2F%2Fraw.githubusercontent.com%2FNUKnightLab%2Fsql-mysteries%2Fmaster%2Fsql-murder-mystery.db#/sql-murder-mystery>`__ opens Datasette Lite and loads the SQL Murder Mystery example database from `Northwestern University Knight Lab <https://github.com/NUKnightLab/sql-mysteries>`__.
.. _getting_started_glitch:
.. _getting_started_codespaces:
Try Datasette without installing anything using Glitch
------------------------------------------------------
Try Datasette without installing anything with Codespaces
---------------------------------------------------------
`Glitch <https://glitch.com/>`__ is a free online tool for building web apps directly from your web browser. You can use Glitch to try out Datasette without needing to install any software on your own computer.
`GitHub Codespaces <https://github.com/features/codespaces/>`__ offers a free browser-based development environment that lets you run a development server without installing any local software.
Here's a demo project on Glitch which you can use as the basis for your own experiments:
Here's a demo project on GitHub which you can use as the basis for your own experiments:
`glitch.com/~datasette-csvs <https://glitch.com/~datasette-csvs>`__
`github.com/datasette/datasette-studio <https://github.com/datasette/datasette-studio>`__
Glitch allows you to "remix" any project to create your own copy and start editing it in your browser. You can remix the ``datasette-csvs`` project by clicking this button:
.. image:: https://cdn.glitch.com/2703baf2-b643-4da7-ab91-7ee2a2d00b5b%2Fremix-button.svg
:target: https://glitch.com/edit/#!/remix/datasette-csvs
Find a CSV file and drag it onto the Glitch file explorer panel - ``datasette-csvs`` will automatically convert it to a SQLite database (using `sqlite-utils <https://github.com/simonw/sqlite-utils>`__) and allow you to start exploring it using Datasette.
If your CSV file has a ``latitude`` and ``longitude`` column you can visualize it on a map by uncommenting the ``datasette-cluster-map`` line in the ``requirements.txt`` file using the Glitch file editor.
Need some data? Try this `Public Art Data <https://data.seattle.gov/Community/Public-Art-Data/j7sn-tdzk>`__ for the city of Seattle - hit "Export" and select "CSV" to download it as a CSV file.
For more on how this works, see `Running Datasette on Glitch <https://simonwillison.net/2019/Apr/23/datasette-glitch/>`__.
The README file in that repository has instructions on how to get started.
.. _getting_started_your_computer:

Wyświetl plik

@ -25,7 +25,7 @@ Datasette is a tool for exploring and publishing data. It helps people take data
Datasette is aimed at data journalists, museum curators, archivists, local governments and anyone else who has data that they wish to share with the world. It is part of a :ref:`wider ecosystem of tools and plugins <ecosystem>` dedicated to making working with structured data as productive as possible.
`Explore a demo <https://fivethirtyeight.datasettes.com/fivethirtyeight>`__, watch `a presentation about the project <https://static.simonwillison.net/static/2018/pybay-datasette/>`__ or :ref:`getting_started_glitch`.
`Explore a demo <https://fivethirtyeight.datasettes.com/fivethirtyeight>`__, watch `a presentation about the project <https://static.simonwillison.net/static/2018/pybay-datasette/>`__.
Interested in learning Datasette? Start with `the official tutorials <https://datasette.io/tutorials>`__.

Wyświetl plik

@ -4,9 +4,6 @@
Installation
==============
.. note::
If you just want to try Datasette out you don't need to install anything: see :ref:`getting_started_glitch`
There are two main options for installing Datasette. You can install it directly on to your machine, or you can install it using Docker.
If you want to start making contributions to the Datasette project by installing a copy that lets you directly modify the code, take a look at our guide to :ref:`devenvironment`.

Wyświetl plik

@ -1294,27 +1294,64 @@ The ``Database`` class also provides properties and methods for introspecting th
Returns the SQL definition of the named view.
``await db.get_all_foreign_keys()`` - dictionary
Dictionary representing both incoming and outgoing foreign keys for this table. It has two keys, ``"incoming"`` and ``"outgoing"``, each of which is a list of dictionaries with keys ``"column"``, ``"other_table"`` and ``"other_column"``. For example:
Dictionary representing both incoming and outgoing foreign keys for every table in this database. Each key is a table name that points to a dictionary with two keys, ``"incoming"`` and ``"outgoing"``, each of which is a list of dictionaries with keys ``"column"``, ``"other_table"`` and ``"other_column"``. For example:
.. code-block:: json
{
"documents": {
"incoming": [
{
"other_table": "pages",
"column": "id",
"other_column": "document_id"
}
],
"outgoing": []
},
"pages": {
"incoming": [
{
"other_table": "organization_pages",
"column": "id",
"other_column": "page_id"
}
],
"outgoing": [
{
"other_table": "documents",
"column": "document_id",
"other_column": "id"
}
]
},
"organization": {
"incoming": [
{
"other_table": "organization_pages",
"column": "id",
"other_column": "organization_id"
}
],
"outgoing": []
},
"organization_pages": {
"incoming": [],
"outgoing": [
{
"other_table": "attraction_characteristic",
"column": "characteristic_id",
"other_column": "pk",
},
{
"other_table": "roadside_attractions",
"column": "attraction_id",
"other_column": "pk",
}
{
"other_table": "pages",
"column": "page_id",
"other_column": "id"
},
{
"other_table": "organization",
"column": "organization_id",
"other_column": "id"
}
]
}
}
.. _internals_csrf:
CSRF protection
@ -1341,7 +1378,7 @@ Datasette's internal database
Datasette maintains an "internal" SQLite database used for configuration, caching, and storage. Plugins can store configuration, settings, and other data inside this database. By default, Datasette will use a temporary in-memory SQLite database as the internal database, which is created at startup and destroyed at shutdown. Users of Datasette can optionally pass in a ``--internal`` flag to specify the path to a SQLite database to use as the internal database, which will persist internal data across Datasette instances.
Datasette maintains tables called ``catalog_databases``, ``catalog_tables``, ``catalog_columns``, ``catalog_indexes``, ``catalog_foreign_keys`` with details of the attached databases and their schemas. These tables should not be considered a stable API - they may change between Datasette releases.
Datasette maintains tables called ``catalog_databases``, ``catalog_tables``, ``catalog_views``, ``catalog_columns``, ``catalog_indexes``, ``catalog_foreign_keys`` with details of the attached databases and their schemas. These tables should not be considered a stable API - they may change between Datasette releases.
Metadata is stored in tables ``metadata_instance``, ``metadata_databases``, ``metadata_resources`` and ``metadata_columns``. Plugins can interact with these tables via the :ref:`get_*_metadata() and set_*_metadata() methods <datasette_get_set_metadata>`.
@ -1382,7 +1419,15 @@ The internal database schema is as follows:
rootpage INTEGER,
sql TEXT,
PRIMARY KEY (database_name, table_name),
FOREIGN KEY (database_name) REFERENCES databases(database_name)
FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name)
);
CREATE TABLE catalog_views (
database_name TEXT,
view_name TEXT,
rootpage INTEGER,
sql TEXT,
PRIMARY KEY (database_name, view_name),
FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name)
);
CREATE TABLE catalog_columns (
database_name TEXT,
@ -1395,8 +1440,8 @@ The internal database schema is as follows:
is_pk INTEGER, -- renamed from pk
hidden INTEGER,
PRIMARY KEY (database_name, table_name, name),
FOREIGN KEY (database_name) REFERENCES databases(database_name),
FOREIGN KEY (database_name, table_name) REFERENCES tables(database_name, table_name)
FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name),
FOREIGN KEY (database_name, table_name) REFERENCES catalog_tables(database_name, table_name)
);
CREATE TABLE catalog_indexes (
database_name TEXT,
@ -1407,8 +1452,8 @@ The internal database schema is as follows:
origin TEXT,
partial INTEGER,
PRIMARY KEY (database_name, table_name, name),
FOREIGN KEY (database_name) REFERENCES databases(database_name),
FOREIGN KEY (database_name, table_name) REFERENCES tables(database_name, table_name)
FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name),
FOREIGN KEY (database_name, table_name) REFERENCES catalog_tables(database_name, table_name)
);
CREATE TABLE catalog_foreign_keys (
database_name TEXT,
@ -1422,8 +1467,8 @@ The internal database schema is as follows:
on_delete TEXT,
match TEXT,
PRIMARY KEY (database_name, table_name, id, seq),
FOREIGN KEY (database_name) REFERENCES databases(database_name),
FOREIGN KEY (database_name, table_name) REFERENCES tables(database_name, table_name)
FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name),
FOREIGN KEY (database_name, table_name) REFERENCES catalog_tables(database_name, table_name)
);
CREATE TABLE metadata_instance (
key text,

Wyświetl plik

@ -457,7 +457,7 @@ You can find this near the top of the source code of those pages, looking like t
The JSON URL is also made available in a ``Link`` HTTP header for the page::
Link: https://latest.datasette.io/fixtures/sortable.json; rel="alternate"; type="application/json+datasette"
Link: <https://latest.datasette.io/fixtures/sortable.json>; rel="alternate"; type="application/json+datasette"
.. _json_api_cors:

Wyświetl plik

@ -14,13 +14,11 @@ Top-level index
The root page of any Datasette installation is an index page that lists all of the currently attached databases. Some examples:
* `fivethirtyeight.datasettes.com <https://fivethirtyeight.datasettes.com/>`_
* `global-power-plants.datasettes.com <https://global-power-plants.datasettes.com/>`_
* `register-of-members-interests.datasettes.com <https://register-of-members-interests.datasettes.com/>`_
Add ``/.json`` to the end of the URL for the JSON version of the underlying data:
* `fivethirtyeight.datasettes.com/.json <https://fivethirtyeight.datasettes.com/.json>`_
* `global-power-plants.datasettes.com/.json <https://global-power-plants.datasettes.com/.json>`_
* `register-of-members-interests.datasettes.com/.json <https://register-of-members-interests.datasettes.com/.json>`_
The index page can also be accessed at ``/-/``, useful for if the default index page has been replaced using an :ref:`index.html custom template <customization_custom_templates>`. The ``/-/`` page will always render the default Datasette ``index.html`` template.
@ -35,12 +33,12 @@ Each database has a page listing the tables, views and canned queries available
Examples:
* `fivethirtyeight.datasettes.com/fivethirtyeight <https://fivethirtyeight.datasettes.com/fivethirtyeight>`_
* `global-power-plants.datasettes.com/global-power-plants <https://global-power-plants.datasettes.com/global-power-plants>`_
* `datasette.io/global-power-plants <https://datasette.io/global-power-plants>`_
The JSON version of this page provides programmatic access to the underlying data:
* `fivethirtyeight.datasettes.com/fivethirtyeight.json <https://fivethirtyeight.datasettes.com/fivethirtyeight.json>`_
* `global-power-plants.datasettes.com/global-power-plants.json <https://global-power-plants.datasettes.com/global-power-plants.json>`_
* `datasette.io/global-power-plants.json <https://datasette.io/global-power-plants.json>`_
.. _DatabaseView_hidden:
@ -89,7 +87,7 @@ Some examples:
* `../items <https://register-of-members-interests.datasettes.com/regmem/items>`_ lists all of the line-items registered by UK MPs as potential conflicts of interest. It demonstrates Datasette's support for :ref:`full_text_search`.
* `../antiquities-act%2Factions_under_antiquities_act <https://fivethirtyeight.datasettes.com/fivethirtyeight/antiquities-act%2Factions_under_antiquities_act>`_ is an interface for exploring the "actions under the antiquities act" data table published by FiveThirtyEight.
* `../global-power-plants?country_long=United+Kingdom&primary_fuel=Gas <https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=owner&_facet=country_long&country_long__exact=United+Kingdom&primary_fuel=Gas>`_ is a filtered table page showing every Gas power plant in the United Kingdom. It includes some default facets (configured using `its metadata.json <https://global-power-plants.datasettes.com/-/metadata>`_) and uses the `datasette-cluster-map <https://github.com/simonw/datasette-cluster-map>`_ plugin to show a map of the results.
* `../global-power-plants?country_long=United+Kingdom&primary_fuel=Gas <https://datasette.io/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=owner&_facet=country_long&country_long__exact=United+Kingdom&primary_fuel=Gas>`_ is a filtered table page showing every Gas power plant in the United Kingdom. It includes some default facets (configured using `its metadata.json <https://datasette.io/-/metadata>`_) and uses the `datasette-cluster-map <https://github.com/simonw/datasette-cluster-map>`_ plugin to show a map of the results.
.. _RowView:

Wyświetl plik

@ -57,6 +57,8 @@ arguments and can be called like this::
select random_integer(1, 10);
``prepare_connection()`` hooks are not called for Datasette's :ref:`internal database <internals_internal>`.
Examples: `datasette-jellyfish <https://datasette.io/plugins/datasette-jellyfish>`__, `datasette-jq <https://datasette.io/plugins/datasette-jq>`__, `datasette-haversine <https://datasette.io/plugins/datasette-haversine>`__, `datasette-rure <https://datasette.io/plugins/datasette-rure>`__
.. _plugin_hook_prepare_jinja2_environment:
@ -1024,7 +1026,7 @@ actor_from_request(datasette, request)
This is part of Datasette's :ref:`authentication and permissions system <authentication>`. The function should attempt to authenticate an actor (either a user or an API actor of some sort) based on information in the request.
If it cannot authenticate an actor, it should return ``None``. Otherwise it should return a dictionary representing that actor.
If it cannot authenticate an actor, it should return ``None``, otherwise it should return a dictionary representing that actor. Once a plugin has returned an actor from this hook other plugins will be ignored.
Here's an example that authenticates the actor based on an incoming API key:

Wyświetl plik

@ -374,7 +374,7 @@ One way to generate a secure random secret is to use Python like this::
python3 -c 'import secrets; print(secrets.token_hex(32))'
cdb19e94283a20f9d42cca50c5a4871c0aa07392db308755d60a1a5b9bb0fa52
Plugin authors make use of this signing mechanism in their plugins using :ref:`datasette_sign` and :ref:`datasette_unsign`.
Plugin authors can make use of this signing mechanism in their plugins using the :ref:`datasette.sign() <datasette_sign>` and :ref:`datasette.unsign() <datasette_unsign>` methods.
.. _setting_publish_secrets:

Wyświetl plik

@ -83,8 +83,8 @@ setup(
"pytest-xdist>=2.2.1",
"pytest-asyncio>=0.17",
"beautifulsoup4>=4.8.1",
"black==24.8.0",
"blacken-docs==1.18.0",
"black==25.1.0",
"blacken-docs==1.19.1",
"pytest-timeout>=1.4.2",
"trustme>=0.7",
"cogapp>=3.3.0",

Wyświetl plik

@ -5,6 +5,7 @@ import pytest
import pytest_asyncio
import re
import subprocess
import sys
import tempfile
import time
from dataclasses import dataclass
@ -196,7 +197,7 @@ def install_event_tracking_plugin():
@pytest.fixture(scope="session")
def ds_localhost_http_server():
ds_proc = subprocess.Popen(
["datasette", "--memory", "-p", "8041"],
[sys.executable, "-m", "datasette", "--memory", "-p", "8041"],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
# Avoid FileNotFoundError: [Errno 2] No such file or directory:
@ -218,7 +219,7 @@ def ds_unix_domain_socket_server(tmp_path_factory):
# using tempfile.gettempdir()
uds = str(pathlib.Path(tempfile.gettempdir()) / "datasette.sock")
ds_proc = subprocess.Popen(
["datasette", "--memory", "--uds", uds],
[sys.executable, "-m", "datasette", "--memory", "--uds", uds],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
cwd=tempfile.gettempdir(),

Wyświetl plik

@ -389,29 +389,6 @@ async def test_database_page(ds_client):
},
"private": False,
},
{
"name": "searchable_fts",
"columns": [
"text1",
"text2",
"name with . and spaces",
]
+ (
[
"searchable_fts",
"docid",
"__langid",
]
if supports_table_xinfo()
else []
),
"primary_keys": [],
"count": 2,
"hidden": False,
"fts_table": "searchable_fts",
"foreign_keys": {"incoming": [], "outgoing": []},
"private": False,
},
{
"name": "searchable_tags",
"columns": ["searchable_id", "tag"],
@ -538,6 +515,31 @@ async def test_database_page(ds_client):
"foreign_keys": {"incoming": [], "outgoing": []},
"private": False,
},
{
"columns": Either(
[
"text1",
"text2",
"name with . and spaces",
"searchable_fts",
"docid",
"__langid",
],
# Get tests to pass on SQLite 3.25 as well
[
"text1",
"text2",
"name with . and spaces",
],
),
"count": 2,
"foreign_keys": {"incoming": [], "outgoing": []},
"fts_table": "searchable_fts",
"hidden": True,
"name": "searchable_fts",
"primary_keys": [],
"private": False,
},
{
"name": "searchable_fts_docsize",
"columns": ["docid", "size"],
@ -1198,3 +1200,12 @@ async def test_upgrade_metadata(metadata, expected_config, expected_metadata):
assert response.json() == expected_config
response2 = await ds.client.get("/-/metadata.json")
assert response2.json() == expected_metadata
class Either:
def __init__(self, a, b):
self.a = a
self.b = b
def __eq__(self, other):
return other == self.a or other == self.b

Wyświetl plik

@ -307,9 +307,9 @@ async def test_auth_with_dstok_token(ds_client, scenario, should_work):
@pytest.mark.parametrize("expires", (None, 1000, -1000))
def test_cli_create_token(event_loop, app_client, expires):
def test_cli_create_token(app_client, expires):
secret = app_client.ds._secret
runner = CliRunner(mix_stderr=False)
runner = CliRunner()
args = ["create-token", "--secret", secret, "test"]
if expires:
args += ["--expires-after", str(expires)]

Wyświetl plik

@ -5,7 +5,7 @@ from pathlib import Path
code_root = Path(__file__).parent.parent
def test_black(event_loop):
def test_black():
runner = CliRunner()
result = runner.invoke(black.main, [str(code_root), "--check"])
assert result.exit_code == 0, result.output

Wyświetl plik

@ -433,7 +433,7 @@ def test_canned_write_custom_template(canned_write_client):
)
assert (
response.headers["link"]
== 'http://localhost/data/update_name.json; rel="alternate"; type="application/json+datasette"'
== '<http://localhost/data/update_name.json>; rel="alternate"; type="application/json+datasette"'
)

Wyświetl plik

@ -36,7 +36,7 @@ def test_inspect_cli(app_client):
assert expected_count == database["tables"][table_name]["count"]
def test_inspect_cli_writes_to_file(event_loop, app_client):
def test_inspect_cli_writes_to_file(app_client):
runner = CliRunner()
result = runner.invoke(
cli, ["inspect", "fixtures.db", "--inspect-file", "foo.json"]
@ -218,7 +218,7 @@ def test_version():
@pytest.mark.parametrize("invalid_port", ["-1", "0.5", "dog", "65536"])
def test_serve_invalid_ports(invalid_port):
runner = CliRunner(mix_stderr=False)
runner = CliRunner()
result = runner.invoke(cli, ["--port", invalid_port])
assert result.exit_code == 2
assert "Invalid value for '-p'" in result.stderr
@ -304,7 +304,7 @@ def test_plugin_s_overwrite():
def test_setting_type_validation():
runner = CliRunner(mix_stderr=False)
runner = CliRunner()
result = runner.invoke(cli, ["--setting", "default_page_size", "dog"])
assert result.exit_code == 2
assert '"settings.default_page_size" should be an integer' in result.stderr
@ -333,7 +333,7 @@ def test_setting_default_allow_sql(default_allow_sql):
def test_sql_errors_logged_to_stderr():
runner = CliRunner(mix_stderr=False)
runner = CliRunner()
result = runner.invoke(cli, ["--get", "/_memory/-/query.json?sql=select+blah"])
assert result.exit_code == 1
assert "sql = 'select blah', params = {}: no such column: blah\n" in result.stderr

Wyświetl plik

@ -45,7 +45,7 @@ def test_crossdb_warning_if_too_many_databases(tmp_path_factory):
conn = sqlite3.connect(path)
conn.execute("vacuum")
dbs.append(path)
runner = CliRunner(mix_stderr=False)
runner = CliRunner()
result = runner.invoke(
cli,
[

Wyświetl plik

@ -41,14 +41,13 @@ def test_homepage(app_client_two_attached_databases):
assert "extra database" == h2.text.strip()
counts_p, links_p = h2.find_all_next("p")[:2]
assert (
"4 rows in 2 tables, 3 rows in 3 hidden tables, 1 view" == counts_p.text.strip()
"2 rows in 1 table, 5 rows in 4 hidden tables, 1 view" == counts_p.text.strip()
)
# We should only show visible, not hidden tables here:
table_links = [
{"href": a["href"], "text": a.text.strip()} for a in links_p.find_all("a")
]
assert [
{"href": r"/extra+database/searchable_fts", "text": "searchable_fts"},
{"href": r"/extra+database/searchable", "text": "searchable"},
{"href": r"/extra+database/searchable_view", "text": "searchable_view"},
] == table_links
@ -1040,7 +1039,7 @@ async def test_alternate_url_json(ds_client, path, expected):
response = await ds_client.get(path)
assert response.status_code == 200
link = response.headers["link"]
assert link == '{}; rel="alternate"; type="application/json+datasette"'.format(
assert link == '<{}>; rel="alternate"; type="application/json+datasette"'.format(
expected
)
assert (

Wyświetl plik

@ -1,4 +1,5 @@
import pytest
import sqlite_utils
# ensure refresh_schemas() gets called before interacting with internal_db
@ -24,6 +25,15 @@ async def test_internal_tables(ds_client):
assert set(table.keys()) == {"rootpage", "table_name", "database_name", "sql"}
@pytest.mark.asyncio
async def test_internal_views(ds_client):
internal_db = await ensure_internal(ds_client)
views = await internal_db.execute("select * from catalog_views")
assert len(views) >= 4
view = views.rows[0]
assert set(view.keys()) == {"rootpage", "view_name", "database_name", "sql"}
@pytest.mark.asyncio
async def test_internal_indexes(ds_client):
internal_db = await ensure_internal(ds_client)
@ -59,3 +69,25 @@ async def test_internal_foreign_keys(ds_client):
"table_name",
"from",
}
@pytest.mark.asyncio
async def test_internal_foreign_key_references(ds_client):
internal_db = await ensure_internal(ds_client)
def inner(conn):
db = sqlite_utils.Database(conn)
table_names = db.table_names()
for table in db.tables:
for fk in table.foreign_keys:
other_table = fk.other_table
other_column = fk.other_column
message = 'Column "{}.{}" references other column "{}.{}" which does not exist'.format(
table.name, fk.column, other_table, other_column
)
assert other_table in table_names, message + " (bad table)"
assert other_column in db[other_table].columns_dict, (
message + " (bad column)"
)
await internal_db.execute_fn(inner)

Wyświetl plik

@ -722,6 +722,25 @@ async def test_hidden_tables(app_client):
"r_rowid",
]
# A fts virtual table with a content table should be hidden too
await db.execute("create virtual table f2_fts using fts5(a, content='f')")
assert await db.hidden_table_names() == [
"_hideme",
"f2_fts_config",
"f2_fts_data",
"f2_fts_docsize",
"f2_fts_idx",
"f_config",
"f_content",
"f_data",
"f_docsize",
"f_idx",
"r_node",
"r_parent",
"r_rowid",
"f2_fts",
]
@pytest.mark.asyncio
async def test_replace_database(tmpdir):

Wyświetl plik

@ -59,6 +59,11 @@ async def test_hook_plugin_prepare_connection_arguments(ds_client):
"database=fixtures, datasette.plugin_config(\"name-of-plugin\")={'depth': 'root'}"
] == response.json()
# Function should not be available on the internal database
db = ds_client.ds.get_internal_database()
with pytest.raises(sqlite3.OperationalError):
await db.execute("select prepare_connection_args()")
@pytest.mark.asyncio
@pytest.mark.parametrize(

Wyświetl plik

@ -3,6 +3,7 @@ from bs4 import BeautifulSoup as Soup
from .fixtures import ( # noqa
app_client,
make_app_client,
app_client_with_dot,
)
import pathlib
import pytest
@ -1291,3 +1292,9 @@ async def test_foreign_key_labels_obey_permissions(config):
"rows": [{"id": 1, "name": "world", "a_id": 1}],
"truncated": False,
}
def test_foreign_keys_special_character_in_database_name(app_client_with_dot):
# https://github.com/simonw/datasette/pull/2476
response = app_client_with_dot.get("/fixtures~2Edot/complex_foreign_keys")
assert '<a href="/fixtures~2Edot/simple_primary_key/2">world</a>' in response.text