An open source multi-tool for exploring and publishing data
You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Go to file
Simon Willison dda99fc09f
New View base class (#2080)
* New View base class, closes #2078
* Use new View subclass for PatternPortfolioView
2 weeks ago
.github Action: Deploy a Datasette branch preview to Vercel 4 weeks ago
datasette New View base class (#2080) 2 weeks ago
demos/apache-proxy Upgrade Docker images to Python 3.11, closes #1853 7 months ago
docs --cors Access-Control-Max-Age: 3600, closes #2079 2 weeks ago
tests New View base class (#2080) 2 weeks ago
.coveragerc Configure code coverage, refs #841, #843 3 years ago
.dockerignore Build Dockerfile with SpatiaLite 5, refs #1249 2 years ago
.git-blame-ignore-revs Ignore Black commits in git blame, refs #1716 1 year ago
.gitattributes New explicit versioning mechanism 3 years ago
.gitignore Add new entrypoint option to --load-extensions. (#1789) 10 months ago
.isort.cfg Used isort to re-order my imports 5 years ago
.prettierrc .prettierrc, refs #1166 2 years ago
.readthedocs.yaml Build docs with 3.11 on ReadTheDocs 1 month ago Add code of conduct again 1 year ago
Dockerfile Upgrade Docker images to Python 3.11, closes #1853 7 months ago
LICENSE Initial commit 6 years ago Include LICENSE in sdist (#1043) 3 years ago Add --nolock to the README Chrome demo 9 months ago
codecov.yml codecov should not be blocking 3 years ago
package-lock.json Upgrade to CodeMirror 6, add SQL autocomplete (#1893) 7 months ago
package.json Prettier should ignore bundle.js file - refs #1893 7 months ago
pytest.ini Removed @pytest.mark.ds_client mark - refs #1959 6 months ago
setup.cfg New explicit versioning mechanism 3 years ago Add pip as a dependency too, for Rye - refs #2065 1 month ago Pin httpx in Pyodide test, refs #1904 7 months ago


PyPI Changelog Python 3.x Tests Documentation Status License docker: datasette discord

An open source multi-tool for exploring and publishing data

Datasette is a tool for exploring and publishing data. It helps people take data of any shape or size and publish that as an interactive, explorable website and accompanying API.

Datasette is aimed at data journalists, museum curators, archivists, local governments, scientists, researchers and anyone else who has data that they wish to share with the world.

Explore a demo, watch a video about the project or try it out by uploading and publishing your own CSV data.

Want to stay up-to-date with the project? Subscribe to the Datasette newsletter for tips, tricks and news on what's new in the Datasette ecosystem.


If you are on a Mac, Homebrew is the easiest way to install Datasette:

brew install datasette

You can also install it using pip or pipx:

pip install datasette

Datasette requires Python 3.7 or higher. We also have detailed installation instructions covering other options such as Docker.

Basic usage

datasette serve path/to/database.db

This will start a web server on port 8001 - visit http://localhost:8001/ to access the web interface.

serve is the default subcommand, you can omit it if you like.

Use Chrome on OS X? You can run datasette against your browser history like so:

 datasette ~/Library/Application\ Support/Google/Chrome/Default/History --nolock

Now visiting http://localhost:8001/History/downloads will show you a web interface to browse your downloads data:

Downloads table rendered by datasette


If you want to include licensing and source information in the generated datasette website you can do so using a JSON file that looks something like this:

    "title": "Five Thirty Eight",
    "license": "CC Attribution 4.0 License",
    "license_url": "",
    "source": "fivethirtyeight/data on GitHub",
    "source_url": ""

Save this in metadata.json and run Datasette like so:

datasette serve fivethirtyeight.db -m metadata.json

The license and source information will be displayed on the index page and in the footer. They will also be included in the JSON produced by the API.

datasette publish

If you have Heroku or Google Cloud Run configured, Datasette can deploy one or more SQLite databases to the internet with a single command:

datasette publish heroku database.db


datasette publish cloudrun database.db

This will create a docker image containing both the datasette application and the specified SQLite database files. It will then deploy that image to Heroku or Cloud Run and give you a URL to access the resulting website and API.

See Publishing data in the documentation for more details.

Datasette Lite

Datasette Lite is Datasette packaged using WebAssembly so that it runs entirely in your browser, no Python web application server required. Read more about that in the Datasette Lite documentation.