Porównaj commity

...

117 Commity
0.9.0 ... main

Autor SHA1 Wiadomość Data
Andrew Godwin 7c34ac78ed Write a release checklist and do a couple things on it 2024-02-06 14:49:35 -07:00
Henri Dickson 72eb6a6271
add application/activity+json to accept header to improve compatibility (#694) 2024-02-05 21:40:04 -05:00
Jamie Bliss b2223ddf42
Back out push notification changes 2024-02-05 21:18:59 -05:00
Jamie Bliss 045a499ddf
Fix docs 2024-02-05 20:59:22 -05:00
Jamie Bliss 0fa48578f2
Write release notes for 0.11.0 2024-02-05 20:53:09 -05:00
Henri Dickson f86f3a49e4
Fix when report ap message does not have content field (#689) 2024-01-08 19:48:21 -07:00
Henri Dickson 2f4daa02bd
Add missing validator to initial migration (#687) 2024-01-04 08:59:26 -07:00
Henri Dickson 798222dcdb
Post update/delete also fanout to those who liked/boosted it but not following the author (#684) 2023-12-31 11:06:30 -07:00
Henri Dickson 74b3ac551a
Fix accept/reject follow request (#683) 2023-12-27 11:48:09 -07:00
Henri Dickson 4a09379e09
Fix federating with GoToSocial (#682) 2023-12-26 10:26:03 -07:00
Henri Dickson 448092d6d9
Improve identity deletion (#678) 2023-12-16 23:49:59 +00:00
Henri Dickson 5d508a17ec
Basic protection against invalid domain names (#680) 2023-12-13 09:04:41 +00:00
Jamie Bliss d07482f5a8
Allow statusless posts (#677) 2023-12-07 16:32:18 -07:00
Henri Dickson 123c20efb1
When remote follows local, hold off sending Accept if remote identity is not fully fetched (#676) 2023-12-06 11:08:41 -07:00
Karthik Balakrishnan 83607779cd
Fix README: 0.10.1 is latest release (#675) 2023-12-01 09:11:18 -07:00
Andrew Godwin 837320f461 Invert pruning exit codes 2023-12-01 00:03:09 -07:00
Rob 5f28d702f8
Make max_media_attachments configurable by admin (#669) 2023-11-28 09:52:04 -07:00
Henri Dickson ac7fef4b28
Do not fetch webfinger if when querying identity on local domain (#668) 2023-11-26 21:00:58 -07:00
Henri Dickson 6855e74c6f
Do not retry unmute if mute never expires 2023-11-26 12:46:31 -07:00
Henri Dickson a58d7ccd8f
Do not make local identities outdated (#667) 2023-11-26 11:19:18 -07:00
Rob 1a728ea023
Add s3-insecure to pydantic checker (#665) 2023-11-26 11:13:55 -07:00
Humberto Rocha b031880e41
Extract json parser to core and use in fetch_actor (#663) 2023-11-20 11:46:51 -07:00
Humberto Rocha 81d019ad0d
Improve search api json parsing (#662) 2023-11-19 11:32:35 -07:00
Henri Dickson 5267e4108c
Allow unicode characters in hashtag (#659) 2023-11-19 09:58:20 -07:00
Henri Dickson b122e2beda
Fix fetching post from another takahe by searching its url (#661) 2023-11-18 21:03:51 -07:00
Rob ae1bfc49a7
Add s3-insecure for S3 backend (#658) 2023-11-17 21:49:06 -07:00
Osma Ahvenlampi 1ceef59bec
Module-specific loggers and minor reformatting (#657) 2023-11-16 10:27:20 -07:00
Humberto Rocha 2f546dfa74
Do not canonicalise non json content in the search endpoint (#654) 2023-11-15 15:00:56 -07:00
Andrew Godwin cc9e397f60 Ensure post pruning has a random selection element 2023-11-14 00:04:18 -07:00
Andrew Godwin dc397903b2 Fix release date formatting 2023-11-13 12:18:30 -07:00
Andrew Godwin debf4670e8 Releasing 0.10.1 2023-11-13 12:16:40 -07:00
Andrew Godwin e49bfc4775 Add Stator tuning notes 2023-11-13 10:52:22 -07:00
Andrew Godwin 308dd033e1 Significantly drop the default settings for stator 2023-11-13 10:39:21 -07:00
Andrew Godwin 460d1d7e1c Don't prune replies to local, add docs 2023-11-12 18:32:38 -07:00
Andrew Godwin eb0b0d775c Don't delete mentioned people 2023-11-12 18:06:29 -07:00
Andrew Godwin 74f69a3813 Add identity pruning, improve post pruning 2023-11-12 18:01:01 -07:00
Andrew Godwin 9fc497f826 Mention that the final number includes dependencies 2023-11-12 17:12:05 -07:00
Andrew Godwin ab3648e05d Add console logging back to Stator 2023-11-12 16:49:01 -07:00
Andrew Godwin 476f817464 Only consider local replies 2023-11-12 16:31:20 -07:00
Andrew Godwin 99e7fb8639 Fix prune issues when multiple replies 2023-11-12 16:30:49 -07:00
Andrew Godwin 87344b47b5 Add manual post pruning command 2023-11-12 16:23:43 -07:00
Andrew Godwin aa39ef0571 Move remote pruning note over to 0.11 2023-11-12 14:43:45 -07:00
Andrew Godwin 110a5e64dc "a" to "our" is important meaning 2023-11-12 14:42:59 -07:00
Andrew Godwin bae76c3063 Add 0.10 to release index with date 2023-11-12 14:39:24 -07:00
Andrew Godwin 9bb40ca7f6 Releasing 0.10 2023-11-12 14:12:06 -07:00
Andrew Godwin af7f1173fc Disable remote post pruning via Stator for now 2023-11-12 12:37:04 -07:00
Andrew Godwin 30e9b1f62d Ignore more Lemmy things 2023-11-12 12:35:11 -07:00
Andrew Godwin 95089c0c61 Ignore some messages at inbox view time 2023-11-12 12:09:09 -07:00
Andrew Godwin d815aa53e1 Ignore lemmy-flavour likes and dislikes 2023-11-12 11:21:23 -07:00
Andrew Godwin e6e64f1000 Don't use other server URIs in our IDs (Fixes #323) 2023-11-12 10:21:07 -07:00
Andrew Godwin c3bf7563b4
Fix memcached testing error on GH Actions 2023-11-09 12:47:08 -07:00
Andrew Godwin e577d020ee Bump to Python 3.11, as 3.10 is in security-only now 2023-11-09 12:19:56 -07:00
Andrew Godwin 57cefa967c Prepping 0.10 release notes 2023-11-09 12:10:31 -07:00
Andrew Godwin 6fdfdca442 Update all the pre-commit hooks 2023-11-09 12:07:21 -07:00
Andrew Godwin e17f17385a Add setting to keep migration off by default for now 2023-11-09 11:58:40 -07:00
Jamie Bliss 5cc74900b1
Update client app compatibility, add links (#649)
* Tuba advertises compatibility
* Phanpy seems to work for me
2023-11-08 13:33:08 -07:00
Osma Ahvenlampi 24577761ed
focalpoints are floats between -1..1, not int (#648) 2023-11-04 11:24:09 -06:00
Osma Ahvenlampi 039adae797
Refactoring inbox processing to smaller tasks (#647) 2023-10-26 10:01:03 -06:00
Osma Ahvenlampi 9368996a5b
use logging instead of sentry.capture_* (#646) 2023-10-23 10:33:55 -06:00
Andrew Godwin 84ded2f3a5 Turn off remote prune for now 2023-10-19 08:42:01 -06:00
Andrew Godwin 07d187309e Pruning docs and ability to turn off 2023-10-01 10:49:10 -06:00
Andrew Godwin 8cc1691857 Delete remote posts after a set horizon time 2023-10-01 10:43:22 -06:00
Osma Ahvenlampi b60e807b91
Separate out timeouts from other remote server issues (#645) 2023-10-01 09:27:23 -06:00
Osma Ahvenlampi 1e8a392e57
Deal with unknown json-ld schemas (#644)
Rather than raising an error, returns an empty schema.
2023-09-20 14:58:38 -04:00
Humberto Rocha 8c832383e0
Update ld schema to support instances that implement multikey and wytchspace (#643) 2023-09-16 19:09:13 -06:00
Andrew Godwin 6c83d7b67b Fix #642: Race condition searching for unseen users 2023-09-15 10:21:33 -06:00
Andrew Godwin dd532e4425 Fix tests for profile redirect and add to notes 2023-09-07 22:06:50 -06:00
Andrew Godwin 1e76430f74 Don't show identity pages for remote identities 2023-09-07 21:54:42 -06:00
Andrew Godwin ddf24d376e Fix initial identity choices 2023-09-04 11:21:04 -06:00
Osma Ahvenlampi 2a0bbf0d5d
One more try to get the fetch_account/sync_pins/post relationship and parallelism fixed (#634) 2023-08-26 15:16:14 -06:00
Henri Dickson 555046ac4d
Ignore unknown tag type in incoming post, rather than raise exception (#639) 2023-08-25 16:35:57 -06:00
Henri Dickson b003af64cc
Do not print "Scheduling 0 handled" unless settings.DEBUG is on (#636) 2023-08-23 22:12:21 +10:00
Osma Ahvenlampi 671807beb8
Misc lemmy compat (#635) 2023-08-21 11:55:48 +09:30
Osma Ahvenlampi 2a50928f27
Signatures need to use UTF-8 in order to represent all URLs (#633) 2023-08-21 11:54:47 +09:30
Henri Dickson 70b9e3b900
Support follow requests (#625) 2023-08-18 15:49:45 +09:30
TAKAHASHI Shuuji faa181807c
Fix Accept object id for follow activity for Misskey and Firefish (#632) 2023-08-18 15:42:53 +09:30
TAKAHASHI Shuuji 679f0def99
Add stub API endpoint for user suggestion (api/v2/suggestions) (#631) 2023-08-17 17:41:06 +09:30
Henri Dickson 1262c619bb
Make nodeinfo do metadata based on domain requested (#628) 2023-08-11 09:34:25 -06:00
Andrew Godwin 0c72327ab7 Fix state graph 2023-08-08 09:04:21 -06:00
Andrew Godwin 84703bbc45 Lay groundwork for moved identity state 2023-08-08 08:55:16 -06:00
TAKAHASHI Shuuji 93dfc85cf7
Fix small syntax errors (#627) 2023-08-07 09:18:18 -06:00
TAKAHASHI Shuuji 67d755e6d3
Support to export blocks/mutes as CSV files (#626) 2023-08-07 09:16:52 -06:00
Henri Dickson 4a9109271d
Fix like/boost remote post (#629) 2023-08-07 09:15:13 -06:00
Humberto Rocha a69499c742
Add 'domain' to the blocklist supported headers (#623) 2023-08-03 10:41:47 -06:00
Humberto Rocha c4a2b62016
Allow updated to updated transition on Domain model (#621) 2023-07-30 11:22:35 -07:00
Henri Dickson 1b7bb8c501
Add Idempotency-Key to allowed CORS header (#618)
It's used by other web clients, so should improve compatibility.
2023-07-24 18:54:58 -06:00
Humberto Rocha f3bab95827
Add support to import blocklists (#617) 2023-07-24 17:59:50 -06:00
Andrew Godwin 4a8bdec90c Implement inbound account migration 2023-07-22 11:46:35 -06:00
Andrew Godwin cc6355f60b Refs #613: Also block subdomains 2023-07-22 10:54:36 -06:00
Andrew Godwin 83b57a0998 Never put blocked domains into outdated either 2023-07-22 10:44:01 -06:00
Andrew Godwin aac75dd4c3 Fixed #613: Don't pull nodeinfo from blocked servers! 2023-07-22 10:41:58 -06:00
Andrew Godwin 759d5ac052 Fixed #616: Do followers-only properly 2023-07-22 10:38:22 -06:00
Andrew Godwin 1dd076ff7d Fixed #615: Nicely reject malformatted http signatures 2023-07-20 09:55:36 -06:00
Humberto Rocha d6cdcb1d83
Wait setup to complete before starting web and stator containers (#611) 2023-07-17 09:31:21 -06:00
Andrew Godwin 188e5a2446 Remove all remaining async code for now 2023-07-17 00:37:47 -06:00
Andrew Godwin 0915b17c4b Prune some unnecessary async usage 2023-07-17 00:18:00 -06:00
Andrew Godwin 31c743319e Require hatchway 0.5.2 2023-07-15 12:43:45 -06:00
Andrew Godwin 11e3ca12d4 Start on push notification work 2023-07-15 12:37:47 -06:00
Deborah Pickett 824f5b289c
Permit SMTP to mail relay without authentication (#600) 2023-07-14 13:57:58 -06:00
Osma Ahvenlampi 2d140f2e97
remove duplicate attachment url check (#608) 2023-07-14 13:52:04 -06:00
Osma Ahvenlampi b2a9b334be
Resubmit: Be quieter about remote hosts with invalid SSL certs (#595) 2023-07-12 09:51:08 -06:00
Osma Ahvenlampi 5549d21528
Fix inbox processing errors from pinned posts and non-Mastodon servers (#596)
If a post (interaction) comes in from AP inbox but no local author profile exists,
fetch_actor will pull in both the identity AND its pinned posts, which the incoming
post might have been. This would case a database integrity violation. We check
for post existing again after syncing the actor.

Post processing also barfed on posts where content didn't follow Mastodon specs.
For example, Kbin sets tag names in 'tag' attribute, instead of 'name' attribute.
2023-07-12 09:49:30 -06:00
Humberto Rocha 5f49f9b2bb
Add support to dismiss notifications (#605) 2023-07-11 16:37:03 -06:00
Osma Ahvenlampi 1cc9c16b8c
Use 400 and 401 error codes as OAuth2 documents, accept 400 as webfinger error code (#597) 2023-07-10 10:19:20 -06:00
Humberto Rocha 91cf2f3a30
Add missing SignUp link to header (#606) 2023-07-10 10:13:57 -06:00
Andrew Godwin 68eea142b1 Fix domain index issue 2023-07-10 10:11:48 -06:00
Andrew Godwin 3f8213f54a Syncify another handler 2023-07-09 00:43:16 -06:00
Andrew Godwin 2523de4249 Prevent race condition between threads and locking 2023-07-09 00:42:56 -06:00
Andrew Godwin 933f6660d5 Catch all the subtypes too 2023-07-07 16:39:02 -06:00
Andrew Godwin 2fda9ad2b4 Also capture unknown message types 2023-07-07 16:33:55 -06:00
Andrew Godwin 4458594f04 Also capture JSON-LD errors 2023-07-07 16:32:57 -06:00
Andrew Godwin c93a27e418 Capture and don't thrash on badly formatted AP messages 2023-07-07 16:29:12 -06:00
Andrew Godwin 709f2527ac Refresh identities half as frequently 2023-07-07 15:52:12 -06:00
Andrew Godwin 7f483af8d3 Rework Stator to use a next field and no async 2023-07-07 15:14:06 -06:00
Andrew Godwin e34e4c0c77 Fixed #599: Interaction state not present on notifications 2023-07-05 07:58:54 -06:00
Humberto Rocha 542e3836af
Add endpoint to get notification by id (#594) 2023-07-04 08:06:31 -06:00
Andrew Godwin 82a9c18205 Fixed #593: Add some docs for TAKAHE_CSRF_HOSTS 2023-07-02 20:41:38 +01:00
176 zmienionych plików z 4137 dodań i 1433 usunięć

Wyświetl plik

@ -8,7 +8,7 @@ jobs:
timeout-minutes: 5
strategy:
matrix:
python-version: ["3.10"]
python-version: ["3.11"]
steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}

Wyświetl plik

@ -4,6 +4,8 @@ on:
push:
paths-ignore:
- 'docs/**'
branches:
- main
pull_request:
paths-ignore:
- 'docs/**'
@ -15,7 +17,7 @@ jobs:
timeout-minutes: 8
strategy:
matrix:
python-version: ["3.10", "3.11"]
python-version: ["3.11", "3.12"]
db:
- "postgres://postgres:postgres@localhost/postgres"
include:
@ -44,6 +46,7 @@ jobs:
cache: pip
- name: Install dependencies
run: |
sudo apt-get install -y libmemcached-dev libwebp-dev libjpeg-dev
python -m pip install -r requirements-dev.txt
- name: Run pytest
env:

Wyświetl plik

@ -1,6 +1,6 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
rev: v4.5.0
hooks:
- id: check-case-conflict
- id: check-merge-conflict
@ -15,19 +15,19 @@ repos:
- id: trailing-whitespace
- repo: https://github.com/asottile/pyupgrade
rev: "v3.3.0"
rev: "v3.15.0"
hooks:
- id: pyupgrade
args: [--py310-plus]
args: [--py311-plus]
- repo: https://github.com/adamchainz/django-upgrade
rev: "1.13.0"
rev: "1.15.0"
hooks:
- id: django-upgrade
args: [--target-version, "4.2"]
- repo: https://github.com/psf/black
rev: 22.10.0
- repo: https://github.com/psf/black-pre-commit-mirror
rev: 23.11.0
hooks:
- id: black
@ -38,12 +38,12 @@ repos:
args: ["--profile=black"]
- repo: https://github.com/pycqa/flake8
rev: 6.0.0
rev: 6.1.0
hooks:
- id: flake8
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.991
rev: v1.6.1
hooks:
- id: mypy
exclude: "^tests/"
@ -51,8 +51,6 @@ repos:
[types-pyopenssl, types-mock, types-cachetools, types-python-dateutil]
- repo: https://github.com/rtts/djhtml
rev: v1.5.2
rev: 3.0.6
hooks:
- id: djhtml
- id: djcss
- id: djjs

Wyświetl plik

@ -4,7 +4,7 @@ version: 2
build:
os: ubuntu-22.04
tools:
python: "3.10"
python: "3.11"
# Build documentation in the docs/ directory with Sphinx
sphinx:

Wyświetl plik

@ -3,7 +3,7 @@
A *beta* Fediverse server for microblogging. Not fully polished yet -
we're still working towards a 1.0!
**Current version: [0.9](https://docs.jointakahe.org/en/latest/releases/0.9/)**
**Current version: [0.11.0](https://docs.jointakahe.org/en/latest/releases/0.11/)**
Key features:

Wyświetl plik

@ -210,8 +210,8 @@ class TimelineEventAdmin(admin.ModelAdmin):
@admin.register(FanOut)
class FanOutAdmin(admin.ModelAdmin):
list_display = ["id", "state", "created", "state_attempted", "type", "identity"]
list_filter = (IdentityLocalFilter, "type", "state", "state_attempted")
list_display = ["id", "state", "created", "state_next_attempt", "type", "identity"]
list_filter = (IdentityLocalFilter, "type", "state")
raw_id_fields = ["subject_post", "subject_post_interaction"]
autocomplete_fields = ["identity"]
readonly_fields = ["created", "updated", "state_changed"]
@ -229,7 +229,7 @@ class FanOutAdmin(admin.ModelAdmin):
@admin.register(PostInteraction)
class PostInteractionAdmin(admin.ModelAdmin):
list_display = ["id", "state", "state_attempted", "type", "identity", "post"]
list_display = ["id", "state", "state_next_attempt", "type", "identity", "post"]
list_filter = (IdentityLocalFilter, "type", "state")
raw_id_fields = ["post"]
autocomplete_fields = ["identity"]

Wyświetl plik

@ -0,0 +1,83 @@
import datetime
import sys
from django.conf import settings
from django.core.management.base import BaseCommand
from django.db.models import Q
from django.utils import timezone
from activities.models import Post
class Command(BaseCommand):
help = "Prunes posts that are old, not local and have no local interaction"
def add_arguments(self, parser):
parser.add_argument(
"--number",
"-n",
type=int,
default=500,
help="The maximum number of posts to prune at once",
)
def handle(self, number: int, *args, **options):
if not settings.SETUP.REMOTE_PRUNE_HORIZON:
print("Pruning has been disabled as REMOTE_PRUNE_HORIZON=0")
sys.exit(2)
# Find a set of posts that match the initial criteria
print(f"Running query to find up to {number} old posts...")
posts = (
Post.objects.filter(
local=False,
created__lt=timezone.now()
- datetime.timedelta(days=settings.SETUP.REMOTE_PRUNE_HORIZON),
)
.exclude(
Q(interactions__identity__local=True)
| Q(visibility=Post.Visibilities.mentioned)
)
.order_by("?")[:number]
)
post_ids_and_uris = dict(posts.values_list("object_uri", "id"))
print(f" found {len(post_ids_and_uris)}")
# Fetch all of their replies and exclude any that have local replies
print("Excluding ones with local replies...")
replies = Post.objects.filter(
local=True,
in_reply_to__in=post_ids_and_uris.keys(),
).values_list("in_reply_to", flat=True)
for reply in replies:
if reply and reply in post_ids_and_uris:
del post_ids_and_uris[reply]
print(f" narrowed down to {len(post_ids_and_uris)}")
# Fetch all the posts that they are replies to, and don't delete ones
# that are replies to local posts
print("Excluding ones that are replies to local posts...")
in_reply_tos = (
Post.objects.filter(id__in=post_ids_and_uris.values())
.values_list("in_reply_to", flat=True)
.distinct()
)
local_object_uris = Post.objects.filter(
local=True, object_uri__in=in_reply_tos
).values_list("object_uri", flat=True)
final_post_ids = list(
Post.objects.filter(id__in=post_ids_and_uris.values())
.exclude(in_reply_to__in=local_object_uris)
.values_list("id", flat=True)
)
print(f" narrowed down to {len(final_post_ids)}")
# Delete them
if not final_post_ids:
sys.exit(0)
print("Deleting...")
_, deleted = Post.objects.filter(id__in=final_post_ids).delete()
print("Deleted:")
for model, model_deleted in deleted.items():
print(f" {model}: {model_deleted}")
sys.exit(1)

Wyświetl plik

@ -16,7 +16,6 @@ import stator.models
class Migration(migrations.Migration):
initial = True
dependencies = [
@ -264,6 +263,7 @@ class Migration(migrations.Migration):
("identity_edited", "Identity Edited"),
("identity_deleted", "Identity Deleted"),
("identity_created", "Identity Created"),
("identity_moved", "Identity Moved"),
],
max_length=100,
),
@ -324,6 +324,7 @@ class Migration(migrations.Migration):
("mentioned", "Mentioned"),
("liked", "Liked"),
("followed", "Followed"),
("follow_requested", "Follow Requested"),
("boosted", "Boosted"),
("announcement", "Announcement"),
("identity_created", "Identity Created"),

Wyświetl plik

@ -8,7 +8,6 @@ import stator.models
class Migration(migrations.Migration):
dependencies = [
("activities", "0001_initial"),
]

Wyświetl plik

@ -10,7 +10,6 @@ import core.uploads
class Migration(migrations.Migration):
dependencies = [
("activities", "0002_hashtag"),
]

Wyświetl plik

@ -11,7 +11,6 @@ import stator.models
class Migration(migrations.Migration):
dependencies = [
("users", "0003_identity_followers_etc"),
("activities", "0003_postattachment_null_thumb"),

Wyświetl plik

@ -14,7 +14,6 @@ def timelineevent_populate_published(apps, schema_editor):
class Migration(migrations.Migration):
dependencies = [
("activities", "0004_emoji_post_emojis"),
]

Wyświetl plik

@ -5,7 +5,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("users", "0005_report"),
("activities", "0005_post_type_timeline_urls"),

Wyświetl plik

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0006_fanout_subject_identity_alter_fanout_type"),
]

Wyświetl plik

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0007_post_stats"),
]

Wyświetl plik

@ -4,7 +4,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("users", "0011_announcement"),
("activities", "0008_state_and_post_indexes"),

Wyświetl plik

@ -4,7 +4,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("users", "0013_stator_indexes"),
("activities", "0009_alter_timelineevent_index_together"),

Wyświetl plik

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0010_stator_indexes"),
]

Wyświetl plik

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0011_postinteraction_value_alter_postinteraction_type"),
]

Wyświetl plik

@ -5,7 +5,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("users", "0015_bookmark"),
("activities", "0012_in_reply_to_index"),

Wyświetl plik

@ -6,7 +6,6 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("activities", "0013_postattachment_author"),
]

Wyświetl plik

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0014_post_content_vector_gin"),
]

Wyświetl plik

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0015_alter_postinteraction_type"),
]

Wyświetl plik

@ -0,0 +1,234 @@
# Generated by Django 4.2.1 on 2023-07-05 22:18
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0016_index_together_migration"),
]
operations = [
migrations.RemoveIndex(
model_name="emoji",
name="activities__state_r_aa72ec_idx",
),
migrations.RemoveIndex(
model_name="emoji",
name="ix_emoji_state_attempted",
),
migrations.RemoveIndex(
model_name="emoji",
name="ix_emoji_state_locked",
),
migrations.RemoveIndex(
model_name="fanout",
name="ix_fanout_state_attempted",
),
migrations.RemoveIndex(
model_name="fanout",
name="ix_fanout_state_locked",
),
migrations.RemoveIndex(
model_name="fanout",
name="activities__state_r_aae3b4_idx",
),
migrations.RemoveIndex(
model_name="hashtag",
name="ix_hashtag_state_attempted",
),
migrations.RemoveIndex(
model_name="hashtag",
name="ix_hashtag_state_locked",
),
migrations.RemoveIndex(
model_name="hashtag",
name="activities__state_r_5703be_idx",
),
migrations.RemoveIndex(
model_name="post",
name="ix_post_state_attempted",
),
migrations.RemoveIndex(
model_name="post",
name="ix_post_state_locked",
),
migrations.RemoveIndex(
model_name="post",
name="activities__state_r_b8f1ff_idx",
),
migrations.RemoveIndex(
model_name="postattachment",
name="ix_postattachm_state_attempted",
),
migrations.RemoveIndex(
model_name="postattachment",
name="ix_postattachm_state_locked",
),
migrations.RemoveIndex(
model_name="postattachment",
name="activities__state_r_4e981c_idx",
),
migrations.RemoveIndex(
model_name="postinteraction",
name="activities__state_r_981d8c_idx",
),
migrations.RemoveIndex(
model_name="postinteraction",
name="ix_postinterac_state_attempted",
),
migrations.RemoveIndex(
model_name="postinteraction",
name="ix_postinterac_state_locked",
),
migrations.RemoveField(
model_name="emoji",
name="state_attempted",
),
migrations.RemoveField(
model_name="emoji",
name="state_ready",
),
migrations.RemoveField(
model_name="fanout",
name="state_attempted",
),
migrations.RemoveField(
model_name="fanout",
name="state_ready",
),
migrations.RemoveField(
model_name="hashtag",
name="state_attempted",
),
migrations.RemoveField(
model_name="hashtag",
name="state_ready",
),
migrations.RemoveField(
model_name="post",
name="state_attempted",
),
migrations.RemoveField(
model_name="post",
name="state_ready",
),
migrations.RemoveField(
model_name="postattachment",
name="state_attempted",
),
migrations.RemoveField(
model_name="postattachment",
name="state_ready",
),
migrations.RemoveField(
model_name="postinteraction",
name="state_attempted",
),
migrations.RemoveField(
model_name="postinteraction",
name="state_ready",
),
migrations.AddField(
model_name="emoji",
name="state_next_attempt",
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name="fanout",
name="state_next_attempt",
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name="hashtag",
name="state_next_attempt",
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name="post",
name="state_next_attempt",
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name="postattachment",
name="state_next_attempt",
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name="postinteraction",
name="state_next_attempt",
field=models.DateTimeField(blank=True, null=True),
),
migrations.AlterField(
model_name="emoji",
name="state_locked_until",
field=models.DateTimeField(blank=True, db_index=True, null=True),
),
migrations.AlterField(
model_name="fanout",
name="state_locked_until",
field=models.DateTimeField(blank=True, db_index=True, null=True),
),
migrations.AlterField(
model_name="hashtag",
name="state_locked_until",
field=models.DateTimeField(blank=True, db_index=True, null=True),
),
migrations.AlterField(
model_name="post",
name="state_locked_until",
field=models.DateTimeField(blank=True, db_index=True, null=True),
),
migrations.AlterField(
model_name="postattachment",
name="state_locked_until",
field=models.DateTimeField(blank=True, db_index=True, null=True),
),
migrations.AlterField(
model_name="postinteraction",
name="state_locked_until",
field=models.DateTimeField(blank=True, db_index=True, null=True),
),
migrations.AddIndex(
model_name="emoji",
index=models.Index(
fields=["state", "state_next_attempt", "state_locked_until"],
name="ix_emoji_state_next",
),
),
migrations.AddIndex(
model_name="fanout",
index=models.Index(
fields=["state", "state_next_attempt", "state_locked_until"],
name="ix_fanout_state_next",
),
),
migrations.AddIndex(
model_name="hashtag",
index=models.Index(
fields=["state", "state_next_attempt", "state_locked_until"],
name="ix_hashtag_state_next",
),
),
migrations.AddIndex(
model_name="post",
index=models.Index(
fields=["state", "state_next_attempt", "state_locked_until"],
name="ix_post_state_next",
),
),
migrations.AddIndex(
model_name="postattachment",
index=models.Index(
fields=["state", "state_next_attempt", "state_locked_until"],
name="ix_postattachm_state_next",
),
),
migrations.AddIndex(
model_name="postinteraction",
index=models.Index(
fields=["state", "state_next_attempt", "state_locked_until"],
name="ix_postinterac_state_next",
),
),
]

Wyświetl plik

@ -0,0 +1,17 @@
# Generated by Django 4.2.2 on 2023-07-09 17:25
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0017_stator_next_change"),
]
operations = [
migrations.AddField(
model_name="timelineevent",
name="dismissed",
field=models.BooleanField(default=False),
),
]

Wyświetl plik

@ -0,0 +1,22 @@
# Generated by Django 4.2.3 on 2023-10-30 07:44
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("activities", "0018_timelineevent_dismissed"),
]
operations = [
migrations.AlterField(
model_name="postattachment",
name="focal_x",
field=models.FloatField(blank=True, null=True),
),
migrations.AlterField(
model_name="postattachment",
name="focal_y",
field=models.FloatField(blank=True, null=True),
),
]

Wyświetl plik

@ -4,7 +4,6 @@ from typing import ClassVar
import httpx
import urlman
from asgiref.sync import sync_to_async
from cachetools import TTLCache, cached
from django.conf import settings
from django.core.exceptions import ValidationError
@ -35,13 +34,13 @@ class EmojiStates(StateGraph):
outdated.transitions_to(updated)
@classmethod
async def handle_outdated(cls, instance: "Emoji"):
def handle_outdated(cls, instance: "Emoji"):
"""
Fetches remote emoji and uploads to file for local caching
"""
if instance.remote_url and not instance.file:
try:
file, mimetype = await get_remote_file(
file, mimetype = get_remote_file(
instance.remote_url,
timeout=settings.SETUP.REMOTE_TIMEOUT,
max_size=settings.SETUP.EMOJI_MAX_IMAGE_FILESIZE_KB * 1024,
@ -55,7 +54,7 @@ class EmojiStates(StateGraph):
instance.file = file
instance.mimetype = mimetype
await sync_to_async(instance.save)()
instance.save()
return cls.updated
@ -127,7 +126,7 @@ class Emoji(StatorModel):
class Meta:
unique_together = ("domain", "shortcode")
indexes = StatorModel.Meta.indexes
indexes: list = [] # We need this so Stator can add its own
class urls(urlman.Urls):
admin = "/admin/emoji/"
@ -282,7 +281,7 @@ class Emoji(StatorModel):
# Name could be a direct property, or in a language'd value
if "name" in data:
name = data["name"]
elif "nameMap" in data:
elif "nameMap" in data and "und" in data["nameMap"]:
name = data["nameMap"]["und"]
else:
raise ValueError("No name on emoji JSON")
@ -314,11 +313,11 @@ class Emoji(StatorModel):
emoji.remote_url = icon["url"]
emoji.mimetype = mimetype
emoji.category = category
emoji.transition_set_state("outdated")
if emoji.file:
emoji.file.delete(save=True)
else:
emoji.save()
emoji.transition_perform("outdated")
return emoji
emoji = cls.objects.create(

Wyświetl plik

@ -1,5 +1,4 @@
import httpx
from asgiref.sync import sync_to_async
from django.db import models
from activities.models.timeline_event import TimelineEvent
@ -19,26 +18,24 @@ class FanOutStates(StateGraph):
new.times_out_to(failed, seconds=86400 * 3)
@classmethod
async def handle_new(cls, instance: "FanOut"):
def handle_new(cls, instance: "FanOut"):
"""
Sends the fan-out to the right inbox.
"""
fan_out = await instance.afetch_full()
# Don't try to fan out to identities that are not fetched yet
if not (fan_out.identity.local or fan_out.identity.inbox_uri):
if not (instance.identity.local or instance.identity.inbox_uri):
return
match (fan_out.type, fan_out.identity.local):
match (instance.type, instance.identity.local):
# Handle creating/updating local posts
case ((FanOut.Types.post | FanOut.Types.post_edited), True):
post = await fan_out.subject_post.afetch_full()
post = instance.subject_post
# If the author of the post is blocked or muted, skip out
if (
await Block.objects.active()
.filter(source=fan_out.identity, target=post.author)
.aexists()
Block.objects.active()
.filter(source=instance.identity, target=post.author)
.exists()
):
return cls.skipped
# Make a timeline event directly
@ -48,42 +45,42 @@ class FanOutStates(StateGraph):
add = True
mentioned = {identity.id for identity in post.mentions.all()}
if post.in_reply_to:
followed = await sync_to_async(set)(
fan_out.identity.outbound_follows.filter(
followed = set(
instance.identity.outbound_follows.filter(
state__in=FollowStates.group_active()
).values_list("target_id", flat=True)
)
interested_in = followed.union(
{post.author_id, fan_out.identity_id}
{post.author_id, instance.identity_id}
)
add = (post.author_id in followed) and (
bool(mentioned.intersection(interested_in))
)
if add:
await sync_to_async(TimelineEvent.add_post)(
identity=fan_out.identity,
TimelineEvent.add_post(
identity=instance.identity,
post=post,
)
# We might have been mentioned
if (
fan_out.identity.id in mentioned
and fan_out.identity_id != post.author_id
instance.identity.id in mentioned
and instance.identity_id != post.author_id
):
await sync_to_async(TimelineEvent.add_mentioned)(
identity=fan_out.identity,
TimelineEvent.add_mentioned(
identity=instance.identity,
post=post,
)
# Handle sending remote posts create
case (FanOut.Types.post, False):
post = await fan_out.subject_post.afetch_full()
post = instance.subject_post
# Sign it and send it
try:
await post.author.signed_request(
post.author.signed_request(
method="post",
uri=(
fan_out.identity.shared_inbox_uri
or fan_out.identity.inbox_uri
instance.identity.shared_inbox_uri
or instance.identity.inbox_uri
),
body=canonicalise(post.to_create_ap()),
)
@ -92,14 +89,14 @@ class FanOutStates(StateGraph):
# Handle sending remote posts update
case (FanOut.Types.post_edited, False):
post = await fan_out.subject_post.afetch_full()
post = instance.subject_post
# Sign it and send it
try:
await post.author.signed_request(
post.author.signed_request(
method="post",
uri=(
fan_out.identity.shared_inbox_uri
or fan_out.identity.inbox_uri
instance.identity.shared_inbox_uri
or instance.identity.inbox_uri
),
body=canonicalise(post.to_update_ap()),
)
@ -108,24 +105,24 @@ class FanOutStates(StateGraph):
# Handle deleting local posts
case (FanOut.Types.post_deleted, True):
post = await fan_out.subject_post.afetch_full()
if fan_out.identity.local:
post = instance.subject_post
if instance.identity.local:
# Remove all timeline events mentioning it
await TimelineEvent.objects.filter(
identity=fan_out.identity,
TimelineEvent.objects.filter(
identity=instance.identity,
subject_post=post,
).adelete()
).delete()
# Handle sending remote post deletes
case (FanOut.Types.post_deleted, False):
post = await fan_out.subject_post.afetch_full()
post = instance.subject_post
# Send it to the remote inbox
try:
await post.author.signed_request(
post.author.signed_request(
method="post",
uri=(
fan_out.identity.shared_inbox_uri
or fan_out.identity.inbox_uri
instance.identity.shared_inbox_uri
or instance.identity.inbox_uri
),
body=canonicalise(post.to_delete_ap()),
)
@ -134,51 +131,51 @@ class FanOutStates(StateGraph):
# Handle local boosts/likes
case (FanOut.Types.interaction, True):
interaction = await fan_out.subject_post_interaction.afetch_full()
interaction = instance.subject_post_interaction
# If the author of the interaction is blocked or their notifications
# are muted, skip out
if (
await Block.objects.active()
Block.objects.active()
.filter(
models.Q(mute=False) | models.Q(include_notifications=True),
source=fan_out.identity,
source=instance.identity,
target=interaction.identity,
)
.aexists()
.exists()
):
return cls.skipped
# If blocked/muted the underlying post author, skip out
if (
await Block.objects.active()
Block.objects.active()
.filter(
source=fan_out.identity,
source=instance.identity,
target_id=interaction.post.author_id,
)
.aexists()
.exists()
):
return cls.skipped
# Make a timeline event directly
await sync_to_async(TimelineEvent.add_post_interaction)(
identity=fan_out.identity,
TimelineEvent.add_post_interaction(
identity=instance.identity,
interaction=interaction,
)
# Handle sending remote boosts/likes/votes/pins
case (FanOut.Types.interaction, False):
interaction = await fan_out.subject_post_interaction.afetch_full()
interaction = instance.subject_post_interaction
# Send it to the remote inbox
try:
if interaction.type == interaction.Types.vote:
body = interaction.to_ap()
body = interaction.to_create_ap()
elif interaction.type == interaction.Types.pin:
body = interaction.to_add_ap()
else:
body = interaction.to_create_ap()
await interaction.identity.signed_request(
body = interaction.to_ap()
interaction.identity.signed_request(
method="post",
uri=(
fan_out.identity.shared_inbox_uri
or fan_out.identity.inbox_uri
instance.identity.shared_inbox_uri
or instance.identity.inbox_uri
),
body=canonicalise(body),
)
@ -187,28 +184,28 @@ class FanOutStates(StateGraph):
# Handle undoing local boosts/likes
case (FanOut.Types.undo_interaction, True): # noqa:F841
interaction = await fan_out.subject_post_interaction.afetch_full()
interaction = instance.subject_post_interaction
# Delete any local timeline events
await sync_to_async(TimelineEvent.delete_post_interaction)(
identity=fan_out.identity,
TimelineEvent.delete_post_interaction(
identity=instance.identity,
interaction=interaction,
)
# Handle sending remote undoing boosts/likes/pins
case (FanOut.Types.undo_interaction, False): # noqa:F841
interaction = await fan_out.subject_post_interaction.afetch_full()
interaction = instance.subject_post_interaction
# Send an undo to the remote inbox
try:
if interaction.type == interaction.Types.pin:
body = interaction.to_remove_ap()
else:
body = interaction.to_undo_ap()
await interaction.identity.signed_request(
interaction.identity.signed_request(
method="post",
uri=(
fan_out.identity.shared_inbox_uri
or fan_out.identity.inbox_uri
instance.identity.shared_inbox_uri
or instance.identity.inbox_uri
),
body=canonicalise(body),
)
@ -217,36 +214,38 @@ class FanOutStates(StateGraph):
# Handle sending identity edited to remote
case (FanOut.Types.identity_edited, False):
identity = await fan_out.subject_identity.afetch_full()
identity = instance.subject_identity
try:
await identity.signed_request(
identity.signed_request(
method="post",
uri=(
fan_out.identity.shared_inbox_uri
or fan_out.identity.inbox_uri
),
body=canonicalise(
await sync_to_async(fan_out.subject_identity.to_update_ap)()
instance.identity.shared_inbox_uri
or instance.identity.inbox_uri
),
body=canonicalise(instance.subject_identity.to_update_ap()),
)
except httpx.RequestError:
return
# Handle sending identity deleted to remote
case (FanOut.Types.identity_deleted, False):
identity = await fan_out.subject_identity.afetch_full()
identity = instance.subject_identity
try:
await identity.signed_request(
identity.signed_request(
method="post",
uri=(
fan_out.identity.shared_inbox_uri
or fan_out.identity.inbox_uri
instance.identity.shared_inbox_uri
or instance.identity.inbox_uri
),
body=canonicalise(fan_out.subject_identity.to_delete_ap()),
body=canonicalise(instance.subject_identity.to_delete_ap()),
)
except httpx.RequestError:
return
# Handle sending identity moved to remote
case (FanOut.Types.identity_moved, False):
raise NotImplementedError()
# Sending identity edited/deleted to local is a no-op
case (FanOut.Types.identity_edited, True):
pass
@ -255,14 +254,14 @@ class FanOutStates(StateGraph):
# Created identities make a timeline event
case (FanOut.Types.identity_created, True):
await sync_to_async(TimelineEvent.add_identity_created)(
identity=fan_out.identity,
new_identity=fan_out.subject_identity,
TimelineEvent.add_identity_created(
identity=instance.identity,
new_identity=instance.subject_identity,
)
case _:
raise ValueError(
f"Cannot fan out with type {fan_out.type} local={fan_out.identity.local}"
f"Cannot fan out with type {instance.type} local={instance.identity.local}"
)
return cls.sent
@ -282,6 +281,7 @@ class FanOut(StatorModel):
identity_edited = "identity_edited"
identity_deleted = "identity_deleted"
identity_created = "identity_created"
identity_moved = "identity_moved"
state = StateField(FanOutStates)
@ -323,23 +323,3 @@ class FanOut(StatorModel):
created = models.DateTimeField(auto_now_add=True)
updated = models.DateTimeField(auto_now=True)
### Async helpers ###
async def afetch_full(self):
"""
Returns a version of the object with all relations pre-loaded
"""
return (
await FanOut.objects.select_related(
"identity",
"subject_post",
"subject_post_interaction",
"subject_identity",
"subject_identity__domain",
)
.prefetch_related(
"subject_post__emojis",
)
.aget(pk=self.pk)
)

Wyświetl plik

@ -2,7 +2,6 @@ import re
from datetime import date, timedelta
import urlman
from asgiref.sync import sync_to_async
from django.db import models
from django.utils import timezone
@ -18,31 +17,27 @@ class HashtagStates(StateGraph):
updated.transitions_to(outdated)
@classmethod
async def handle_outdated(cls, instance: "Hashtag"):
def handle_outdated(cls, instance: "Hashtag"):
"""
Computes the stats and other things for a Hashtag
"""
from time import time
from .post import Post
start = time()
posts_query = Post.objects.local_public().tagged_with(instance)
total = await posts_query.acount()
total = posts_query.count()
today = timezone.now().date()
total_today = await posts_query.filter(
total_today = posts_query.filter(
created__gte=today,
created__lte=today + timedelta(days=1),
).acount()
total_month = await posts_query.filter(
).count()
total_month = posts_query.filter(
created__year=today.year,
created__month=today.month,
).acount()
total_year = await posts_query.filter(
).count()
total_year = posts_query.filter(
created__year=today.year,
).acount()
).count()
if total:
if not instance.stats:
instance.stats = {}
@ -55,9 +50,8 @@ class HashtagStates(StateGraph):
}
)
instance.stats_updated = timezone.now()
await sync_to_async(instance.save)()
instance.save()
print(f"Updated hashtag {instance.hashtag} in {time() - start:.5f} seconds")
return cls.updated
@ -86,7 +80,6 @@ class HashtagManager(models.Manager):
class Hashtag(StatorModel):
MAXIMUM_LENGTH = 100
# Normalized hashtag without the '#'

Wyświetl plik

@ -1,5 +1,6 @@
import datetime
import json
import logging
import mimetypes
import ssl
from collections.abc import Iterable
@ -8,13 +9,15 @@ from urllib.parse import urlparse
import httpx
import urlman
from asgiref.sync import async_to_sync, sync_to_async
from django.conf import settings
from django.contrib.postgres.indexes import GinIndex
from django.contrib.postgres.search import SearchVector
from django.db import models, transaction
from django.db.utils import IntegrityError
from django.template import loader
from django.template.defaultfilters import linebreaks_filter
from django.utils import timezone
from pyld.jsonld import JsonLdError
from activities.models.emoji import Emoji
from activities.models.fan_out import FanOut
@ -25,7 +28,7 @@ from activities.models.post_types import (
PostTypeDataEncoder,
QuestionData,
)
from core.exceptions import capture_message
from core.exceptions import ActivityPubFormatError
from core.html import ContentRenderer, FediverseHtmlParser
from core.ld import (
canonicalise,
@ -43,17 +46,20 @@ from users.models.identity import Identity, IdentityStates
from users.models.inbox_message import InboxMessage
from users.models.system_actor import SystemActor
logger = logging.getLogger(__name__)
class PostStates(StateGraph):
new = State(try_interval=300)
fanned_out = State(externally_progressed=True)
deleted = State(try_interval=300)
deleted_fanned_out = State(delete_after=24 * 60 * 60)
deleted_fanned_out = State(delete_after=86400)
edited = State(try_interval=300)
edited_fanned_out = State(externally_progressed=True)
new.transitions_to(fanned_out)
fanned_out.transitions_to(deleted_fanned_out)
fanned_out.transitions_to(deleted)
fanned_out.transitions_to(edited)
@ -63,45 +69,66 @@ class PostStates(StateGraph):
edited_fanned_out.transitions_to(deleted)
@classmethod
async def targets_fan_out(cls, post: "Post", type_: str) -> None:
def targets_fan_out(cls, post: "Post", type_: str) -> None:
# Fan out to each target
for follow in await post.aget_targets():
await FanOut.objects.acreate(
for follow in post.get_targets():
FanOut.objects.create(
identity=follow,
type=type_,
subject_post=post,
)
@classmethod
async def handle_new(cls, instance: "Post"):
def handle_new(cls, instance: "Post"):
"""
Creates all needed fan-out objects for a new Post.
"""
post = await instance.afetch_full()
# Only fan out if the post was published in the last day or it's local
# (we don't want to fan out anything older that that which is remote)
if post.local or (timezone.now() - post.published) < datetime.timedelta(days=1):
await cls.targets_fan_out(post, FanOut.Types.post)
await post.ensure_hashtags()
if instance.local or (timezone.now() - instance.published) < datetime.timedelta(
days=1
):
cls.targets_fan_out(instance, FanOut.Types.post)
instance.ensure_hashtags()
return cls.fanned_out
@classmethod
async def handle_deleted(cls, instance: "Post"):
def handle_fanned_out(cls, instance: "Post"):
"""
Creates all needed fan-out objects needed to delete a Post.
For remote posts, sees if we can delete them every so often.
"""
post = await instance.afetch_full()
await cls.targets_fan_out(post, FanOut.Types.post_deleted)
# Skip all of this if the horizon is zero
if settings.SETUP.REMOTE_PRUNE_HORIZON <= 0:
return
# To be a candidate for deletion, a post must be remote and old enough
if instance.local:
return
if instance.created > timezone.now() - datetime.timedelta(
days=settings.SETUP.REMOTE_PRUNE_HORIZON
):
return
# It must have no local interactions
if instance.interactions.filter(identity__local=True).exists():
return
# OK, delete it!
instance.delete()
return cls.deleted_fanned_out
@classmethod
async def handle_edited(cls, instance: "Post"):
def handle_deleted(cls, instance: "Post"):
"""
Creates all needed fan-out objects needed to delete a Post.
"""
cls.targets_fan_out(instance, FanOut.Types.post_deleted)
return cls.deleted_fanned_out
@classmethod
def handle_edited(cls, instance: "Post"):
"""
Creates all needed fan-out objects for an edited Post.
"""
post = await instance.afetch_full()
await cls.targets_fan_out(post, FanOut.Types.post_edited)
await post.ensure_hashtags()
cls.targets_fan_out(instance, FanOut.Types.post_edited)
instance.ensure_hashtags()
return cls.edited_fanned_out
@ -324,7 +351,7 @@ class Post(StatorModel):
fields=["visibility", "local", "created"],
name="ix_post_local_public_created",
),
] + StatorModel.Meta.indexes
]
class urls(urlman.Urls):
view = "{self.author.urls.view}posts/{self.id}/"
@ -375,8 +402,6 @@ class Post(StatorModel):
.first()
)
ain_reply_to_post = sync_to_async(in_reply_to_post)
### Content cleanup and extraction ###
def clean_type_data(self, value):
PostTypeData.parse_obj(value)
@ -448,18 +473,6 @@ class Post(StatorModel):
"replies": self.stats.get("replies", 0) if self.stats else 0,
}
### Async helpers ###
async def afetch_full(self) -> "Post":
"""
Returns a version of the object with all relations pre-loaded
"""
return (
await Post.objects.select_related("author", "author__domain")
.prefetch_related("mentions", "mentions__domain", "attachments", "emojis")
.aget(pk=self.pk)
)
### Local creation/editing ###
@classmethod
@ -552,6 +565,8 @@ class Post(StatorModel):
attachment.name = attrs.description
attachment.save()
self.transition_perform(PostStates.edited)
@classmethod
def mentions_from_content(cls, content, author) -> set[Identity]:
mention_hits = FediverseHtmlParser(content, find_mentions=True).mentions
@ -568,11 +583,11 @@ class Post(StatorModel):
domain=domain,
fetch=True,
)
if identity is not None:
if identity is not None and not identity.deleted:
mentions.add(identity)
return mentions
async def ensure_hashtags(self) -> None:
def ensure_hashtags(self) -> None:
"""
Ensure any of the already parsed hashtags from this Post
have a corresponding Hashtag record.
@ -580,10 +595,10 @@ class Post(StatorModel):
# Ensure hashtags
if self.hashtags:
for hashtag in self.hashtags:
tag, _ = await Hashtag.objects.aget_or_create(
tag, _ = Hashtag.objects.get_or_create(
hashtag=hashtag[: Hashtag.MAXIMUM_LENGTH],
)
await tag.atransition_perform(HashtagStates.outdated)
tag.transition_perform(HashtagStates.outdated)
def calculate_stats(self, save=True):
"""
@ -635,6 +650,7 @@ class Post(StatorModel):
"""
Returns the AP JSON for this object
"""
self.author.ensure_uris()
value = {
"to": [],
"cc": [],
@ -667,11 +683,14 @@ class Post(StatorModel):
if self.edited:
value["updated"] = format_ld_date(self.edited)
# Targeting
# TODO: Add followers object
if self.visibility == self.Visibilities.public:
value["to"].append("as:Public")
elif self.visibility == self.Visibilities.unlisted:
value["cc"].append("as:Public")
elif (
self.visibility == self.Visibilities.followers and self.author.followers_uri
):
value["to"].append(self.author.followers_uri)
# Mentions
for mention in self.mentions.all():
value["tag"].append(mention.to_ap_tag())
@ -739,33 +758,36 @@ class Post(StatorModel):
"object": object,
}
async def aget_targets(self) -> Iterable[Identity]:
def get_targets(self) -> Iterable[Identity]:
"""
Returns a list of Identities that need to see posts and their changes
"""
targets = set()
async for mention in self.mentions.all():
for mention in self.mentions.all():
targets.add(mention)
if self.visibility in [Post.Visibilities.public, Post.Visibilities.unlisted]:
for interaction in self.interactions.all():
targets.add(interaction.identity)
# Then, if it's not mentions only, also deliver to followers and all hashtag followers
if self.visibility != Post.Visibilities.mentioned:
async for follower in self.author.inbound_follows.filter(
for follower in self.author.inbound_follows.filter(
state__in=FollowStates.group_active()
).select_related("source"):
targets.add(follower.source)
if self.hashtags:
async for follow in HashtagFollow.objects.by_hashtags(
for follow in HashtagFollow.objects.by_hashtags(
self.hashtags
).prefetch_related("identity"):
targets.add(follow.identity)
# If it's a reply, always include the original author if we know them
reply_post = await self.ain_reply_to_post()
reply_post = self.in_reply_to_post()
if reply_post:
targets.add(reply_post.author)
# And if it's a reply to one of our own, we have to re-fan-out to
# the original author's followers
if reply_post.author.local:
async for follower in reply_post.author.inbound_follows.filter(
for follower in reply_post.author.inbound_follows.filter(
state__in=FollowStates.group_active()
).select_related("source"):
targets.add(follower.source)
@ -782,7 +804,7 @@ class Post(StatorModel):
.filter(mute=False)
.select_related("target")
)
async for block in blocks:
for block in blocks:
try:
targets.remove(block.target)
except KeyError:
@ -842,32 +864,52 @@ class Post(StatorModel):
# If the author is not fetched yet, try again later
if author.domain is None:
if fetch_author:
async_to_sync(author.fetch_actor)()
if author.domain is None:
if not author.fetch_actor() or author.domain is None:
raise TryAgainLater()
else:
raise TryAgainLater()
# If the post is from a blocked domain, stop and drop
if author.domain.blocked:
if author.domain.recursively_blocked():
raise cls.DoesNotExist("Post is from a blocked domain")
post = cls.objects.create(
object_uri=data["id"],
author=author,
content="",
local=False,
type=data["type"],
)
created = True
# parallelism may cause another simultaneous worker thread
# to try to create the same post - so watch for that and
# try to avoid failing the entire transaction
try:
# wrapped in a transaction to avoid breaking the outer
# transaction
with transaction.atomic():
post = cls.objects.create(
object_uri=data["id"],
author=author,
content="",
local=False,
type=data["type"],
)
created = True
except IntegrityError:
# despite previous checks, a parallel thread managed
# to create the same object already
raise TryAgainLater()
else:
raise cls.DoesNotExist(f"No post with ID {data['id']}", data)
if update or created:
post.type = data["type"]
post.url = data.get("url", data["id"])
if post.type in (cls.Types.article, cls.Types.question):
post.type_data = PostTypeData(__root__=data).__root__
post.content = get_value_or_map(data, "content", "contentMap")
post.summary = data.get("summary")
try:
# apparently sometimes posts (Pages?) in the fediverse
# don't have content, but this shouldn't be a total failure
post.content = get_value_or_map(data, "content", "contentMap")
except ActivityPubFormatError as err:
logger.warning("%s on %s", err, post.url)
post.content = None
# Document types have names, not summaries
post.summary = data.get("summary") or data.get("name")
if not post.content and post.summary:
post.content = post.summary
post.summary = None
post.sensitive = data.get("sensitive", False)
post.url = data.get("url", data["id"])
post.published = parse_ld_date(data.get("published"))
post.edited = parse_ld_date(data.get("updated"))
post.in_reply_to = data.get("inReplyTo")
@ -879,21 +921,22 @@ class Post(StatorModel):
mention_identity = Identity.by_actor_uri(tag["href"], create=True)
post.mentions.add(mention_identity)
elif tag_type in ["_:hashtag", "hashtag"]:
# kbin produces tags with 'tag' instead of 'name'
if "tag" in tag and "name" not in tag:
name = get_value_or_map(tag, "tag", "tagMap")
else:
name = get_value_or_map(tag, "name", "nameMap")
post.hashtags.append(
get_value_or_map(tag, "name", "nameMap")
.lower()
.lstrip("#")[: Hashtag.MAXIMUM_LENGTH]
name.lower().lstrip("#")[: Hashtag.MAXIMUM_LENGTH]
)
elif tag_type in ["toot:emoji", "emoji"]:
emoji = Emoji.by_ap_tag(post.author.domain, tag, create=True)
post.emojis.add(emoji)
elif tag_type == "edition":
# Bookwyrm Edition is similar to hashtags. There should be a link to
# the book in the Note's content and a post attachment of the cover
# image. No special processing should be needed for ingest.
pass
else:
raise ValueError(f"Unknown tag type {tag['type']}")
# Various ActivityPub implementations and proposals introduced tag
# types, e.g. Edition in Bookwyrm and Link in fep-e232 Object Links
# it should be safe to ignore (and log) them before a full support
pass
# Visibility and to
# (a post is public if it's to:public, otherwise it's unlisted if
# it's cc:public, otherwise it's more limited)
@ -904,10 +947,15 @@ class Post(StatorModel):
post.visibility = Post.Visibilities.public
elif "public" in cc or "as:public" in cc:
post.visibility = Post.Visibilities.unlisted
elif post.author.followers_uri in to:
post.visibility = Post.Visibilities.followers
# Attachments
# These have no IDs, so we have to wipe them each time
post.attachments.all().delete()
for attachment in get_list(data, "attachment"):
if "url" not in attachment and "href" in attachment:
# Links have hrefs, while other Objects have urls
attachment["url"] = attachment["href"]
if "focalPoint" in attachment:
try:
focal_x, focal_y = attachment["focalPoint"]
@ -917,6 +965,10 @@ class Post(StatorModel):
focal_x, focal_y = None, None
mimetype = attachment.get("mediaType")
if not mimetype or not isinstance(mimetype, str):
if "url" not in attachment:
raise ActivityPubFormatError(
f"No URL present on attachment in {post.url}"
)
mimetype, _ = mimetypes.guess_type(attachment["url"])
if not mimetype:
mimetype = "application/octet-stream"
@ -932,7 +984,11 @@ class Post(StatorModel):
)
# Calculate stats in case we have existing replies
post.calculate_stats(save=False)
post.save()
with transaction.atomic():
# if we don't commit the transaction here, there's a chance
# the parent fetch below goes into an infinite loop
post.save()
# Potentially schedule a fetch of the reply parent, and recalculate
# its stats if it's here already.
if post.in_reply_to:
@ -942,8 +998,10 @@ class Post(StatorModel):
try:
cls.ensure_object_uri(post.in_reply_to, reason=post.object_uri)
except ValueError:
capture_message(
f"Cannot fetch ancestor of Post={post.pk}, ancestor_uri={post.in_reply_to}"
logger.warning(
"Cannot fetch ancestor of Post=%s, ancestor_uri=%s",
post.pk,
post.in_reply_to,
)
else:
parent.calculate_stats()
@ -960,10 +1018,10 @@ class Post(StatorModel):
except cls.DoesNotExist:
if fetch:
try:
response = async_to_sync(SystemActor().signed_request)(
response = SystemActor().signed_request(
method="get", uri=object_uri
)
except (httpx.HTTPError, ssl.SSLCertVerificationError):
except (httpx.HTTPError, ssl.SSLCertVerificationError, ValueError):
raise cls.DoesNotExist(f"Could not fetch {object_uri}")
if response.status_code in [404, 410]:
raise cls.DoesNotExist(f"No post at {object_uri}")
@ -981,11 +1039,13 @@ class Post(StatorModel):
update=True,
fetch_author=True,
)
except (json.JSONDecodeError, ValueError):
raise cls.DoesNotExist(f"Invalid ld+json response for {object_uri}")
except (json.JSONDecodeError, ValueError, JsonLdError) as err:
raise cls.DoesNotExist(
f"Invalid ld+json response for {object_uri}"
) from err
# We may need to fetch the author too
if post.author.state == IdentityStates.outdated:
async_to_sync(post.author.fetch_actor)()
post.author.fetch_actor()
return post
else:
raise cls.DoesNotExist(f"Cannot find Post with URI {object_uri}")
@ -1019,7 +1079,7 @@ class Post(StatorModel):
if data["actor"] != data["object"]["attributedTo"]:
raise ValueError("Create actor does not match its Post object", data)
# Create it, stator will fan it out locally
cls.by_ap(data["object"], create=True, update=True)
cls.by_ap(data["object"], create=True, update=True, fetch_author=True)
@classmethod
def handle_update_ap(cls, data):

Wyświetl plik

@ -57,8 +57,8 @@ class PostAttachment(StatorModel):
width = models.IntegerField(null=True, blank=True)
height = models.IntegerField(null=True, blank=True)
focal_x = models.IntegerField(null=True, blank=True)
focal_y = models.IntegerField(null=True, blank=True)
focal_x = models.FloatField(null=True, blank=True)
focal_y = models.FloatField(null=True, blank=True)
blurhash = models.TextField(null=True, blank=True)
created = models.DateTimeField(auto_now_add=True)
@ -113,7 +113,7 @@ class PostAttachment(StatorModel):
### ActivityPub ###
def to_ap(self):
return {
ap = {
"url": self.file.url,
"name": self.name,
"type": "Document",
@ -122,6 +122,10 @@ class PostAttachment(StatorModel):
"mediaType": self.mimetype,
"blurhash": self.blurhash,
}
if self.is_image() and self.focal_x and self.focal_y:
ap["type"] = "Image"
ap["focalPoint"] = [self.focal_x, self.focal_y]
return ap
### Mastodon Client API ###

Wyświetl plik

@ -27,103 +27,89 @@ class PostInteractionStates(StateGraph):
return [cls.new, cls.fanned_out]
@classmethod
async def handle_new(cls, instance: "PostInteraction"):
def handle_new(cls, instance: "PostInteraction"):
"""
Creates all needed fan-out objects for a new PostInteraction.
"""
interaction = await instance.afetch_full()
# Boost: send a copy to all people who follow this user (limiting
# to just local follows if it's a remote boost)
# Pin: send Add activity to all people who follow this user
if (
interaction.type == interaction.Types.boost
or interaction.type == interaction.Types.pin
):
for target in await interaction.aget_targets():
await FanOut.objects.acreate(
if instance.type == instance.Types.boost or instance.type == instance.Types.pin:
for target in instance.get_targets():
FanOut.objects.create(
type=FanOut.Types.interaction,
identity=target,
subject_post=interaction.post,
subject_post_interaction=interaction,
subject_post=instance.post,
subject_post_interaction=instance,
)
# Like: send a copy to the original post author only,
# if the liker is local or they are
elif interaction.type == interaction.Types.like:
if interaction.identity.local or interaction.post.local:
await FanOut.objects.acreate(
elif instance.type == instance.Types.like:
if instance.identity.local or instance.post.local:
FanOut.objects.create(
type=FanOut.Types.interaction,
identity_id=interaction.post.author_id,
subject_post=interaction.post,
subject_post_interaction=interaction,
identity_id=instance.post.author_id,
subject_post=instance.post,
subject_post_interaction=instance,
)
# Vote: send a copy of the vote to the original
# post author only if it's a local interaction
# to a non local post
elif interaction.type == interaction.Types.vote:
if interaction.identity.local and not interaction.post.local:
await FanOut.objects.acreate(
elif instance.type == instance.Types.vote:
if instance.identity.local and not instance.post.local:
FanOut.objects.create(
type=FanOut.Types.interaction,
identity_id=interaction.post.author_id,
subject_post=interaction.post,
subject_post_interaction=interaction,
identity_id=instance.post.author_id,
subject_post=instance.post,
subject_post_interaction=instance,
)
else:
raise ValueError("Cannot fan out unknown type")
# And one for themselves if they're local and it's a boost
if (
interaction.type == PostInteraction.Types.boost
and interaction.identity.local
):
await FanOut.objects.acreate(
identity_id=interaction.identity_id,
if instance.type == PostInteraction.Types.boost and instance.identity.local:
FanOut.objects.create(
identity_id=instance.identity_id,
type=FanOut.Types.interaction,
subject_post=interaction.post,
subject_post_interaction=interaction,
subject_post=instance.post,
subject_post_interaction=instance,
)
return cls.fanned_out
@classmethod
async def handle_undone(cls, instance: "PostInteraction"):
def handle_undone(cls, instance: "PostInteraction"):
"""
Creates all needed fan-out objects to undo a PostInteraction.
"""
interaction = await instance.afetch_full()
# Undo Boost: send a copy to all people who follow this user
# Undo Pin: send a Remove activity to all people who follow this user
if (
interaction.type == interaction.Types.boost
or interaction.type == interaction.Types.pin
):
async for follow in interaction.identity.inbound_follows.select_related(
if instance.type == instance.Types.boost or instance.type == instance.Types.pin:
for follow in instance.identity.inbound_follows.select_related(
"source", "target"
):
if follow.source.local or follow.target.local:
await FanOut.objects.acreate(
FanOut.objects.create(
type=FanOut.Types.undo_interaction,
identity_id=follow.source_id,
subject_post=interaction.post,
subject_post_interaction=interaction,
subject_post=instance.post,
subject_post_interaction=instance,
)
# Undo Like: send a copy to the original post author only
elif interaction.type == interaction.Types.like:
await FanOut.objects.acreate(
elif instance.type == instance.Types.like:
FanOut.objects.create(
type=FanOut.Types.undo_interaction,
identity_id=interaction.post.author_id,
subject_post=interaction.post,
subject_post_interaction=interaction,
identity_id=instance.post.author_id,
subject_post=instance.post,
subject_post_interaction=instance,
)
else:
raise ValueError("Cannot fan out unknown type")
# And one for themselves if they're local and it's a boost
if (
interaction.type == PostInteraction.Types.boost
and interaction.identity.local
):
await FanOut.objects.acreate(
identity_id=interaction.identity_id,
if instance.type == PostInteraction.Types.boost and instance.identity.local:
FanOut.objects.create(
identity_id=instance.identity_id,
type=FanOut.Types.undo_interaction,
subject_post=interaction.post,
subject_post_interaction=interaction,
subject_post=instance.post,
subject_post_interaction=instance,
)
return cls.undone_fanned_out
@ -179,9 +165,7 @@ class PostInteraction(StatorModel):
updated = models.DateTimeField(auto_now=True)
class Meta:
indexes = [
models.Index(fields=["type", "identity", "post"])
] + StatorModel.Meta.indexes
indexes = [models.Index(fields=["type", "identity", "post"])]
### Display helpers ###
@ -214,17 +198,7 @@ class PostInteraction(StatorModel):
[e.subject_post for e in events if e.subject_post], identity
)
### Async helpers ###
async def afetch_full(self):
"""
Returns a version of the object with all relations pre-loaded
"""
return await PostInteraction.objects.select_related(
"identity", "post", "post__author"
).aget(pk=self.pk)
async def aget_targets(self) -> Iterable[Identity]:
def get_targets(self) -> Iterable[Identity]:
"""
Returns an iterable with Identities of followers that have unique
shared_inbox among each other to be used as target.
@ -239,13 +213,15 @@ class PostInteraction(StatorModel):
# Include all followers that are following the boosts
if self.type == self.Types.boost:
query = query.filter(boosts=True)
async for follow in query.select_related("source"):
for follow in query.select_related("source"):
targets.add(follow.source)
# Fetch the full blocks and remove them as targets
async for block in self.identity.outbound_blocks.active().filter(
mute=False
).select_related("target"):
for block in (
self.identity.outbound_blocks.active()
.filter(mute=False)
.select_related("target")
):
try:
targets.remove(block.target)
except KeyError:
@ -473,8 +449,9 @@ class PostInteraction(StatorModel):
# TODO: Limited retry state?
return
interaction.post.calculate_stats()
interaction.post.calculate_type_data()
if interaction and interaction.post:
interaction.post.calculate_stats()
interaction.post.calculate_type_data()
@classmethod
def handle_undo_ap(cls, data):

Wyświetl plik

@ -16,6 +16,7 @@ class TimelineEvent(models.Model):
mentioned = "mentioned"
liked = "liked" # Someone liking one of our posts
followed = "followed"
follow_requested = "follow_requested"
boosted = "boosted" # Someone boosting one of our posts
announcement = "announcement" # Server announcement
identity_created = "identity_created" # New identity created
@ -55,6 +56,7 @@ class TimelineEvent(models.Model):
published = models.DateTimeField(default=timezone.now)
seen = models.BooleanField(default=False)
dismissed = models.BooleanField(default=False)
created = models.DateTimeField(auto_now_add=True)
@ -73,14 +75,30 @@ class TimelineEvent(models.Model):
@classmethod
def add_follow(cls, identity, source_identity):
"""
Adds a follow to the timeline if it's not there already
Adds a follow to the timeline if it's not there already, remove follow request if any
"""
cls.objects.filter(
type=cls.Types.follow_requested,
identity=identity,
subject_identity=source_identity,
).delete()
return cls.objects.get_or_create(
identity=identity,
type=cls.Types.followed,
subject_identity=source_identity,
)[0]
@classmethod
def add_follow_request(cls, identity, source_identity):
"""
Adds a follow request to the timeline if it's not there already
"""
return cls.objects.get_or_create(
identity=identity,
type=cls.Types.follow_requested,
subject_identity=source_identity,
)[0]
@classmethod
def add_post(cls, identity, post):
"""
@ -168,6 +186,14 @@ class TimelineEvent(models.Model):
subject_identity_id=interaction.identity_id,
).delete()
@classmethod
def delete_follow(cls, target, source):
TimelineEvent.objects.filter(
type__in=[cls.Types.followed, cls.Types.follow_requested],
identity=target,
subject_identity=source,
).delete()
### Background tasks ###
@classmethod
@ -217,6 +243,8 @@ class TimelineEvent(models.Model):
)
elif self.type == self.Types.followed:
result["type"] = "follow"
elif self.type == self.Types.follow_requested:
result["type"] = "follow_request"
elif self.type == self.Types.identity_created:
result["type"] = "admin.sign_up"
else:

Wyświetl plik

@ -1,3 +1,5 @@
import logging
from activities.models import (
Post,
PostInteraction,
@ -5,9 +7,10 @@ from activities.models import (
PostStates,
TimelineEvent,
)
from core.exceptions import capture_message
from users.models import Identity
logger = logging.getLogger(__name__)
class PostService:
"""
@ -98,7 +101,7 @@ class PostService:
try:
Post.ensure_object_uri(object_uri, reason=reason)
except ValueError:
capture_message(
logger.error(
f"Cannot fetch ancestor Post={self.post.pk}, ancestor_uri={object_uri}"
)
break

Wyświetl plik

@ -1,7 +1,7 @@
import httpx
from asgiref.sync import async_to_sync
from activities.models import Hashtag, Post
from core.json import json_from_response
from core.ld import canonicalise
from users.models import Domain, Identity, IdentityStates
from users.models.system_actor import SystemActor
@ -49,7 +49,7 @@ class SearchService:
username, domain_instance or domain, fetch=True
)
if identity and identity.state == IdentityStates.outdated:
async_to_sync(identity.fetch_actor)()
identity.fetch_actor()
except ValueError:
pass
@ -74,7 +74,7 @@ class SearchService:
# Fetch the provided URL as the system actor to retrieve the AP JSON
try:
response = async_to_sync(SystemActor().signed_request)(
response = SystemActor().signed_request(
method="get",
uri=self.query,
)
@ -82,7 +82,12 @@ class SearchService:
return None
if response.status_code >= 400:
return None
document = canonicalise(response.json(), include_security=True)
json_data = json_from_response(response)
if not json_data:
return None
document = canonicalise(json_data, include_security=True)
type = document.get("type", "unknown").lower()
# Is it an identity?
@ -90,7 +95,7 @@ class SearchService:
# Try and retrieve the profile by actor URI
identity = Identity.by_actor_uri(document["id"], create=True)
if identity and identity.state == IdentityStates.outdated:
async_to_sync(identity.fetch_actor)()
identity.fetch_actor()
return identity
# Is it a post?

Wyświetl plik

@ -77,7 +77,7 @@ class TimelineService:
def notifications(self, types: list[str]) -> models.QuerySet[TimelineEvent]:
return (
self.event_queryset()
.filter(identity=self.identity, type__in=types)
.filter(identity=self.identity, type__in=types, dismissed=False)
.order_by("-created")
)

Wyświetl plik

@ -1,7 +1,6 @@
import json
import httpx
from asgiref.sync import async_to_sync
from django import forms
from django.utils.decorators import method_decorator
from django.views.generic import FormView, TemplateView
@ -13,7 +12,6 @@ from users.models import SystemActor
@method_decorator(admin_required, name="dispatch")
class JsonViewer(FormView):
template_name = "activities/debug_json.html"
class form_class(forms.Form):
@ -31,7 +29,7 @@ class JsonViewer(FormView):
context = self.get_context_data(form=form)
try:
response = async_to_sync(SystemActor().signed_request)(
response = SystemActor().signed_request(
method="get",
uri=uri,
)
@ -64,18 +62,15 @@ class JsonViewer(FormView):
class NotFound(TemplateView):
template_name = "404.html"
class ServerError(TemplateView):
template_name = "500.html"
@method_decorator(admin_required, name="dispatch")
class OauthAuthorize(TemplateView):
template_name = "api/oauth_authorize.html"
def get_context_data(self):

Wyświetl plik

@ -6,7 +6,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [

Wyświetl plik

@ -6,7 +6,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("users", "0008_follow_boosts"),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),

Wyświetl plik

@ -0,0 +1,17 @@
# Generated by Django 4.2.1 on 2023-07-15 17:40
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("api", "0002_remove_token_code_token_revoked_alter_token_token_and_more"),
]
operations = [
migrations.AddField(
model_name="token",
name="push_subscription",
field=models.JSONField(blank=True, null=True),
),
]

Wyświetl plik

@ -1,5 +1,21 @@
import urlman
from django.db import models
from pydantic import BaseModel
class PushSubscriptionSchema(BaseModel):
"""
Basic validating schema for push data
"""
class Keys(BaseModel):
p256dh: str
auth: str
endpoint: str
keys: Keys
alerts: dict[str, bool]
policy: str
class Token(models.Model):
@ -38,6 +54,8 @@ class Token(models.Model):
updated = models.DateTimeField(auto_now=True)
revoked = models.DateTimeField(blank=True, null=True)
push_subscription = models.JSONField(blank=True, null=True)
class urls(urlman.Urls):
edit = "/@{self.identity.handle}/settings/tokens/{self.id}/"
@ -49,3 +67,8 @@ class Token(models.Model):
# TODO: Support granular scopes the other way?
scope_prefix = scope.split(":")[0]
return (scope in self.scopes) or (scope_prefix in self.scopes)
def set_push_subscription(self, data: dict):
# Validate schema and assign
self.push_subscription = PushSubscriptionSchema(**data).dict()
self.save()

Wyświetl plik

@ -1,8 +1,10 @@
from typing import Literal, Optional, Union
from django.conf import settings
from hatchway import Field, Schema
from activities import models as activities_models
from api import models as api_models
from core.html import FediverseHtmlParser
from users import models as users_models
from users.services import IdentityService
@ -15,6 +17,23 @@ class Application(Schema):
client_id: str
client_secret: str
redirect_uri: str = Field(alias="redirect_uris")
vapid_key: str | None
@classmethod
def from_application(cls, application: api_models.Application) -> "Application":
instance = cls.from_orm(application)
instance.vapid_key = settings.SETUP.VAPID_PUBLIC_KEY
return instance
@classmethod
def from_application_no_keys(
cls, application: api_models.Application
) -> "Application":
instance = cls.from_orm(application)
instance.vapid_key = settings.SETUP.VAPID_PUBLIC_KEY
instance.client_id = ""
instance.client_secret = ""
return instance
class CustomEmoji(Schema):
@ -273,8 +292,9 @@ class Notification(Schema):
def from_timeline_event(
cls,
event: activities_models.TimelineEvent,
interactions=None,
) -> "Notification":
return cls(**event.to_mastodon_notification_json())
return cls(**event.to_mastodon_notification_json(interactions=interactions))
class Tag(Schema):
@ -433,3 +453,53 @@ class Preferences(Schema):
"reading:expand:spoilers": identity.config_identity.expand_content_warnings,
}
)
class PushSubscriptionKeys(Schema):
p256dh: str
auth: str
class PushSubscriptionCreation(Schema):
endpoint: str
keys: PushSubscriptionKeys
class PushDataAlerts(Schema):
mention: bool = False
status: bool = False
reblog: bool = False
follow: bool = False
follow_request: bool = False
favourite: bool = False
poll: bool = False
update: bool = False
admin_sign_up: bool = Field(False, alias="admin.sign_up")
admin_report: bool = Field(False, alias="admin.report")
class PushData(Schema):
alerts: PushDataAlerts
policy: Literal["all", "followed", "follower", "none"] = "all"
class PushSubscription(Schema):
id: str
endpoint: str
alerts: PushDataAlerts
policy: str
server_key: str
@classmethod
def from_token(
cls,
token: api_models.Token,
) -> Optional["PushSubscription"]:
value = token.push_subscription
if value:
value["id"] = "1"
value["server_key"] = settings.SETUP.VAPID_PUBLIC_KEY
del value["keys"]
return value
else:
return None

Wyświetl plik

@ -15,8 +15,10 @@ from api.views import (
notifications,
polls,
preferences,
push,
search,
statuses,
suggestions,
tags,
timelines,
trends,
@ -46,6 +48,7 @@ urlpatterns = [
path("v1/announcements/<pk>/dismiss", announcements.announcement_dismiss),
# Apps
path("v1/apps", apps.add_app),
path("v1/apps/verify_credentials", apps.verify_credentials),
# Bookmarks
path("v1/bookmarks", bookmarks.bookmarks),
# Emoji
@ -55,8 +58,11 @@ urlpatterns = [
path("v1/filters", filters.list_filters),
# Follow requests
path("v1/follow_requests", follow_requests.follow_requests),
path("v1/follow_requests/<id>/authorize", follow_requests.accept_follow_request),
path("v1/follow_requests/<id>/reject", follow_requests.reject_follow_request),
# Instance
path("v1/instance", instance.instance_info_v1),
path("v1/instance/activity", instance.activity),
path("v1/instance/peers", instance.peers),
path("v2/instance", instance.instance_info_v2),
# Lists
@ -76,11 +82,24 @@ urlpatterns = [
path("v1/statuses/<id>/source", statuses.status_source),
# Notifications
path("v1/notifications", notifications.notifications),
path("v1/notifications/clear", notifications.dismiss_notifications),
path("v1/notifications/<id>", notifications.get_notification),
path("v1/notifications/<id>/dismiss", notifications.dismiss_notification),
# Polls
path("v1/polls/<id>", polls.get_poll),
path("v1/polls/<id>/votes", polls.vote_poll),
# Preferences
path("v1/preferences", preferences.preferences),
# Push
path(
"v1/push/subscription",
methods(
get=push.get_subscription,
post=push.create_subscription,
put=push.update_subscription,
delete=push.delete_subscription,
),
),
# Search
path("v1/search", search.search),
path("v2/search", search.search),
@ -112,4 +131,6 @@ urlpatterns = [
path("v1/trends/tags", trends.trends_tags),
path("v1/trends/statuses", trends.trends_statuses),
path("v1/trends/links", trends.trends_links),
# Suggestions
path("v2/suggestions", suggestions.suggested_users),
]

Wyświetl plik

@ -29,6 +29,7 @@ def update_credentials(
display_name: QueryOrBody[str | None] = None,
note: QueryOrBody[str | None] = None,
discoverable: QueryOrBody[bool | None] = None,
locked: QueryOrBody[bool | None] = None,
source: QueryOrBody[dict[str, Any] | None] = None,
fields_attributes: QueryOrBody[dict[str, dict[str, str]] | None] = None,
avatar: File | None = None,
@ -42,6 +43,8 @@ def update_credentials(
service.set_summary(note)
if discoverable is not None:
identity.discoverable = discoverable
if locked is not None:
identity.manually_approves_followers = locked
if source:
if "privacy" in source:
privacy_map = {

Wyświetl plik

@ -1,7 +1,8 @@
from hatchway import QueryOrBody, api_view
from .. import schemas
from ..models import Application
from api import schemas
from api.decorators import scope_required
from api.models import Application
@api_view.post
@ -18,4 +19,12 @@ def add_app(
redirect_uris=redirect_uris,
scopes=scopes,
)
return schemas.Application.from_orm(application)
return schemas.Application.from_application(application)
@scope_required("read")
@api_view.get
def verify_credentials(
request,
) -> schemas.Application:
return schemas.Application.from_application_no_keys(request.token.application)

Wyświetl plik

@ -1,8 +1,12 @@
from django.http import HttpRequest
from django.shortcuts import get_object_or_404
from hatchway import api_view
from api import schemas
from api.decorators import scope_required
from api.pagination import MastodonPaginator, PaginatingApiResponse, PaginationResult
from users.models.identity import Identity
from users.services.identity import IdentityService
@scope_required("read:follows")
@ -14,5 +18,43 @@ def follow_requests(
min_id: str | None = None,
limit: int = 40,
) -> list[schemas.Account]:
# We don't implement this yet
return []
service = IdentityService(request.identity)
paginator = MastodonPaginator(max_limit=80)
pager: PaginationResult[Identity] = paginator.paginate(
service.follow_requests(),
min_id=min_id,
max_id=max_id,
since_id=since_id,
limit=limit,
)
return PaginatingApiResponse(
[schemas.Account.from_identity(i) for i in pager.results],
request=request,
include_params=["limit"],
)
@scope_required("write:follows")
@api_view.post
def accept_follow_request(
request: HttpRequest,
id: str | None = None,
) -> schemas.Relationship:
source_identity = get_object_or_404(
Identity.objects.exclude(restriction=Identity.Restriction.blocked), pk=id
)
IdentityService(request.identity).accept_follow_request(source_identity)
return IdentityService(source_identity).mastodon_json_relationship(request.identity)
@scope_required("write:follows")
@api_view.post
def reject_follow_request(
request: HttpRequest,
id: str | None = None,
) -> schemas.Relationship:
source_identity = get_object_or_404(
Identity.objects.exclude(restriction=Identity.Restriction.blocked), pk=id
)
IdentityService(request.identity).reject_follow_request(source_identity)
return IdentityService(source_identity).mastodon_json_relationship(request.identity)

Wyświetl plik

@ -1,4 +1,8 @@
import datetime
from django.conf import settings
from django.core.cache import cache
from django.utils import timezone
from hatchway import api_view
from activities.models import Post
@ -10,6 +14,15 @@ from users.models import Domain, Identity
@api_view.get
def instance_info_v1(request):
# The stats are expensive to calculate, so don't do it very often
stats = cache.get("instance_info_stats")
if stats is None:
stats = {
"user_count": Identity.objects.filter(local=True).count(),
"status_count": Post.objects.filter(local=True).not_hidden().count(),
"domain_count": Domain.objects.count(),
}
cache.set("instance_info_stats", stats, timeout=300)
return {
"uri": request.headers.get("host", settings.SETUP.MAIN_DOMAIN),
"title": Config.system.site_name,
@ -18,11 +31,7 @@ def instance_info_v1(request):
"email": "",
"version": f"takahe/{__version__}",
"urls": {},
"stats": {
"user_count": Identity.objects.filter(local=True).count(),
"status_count": Post.objects.filter(local=True).not_hidden().count(),
"domain_count": Domain.objects.count(),
},
"stats": stats,
"thumbnail": Config.system.site_banner,
"languages": ["en"],
"registrations": (Config.system.signup_allowed),
@ -32,7 +41,7 @@ def instance_info_v1(request):
"accounts": {},
"statuses": {
"max_characters": Config.system.post_length,
"max_media_attachments": 4,
"max_media_attachments": Config.system.max_media_attachments,
"characters_reserved_per_url": 23,
},
"media_attachments": {
@ -93,7 +102,7 @@ def instance_info_v2(request) -> dict:
"accounts": {"max_featured_tags": 0},
"statuses": {
"max_characters": Config.system.post_length,
"max_media_attachments": 4,
"max_media_attachments": Config.system.max_media_attachments,
"characters_reserved_per_url": 23,
},
"media_attachments": {
@ -139,3 +148,37 @@ def peers(request) -> list[str]:
"domain", flat=True
)
)
@api_view.get
def activity(request) -> list:
"""
Weekly activity endpoint
"""
# The stats are expensive to calculate, so don't do it very often
stats = cache.get("instance_activity_stats")
if stats is None:
stats = []
# Work out our most recent week start
now = timezone.now()
week_start = now.replace(
hour=0, minute=0, second=0, microsecond=0
) - datetime.timedelta(now.weekday())
for i in range(12):
week_end = week_start + datetime.timedelta(days=7)
stats.append(
{
"week": int(week_start.timestamp()),
"statuses": Post.objects.filter(
local=True, created__gte=week_start, created__lt=week_end
).count(),
# TODO: Populate when we have identity activity tracking
"logins": 0,
"registrations": Identity.objects.filter(
local=True, created__gte=week_start, created__lt=week_end
).count(),
}
)
week_start -= datetime.timedelta(days=7)
cache.set("instance_activity_stats", stats, timeout=300)
return stats

Wyświetl plik

@ -1,12 +1,22 @@
from django.http import HttpRequest
from django.shortcuts import get_object_or_404
from hatchway import ApiResponse, api_view
from activities.models import TimelineEvent
from activities.models import PostInteraction, TimelineEvent
from activities.services import TimelineService
from api import schemas
from api.decorators import scope_required
from api.pagination import MastodonPaginator, PaginatingApiResponse, PaginationResult
# Types/exclude_types use weird syntax so we have to handle them manually
NOTIFICATION_TYPES = {
"favourite": TimelineEvent.Types.liked,
"reblog": TimelineEvent.Types.boosted,
"mention": TimelineEvent.Types.mentioned,
"follow": TimelineEvent.Types.followed,
"admin.sign_up": TimelineEvent.Types.identity_created,
}
@scope_required("read:notifications")
@api_view.get
@ -18,22 +28,14 @@ def notifications(
limit: int = 20,
account_id: str | None = None,
) -> ApiResponse[list[schemas.Notification]]:
# Types/exclude_types use weird syntax so we have to handle them manually
base_types = {
"favourite": TimelineEvent.Types.liked,
"reblog": TimelineEvent.Types.boosted,
"mention": TimelineEvent.Types.mentioned,
"follow": TimelineEvent.Types.followed,
"admin.sign_up": TimelineEvent.Types.identity_created,
}
requested_types = set(request.GET.getlist("types[]"))
excluded_types = set(request.GET.getlist("exclude_types[]"))
if not requested_types:
requested_types = set(base_types.keys())
requested_types = set(NOTIFICATION_TYPES.keys())
requested_types.difference_update(excluded_types)
# Use that to pull relevant events
queryset = TimelineService(request.identity).notifications(
[base_types[r] for r in requested_types if r in base_types]
[NOTIFICATION_TYPES[r] for r in requested_types if r in NOTIFICATION_TYPES]
)
paginator = MastodonPaginator()
pager: PaginationResult[TimelineEvent] = paginator.paginate(
@ -43,8 +45,56 @@ def notifications(
since_id=since_id,
limit=limit,
)
interactions = PostInteraction.get_event_interactions(
pager.results,
request.identity,
)
return PaginatingApiResponse(
[schemas.Notification.from_timeline_event(event) for event in pager.results],
[
schemas.Notification.from_timeline_event(event, interactions=interactions)
for event in pager.results
],
request=request,
include_params=["limit", "account_id"],
)
@scope_required("read:notifications")
@api_view.get
def get_notification(
request: HttpRequest,
id: str,
) -> schemas.Notification:
notification = get_object_or_404(
TimelineService(request.identity).notifications(
list(NOTIFICATION_TYPES.values())
),
id=id,
)
return schemas.Notification.from_timeline_event(notification)
@scope_required("write:notifications")
@api_view.post
def dismiss_notifications(request: HttpRequest) -> dict:
TimelineService(request.identity).notifications(
list(NOTIFICATION_TYPES.values())
).update(dismissed=True)
return {}
@scope_required("write:notifications")
@api_view.post
def dismiss_notification(request: HttpRequest, id: str) -> dict:
notification = get_object_or_404(
TimelineService(request.identity).notifications(
list(NOTIFICATION_TYPES.values())
),
id=id,
)
notification.dismissed = True
notification.save()
return {}

Wyświetl plik

@ -73,6 +73,7 @@ class AuthorizationView(LoginRequiredMixin, View):
request,
"api/oauth_error.html",
{"error": f"Invalid response type '{response_type}'"},
status=400,
)
application = Application.objects.filter(
@ -81,7 +82,10 @@ class AuthorizationView(LoginRequiredMixin, View):
if application is None:
return render(
request, "api/oauth_error.html", {"error": "Invalid client_id"}
request,
"api/oauth_error.html",
{"error": "Invalid client_id"},
status=400,
)
if application.redirect_uris and redirect_uri not in application.redirect_uris:
@ -89,6 +93,7 @@ class AuthorizationView(LoginRequiredMixin, View):
request,
"api/oauth_error.html",
{"error": "Invalid application redirect URI"},
status=401,
)
context = {

70
api/views/push.py 100644
Wyświetl plik

@ -0,0 +1,70 @@
from django.conf import settings
from django.http import Http404
from hatchway import ApiError, QueryOrBody, api_view
from api import schemas
from api.decorators import scope_required
@scope_required("push")
@api_view.post
def create_subscription(
request,
subscription: QueryOrBody[schemas.PushSubscriptionCreation],
data: QueryOrBody[schemas.PushData],
) -> schemas.PushSubscription:
# First, check the server is set up to do push notifications
if not settings.SETUP.VAPID_PRIVATE_KEY:
raise Http404("Push not available")
# Then, register this with our token
request.token.set_push_subscription(
{
"endpoint": subscription.endpoint,
"keys": subscription.keys,
"alerts": data.alerts,
"policy": data.policy,
}
)
# Then return the subscription
return schemas.PushSubscription.from_token(request.token) # type:ignore
@scope_required("push")
@api_view.get
def get_subscription(request) -> schemas.PushSubscription:
# First, check the server is set up to do push notifications
if not settings.SETUP.VAPID_PRIVATE_KEY:
raise Http404("Push not available")
# Get the subscription if it exists
subscription = schemas.PushSubscription.from_token(request.token)
if not subscription:
raise ApiError(404, "Not Found")
return subscription
@scope_required("push")
@api_view.put
def update_subscription(
request, data: QueryOrBody[schemas.PushData]
) -> schemas.PushSubscription:
# First, check the server is set up to do push notifications
if not settings.SETUP.VAPID_PRIVATE_KEY:
raise Http404("Push not available")
# Get the subscription if it exists
subscription = schemas.PushSubscription.from_token(request.token)
if not subscription:
raise ApiError(404, "Not Found")
# Update the subscription
subscription.alerts = data.alerts
subscription.policy = data.policy
request.token.set_push_subscription(subscription)
# Then return the subscription
return schemas.PushSubscription.from_token(request.token) # type:ignore
@scope_required("push")
@api_view.delete
def delete_subscription(request) -> dict:
# Unset the subscription
request.token.push_subscription = None
return {}

Wyświetl plik

@ -13,7 +13,7 @@ from api.decorators import scope_required
def search(
request,
q: str,
type: Literal["accounts", "hashtags", "statuses"] | None = None,
type: Literal["accounts", "hashtags", "statuses", ""] | None = None,
fetch_identities: bool = Field(False, alias="resolve"),
following: bool = False,
exclude_unreviewed: bool = False,
@ -33,6 +33,8 @@ def search(
# Run search
searcher = SearchService(q, request.identity)
search_result = searcher.search_all()
if type == "":
type = None
if type is None or type == "accounts":
result["accounts"] = [
schemas.Account.from_identity(i, include_counts=False)

Wyświetl plik

@ -39,7 +39,7 @@ class PostPollSchema(Schema):
class PostStatusSchema(Schema):
status: str
status: str | None
in_reply_to_id: str | None = None
sensitive: bool = False
spoiler_text: str | None = None
@ -82,9 +82,9 @@ def post_for_id(request: HttpRequest, id: str) -> Post:
@api_view.post
def post_status(request, details: PostStatusSchema) -> schemas.Status:
# Check text length
if len(details.status) > Config.system.post_length:
if details.status and len(details.status) > Config.system.post_length:
raise ApiError(400, "Status is too long")
if len(details.status) == 0 and not details.media_ids:
if not details.status and not details.media_ids:
raise ApiError(400, "Status is empty")
# Grab attachments
attachments = [get_object_or_404(PostAttachment, pk=id) for id in details.media_ids]
@ -103,7 +103,7 @@ def post_status(request, details: PostStatusSchema) -> schemas.Status:
pass
post = Post.create_local(
author=request.identity,
content=details.status,
content=details.status or "",
summary=details.spoiler_text,
sensitive=details.sensitive,
visibility=visibility_map[details.visibility],

Wyświetl plik

@ -0,0 +1,16 @@
from django.http import HttpRequest
from hatchway import api_view
from api import schemas
from api.decorators import scope_required
@scope_required("read")
@api_view.get
def suggested_users(
request: HttpRequest,
limit: int = 10,
offset: int | None = None,
) -> list[schemas.Account]:
# We don't implement this yet
return []

Wyświetl plik

@ -1,9 +1,12 @@
from django.conf import settings
from core.models import Config
def config_context(request):
return {
"config": Config.system,
"allow_migration": settings.SETUP.ALLOW_USER_MIGRATION,
"top_section": request.path.strip("/").split("/")[0],
"opengraph_defaults": {
"og:site_name": Config.system.site_name,

Wyświetl plik

@ -1,45 +1,16 @@
import traceback
from asgiref.sync import sync_to_async
from django.conf import settings
class ActivityPubError(BaseException):
"""
A problem with an ActivityPub message
"""
class ActivityPubFormatError(ActivityPubError):
"""
A problem with an ActivityPub message's format/keys
"""
class ActorMismatchError(ActivityPubError):
"""
The actor is not authorised to do the action we saw
"""
def capture_message(message: str, level: str | None = None, scope=None, **scope_args):
"""
Sends the informational message to Sentry if it's configured
"""
if settings.SETUP.SENTRY_DSN and settings.SETUP.SENTRY_CAPTURE_MESSAGES:
from sentry_sdk import capture_message
capture_message(message, level, scope, **scope_args)
elif settings.DEBUG:
if scope or scope_args:
message += f"; {scope=}, {scope_args=}"
print(message)
def capture_exception(exception: BaseException, scope=None, **scope_args):
"""
Sends the exception to Sentry if it's configured
"""
if settings.SETUP.SENTRY_DSN:
from sentry_sdk import capture_exception
capture_exception(exception, scope, **scope_args)
elif settings.DEBUG:
traceback.print_exc()
acapture_exception = sync_to_async(capture_exception, thread_sensitive=False)

Wyświetl plik

@ -57,7 +57,7 @@ def blurhash_image(file) -> str:
return blurhash.encode(file, 4, 4)
async def get_remote_file(
def get_remote_file(
url: str,
*,
timeout: float = settings.SETUP.REMOTE_TIMEOUT,
@ -70,8 +70,8 @@ async def get_remote_file(
"User-Agent": settings.TAKAHE_USER_AGENT,
}
async with httpx.AsyncClient(headers=headers) as client:
async with client.stream(
with httpx.Client(headers=headers) as client:
with client.stream(
"GET", url, timeout=timeout, follow_redirects=True
) as stream:
allow_download = max_size is None
@ -82,7 +82,7 @@ async def get_remote_file(
except (KeyError, TypeError):
pass
if allow_download:
file = ContentFile(await stream.aread(), name=url)
file = ContentFile(stream.read(), name=url)
return file, stream.headers.get(
"content-type", "application/octet-stream"
)

Wyświetl plik

@ -38,7 +38,7 @@ class FediverseHtmlParser(HTMLParser):
r"(^|[^\w\d\-_/])@([\w\d\-_]+(?:@[\w\d\-_\.]+[\w\d\-_]+)?)"
)
HASHTAG_REGEX = re.compile(r"\B#([a-zA-Z0-9(_)]+\b)(?!;)")
HASHTAG_REGEX = re.compile(r"\B#([\w()]+\b)(?!;)")
EMOJI_REGEX = re.compile(r"\B:([a-zA-Z0-9(_)-]+):\B")
@ -91,6 +91,8 @@ class FediverseHtmlParser(HTMLParser):
for mention in mentions or []:
if self.uri_domain:
url = mention.absolute_profile_uri()
elif not mention.local:
url = mention.profile_uri
else:
url = str(mention.urls.view)
if mention.username:

32
core/json.py 100644
Wyświetl plik

@ -0,0 +1,32 @@
import json
from httpx import Response
JSON_CONTENT_TYPES = [
"application/json",
"application/ld+json",
"application/activity+json",
]
def json_from_response(response: Response) -> dict | None:
content_type, *parameters = (
response.headers.get("Content-Type", "invalid").lower().split(";")
)
if content_type not in JSON_CONTENT_TYPES:
return None
charset = None
for parameter in parameters:
key, value = parameter.split("=")
if key.strip() == "charset":
charset = value.strip()
if charset:
return json.loads(response.content.decode(charset))
else:
# if no charset informed, default to
# httpx json for encoding inference
return response.json()

Wyświetl plik

@ -1,12 +1,24 @@
import datetime
import logging
import os
import urllib.parse as urllib_parse
from dateutil import parser
from pyld import jsonld
from pyld.jsonld import JsonLdError
from core.exceptions import ActivityPubFormatError
logger = logging.getLogger(__name__)
schemas = {
"unknown": {
"contentType": "application/ld+json",
"documentUrl": "unknown",
"contextUrl": None,
"document": {
"@context": {},
},
},
"www.w3.org/ns/activitystreams": {
"contentType": "application/ld+json",
"documentUrl": "http://www.w3.org/ns/activitystreams",
@ -456,6 +468,46 @@ schemas = {
}
},
},
"w3id.org/security/multikey/v1": {
"contentType": "application/ld+json",
"documentUrl": "https://w3id.org/security/multikey/v1",
"contextUrl": None,
"document": {
"@context": {
"id": "@id",
"type": "@type",
"@protected": True,
"Multikey": {
"@id": "https://w3id.org/security#Multikey",
"@context": {
"@protected": True,
"id": "@id",
"type": "@type",
"controller": {
"@id": "https://w3id.org/security#controller",
"@type": "@id",
},
"revoked": {
"@id": "https://w3id.org/security#revoked",
"@type": "http://www.w3.org/2001/XMLSchema#dateTime",
},
"expires": {
"@id": "https://w3id.org/security#expiration",
"@type": "http://www.w3.org/2001/XMLSchema#dateTime",
},
"publicKeyMultibase": {
"@id": "https://w3id.org/security#publicKeyMultibase",
"@type": "https://w3id.org/security#multibase",
},
"secretKeyMultibase": {
"@id": "https://w3id.org/security#secretKeyMultibase",
"@type": "https://w3id.org/security#multibase",
},
},
},
},
},
},
"*/schemas/litepub-0.1.jsonld": {
"contentType": "application/ld+json",
"documentUrl": "http://w3id.org/security/v1",
@ -559,6 +611,16 @@ schemas = {
},
},
},
"purl.org/wytchspace/ns/ap/1.0": {
"contentType": "application/ld+json",
"documentUrl": "https://purl.org/wytchspace/ns/ap/1.0",
"contextUrl": None,
"document": {
"@context": {
"wytch": "https://ns.wytch.space/ap/1.0.jsonld",
},
},
},
}
DATETIME_FORMAT = "%Y-%m-%dT%H:%M:%S.Z"
@ -570,12 +632,8 @@ def builtin_document_loader(url: str, options={}):
# Get URL without scheme
pieces = urllib_parse.urlparse(url)
if pieces.hostname is None:
raise JsonLdError(
f"No schema built-in for {url!r}",
"jsonld.LoadDocumentError",
code="loading document failed",
cause="NoHostnameError",
)
logger.info(f"No host name for json-ld schema: {url!r}")
return schemas["unknown"]
key = pieces.hostname + pieces.path.rstrip("/")
try:
return schemas[key]
@ -584,12 +642,9 @@ def builtin_document_loader(url: str, options={}):
key = "*" + pieces.path.rstrip("/")
return schemas[key]
except KeyError:
raise JsonLdError(
f"No schema built-in for {key!r}",
"jsonld.LoadDocumentError",
code="loading document failed",
cause="KeyError",
)
# return an empty context instead of throwing an error
logger.info(f"Ignoring unknown json-ld schema: {url!r}")
return schemas["unknown"]
def canonicalise(json_data: dict, include_security: bool = False) -> dict:
@ -695,7 +750,7 @@ def get_value_or_map(data, key, map_key):
if "und" in map_key:
return data[map_key]["und"]
return list(data[map_key].values())[0]
raise KeyError(f"Cannot find {key} or {map_key}")
raise ActivityPubFormatError(f"Cannot find {key} or {map_key}")
def media_type_from_filename(filename):

Wyświetl plik

@ -10,7 +10,6 @@ import core.uploads
class Migration(migrations.Migration):
initial = True
dependencies = [

Wyświetl plik

@ -6,7 +6,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("users", "0016_hashtagfollow"),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),

Wyświetl plik

@ -2,7 +2,6 @@ from functools import partial
from typing import ClassVar
import pydantic
from asgiref.sync import sync_to_async
from django.core.files import File
from django.db import models
from django.utils.functional import lazy
@ -97,16 +96,6 @@ class Config(models.Model):
{"identity__isnull": True, "user__isnull": True, "domain__isnull": True},
)
@classmethod
async def aload_system(cls):
"""
Async loads the system config options object
"""
return await sync_to_async(cls.load_values)(
cls.SystemOptions,
{"identity__isnull": True, "user__isnull": True, "domain__isnull": True},
)
@classmethod
def load_user(cls, user):
"""
@ -117,16 +106,6 @@ class Config(models.Model):
{"identity__isnull": True, "user": user, "domain__isnull": True},
)
@classmethod
async def aload_user(cls, user):
"""
Async loads the user config options object
"""
return await sync_to_async(cls.load_values)(
cls.UserOptions,
{"identity__isnull": True, "user": user, "domain__isnull": True},
)
@classmethod
def load_identity(cls, identity):
"""
@ -137,16 +116,6 @@ class Config(models.Model):
{"identity": identity, "user__isnull": True, "domain__isnull": True},
)
@classmethod
async def aload_identity(cls, identity):
"""
Async loads an identity config options object
"""
return await sync_to_async(cls.load_values)(
cls.IdentityOptions,
{"identity": identity, "user__isnull": True, "domain__isnull": True},
)
@classmethod
def load_domain(cls, domain):
"""
@ -157,16 +126,6 @@ class Config(models.Model):
{"domain": domain, "user__isnull": True, "identity__isnull": True},
)
@classmethod
async def aload_domain(cls, domain):
"""
Async loads an domain config options object
"""
return await sync_to_async(cls.load_values)(
cls.DomainOptions,
{"domain": domain, "user__isnull": True, "identity__isnull": True},
)
@classmethod
def set_value(cls, key, value, options_class, filters):
config_field = options_class.__fields__[key]
@ -255,6 +214,7 @@ class Config(models.Model):
content_warning_text: str = "Content Warning"
post_length: int = 500
max_media_attachments: int = 4
post_minimum_interval: int = 3 # seconds
identity_min_length: int = 2
identity_max_per_user: int = 5

Wyświetl plik

@ -27,12 +27,14 @@ if SENTRY_ENABLED:
set_context = sentry_sdk.set_context
set_tag = sentry_sdk.set_tag
start_transaction = sentry_sdk.start_transaction
start_span = sentry_sdk.start_span
else:
configure_scope = noop_context
push_scope = noop_context
set_context = noop
set_tag = noop
start_transaction = noop_context
start_span = noop_context
def set_takahe_app(name: str):

Wyświetl plik

@ -1,5 +1,7 @@
import base64
import json
import logging
from ssl import SSLCertVerificationError, SSLError
from typing import Literal, TypedDict, cast
from urllib.parse import urlparse
@ -17,6 +19,8 @@ from pyld import jsonld
from core.ld import format_ld_date
logger = logging.getLogger(__name__)
class VerificationError(BaseException):
"""
@ -102,12 +106,18 @@ class HttpSignature:
name, value = item.split("=", 1)
value = value.strip('"')
bits[name.lower()] = value
signature_details: HttpSignatureDetails = {
"headers": bits["headers"].split(),
"signature": base64.b64decode(bits["signature"]),
"algorithm": bits["algorithm"],
"keyid": bits["keyid"],
}
try:
signature_details: HttpSignatureDetails = {
"headers": bits["headers"].split(),
"signature": base64.b64decode(bits["signature"]),
"algorithm": bits["algorithm"],
"keyid": bits["keyid"],
}
except KeyError as e:
key_names = " ".join(bits.keys())
raise VerificationError(
f"Missing item from details (have: {key_names}, error: {e})"
)
return signature_details
@classmethod
@ -133,7 +143,7 @@ class HttpSignature:
try:
public_key_instance.verify(
signature,
cleartext.encode("ascii"),
cleartext.encode("utf8"),
padding.PKCS1v15(),
hashes.SHA256(),
)
@ -176,13 +186,13 @@ class HttpSignature:
)
@classmethod
async def signed_request(
def signed_request(
cls,
uri: str,
body: dict | None,
private_key: str,
key_id: str,
content_type: str = "application/json",
content_type: str = "application/activity+json",
method: Literal["get", "post"] = "post",
timeout: TimeoutTypes = settings.SETUP.REMOTE_TIMEOUT,
):
@ -209,7 +219,7 @@ class HttpSignature:
body_bytes = b""
# GET requests get implicit accept headers added
if method == "get":
headers["Accept"] = "application/ld+json"
headers["Accept"] = "application/activity+json,application/ld+json"
# Sign the headers
signed_string = "\n".join(
f"{name.lower()}: {value}" for name, value in headers.items()
@ -222,7 +232,7 @@ class HttpSignature:
),
)
signature = private_key_instance.sign(
signed_string.encode("ascii"),
signed_string.encode("utf8"),
padding.PKCS1v15(),
hashes.SHA256(),
)
@ -240,15 +250,19 @@ class HttpSignature:
# Send the request with all those headers except the pseudo one
del headers["(request-target)"]
async with httpx.AsyncClient(timeout=timeout) as client:
with httpx.Client(timeout=timeout) as client:
try:
response = await client.request(
response = client.request(
method,
uri,
headers=headers,
content=body_bytes,
follow_redirects=method == "get",
)
except SSLError as invalid_cert:
# Not our problem if the other end doesn't have proper SSL
logger.info("Invalid cert on %s %s", uri, invalid_cert)
raise SSLCertVerificationError(invalid_cert) from invalid_cert
except InvalidCodepoint as ex:
# Convert to a more generic error we handle
raise httpx.HTTPError(f"InvalidCodepoint: {str(ex)}") from None
@ -283,6 +297,8 @@ class LDSignature:
Verifies a document
"""
try:
# causing side effects to the original document is bad form
document = document.copy()
# Strip out the signature from the incoming document
signature = document.pop("signature")
# Create the options document
@ -310,7 +326,7 @@ class LDSignature:
hashes.SHA256(),
)
except InvalidSignature:
raise VerificationError("Signature mismatch")
raise VerificationError("LDSignature mismatch")
@classmethod
def create_signature(

Wyświetl plik

@ -15,7 +15,7 @@ x-takahe-common:
TAKAHE_DATABASE_SERVER: "postgres://postgres:insecure_password@db/takahe"
TAKAHE_DEBUG: "true"
TAKAHE_SECRET_KEY: "insecure_secret"
TAKAHE_CSRF_TRUSTED_ORIGINS: '["http://127.0.0.1:8000", "https://127.0.0.1:8000"]'
TAKAHE_CSRF_HOSTS: '["http://127.0.0.1:8000", "https://127.0.0.1:8000"]'
TAKAHE_USE_PROXY_HEADERS: "true"
TAKAHE_EMAIL_BACKEND: "console://console"
TAKAHE_MAIN_DOMAIN: "example.com"
@ -56,10 +56,16 @@ services:
start_period: 15s
ports:
- "8000:8000"
depends_on:
setup:
condition: service_completed_successfully
stator:
<<: *takahe-common
command: ["/takahe/manage.py", "runstator"]
depends_on:
setup:
condition: service_completed_successfully
setup:
<<: *takahe-common

Wyświetl plik

@ -79,7 +79,7 @@ local installation, though.
Direct Installation
~~~~~~~~~~~~~~~~~~~
Takahē requires Python 3.10 or above, so you'll need that first. Clone the repo::
Takahē requires Python 3.11 or above, so you'll need that first. Clone the repo::
git clone https://github.com/jointakahe/takahe/
@ -172,3 +172,37 @@ We use `HTMX <https://htmx.org/>`_ for dynamically loading content, and
`Hyperscript <https://hyperscript.org/>`_ for most interactions rather than raw
JavaScript. If you can accomplish what you need with these tools, please use them
rather than adding JS.
Cutting a release
-----------------
In order to make a release of Takahē, follow these steps:
* Create or update the release document (in ``/docs/releases``) for the
release; major versions get their own document, minor releases get a
subheading in the document for their major release.
* Go through the git commit history since the last release in order to write
a reasonable summary of features.
* Be sure to include the little paragraphs at the end about contributing and
the docker tag, and an Upgrade Notes section that at minimum mentions
migrations and if they're normal or weird (even if there aren't any, it's
nice to call that out).
* If it's a new doc, make sure you include it in ``docs/releases/index.rst``!
* Update the version number in ``/takahe/__init__.py``
* Update the version number in ``README.md``
* Make a commit containing these changes called ``Releasing 1.23.45``.
* Tag that commit with a tag in the format ``1.23.45``.
* Wait for the GitHub Actions to run and publish the docker images (around 20
minutes as the ARM build is a bit slow)
* Post on the official account announcing the relase and linking to the
now-published release notes.

Wyświetl plik

@ -119,9 +119,26 @@ be provided to the containers from the first boot.
``["andrew@aeracode.org"]`` (if you're doing this via shell, be careful
about escaping!)
* If you want to support push notifications, set ``TAKAHE_VAPID_PUBLIC_KEY``
and ``TAKAHE_VAPID_PRIVATE_KEY`` to a valid VAPID keypair (note that if you
ever change these, push notifications will stop working). You can generate
a keypair at `<https://web-push-codelab.glitch.me/>`_.
There are some other, optional variables you can tweak once the
system is up and working - see :doc:`tuning` for more.
If you are behind a caching proxy, such as Cloudflare, you may need to update
your CSRF host settings to match. Takahē validates that requests have an
Origin header that matches their Referer header by default, and these services
can break that relationship.
Takahē lets you set this up via the ``TAKAHE_CSRF_HOSTS`` environment variable, which takes
a Python-list-formatted list of additional protocols/domains to allow, with wildcards. It feeds
directly into Django's `CSRF_TRUSTED_ORIGINS <https://docs.djangoproject.com/en/4.2/ref/settings/#csrf-trusted-origins>`_
setting, so for more information about how to use it, see `the Django documentation <https://docs.djangoproject.com/en/4.2/ref/settings/#csrf-trusted-origins>`_ - generally, you'd want to set it to
your website's public address, so for our server it would have been
``TAKAHE_CSRF_HOSTS='["https://takahe.social"]'``.
.. _media_configuration:
@ -150,6 +167,11 @@ If you omit the keys or the endpoint URL, then Takahē will try to use implicit
authentication for them. The keys, if included, should be urlencoded, as AWS
secret keys commonly contain eg + characters.
With the above examples, Takahē connects to an S3 bucket using **HTTPS**. If
you wish to connect to an S3 bucket using **HTTP** (for example, to connect to
an S3 API endpoint on a private network), replace `s3` in the examples above
with `s3-insecure`.
Your S3 bucket *must* be set to allow publically-readable files, as Takahē will
set all files it uploads to be ``public-read``. We randomise uploaded file
names to prevent enumeration attacks.
@ -206,6 +228,9 @@ with the password ``my:password``, it would be represented as::
smtp://someone%40example.com:my%3Apassword@smtp.example.com:25/
The username and password can be omitted, with a URL in the form
``smtp://host:port/``, if your mail server is a (properly firewalled!)
unauthenticated relay.
SendGrid
########

Wyświetl plik

@ -15,14 +15,15 @@ Client Apps
These apps are known to fully work as far as Takahē's
:doc:`own featureset <features>` allows:
* Tusky
* Elk
* Pinafore
* `Tusky <https://tusky.app/>`_
* `Elk <https://elk.zone/>`_
* `Pinafore <https://pinafore.social/>`_
* `Tuba <https://tuba.geopjr.dev/>`_
These apps have had initial testing and work at a basic level:
* Ivory
* `Ivory <https://tapbots.com/ivory/>`_
* `Phanpy <https://phanpy.social/>`_
Fediverse Servers

Wyświetl plik

@ -0,0 +1,98 @@
0.10
====
*0.10.0 Released: 2023/11/12*
*0.10.1 Released: 2023/11/13*
This release is a polish release that mostly focuses on performance, stability
and federation compatibility.
This release's major changes:
* Stator, the background task system, has been significantly reworked to require
smaller indexes, spend less time scheduling, and has had most of its async
nature removed, as this both reduces deadlocks and improves performance in
most situations (the context switching was costing more than the gains from
talking to other servers asynchronously).
Minor changes also include:
* Followers-only mode now works correctly inbound and outbound (though outbound
may need the other server to refresh the profile first).
* Profile pages are no longer shown for remote identities; instead, users are
linked or redirected directly to the remote profile page.
* Inbound migration has been implemented, but is disabled by default as outbound
migration is not yet complete, and we don't want to release a system that
captures users with no outward path. If you *really* want to enable it, set
``TAKAHE_ALLOW_USER_MIGRATION=true`` in your environment.
* Federation compatibility has been improved with several other servers.
* Blocked domains now receive absolutely zero fetches from Takahē; previously,
they were still pinged occasionally to see if they were online.
* SMTP servers that don't require authentication are now supported.
* Python 3.11 is now the minimum version required; this will not affect you at
all if you run Takahē via our docker image, as is recommended.
An automatic remote post pruning system, to shrink the database of old data
that was no longer needed, was in the development version but has been switched
to a set of manual commands as of 0.10.1 - you can read more below or in
:doc:`/tuning`.
If you'd like to help with code, design, or other areas, see
:doc:`/contributing` to see how to get in touch.
You can download images from `Docker Hub <https://hub.docker.com/r/jointakahe/takahe>`_,
or use the image name ``jointakahe/takahe:0.10``.
0.10.1
------
*Released: 2023/11/13*
This is a bugfix and small feature addition release:
* The ``runstator`` command now logs its output to the terminal again
* Two new commands, ``pruneposts`` and ``pruneidentities`` are added, to enable
pruning (deletion of old content) of Posts and Identities respectively.
You can read more about them in :doc:`/tuning`.
* Stator's default concurrency levels have been significantly reduced as it's
now way more efficient at using individual database connections, but as a
result it places way more load on them. You can read more about tuning this
in :doc:`/tuning`.
Upgrade Notes
-------------
Migrations
~~~~~~~~~~
There are new database migrations; they are backwards-compatible, but contain
very significant index changes to all of the main tables that may cause the
PostgreSQL deadlock detector to trigger if you attempt to apply them while your
site is live.
We recommend:
* Temporarily stopping all instances of the webserver and Stator
* Applying the migration (should be less than a few minutes on most installs)
* Restarting the instances of webserver and Stator
Stator
~~~~~~
Stator's new internal architecture allocates a worker thread and a database
connection up to its concurrency value; this means it is a _lot_ more efficient
for a given "concurrency" number than the old system and also uses a lot more
database connections. We recommend you reduce your configuration values for
these by 5-10x; if you didn't set them manually, then don't worry, we've
reduced the default values by a similar amount.

Wyświetl plik

@ -0,0 +1,54 @@
0.11
====
*Released: 2024-02-05*
This is largely a bugfix and catch up release.
Some highlights:
* Python 3.10 has been dropped. The new minimum Python version is 3.11
* Jamie (`@astraluma@tacobelllabs.net <https://tacobelllabs.net/@astraluma>`_)
has officially joined the project
* If your S3 does not use TLS, you must use ``s3-insecure`` in your
configuration
* Takahē now supports unicode hashtags
* Add a Maximum Media Attachments setting
* Inverted the pruning command exit codes
* Posts are no longer required to have text content
And some interoperability bugs:
* Fixed a bug with GoToSocial
* Attempted to fix follows from Misskey family
* Correctly handle when a federated report doesn't have content
In additions, there's many bugfixes and minor changes, including:
* Several JSON handling improvements
* Post pruning now has a random element to it
* More specific loggers
* Don't make local identities stale
* Don't try to unmute when there's no expiration
* Don't try to WebFinger local users
* Synchronize follow accepting and profile fetching
* Perform some basic domain validity
* Correctly reject more operations when the identity is deleted
* Post edit fanouts for likers/boosters
If you'd like to help with code, design, or other areas, see
:doc:`/contributing` to see how to get in touch.
You can download images from `Docker Hub <https://hub.docker.com/r/jointakahe/takahe>`_,
or use the image name ``jointakahe/takahe:0.11``.
Upgrade Notes
-------------
Migrations
~~~~~~~~~~
There are new database migrations; they are backwards-compatible and should
not present any major database load.

Wyświetl plik

@ -7,6 +7,8 @@ Versions
.. toctree::
:maxdepth: 1
0.11
0.10
0.9
0.8
0.7

Wyświetl plik

@ -0,0 +1,15 @@
Upgrade Notes
-------------
VAPID keys and Push notifications
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Takahē now supports push notifications if you supply a valid VAPID keypair as
the ``TAKAHE_VAPID_PUBLIC_KEY`` and ``TAKAHE_VAPID_PRIVATE_KEY`` environment
variables. You can generate a keypair via `https://web-push-codelab.glitch.me/`_.
Note that users of apps may need to sign out and in again to their accounts for
the app to notice that it can now do push notifications. Some apps, like Elk,
may cache the fact your server didn't support it for a while.

Wyświetl plik

@ -30,7 +30,7 @@ using more resources if you give them to it), you can:
has to send a copy of each of their posts to every follower, separately.
* Takahe is run with Gunicorn which spawns several
[workers](https://docs.gunicorn.org/en/stable/settings.html#workers) to
`workers <https://docs.gunicorn.org/en/stable/settings.html#workers>`_ to
handle requests. Depending on what environment you are running Takahe on,
you might want to customize this via the ``GUNICORN_CMD_ARGS`` environment
variable. For example - ``GUNICORN_CMD_ARGS="--workers 2"`` to set the
@ -56,10 +56,13 @@ Stator (Task Processing)
Takahē's background task processing system is called Stator, and it uses
asynchronous Python to pack loads of tasks at once time into a single process.
By default, it will try to run up to 100 tasks at once, with a maximum of 40
from any single model (FanOut will usually be the one it's doing most of).
You can tweak these with the ``TAKAHE_STATOR_CONCURRENCY`` and
``TAKAHE_STATOR_CONCURRENCY_PER_MODEL`` environment variables.
By default, it will try to run up to 20 tasks at once, with a maximum of 4 from
any single model (FanOut will usually be the one it's doing most of). You can
tweak these with the ``TAKAHE_STATOR_CONCURRENCY`` and
``TAKAHE_STATOR_CONCURRENCY_PER_MODEL`` environment variables; for every extra
element of concurrency you add, however, it will use an additional database
connection in a new worker thread. Be wary of hitting your database's
connection limits.
The only real limits Stator can hit are CPU and memory usage; if you see your
Stator (worker) containers not using anywhere near all of their CPU or memory,
@ -88,6 +91,50 @@ servers' timeouts make the connection fail) for more than about a week, some
servers may consider it permanently unreachable and stop sending posts.
Pruning
-------
Over time, the amount of Fediverse content your server consumes will grow -
you'll see every reply to every post from every user you follow, and fetch
every identity of every author of those replies.
Obviously, you don't need all of this past a certain date, as it's unlikely
you'll want to go back to view what the timeline would have looked like months
ago. If you want to remove this data, you can run the two "pruning" commmands::
./manage.py pruneposts
./manage.py pruneidentities
Each operates in batches, and takes an optional ``--number=1000`` argument
to specify the batch size. The ``TAKAHE_REMOTE_PRUNE_HORIZON`` environment
variable specifies the number of days of history you want to keep intact before
the pruning happens - this defaults to 3 months.
Post pruning removes any post that isn't:
* Written by a local identity
* Newer than ``TAKAHE_REMOTE_PRUNE_HORIZON`` days old
* Favourited, bookmarked or boosted by a local identity
* Replied to by a local identity
* A reply to a local identity's post
Identity pruning removes any identity that isn't:
* A local identity
* Newer than ``TAKAHE_REMOTE_PRUNE_HORIZON`` days old
* Mentioned by a post by a local identity
* Followed or blocked by a local identity
* Following or blocking a local identity
* A liker or booster of a local post
We recommend you run the pruning commands on a scheduled basis (i.e. like
a cronjob). They will return a ``1`` exit code if they deleted something and
a ``0`` exit code if they found nothing to delete, if you want to put them in
a loop that runs until deletion is complete::
until ./manage.py pruneposts; do sleep 1; done
Caching
-------
@ -221,6 +268,17 @@ read-through cache that respects ``Cache-Control``, like Varnish, will
also help if placed in front of Takahē.
Remote Content Pruning
----------------------
By default, Takahē will prune (delete) any remote posts or identities that
haven't been interacted with after 90 days. You can change this using the
``TAKAHE_REMOTE_PRUNE_HORIZON`` environment variable, which accepts an integer
number of days as its value.
Setting this environment variable to ``0`` disables this feature entirely.
Sentry.io integration
---------------------

Wyświetl plik

@ -5,7 +5,7 @@ dj_database_url~=1.0.0
django-cache-url~=3.4.2
django-cors-headers~=3.13.0
django-debug-toolbar~=3.8.1
django-hatchway~=0.5.1
django-hatchway~=0.5.2
django-htmx~=1.13.0
django-oauth-toolkit~=2.2.0
django-storages[google,boto3]~=1.13.1

Wyświetl plik

@ -12,6 +12,10 @@ addopts = --tb=short --ds=takahe.settings --import-mode=importlib
filterwarnings =
ignore:There is no current event loop
ignore:No directory at
ignore:DateTimeField Post.created
ignore:'index_together' is deprecated
ignore:Deprecated call to
ignore:pkg_resources is deprecated as an API
[mypy]
warn_unused_ignores = True

Wyświetl plik

@ -432,6 +432,17 @@ section h1.above {
margin-bottom: -20px;
}
section h2.above {
position: relative;
top: -35px;
left: -15px;
font-weight: bold;
font-size: 100%;
text-transform: uppercase;
color: var(--color-text-dull);
margin-bottom: -20px;
}
section p {
margin: 5px 0 10px 0;
}
@ -983,6 +994,7 @@ button,
background-color: var(--color-highlight);
color: var(--color-text-in-highlight);
display: inline-block;
text-decoration: none;
}
button.delete,
@ -1346,6 +1358,10 @@ table.metadata td .emoji {
cursor: pointer;
}
.message.error {
background-color: var(--color-bg-error);
}
/* Identity banner */
.identity-banner {

Wyświetl plik

@ -13,6 +13,7 @@ class StateGraph:
initial_state: ClassVar["State"]
terminal_states: ClassVar[set["State"]]
automatic_states: ClassVar[set["State"]]
deletion_states: ClassVar[set["State"]]
def __init_subclass__(cls) -> None:
# Collect state members
@ -33,6 +34,7 @@ class StateGraph:
# Check the graph layout
terminal_states = set()
automatic_states = set()
deletion_states = set()
initial_state = None
for state in cls.states.values():
# Check for multiple initial states
@ -42,6 +44,9 @@ class StateGraph:
f"The graph has more than one initial state: {initial_state} and {state}"
)
initial_state = state
# Collect states that require deletion handling (they can be terminal or not)
if state.delete_after:
deletion_states.add(state)
# Collect terminal states
if state.terminal:
state.externally_progressed = True
@ -74,6 +79,7 @@ class StateGraph:
cls.initial_state = initial_state
cls.terminal_states = terminal_states
cls.automatic_states = automatic_states
cls.deletion_states = deletion_states
# Generate choices
cls.choices = [(name, name) for name in cls.states.keys()]
@ -98,6 +104,9 @@ class State:
self.attempt_immediately = attempt_immediately
self.force_initial = force_initial
self.delete_after = delete_after
# Deletes are also only attempted on try_intervals
if self.delete_after and not self.try_interval:
self.try_interval = self.delete_after
self.parents: set["State"] = set()
self.children: set["State"] = set()
self.timeout_state: State | None = None

Wyświetl plik

@ -1,6 +1,6 @@
import logging
from typing import cast
from asgiref.sync import async_to_sync
from django.apps import apps
from django.core.management.base import BaseCommand
@ -8,6 +8,8 @@ from core.models import Config
from stator.models import StatorModel
from stator.runner import StatorRunner
logger = logging.getLogger(__name__)
class Command(BaseCommand):
help = "Runs a Stator runner"
@ -17,7 +19,7 @@ class Command(BaseCommand):
"--concurrency",
"-c",
type=int,
default=30,
default=15,
help="How many tasks to run at once",
)
parser.add_argument(
@ -62,6 +64,12 @@ class Command(BaseCommand):
):
# Cache system config
Config.system = Config.load_system()
logging.basicConfig(
format="[%(asctime)s] %(levelname)8s - %(message)s",
datefmt="%Y-%m-%d %H:%M:%S",
level=logging.INFO,
force=True,
)
# Resolve the models list into names
models = cast(
list[type[StatorModel]],
@ -74,7 +82,9 @@ class Command(BaseCommand):
if not models:
models = StatorModel.subclasses
models = [model for model in models if model not in excluded]
print("Running for models: " + " ".join(m._meta.label_lower for m in models))
logger.info(
"Running for models: " + " ".join(m._meta.label_lower for m in models)
)
# Run a runner
runner = StatorRunner(
models,
@ -84,6 +94,6 @@ class Command(BaseCommand):
run_for=run_for,
)
try:
async_to_sync(runner.run)()
runner.run()
except KeyboardInterrupt:
print("Ctrl-C received")
logger.critical("Ctrl-C received")

Wyświetl plik

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = []

Wyświetl plik

@ -4,7 +4,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("stator", "0001_initial"),
]

Wyświetl plik

@ -1,17 +1,18 @@
import datetime
import traceback
from typing import ClassVar, cast
import logging
from typing import ClassVar
from asgiref.sync import sync_to_async
from asgiref.sync import async_to_sync, iscoroutinefunction
from django.db import models, transaction
from django.db.models.signals import class_prepared
from django.utils import timezone
from django.utils.functional import classproperty
from core import exceptions
from stator.exceptions import TryAgainLater
from stator.graph import State, StateGraph
logger = logging.getLogger(__name__)
class StateField(models.CharField):
"""
@ -47,19 +48,15 @@ def add_stator_indexes(sender, **kwargs):
if issubclass(sender, StatorModel):
indexes = [
models.Index(
fields=["state", "state_attempted"],
name=f"ix_{sender.__name__.lower()[:11]}_state_attempted",
),
models.Index(
fields=["state_locked_until", "state"],
condition=models.Q(state_locked_until__isnull=False),
name=f"ix_{sender.__name__.lower()[:11]}_state_locked",
fields=["state", "state_next_attempt", "state_locked_until"],
name=f"ix_{sender.__name__.lower()[:11]}_state_next",
),
]
if not sender._meta.indexes:
# Meta.indexes needs to not be None to trigger Django behaviors
sender.Meta.indexes = []
sender._meta.indexes = []
for idx in indexes:
sender._meta.indexes.append(idx)
@ -81,30 +78,26 @@ class StatorModel(models.Model):
concrete model yourself.
"""
SCHEDULE_BATCH_SIZE = 1000
CLEAN_BATCH_SIZE = 1000
DELETE_BATCH_SIZE = 500
state: StateField
# If this row is up for transition attempts (which it always is on creation!)
state_ready = models.BooleanField(default=True)
# When the state last actually changed, or the date of instance creation
state_changed = models.DateTimeField(auto_now_add=True)
# When the last state change for the current state was attempted
# (and not successful, as this is cleared on transition)
state_attempted = models.DateTimeField(blank=True, null=True)
# When the next state change should be attempted (null means immediately)
state_next_attempt = models.DateTimeField(blank=True, null=True)
# If a lock is out on this row, when it is locked until
# (we don't identify the lock owner, as there's no heartbeats)
state_locked_until = models.DateTimeField(null=True, blank=True)
state_locked_until = models.DateTimeField(null=True, blank=True, db_index=True)
# Collection of subclasses of us
subclasses: ClassVar[list[type["StatorModel"]]] = []
class Meta:
abstract = True
indexes = [models.Index(fields=["state_ready", "state_locked_until", "state"])]
def __init_subclass__(cls) -> None:
if cls is not StatorModel:
@ -118,52 +111,6 @@ class StatorModel(models.Model):
def state_age(self) -> float:
return (timezone.now() - self.state_changed).total_seconds()
@classmethod
async def atransition_schedule_due(cls, now=None):
"""
Finds instances of this model that need to run and schedule them.
"""
if now is None:
now = timezone.now()
q = models.Q()
for state in cls.state_graph.states.values():
state = cast(State, state)
if not state.externally_progressed:
q = q | models.Q(
(
models.Q(
state_attempted__lte=(
now
- datetime.timedelta(
seconds=cast(float, state.try_interval)
)
)
)
| models.Q(state_attempted__isnull=True)
),
state=state.name,
)
select_query = cls.objects.filter(q)[: cls.SCHEDULE_BATCH_SIZE]
await cls.objects.filter(pk__in=select_query).aupdate(state_ready=True)
@classmethod
async def atransition_delete_due(cls, now=None):
"""
Finds instances of this model that need to be deleted and deletes them.
"""
if now is None:
now = timezone.now()
for state in cls.state_graph.states.values():
state = cast(State, state)
if state.delete_after:
select_query = cls.objects.filter(
state=state,
state_changed__lte=(
now - datetime.timedelta(seconds=state.delete_after)
),
)[: cls.SCHEDULE_BATCH_SIZE]
await cls.objects.filter(pk__in=select_query).adelete()
@classmethod
def transition_get_with_lock(
cls, number: int, lock_expiry: datetime.datetime
@ -172,11 +119,17 @@ class StatorModel(models.Model):
Returns up to `number` tasks for execution, having locked them.
"""
with transaction.atomic():
# Query for `number` rows that:
# - Have a next_attempt that's either null or in the past
# - Have one of the states we care about
# Then, sort them by next_attempt NULLS FIRST, so that we handle the
# rows in a roughly FIFO order.
selected = list(
cls.objects.filter(
state_locked_until__isnull=True,
state_ready=True,
models.Q(state_next_attempt__isnull=True)
| models.Q(state_next_attempt__lte=timezone.now()),
state__in=cls.state_graph.automatic_states,
state_locked_until__isnull=True,
)[:number].select_for_update()
)
cls.objects.filter(pk__in=[i.pk for i in selected]).update(
@ -185,58 +138,74 @@ class StatorModel(models.Model):
return selected
@classmethod
async def atransition_get_with_lock(
cls, number: int, lock_expiry: datetime.datetime
) -> list["StatorModel"]:
return await sync_to_async(cls.transition_get_with_lock)(number, lock_expiry)
def transition_delete_due(cls) -> int | None:
"""
Finds instances of this model that need to be deleted and deletes them
in small batches. Returns how many were deleted.
"""
if cls.state_graph.deletion_states:
constraints = models.Q()
for state in cls.state_graph.deletion_states:
constraints |= models.Q(
state=state,
state_changed__lte=(
timezone.now() - datetime.timedelta(seconds=state.delete_after)
),
)
select_query = cls.objects.filter(
models.Q(state_next_attempt__isnull=True)
| models.Q(state_next_attempt__lte=timezone.now()),
constraints,
)[: cls.DELETE_BATCH_SIZE]
return cls.objects.filter(pk__in=select_query).delete()[0]
return None
@classmethod
async def atransition_ready_count(cls) -> int:
def transition_ready_count(cls) -> int:
"""
Returns how many instances are "queued"
"""
return await cls.objects.filter(
return cls.objects.filter(
models.Q(state_next_attempt__isnull=True)
| models.Q(state_next_attempt__lte=timezone.now()),
state_locked_until__isnull=True,
state_ready=True,
state__in=cls.state_graph.automatic_states,
).acount()
).count()
@classmethod
async def atransition_clean_locks(cls):
def transition_clean_locks(cls):
"""
Deletes stale locks (in batches, to avoid a giant query)
"""
select_query = cls.objects.filter(state_locked_until__lte=timezone.now())[
: cls.SCHEDULE_BATCH_SIZE
: cls.CLEAN_BATCH_SIZE
]
await cls.objects.filter(pk__in=select_query).aupdate(state_locked_until=None)
cls.objects.filter(pk__in=select_query).update(state_locked_until=None)
def transition_schedule(self):
"""
Adds this instance to the queue to get its state transition attempted.
The scheduler will call this, but you can also call it directly if you
know it'll be ready and want to lower latency.
"""
self.state_ready = True
self.save()
async def atransition_attempt(self) -> State | None:
def transition_attempt(self) -> State | None:
"""
Attempts to transition the current state by running its handler(s).
"""
current_state: State = self.state_graph.states[self.state]
# If it's a manual progression state don't even try
# We shouldn't really be here in this case, but it could be a race condition
if current_state.externally_progressed:
print(
logger.warning(
f"Warning: trying to progress externally progressed state {self.state}!"
)
return None
# Try running its handler function
try:
next_state = await current_state.handler(self) # type: ignore
if iscoroutinefunction(current_state.handler):
next_state = async_to_sync(current_state.handler)(self)
else:
next_state = current_state.handler(self)
except TryAgainLater:
pass
except BaseException as e:
await exceptions.acapture_exception(e)
traceback.print_exc()
logger.exception(e)
else:
if next_state:
# Ensure it's a State object
@ -247,20 +216,24 @@ class StatorModel(models.Model):
raise ValueError(
f"Cannot transition from {current_state} to {next_state} - not a declared transition"
)
await self.atransition_perform(next_state)
self.transition_perform(next_state)
return next_state
# See if it timed out
# See if it timed out since its last state change
if (
current_state.timeout_value
and current_state.timeout_value
<= (timezone.now() - self.state_changed).total_seconds()
):
await self.atransition_perform(current_state.timeout_state)
self.transition_perform(current_state.timeout_state) # type: ignore
return current_state.timeout_state
await self.__class__.objects.filter(pk=self.pk).aupdate(
state_attempted=timezone.now(),
# Nothing happened, set next execution and unlock it
self.__class__.objects.filter(pk=self.pk).update(
state_next_attempt=(
timezone.now() + datetime.timedelta(seconds=current_state.try_interval) # type: ignore
),
state_locked_until=None,
state_ready=False,
)
return None
@ -273,27 +246,6 @@ class StatorModel(models.Model):
state,
)
atransition_perform = sync_to_async(transition_perform)
def transition_set_state(self, state: State | str):
"""
Sets the instance to the given state name for when it is saved.
"""
if isinstance(state, State):
state = state.name
if state not in self.state_graph.states:
raise ValueError(f"Invalid state {state}")
self.state = state # type: ignore
self.state_changed = timezone.now()
self.state_locked_until = None
if self.state_graph.states[state].attempt_immediately:
self.state_attempted = None
self.state_ready = True
else:
self.state_attempted = timezone.now()
self.state_ready = False
@classmethod
def transition_perform_queryset(
cls,
@ -303,26 +255,27 @@ class StatorModel(models.Model):
"""
Transitions every instance in the queryset to the given state name, forcibly.
"""
# Really ensure we have the right state object
if isinstance(state, State):
state = state.name
if state not in cls.state_graph.states:
raise ValueError(f"Invalid state {state}")
state_obj = cls.state_graph.states[state.name]
else:
state_obj = cls.state_graph.states[state]
# See if it's ready immediately (if not, delay until first try_interval)
if cls.state_graph.states[state].attempt_immediately:
if state_obj.attempt_immediately or state_obj.try_interval is None:
queryset.update(
state=state,
state=state_obj,
state_changed=timezone.now(),
state_attempted=None,
state_next_attempt=None,
state_locked_until=None,
state_ready=True,
)
else:
queryset.update(
state=state,
state=state_obj,
state_changed=timezone.now(),
state_attempted=timezone.now(),
state_next_attempt=(
timezone.now() + datetime.timedelta(seconds=state_obj.try_interval)
),
state_locked_until=None,
state_ready=False,
)
@ -355,10 +308,6 @@ class Stats(models.Model):
instance.statistics[key] = {}
return instance
@classmethod
async def aget_for_model(cls, model: type[StatorModel]) -> "Stats":
return await sync_to_async(cls.get_for_model)(model)
def set_queued(self, number: int):
"""
Sets the current queued amount.

Wyświetl plik

@ -1,41 +1,46 @@
import asyncio
import datetime
import logging
import os
import signal
import time
import traceback
import uuid
from collections.abc import Callable
from concurrent.futures import Future, ThreadPoolExecutor
from asgiref.sync import async_to_sync, sync_to_async
from django.conf import settings
from django.db import close_old_connections
from django.utils import timezone
from core import exceptions, sentry
from core import sentry
from core.models import Config
from stator.models import StatorModel, Stats
logger = logging.getLogger(__name__)
class LoopingTask:
class LoopingTimer:
"""
Wrapper for having a coroutine go in the background and only have one
copy running at a time.
Triggers check() to be true once every `interval`.
"""
def __init__(self, callable: Callable):
self.callable = callable
self.task: asyncio.Task | None = None
next_run: float | None = None
def run(self) -> bool:
# If we have a task object, see if we can clear it up
if self.task is not None:
if self.task.done():
self.task = None
def __init__(self, interval: float, trigger_at_start=True):
self.interval = interval
self.trigger_at_start = trigger_at_start
def check(self) -> bool:
# See if it's our first time being called
if self.next_run is None:
# Set up the next call based on trigger_at_start
if self.trigger_at_start:
self.next_run = time.monotonic()
else:
return False
# OK, launch a new task
self.task = asyncio.create_task(self.callable())
return True
self.next_run = time.monotonic() + self.interval
# See if it's time to run the next call
if time.monotonic() >= self.next_run:
self.next_run = time.monotonic() + self.interval
return True
return False
class StatorRunner:
@ -47,12 +52,13 @@ class StatorRunner:
def __init__(
self,
models: list[type[StatorModel]],
concurrency: int = getattr(settings, "STATOR_CONCURRENCY", 50),
concurrency: int = getattr(settings, "STATOR_CONCURRENCY", 30),
concurrency_per_model: int = getattr(
settings, "STATOR_CONCURRENCY_PER_MODEL", 15
),
liveness_file: str | None = None,
schedule_interval: int = 30,
schedule_interval: int = 60,
delete_interval: int = 30,
lock_expiry: int = 300,
run_for: int = 0,
):
@ -62,53 +68,52 @@ class StatorRunner:
self.concurrency_per_model = concurrency_per_model
self.liveness_file = liveness_file
self.schedule_interval = schedule_interval
self.delete_interval = delete_interval
self.lock_expiry = lock_expiry
self.run_for = run_for
self.minimum_loop_delay = 0.5
self.maximum_loop_delay = 5
self.tasks: dict[tuple[str, str], Future] = {}
# Set up SIGALRM handler
signal.signal(signal.SIGALRM, self.alarm_handler)
async def run(self):
def run(self):
sentry.set_takahe_app("stator")
self.handled = {}
self.started = time.monotonic()
self.last_clean = time.monotonic() - self.schedule_interval
self.tasks = []
self.executor = ThreadPoolExecutor(max_workers=self.concurrency)
self.loop_delay = self.minimum_loop_delay
self.schedule_task = LoopingTask(self.run_scheduling)
self.fetch_task = LoopingTask(self.fetch_and_process_tasks)
self.config_task = LoopingTask(self.load_config)
self.scheduling_timer = LoopingTimer(self.schedule_interval)
self.deletion_timer = LoopingTimer(self.delete_interval)
# For the first time period, launch tasks
print("Running main task loop")
logger.info("Running main task loop")
try:
with sentry.configure_scope() as scope:
while True:
# Do we need to do cleaning?
if (time.monotonic() - self.last_clean) >= self.schedule_interval:
# Set up the watchdog timer (each time we do this the
# previous one is cancelled)
# See if we need to run cleaning
if self.scheduling_timer.check():
# Set up the watchdog timer (each time we do this the previous one is cancelled)
signal.alarm(self.schedule_interval * 2)
# Refresh the config
self.config_task.run()
if self.schedule_task.run():
print("Running cleaning and scheduling")
else:
print("Previous scheduling still running...!")
# Write liveness file if configured
if self.liveness_file:
with open(self.liveness_file, "w") as fh:
fh.write(str(int(time.time())))
self.last_clean = time.monotonic()
# Refresh the config
self.load_config()
# Do scheduling (stale lock deletion and stats gathering)
self.run_scheduling()
# Clear the cleaning breadcrumbs/extra for the main part of the loop
sentry.scope_clear(scope)
self.remove_completed_tasks()
self.clean_tasks()
# Fetching is kind of blocking, so we need to do this
# as a separate coroutine
self.fetch_task.run()
# See if we need to add deletion tasks
if self.deletion_timer.check():
self.add_deletion_tasks()
# Fetch and run any new handlers we can fit
self.add_transition_tasks()
# Are we in limited run mode?
if (
@ -126,128 +131,168 @@ class StatorRunner:
self.loop_delay * 1.5,
self.maximum_loop_delay,
)
await asyncio.sleep(self.loop_delay)
time.sleep(self.loop_delay)
# Clear the Sentry breadcrumbs and extra for next loop
sentry.scope_clear(scope)
except KeyboardInterrupt:
pass
# Wait for tasks to finish
print("Waiting for tasks to complete")
while True:
self.remove_completed_tasks()
if not self.tasks:
break
# Prevent busylooping
await asyncio.sleep(0.5)
print("Complete")
return self.handled
logger.info("Waiting for tasks to complete")
self.executor.shutdown()
# We're done
logger.info("Complete")
def alarm_handler(self, signum, frame):
"""
Called when SIGALRM fires, which means we missed a schedule loop.
Just exit as we're likely deadlocked.
"""
print("Watchdog timeout exceeded")
logger.warning("Watchdog timeout exceeded")
os._exit(2)
async def load_config(self):
def load_config(self):
"""
Refreshes config from the DB
"""
Config.system = await Config.aload_system()
Config.system = Config.load_system()
async def run_scheduling(self):
def run_scheduling(self):
"""
Do any transition cleanup tasks
Deletes stale locks for models, and submits their stats.
"""
if self.handled:
print("Tasks processed since last flush:")
for label, number in self.handled.items():
print(f" {label}: {number}")
else:
print("No tasks handled since last flush.")
with sentry.start_transaction(op="task", name="stator.run_scheduling"):
for model in self.models:
print(f"Scheduling {model._meta.label_lower}")
await self.submit_stats(model)
print(" Cleaning locks")
await model.atransition_clean_locks()
print(" Scheduling due items")
await model.atransition_schedule_due()
print(" Deleting due items")
await model.atransition_delete_due()
with sentry.start_span(description=model._meta.label_lower):
num = self.handled.get(model._meta.label_lower, 0)
if num or settings.DEBUG:
logger.info(
f"{model._meta.label_lower}: Scheduling ({num} handled)"
)
self.submit_stats(model)
model.transition_clean_locks()
async def submit_stats(self, model):
def submit_stats(self, model: type[StatorModel]):
"""
Pop some statistics into the database
Pop some statistics into the database from our local info for the given model
"""
stats_instance = await Stats.aget_for_model(model)
stats_instance = Stats.get_for_model(model)
if stats_instance.model_label in self.handled:
stats_instance.add_handled(self.handled[stats_instance.model_label])
del self.handled[stats_instance.model_label]
stats_instance.set_queued(await model.atransition_ready_count())
stats_instance.set_queued(model.transition_ready_count())
stats_instance.trim_data()
await sync_to_async(stats_instance.save)()
stats_instance.save()
async def fetch_and_process_tasks(self):
def add_transition_tasks(self, call_inline=False):
"""
Adds a transition thread for as many instances as we can, given capacity
and batch size limits.
"""
# Calculate space left for tasks
space_remaining = self.concurrency - len(self.tasks)
# Fetch new tasks
for model in self.models:
if space_remaining > 0:
for instance in await model.atransition_get_with_lock(
for instance in model.transition_get_with_lock(
number=min(space_remaining, self.concurrency_per_model),
lock_expiry=(
timezone.now() + datetime.timedelta(seconds=self.lock_expiry)
),
):
self.tasks.append(
asyncio.create_task(self.run_transition(instance))
)
key = (model._meta.label_lower, instance.pk)
# Don't run two threads for the same thing
if key in self.tasks:
continue
if call_inline:
task_transition(instance, in_thread=False)
else:
self.tasks[key] = self.executor.submit(
task_transition, instance
)
self.handled[model._meta.label_lower] = (
self.handled.get(model._meta.label_lower, 0) + 1
)
space_remaining -= 1
async def run_transition(self, instance: StatorModel):
def add_deletion_tasks(self, call_inline=False):
"""
Wrapper for atransition_attempt with fallback error handling
Adds a deletion thread for each model
"""
task_name = f"stator.run_transition:{instance._meta.label_lower}#{{id}} from {instance.state}"
with sentry.start_transaction(op="task", name=task_name):
sentry.set_context(
"instance",
{
"model": instance._meta.label_lower,
"pk": instance.pk,
"state": instance.state,
"state_age": instance.state_age,
},
)
# Yes, this potentially goes over the capacity limit - it's fine.
for model in self.models:
if model.state_graph.deletion_states:
if call_inline:
task_deletion(model, in_thread=False)
else:
self.tasks[
model._meta.label_lower, "__delete__"
] = self.executor.submit(task_deletion, model)
try:
print(
f"Attempting transition on {instance._meta.label_lower}#{instance.pk} from state {instance.state}"
)
await instance.atransition_attempt()
except BaseException as e:
await exceptions.acapture_exception(e)
traceback.print_exc()
def remove_completed_tasks(self):
def clean_tasks(self):
"""
Removes all completed asyncio.Tasks from our local in-progress list
Removes any tasks that are done and handles exceptions if they
raised them.
"""
self.tasks = [t for t in self.tasks if not t.done()]
for key, task in list(self.tasks.items()):
if task.done():
del self.tasks[key]
try:
task.result()
except BaseException as e:
logger.exception(e)
async def run_single_cycle(self):
def run_single_cycle(self):
"""
Testing entrypoint to advance things just one cycle, and allow errors
to propagate out.
"""
await asyncio.wait_for(self.fetch_and_process_tasks(), timeout=1)
for task in self.tasks:
await task
self.add_deletion_tasks(call_inline=True)
self.add_transition_tasks(call_inline=True)
run_single_cycle_sync = async_to_sync(run_single_cycle)
def task_transition(instance: StatorModel, in_thread: bool = True):
"""
Runs one state transition/action.
"""
task_name = f"stator.task_transition:{instance._meta.label_lower}#{{id}} from {instance.state}"
started = time.monotonic()
with sentry.start_transaction(op="task", name=task_name):
sentry.set_context(
"instance",
{
"model": instance._meta.label_lower,
"pk": instance.pk,
"state": instance.state,
"state_age": instance.state_age,
},
)
result = instance.transition_attempt()
duration = time.monotonic() - started
if result:
logger.info(
f"{instance._meta.label_lower}: {instance.pk}: {instance.state} -> {result} ({duration:.2f}s)"
)
else:
logger.info(
f"{instance._meta.label_lower}: {instance.pk}: {instance.state} unchanged ({duration:.2f}s)"
)
if in_thread:
close_old_connections()
def task_deletion(model: type[StatorModel], in_thread: bool = True):
"""
Runs one model deletion set.
"""
# Loop, running deletions every second, until there are no more to do
while True:
deleted = model.transition_delete_due()
if not deleted:
break
logger.info(f"{model._meta.label_lower}: Deleted {deleted} stale items")
time.sleep(1)
if in_thread:
close_old_connections()

Wyświetl plik

@ -12,7 +12,7 @@ class RequestRunner(View):
For when you're on something serverless.
"""
async def get(self, request):
def get(self, request):
# Check the token, if supplied
if not settings.STATOR_TOKEN:
return HttpResponseForbidden("No token set")
@ -20,5 +20,5 @@ class RequestRunner(View):
return HttpResponseForbidden("Invalid token")
# Run on all models
runner = StatorRunner(StatorModel.subclasses, run_for=2)
handled = await runner.run()
handled = runner.run()
return HttpResponse(f"Handled {handled}")

Wyświetl plik

@ -1 +1 @@
__version__ = "0.9.0"
__version__ = "0.11.0"

Wyświetl plik

@ -9,8 +9,8 @@ import dj_database_url
import django_cache_url
import httpx
import sentry_sdk
from corsheaders.defaults import default_headers
from pydantic import AnyUrl, BaseSettings, EmailStr, Field, validator
from sentry_sdk.integrations.django import DjangoIntegration
from takahe import __version__
@ -28,7 +28,7 @@ class ImplicitHostname(AnyUrl):
class MediaBackendUrl(AnyUrl):
host_required = False
allowed_schemes = {"s3", "gs", "local"}
allowed_schemes = {"s3", "s3-insecure", "gs", "local"}
def as_bool(v: str | list[str] | None):
@ -142,9 +142,22 @@ class Settings(BaseSettings):
#: Default cache backend
CACHES_DEFAULT: CacheBackendUrl | None = None
# How long to wait, in days, until remote posts/profiles are pruned from
# our database if nobody local has interacted with them.
# Set to zero to disable.
REMOTE_PRUNE_HORIZON: int = 90
# Stator tuning
STATOR_CONCURRENCY: int = 50
STATOR_CONCURRENCY_PER_MODEL: int = 15
STATOR_CONCURRENCY: int = 20
STATOR_CONCURRENCY_PER_MODEL: int = 4
# If user migration is allowed (off by default until outbound is done)
ALLOW_USER_MIGRATION: bool = False
# Web Push keys
# Generate via https://web-push-codelab.glitch.me/
VAPID_PUBLIC_KEY: str | None = None
VAPID_PRIVATE_KEY: str | None = None
PGHOST: str | None = None
PGPORT: int | None = 5432
@ -333,6 +346,7 @@ CORS_ORIGIN_WHITELIST = SETUP.CORS_HOSTS
CORS_ALLOW_CREDENTIALS = True
CORS_PREFLIGHT_MAX_AGE = 604800
CORS_EXPOSE_HEADERS = ("link",)
CORS_ALLOW_HEADERS = (*default_headers, "Idempotency-Key")
JSONLD_MAX_SIZE = 1024 * 50 # 50 KB
@ -356,7 +370,9 @@ if SETUP.USE_PROXY_HEADERS:
if SETUP.SENTRY_DSN:
from sentry_sdk.integrations.django import DjangoIntegration
from sentry_sdk.integrations.httpx import HttpxIntegration
from sentry_sdk.integrations.logging import LoggingIntegration
sentry_experiments = {}
@ -370,6 +386,7 @@ if SETUP.SENTRY_DSN:
integrations=[
DjangoIntegration(),
HttpxIntegration(),
LoggingIntegration(),
],
traces_sample_rate=SETUP.SENTRY_TRACES_SAMPLE_RATE,
sample_rate=SETUP.SENTRY_SAMPLE_RATE,
@ -395,8 +412,10 @@ if SETUP.EMAIL_SERVER:
elif parsed.scheme == "smtp":
EMAIL_HOST = parsed.hostname
EMAIL_PORT = parsed.port
EMAIL_HOST_USER = urllib.parse.unquote(parsed.username)
EMAIL_HOST_PASSWORD = urllib.parse.unquote(parsed.password)
if parsed.username is not None:
EMAIL_HOST_USER = urllib.parse.unquote(parsed.username)
if parsed.password is not None:
EMAIL_HOST_PASSWORD = urllib.parse.unquote(parsed.password)
EMAIL_USE_TLS = as_bool(query.get("tls"))
EMAIL_USE_SSL = as_bool(query.get("ssl"))
else:
@ -413,7 +432,7 @@ if SETUP.MEDIA_BACKEND:
if parsed.hostname is not None:
port = parsed.port or 443
GS_CUSTOM_ENDPOINT = f"https://{parsed.hostname}:{port}"
elif parsed.scheme == "s3":
elif (parsed.scheme == "s3") or (parsed.scheme == "s3-insecure"):
STORAGES["default"]["BACKEND"] = "core.uploads.TakaheS3Storage"
AWS_STORAGE_BUCKET_NAME = parsed.path.lstrip("/")
AWS_QUERYSTRING_AUTH = False
@ -422,8 +441,14 @@ if SETUP.MEDIA_BACKEND:
AWS_ACCESS_KEY_ID = parsed.username
AWS_SECRET_ACCESS_KEY = urllib.parse.unquote(parsed.password)
if parsed.hostname is not None:
port = parsed.port or 443
AWS_S3_ENDPOINT_URL = f"https://{parsed.hostname}:{port}"
if parsed.scheme == "s3-insecure":
s3_default_port = 80
s3_scheme = "http"
else:
s3_default_port = 443
s3_scheme = "https"
port = parsed.port or s3_default_port
AWS_S3_ENDPOINT_URL = f"{s3_scheme}://{parsed.hostname}:{port}"
if SETUP.MEDIA_URL is not None:
media_url_parsed = urllib.parse.urlparse(SETUP.MEDIA_URL)
AWS_S3_CUSTOM_DOMAIN = media_url_parsed.hostname

Wyświetl plik

@ -65,6 +65,21 @@ urlpatterns = [
settings.CsvFollowers.as_view(),
name="settings_export_followers_csv",
),
path(
"@<handle>/settings/import_export/blocks.csv",
settings.CsvBlocks.as_view(),
name="settings_export_blocks_csv",
),
path(
"@<handle>/settings/import_export/mutes.csv",
settings.CsvMutes.as_view(),
name="settings_export_mutes_csv",
),
path(
"@<handle>/settings/migrate_in/",
settings.MigrateInPage.as_view(),
name="settings_migrate_in",
),
path(
"@<handle>/settings/tokens/",
settings.TokensRoot.as_view(),
@ -128,6 +143,11 @@ urlpatterns = [
admin.FederationRoot.as_view(),
name="admin_federation",
),
path(
"admin/federation/blocklist/",
admin.FederationBlocklist.as_view(),
name="admin_federation_blocklist",
),
path(
"admin/federation/<domain>/",
admin.FederationEdit.as_view(),

Wyświetl plik

@ -40,7 +40,7 @@
{% for attachment in post.attachments.all %}
{% if attachment.is_image %}
<a href="{{ attachment.full_url.relative }}" class="image" target="_blank"
_="on click halt the event then call imageviewer.show(me)">
_="on click halt the event then call imageviewer.show(me)">
<img src="{{ attachment.thumbnail_url.relative }}" title="{{ attachment.name }}" alt="{{ attachment.name|default:'(no description)' }}" loading="lazy" data-original-url="{{ attachment.full_url.relative }}">
{% if attachment.name %}
<div class="badge">ALT</div>

Wyświetl plik

@ -39,44 +39,44 @@
<div class="field payload">
<span class="name">Raw Response:
<a title="Copy Content"
class="copy"
_="on click
writeText(#raw_response.innerText) into the navigator's clipboard
then add .copied
wait 2s
then remove .copied">
class="copy"
_="on click
writeText(#raw_response.innerText) into the navigator's clipboard
then add .copied
wait 2s
then remove .copied">
<i class="fa-solid fa-copy"></i>
</a>
</span>
<span class="value">
<a _="on click
toggle .hidden on #raw_response
then
if my.innerText is 'Hide' set my.innerText to 'Show'
else set my.innerText to 'Hide'
">Show</a></span>
toggle .hidden on #raw_response
then
if my.innerText is 'Hide' set my.innerText to 'Show'
else set my.innerText to 'Hide'
">Show</a></span>
</div>
<pre id="raw_response" class="hidden">{{ raw_result }}</pre>
<div class="field payload">
<span class="name">Canonical:
<a title="Copy Content"
class="copy"
_="on click
writeText(#canonical_response.innerText) into the navigator's clipboard
then add .copied
wait 2s
then remove .copied">
class="copy"
_="on click
writeText(#canonical_response.innerText) into the navigator's clipboard
then add .copied
wait 2s
then remove .copied">
<i class="fa-solid fa-copy"></i>
</a>
</span>
<span class="value">
<a _="on click
toggle .hidden on #canonical_response
then
if my.innerText is 'Hide' set my.innerText to 'Show'
else set my.innerText to 'Hide'
">Show</a></span>
toggle .hidden on #canonical_response
then
if my.innerText is 'Hide' set my.innerText to 'Show'
else set my.innerText to 'Hide'
">Show</a></span>
</div>
<pre id="canonical_response" class="hidden">{{ result }}</pre>
</div>

Wyświetl plik

@ -8,6 +8,9 @@
<input type="search" name="query" value="{{ query }}" placeholder="Search by domain">
<button><i class="fa-solid fa-search"></i></button>
</form>
<div class="view-options">
<a href="{% url "admin_federation_blocklist" %}?page={{ page_obj.number }}" class="button">Import Blocklist</a>
</div>
<table class="items">
{% for domain in page_obj %}
<tr>

Wyświetl plik

@ -0,0 +1,18 @@
{% extends "admin/base_main.html" %}
{% block subtitle %}Federation Blocklist{% endblock %}
{% block settings_content %}
<form action="." method="POST" enctype="multipart/form-data">
{% csrf_token %}
<h1>Import Blocklist</h1>
<fieldset>
{% include "forms/_field.html" with field=form.blocklist %}
</fieldset>
<div class="buttons">
<a href="{% url "admin_federation" %}?page={{ page }}" class="button secondary left">Back</a>
<button>Save</button>
</div>
</form>
{% endblock %}

Wyświetl plik

@ -61,6 +61,9 @@
{% endif %}
{% elif not request.domain.config_domain.hide_login %}
<a href="{% url "login" %}" title="Login"><i class="fa-solid fa-right-to-bracket"></i></a>
{% if config.signup_allowed %}
<a href="{% url "signup" %}" title="Create Account"><i class="fa-solid fa-user-plus"></i></a>
{% endif %}
{% endif %}
</menu>
</header>

Wyświetl plik

@ -80,9 +80,9 @@
</span>
<div class="right">
<button class="fa-solid fa-trash delete" title="Delete Row"
_="on click remove (closest parent .option)
then {{ field.name }}.collect{{ field.name|title }}Fields()
then halt" />
_="on click remove (closest parent .option)
then {{ field.name }}.collect{{ field.name|title }}Fields()
then halt" />
</div>
</div>
</div>

Wyświetl plik

@ -23,8 +23,8 @@
_="on click halt the event then call imageviewer.show(me)"
>
<img src="{{ identity.local_icon_url.relative }}" class="icon"
data-original-url="{{ identity.local_icon_url.relative }}"
alt="Profile image for {{ identity.name }}"
data-original-url="{{ identity.local_icon_url.relative }}"
alt="Profile image for {{ identity.name }}"
>
</span>
@ -47,13 +47,13 @@
<small>
@{{ identity.handle }}
<a title="Copy handle"
class="copy"
tabindex="0"
_="on click or keyup[key is 'Enter']
writeText('@{{ identity.handle }}') into the navigator's clipboard
then add .copied
wait 2s
then remove .copied">
class="copy"
tabindex="0"
_="on click or keyup[key is 'Enter']
writeText('@{{ identity.handle }}') into the navigator's clipboard
then add .copied
wait 2s
then remove .copied">
<i class="fa-solid fa-copy"></i>
</a>
</small>

Wyświetl plik

@ -14,6 +14,12 @@
<i class="fa-solid fa-cloud-arrow-up"></i>
<span>Import/Export</span>
</a>
{% if allow_migration %}
<a href="{% url "settings_migrate_in" handle=identity.handle %}" {% if section == "migrate_in" %}class="selected"{% endif %} title="Interface">
<i class="fa-solid fa-door-open"></i>
<span>Migrate Inbound</span>
</a>
{% endif %}
<a href="{% url "settings_tokens" handle=identity.handle %}" {% if section == "tokens" %}class="selected"{% endif %} title="Authorized Apps">
<i class="fa-solid fa-window-restore"></i>
<span>Authorized Apps</span>

Wyświetl plik

@ -50,7 +50,7 @@
<small>{{ numbers.blocks }} {{ numbers.blocks|pluralize:"people,people" }}</small>
</td>
<td>
<a href="{% url "settings_export_blocks_csv" handle=identity.handle %}">Download CSV</a>
</td>
</tr>
<tr>
@ -59,7 +59,7 @@
<small>{{ numbers.mutes }} {{ numbers.mutes|pluralize:"people,people" }}</small>
</td>
<td>
<a href="{% url "settings_export_mutes_csv" handle=identity.handle %}">Download CSV</a>
</td>
</tr>
</table>

Wyświetl plik

@ -0,0 +1,36 @@
{% extends "settings/base.html" %}
{% block subtitle %}Migrate Here{% endblock %}
{% block settings_content %}
<form action="." method="POST">
{% csrf_token %}
<fieldset>
<legend>Add New Alias</legend>
<p>
To move another account to this one, first add it as an alias here,
and then go to the server where it is hosted and initiate the move.
</p>
{% include "forms/_field.html" with field=form.alias %}
</fieldset>
<div class="buttons">
<button>Add</button>
</div>
</form>
<section>
<h2 class="above">Current Aliases</h2>
<table>
{% for alias in aliases %}
<tr><td>{{ alias.handle }} <a href=".?remove_alias={{ alias.actor_uri|urlencode }}" class="button danger">Remove Alias</button></td></tr>
{% empty %}
<tr><td class="empty">You have no aliases.</td></tr>
{% endfor %}
</table>
</section>
{% endblock %}

Wyświetl plik

@ -4,7 +4,7 @@
{% block settings_content %}
<form action="." method="POST" enctype="multipart/form-data"
_="on submit metadata.collectMetadataFields()">
_="on submit metadata.collectMetadataFields()">
{% csrf_token %}
<fieldset>

Wyświetl plik

@ -68,7 +68,7 @@ def test_ensure_hashtag(identity: Identity, config_system, stator):
author=identity,
content="Hello, #testtag",
)
stator.run_single_cycle_sync()
stator.run_single_cycle()
assert post.hashtags == ["testtag"]
assert Hashtag.objects.filter(hashtag="testtag").exists()
# Excessively long hashtag
@ -76,7 +76,7 @@ def test_ensure_hashtag(identity: Identity, config_system, stator):
author=identity,
content="Hello, #thisisahashtagthatiswaytoolongandissignificantlyaboveourmaximumlimitofonehundredcharacterswhytheywouldbethislongidontknow",
)
stator.run_single_cycle_sync()
stator.run_single_cycle()
assert post.hashtags == [
"thisisahashtagthatiswaytoolongandissignificantlyaboveourmaximumlimitofonehundredcharacterswhytheywou"
]
@ -180,7 +180,7 @@ def test_linkify_mentions_local(config_system, identity, identity2, remote_ident
post.mentions.add(remote_identity)
assert (
post.safe_content_local()
== '<p>Hello <span class="h-card"><a href="/@test@remote.test/" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>test</span></a></span></p>'
== '<p>Hello <span class="h-card"><a href="https://remote.test/@test/" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>test</span></a></span></p>'
)
# Test a full username (local)
post = Post.objects.create(
@ -204,7 +204,7 @@ def test_linkify_mentions_local(config_system, identity, identity2, remote_ident
post.mentions.add(remote_identity)
assert (
post.safe_content_local()
== '<span class="h-card"><a href="/@test@remote.test/" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>test</span></a></span> hello!'
== '<span class="h-card"><a href="https://remote.test/@test/" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>test</span></a></span> hello!'
)
# Test that they don't get touched without a mention
post = Post.objects.create(
@ -226,19 +226,19 @@ def test_post_transitions(identity, stator):
)
# Test: | --> new --> fanned_out
assert post.state == str(PostStates.new)
stator.run_single_cycle_sync()
stator.run_single_cycle()
post = Post.objects.get(id=post.id)
assert post.state == str(PostStates.fanned_out)
# Test: fanned_out --> (forced) edited --> edited_fanned_out
Post.transition_perform(post, PostStates.edited)
stator.run_single_cycle_sync()
stator.run_single_cycle()
post = Post.objects.get(id=post.id)
assert post.state == str(PostStates.edited_fanned_out)
# Test: edited_fanned_out --> (forced) deleted --> deleted_fanned_out
Post.transition_perform(post, PostStates.deleted)
stator.run_single_cycle_sync()
stator.run_single_cycle()
post = Post.objects.get(id=post.id)
assert post.state == str(PostStates.deleted_fanned_out)
@ -392,7 +392,7 @@ def test_inbound_posts(
InboxMessage.objects.create(message=message)
# Run stator and ensure that made the post
stator.run_single_cycle_sync()
stator.run_single_cycle()
post = Post.objects.get(object_uri="https://remote.test/test-post")
assert post.content == "post version one"
assert post.published.day == 13
@ -416,7 +416,7 @@ def test_inbound_posts(
InboxMessage.objects.create(message=message)
# Run stator and ensure that edited the post
stator.run_single_cycle_sync()
stator.run_single_cycle()
post = Post.objects.get(object_uri="https://remote.test/test-post")
assert post.content == "post version two"
assert post.edited.day == 14
@ -455,7 +455,7 @@ def test_inbound_posts(
InboxMessage.objects.create(message=message)
# Run stator and ensure that deleted the post
stator.run_single_cycle_sync()
stator.run_single_cycle()
assert not Post.objects.filter(object_uri="https://remote.test/test-post").exists()
# Create an inbound new post message with only contentMap
@ -474,7 +474,7 @@ def test_inbound_posts(
InboxMessage.objects.create(message=message)
# Run stator and ensure that made the post
stator.run_single_cycle_sync()
stator.run_single_cycle()
post = Post.objects.get(object_uri="https://remote.test/test-map-only")
assert post.content == "post with only content map"
assert post.published.day == 13
@ -499,3 +499,45 @@ def test_post_hashtag_to_ap(identity: Identity, config_system):
]
assert "#world" in ap["object"]["content"]
assert 'rel="tag"' in ap["object"]["content"]
@pytest.mark.django_db
@pytest.mark.parametrize(
"visibility",
[
Post.Visibilities.public,
Post.Visibilities.unlisted,
Post.Visibilities.followers,
Post.Visibilities.mentioned,
],
)
def test_post_targets_to_ap(
identity: Identity, other_identity: Identity, visibility: Post.Visibilities
):
"""
Ensures that posts have the right targets in AP form.
"""
# Make a post
post = Post.objects.create(
content="<p>Hello @other</p>",
author=identity,
local=True,
visibility=visibility,
)
post.mentions.add(other_identity)
# Check its AP targets
ap_dict = post.to_ap()
if visibility == Post.Visibilities.public:
assert ap_dict["to"] == ["as:Public"]
assert ap_dict["cc"] == [other_identity.actor_uri]
elif visibility == Post.Visibilities.unlisted:
assert "to" not in ap_dict
assert ap_dict["cc"] == ["as:Public", other_identity.actor_uri]
elif visibility == Post.Visibilities.followers:
assert ap_dict["to"] == [identity.followers_uri]
assert ap_dict["cc"] == [other_identity.actor_uri]
elif visibility == Post.Visibilities.mentioned:
assert "to" not in ap_dict
assert ap_dict["cc"] == [other_identity.actor_uri]

Some files were not shown because too many files have changed in this diff Show More