kopia lustrzana https://dev.funkwhale.audio/funkwhale/funkwhale
Porównaj commity
279 Commity
Autor | SHA1 | Data |
---|---|---|
Umut Solmaz | f0706ecc5f | |
Umut Solmaz | 95e463a7ef | |
Umut Solmaz | 71688bdfbc | |
josé m | 1bb7108df5 | |
Petitminion | 4bef27552f | |
Ciarán Ainsworth | ec368e0cd3 | |
Ciarán Ainsworth | a2579bdc60 | |
Ciarán Ainsworth | e1e0045a23 | |
Ciarán Ainsworth | 85c2be6a5b | |
Ciarán Ainsworth | 35de9bd48e | |
Petitminion | ba5b657b61 | |
Petitminion | 4fc73c1430 | |
Ciarán Ainsworth | 97e24bcaa6 | |
Ciarán Ainsworth | 1b15fea1ab | |
Ciarán Ainsworth | b624fea2fa | |
Ciarán Ainsworth | e028e8788b | |
Ciarán Ainsworth | 67f74d40a6 | |
Petitminion | 547bd6f371 | |
Petitminion | 05ec6f6d0f | |
Petitminion | a03cc1db24 | |
Petitminion | 2a364d5785 | |
Petitminion | 5bc0171694 | |
Petitminion | 37acfa475d | |
Petitminion | f45fd1e465 | |
Petitminion | 17c4a92f77 | |
Petitminion | 6414302899 | |
Ciarán Ainsworth | 94a5b9e696 | |
Bruno-Van-den-Bosch | d673e77dff | |
Kasper Seweryn | 02400ceea3 | |
Kasper Seweryn | 31f35a43f1 | |
Renovate Bot | 932de8c242 | |
Renovate Bot | a947a16b0f | |
Renovate Bot | a01079850d | |
Ciarán Ainsworth | 8d22eb925e | |
Georg Krause | 6fe153c8da | |
Georg Krause | cb7284ef95 | |
Georg Krause | 5ca8691feb | |
Georg Krause | b4920af0b8 | |
Georg Krause | 803b077f00 | |
Georg Krause | f1f6ef43ad | |
Georg Krause | 0fd0192b37 | |
Georg Krause | ac6d136105 | |
Georg Krause | 4e825527a5 | |
Georg Krause | 46ee53c967 | |
Georg Krause | 765c801142 | |
Tron | e0e8a54d45 | |
Tron | c67884a245 | |
Kasper Seweryn | d2ca28ca47 | |
Kasper Seweryn | 30540ec186 | |
Kasper Seweryn | 673fe8b828 | |
Kasper Seweryn | fe4af475af | |
Kasper Seweryn | ad1bb6a220 | |
Georg Krause | 298ace1b72 | |
Georg Krause | 37a1b008b3 | |
Kasper Seweryn | e42646d8a1 | |
Kasper Seweryn | 0095fc566e | |
Georg Krause | 419da80e37 | |
Kasper Seweryn | 0b99740d64 | |
Georg Krause | 51f56bc808 | |
Georg Krause | b00d782006 | |
Kasper Seweryn | f3a7394461 | |
Georg Krause | cb8725a838 | |
Georg Krause | cddf6b9d93 | |
Georg Krause | 521c4d927c | |
Kasper Seweryn | 78329ca821 | |
Georg Krause | 1ca5ea2b73 | |
Kasper Seweryn | 62f84a311b | |
Kasper Seweryn | 5bf6e23815 | |
Kasper Seweryn | 318aa196fa | |
Kasper Seweryn | b313d0e48c | |
Kasper Seweryn | cea9d9cf47 | |
Kasper Seweryn | 97aa045b0b | |
Kasper Seweryn | ccef0197c6 | |
Kasper Seweryn | 14d099b872 | |
Kasper Seweryn | 5647a1072d | |
Kasper Seweryn | de232cb749 | |
Georg Krause | b1eba58dcc | |
Georg Krause | 06cfe8da95 | |
wvffle | 6aa609970f | |
wvffle | 2b1228e620 | |
wvffle | 83120cced2 | |
wvffle | 367ba84f13 | |
wvffle | 7957661573 | |
wvffle | 9e2d47f698 | |
wvffle | 243f2a57e3 | |
wvffle | 670b522675 | |
Renovate Bot | ff6fc46c58 | |
Ciarán Ainsworth | 84bb893f3a | |
petitminion | 6c38bae189 | |
petitminion | 4364d82b0b | |
Renovate Bot | ac74380986 | |
Renovate Bot | ee0abed0b7 | |
Renovate Bot | fc456e6985 | |
petitminion | b0423d412f | |
Renovate Bot | 9853b89911 | |
Renovate Bot | e6e1b5cdc4 | |
Ciarán Ainsworth | 3b45fde10a | |
Georg Krause | 1eaad85c7d | |
Renovate Bot | f76a797638 | |
Georg Krause | d7d6976229 | |
Renovate Bot | 765bc62a2b | |
Renovate Bot | 446b49fd46 | |
Renovate Bot | 0210304338 | |
Renovate Bot | 6d7a52c5ec | |
Renovate Bot | 825baecf8f | |
Renovate Bot | 62f7fda42c | |
Georg Krause | d82eceecae | |
Renovate Bot | f58a33ec02 | |
Renovate Bot | abf0edfcdc | |
Philipp Wolfer | b658089e70 | |
Philipp Wolfer | 82fdc82f93 | |
Philipp Wolfer | 2371f2a4cb | |
Philipp Wolfer | 136f24a917 | |
Philipp Wolfer | a5ee48818e | |
Philipp Wolfer | d227490f5b | |
Philipp Wolfer | bf8f1e41b9 | |
Philipp Wolfer | e169e8edb1 | |
Philipp Wolfer | 0fab0470c2 | |
Philipp Wolfer | 81401075aa | |
Renovate Bot | c1d91ce4d6 | |
Renovate Bot | 1f8c03e248 | |
Renovate Bot | 42bf16034b | |
Renovate Bot | 787acab3ab | |
Renovate Bot | f43ef89c28 | |
Renovate Bot | c4bec419ab | |
Renovate Bot | 55a4221b69 | |
Renovate Bot | 60f66eea6d | |
Renovate Bot | 4148cdd186 | |
Renovate Bot | 004d535eb7 | |
Renovate Bot | 132e291708 | |
Renovate Bot | 40d2dcaeaf | |
Renovate Bot | fa36c97d72 | |
Renovate Bot | 9b8828ca42 | |
Georg Krause | e0791b570f | |
Georg Krause | 90c9230a60 | |
Renovate Bot | 1e0f3abb54 | |
Petitminion | bfff1f85f9 | |
petitminion | ae9fea0cf1 | |
Renovate Bot | da370f5915 | |
Renovate Bot | d6a078643b | |
Renovate Bot | 7fcaa1fed2 | |
Georg Krause | c3ae40cabe | |
Georg Krause | daf9e80ca5 | |
Georg Krause | b4f18edaff | |
Georg Krause | fa6d48f1b7 | |
Georg Krause | 8f3ab416ae | |
jo | cd9d6d696e | |
Baudouin Feildel | 2c90b32bb3 | |
Baudouin Feildel | e96748c029 | |
Georg Krause | d12ca2bad8 | |
Philipp Wolfer | 332ae20f05 | |
Georg Krause | 736625e235 | |
Georg Krause | 33cd0f05a7 | |
Georg Krause | 06d135875b | |
Bruno-Van-den-Bosch | de41545ab3 | |
Maksim Kliazovich | 5ce00a9230 | |
Thomas | d112d82768 | |
Thomas | 03e9be77f9 | |
Thomas | b6bcc88287 | |
Thomas | 4677b9117d | |
Thomas | bc573e47bc | |
Thomas | 9a5a749171 | |
mittwerk | de60ca7309 | |
josé m | 5693d0f86d | |
Thomas | 22084cbca7 | |
Georg Krause | 731ee7c21e | |
Georg Krause | afea533aed | |
Georg Krause | 8a6b19fb6f | |
Georg Krause | 0eec47e493 | |
Georg Krause | 4f9280bd2c | |
Renovate Bot | 2ac4e25fce | |
Georg Krause | 295b0dcc3a | |
Ciarán Ainsworth | ab0efa3edf | |
Ciarán Ainsworth | 587bbc1118 | |
Ciarán Ainsworth | b8978021c0 | |
Georg Krause | 349610bbeb | |
Ciarán Ainsworth | 65f13a379f | |
Ciarán Ainsworth | ba53d03ac5 | |
Ciarán Ainsworth | cb65ee69e1 | |
Ciarán Ainsworth | 65728c81c4 | |
Matteo Piovanelli | 5b022d94d1 | |
Georg Krause | 21ff5f65da | |
Georg Krause | d8c734d3cd | |
Georg Krause | b1f3a62fae | |
Georg Krause | 20cfaa8dc9 | |
Georg Krause | 038b696e75 | |
Georg Krause | 59687b2f32 | |
Thomas | da71fb640d | |
Thomas | 09facc553d | |
Georg Krause | da01070455 | |
Georg Krause | b00daa189d | |
drakonicguy | aa0ce033aa | |
Georg Krause | cc2272bb80 | |
Matteo Piovanelli | f0e79b4a0a | |
Aznörth Niryn | 9da91df798 | |
Aitor | 807a6fd02c | |
Ciarán Ainsworth | 517d99f9bf | |
Georg Krause | 6ab1dc0536 | |
Georg Krause | 803eb85b67 | |
Georg Krause | 6fcae233df | |
Georg Krause | bf43b95208 | |
Georg Krause | d721a3808b | |
Georg Krause | d22a911619 | |
Georg Krause | 7c52227d43 | |
Georg Krause | 58e2c896b2 | |
Georg Krause | 91b85cab46 | |
Georg Krause | bc15de7556 | |
Georg Krause | f99de1ef97 | |
Georg Krause | 5cc0219196 | |
josé m | 369b80bb1c | |
Thomas | 60db27dfba | |
Aznörth Niryn | efffeac280 | |
Thomas | d112ea4bc6 | |
Aznörth Niryn | b8ed2ccd5c | |
Quentin PAGÈS | ab15803be0 | |
Quentin PAGÈS | e282422592 | |
omarmaciasmolina | 96d25ff25d | |
rinenweb | 8645180620 | |
Jérémie Lorente | 142a517b93 | |
dignny | 233d17d287 | |
Aznörth Niryn | 630ba7262a | |
dignny | 0b78affdcd | |
Transcriber allium | 41dbf62356 | |
Matyáš Caras | 6b6ba94291 | |
josé m | 9eda066a39 | |
Aznörth Niryn | 4cf2d68a4f | |
Renovate Bot | a19b459533 | |
Renovate Bot | e3206e2122 | |
Renovate Bot | ba3300a682 | |
Renovate Bot | c6aec56e71 | |
Renovate Bot | 02fd31d321 | |
Renovate Bot | 07f665cb8b | |
Renovate Bot | 0b03bd6c89 | |
Renovate Bot | 2aa301387c | |
Renovate Bot | 46531884b3 | |
Renovate Bot | 6234dfd2a7 | |
Renovate Bot | 1c93460ffb | |
Renovate Bot | b6c906bf7c | |
Renovate Bot | 793fc31e13 | |
Georg Krause | 80b4906438 | |
Renovate Bot | e11a6cea02 | |
Renovate Bot | b46aa638bc | |
Ciarán Ainsworth | 17e08fd332 | |
Georg Krause | 86ce4cfd7c | |
Georg Krause | b21e241f37 | |
Renovate Bot | 08bfc93243 | |
Ciarán Ainsworth | 4cbce95bcb | |
Georg Krause | 3ee6ba6658 | |
Thomas | 259fb1b61d | |
Thomas | 516c281a57 | |
Thomas | d842243b3c | |
Thomas | a4ea1a06b9 | |
Thomas | d44c29bedb | |
Thomas | 6e46660d70 | |
Thomas | 32db5e92a3 | |
Thomas | ba365d6722 | |
Thomas | fd44d0bf12 | |
Thomas | 70c0a038fc | |
Thomas | 06e49598a3 | |
Thomas | 779a3ee717 | |
Thomas | 92f73b1755 | |
Thomas | f34eb14c9a | |
Thomas | 358ce509a5 | |
Thomas | 65ebb8d90e | |
Thomas | 499e1a8354 | |
Thomas | 8de3c1489d | |
Thomas | 11f7fa25ae | |
Thomas | 1ccf18412f | |
Thomas | 1061275487 | |
Thomas | af592d99c2 | |
Thomas | d1dd0bebcf | |
Renovate Bot | 9da463e69d | |
Renovate Bot | 1ee1c88ed1 | |
Renovate Bot | e38808e2ce | |
Renovate Bot | 2edbc6c98f | |
Georg Krause | bfa50a0c35 | |
Georg Krause | 74b2593cb2 | |
Georg Krause | cc2ff8ae88 | |
Georg Krause | 9dbbe9e768 |
|
@ -7,6 +7,7 @@ nd
|
|||
readby
|
||||
serie
|
||||
upto
|
||||
afterall
|
||||
|
||||
# Names
|
||||
nin
|
||||
|
|
2
.env.dev
2
.env.dev
|
@ -18,6 +18,6 @@ MEDIA_ROOT=/data/media
|
|||
# FORCE_HTTPS_URLS=True
|
||||
|
||||
# Customize to your needs
|
||||
POSTGRES_VERSION=11
|
||||
POSTGRES_VERSION=15
|
||||
DEBUG=true
|
||||
TYPESENSE_API_KEY="apikey"
|
||||
|
|
|
@ -1,3 +1,5 @@
|
|||
/dist
|
||||
|
||||
### OSX ###
|
||||
.DS_Store
|
||||
.AppleDouble
|
||||
|
@ -83,8 +85,12 @@ front/yarn-debug.log*
|
|||
front/yarn-error.log*
|
||||
front/tests/unit/coverage
|
||||
front/tests/e2e/reports
|
||||
front/test_results.xml
|
||||
front/coverage/
|
||||
front/selenium-debug.log
|
||||
docs/_build
|
||||
#Tauri
|
||||
front/tauri/gen
|
||||
|
||||
/data/
|
||||
.env
|
||||
|
@ -104,3 +110,9 @@ tsconfig.tsbuildinfo
|
|||
|
||||
# Vscode
|
||||
.vscode/
|
||||
|
||||
# Nix
|
||||
.direnv/
|
||||
.envrc
|
||||
flake.nix
|
||||
flake.lock
|
||||
|
|
|
@ -144,7 +144,6 @@ find_broken_links:
|
|||
--cache
|
||||
--no-progress
|
||||
--exclude-all-private
|
||||
--exclude-mail
|
||||
--exclude 'demo\.funkwhale\.audio'
|
||||
--exclude 'nginx\.com'
|
||||
--exclude-path 'docs/_templates/'
|
||||
|
@ -231,7 +230,7 @@ test_api:
|
|||
image: $CI_REGISTRY/funkwhale/ci/python-funkwhale-api:$PYTHON_VERSION
|
||||
parallel:
|
||||
matrix:
|
||||
- PYTHON_VERSION: ["3.8", "3.9", "3.10", "3.11"]
|
||||
- PYTHON_VERSION: ["3.8", "3.9", "3.10", "3.11", "3.12"]
|
||||
services:
|
||||
- name: postgres:15-alpine
|
||||
command:
|
||||
|
@ -248,7 +247,7 @@ test_api:
|
|||
CACHE_URL: "redis://redis:6379/0"
|
||||
before_script:
|
||||
- cd api
|
||||
- poetry install --all-extras
|
||||
- make install
|
||||
script:
|
||||
- >
|
||||
poetry run pytest
|
||||
|
@ -288,6 +287,7 @@ test_front:
|
|||
coverage_report:
|
||||
coverage_format: cobertura
|
||||
path: front/coverage/cobertura-coverage.xml
|
||||
coverage: '/All files\s+(?:\|\s+((?:\d+\.)?\d+)\s+){4}.*/'
|
||||
|
||||
build_metadata:
|
||||
stage: build
|
||||
|
@ -313,7 +313,7 @@ test_integration:
|
|||
interruptible: true
|
||||
|
||||
image:
|
||||
name: cypress/included:12.14.0
|
||||
name: cypress/included:13.6.4
|
||||
entrypoint: [""]
|
||||
cache:
|
||||
- *front_cache
|
||||
|
@ -351,7 +351,7 @@ build_api_schema:
|
|||
API_TYPE: "v1"
|
||||
before_script:
|
||||
- cd api
|
||||
- poetry install --all-extras
|
||||
- make install
|
||||
- poetry run funkwhale-manage migrate
|
||||
script:
|
||||
- poetry run funkwhale-manage spectacular --file ../docs/schema.yml
|
||||
|
@ -430,6 +430,25 @@ build_api:
|
|||
paths:
|
||||
- api
|
||||
|
||||
build_tauri:
|
||||
stage: build
|
||||
rules:
|
||||
- if: $CI_COMMIT_BRANCH =~ /(stable|develop)/
|
||||
- changes: [front/**/*]
|
||||
|
||||
image: $CI_REGISTRY/funkwhale/ci/node-tauri:18
|
||||
variables:
|
||||
<<: *keep_git_files_permissions
|
||||
before_script:
|
||||
- source /root/.cargo/env
|
||||
- yarn install
|
||||
script:
|
||||
- yarn tauri build --verbose
|
||||
artifacts:
|
||||
name: desktop_${CI_COMMIT_REF_NAME}
|
||||
paths:
|
||||
- front/tauri/target/release/bundle/appimage/*.AppImage
|
||||
|
||||
deploy_docs:
|
||||
interruptible: false
|
||||
extends: .ssh-agent
|
||||
|
@ -462,7 +481,7 @@ docker:
|
|||
variables:
|
||||
BUILD_ARGS: >
|
||||
--set *.platform=linux/amd64,linux/arm64,linux/arm/v7
|
||||
--set *.no-cache
|
||||
--no-cache
|
||||
--push
|
||||
|
||||
- if: $CI_COMMIT_BRANCH =~ /(stable|develop)/
|
||||
|
@ -473,7 +492,8 @@ docker:
|
|||
--set *.cache-to=type=registry,ref=$DOCKER_CACHE_IMAGE:$CI_COMMIT_BRANCH,mode=max,oci-mediatypes=false
|
||||
--push
|
||||
|
||||
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
|
||||
- if: $CI_PIPELINE_SOURCE == "merge_request_event" && $CI_PROJECT_NAMESPACE == "funkwhale"
|
||||
# We don't provide priviledged runners to everyone, so we can only build docker images in the funkwhale group
|
||||
variables:
|
||||
BUILD_ARGS: >
|
||||
--set *.platform=linux/amd64
|
||||
|
@ -508,3 +528,24 @@ docker:
|
|||
name: docker_metadata_${CI_COMMIT_REF_NAME}
|
||||
paths:
|
||||
- metadata.json
|
||||
|
||||
package:
|
||||
stage: publish
|
||||
needs:
|
||||
- job: build_metadata
|
||||
artifacts: true
|
||||
- job: build_api
|
||||
artifacts: true
|
||||
- job: build_front
|
||||
artifacts: true
|
||||
- job: build_tauri
|
||||
artifacts: true
|
||||
rules:
|
||||
- if: $CI_COMMIT_BRANCH =~ /(stable|develop)/
|
||||
|
||||
image: $CI_REGISTRY/funkwhale/ci/python:3.11
|
||||
variables:
|
||||
<<: *keep_git_files_permissions
|
||||
script:
|
||||
- make package
|
||||
- scripts/ci-upload-packages.sh
|
||||
|
|
|
@ -25,6 +25,16 @@
|
|||
"branchConcurrentLimit": 0,
|
||||
"prConcurrentLimit": 0
|
||||
},
|
||||
{
|
||||
"matchBaseBranches": ["develop"],
|
||||
"matchUpdateTypes": ["major"],
|
||||
"prPriority": 2
|
||||
},
|
||||
{
|
||||
"matchBaseBranches": ["develop"],
|
||||
"matchUpdateTypes": ["minor"],
|
||||
"prPriority": 1
|
||||
},
|
||||
{
|
||||
"matchUpdateTypes": ["major", "minor"],
|
||||
"matchBaseBranches": ["stable"],
|
||||
|
@ -35,12 +45,6 @@
|
|||
"matchBaseBranches": ["stable"],
|
||||
"enabled": false
|
||||
},
|
||||
{
|
||||
"matchUpdateTypes": ["patch", "pin", "digest"],
|
||||
"matchBaseBranches": ["develop"],
|
||||
"automerge": true,
|
||||
"automergeType": "branch"
|
||||
},
|
||||
{
|
||||
"matchManagers": ["npm"],
|
||||
"addLabels": ["Area::Frontend"]
|
||||
|
|
|
@ -14,7 +14,7 @@ tasks:
|
|||
docker-compose up -d
|
||||
|
||||
poetry env use python
|
||||
poetry install
|
||||
make install
|
||||
|
||||
gp ports await 5432
|
||||
|
||||
|
|
|
@ -6,6 +6,8 @@ RUN sudo apt update -y \
|
|||
|
||||
RUN pyenv install 3.11 && pyenv global 3.11
|
||||
|
||||
RUN pip install poetry pre-commit \
|
||||
RUN brew install neovim
|
||||
|
||||
RUN pip install poetry pre-commit jinja2 towncrier \
|
||||
&& poetry config virtualenvs.create true \
|
||||
&& poetry config virtualenvs.in-project true
|
||||
|
|
|
@ -28,15 +28,16 @@ services:
|
|||
environment:
|
||||
- "NGINX_MAX_BODY_SIZE=100M"
|
||||
- "FUNKWHALE_API_IP=host.docker.internal"
|
||||
- "FUNKWHALE_API_HOST=host.docker.internal"
|
||||
- "FUNKWHALE_API_PORT=5000"
|
||||
- "FUNKWHALE_FRONT_IP=host.docker.internal"
|
||||
- "FUNKWHALE_FRONT_PORT=8080"
|
||||
- "FUNKWHALE_HOSTNAME=${FUNKWHALE_HOSTNAME-host.docker.internal}"
|
||||
- "FUNKWHALE_PROTOCOL=https"
|
||||
volumes:
|
||||
- ../data/media:/protected/media:ro
|
||||
- ../data/media:/workspace/funkwhale/data/media:ro
|
||||
- ../data/music:/music:ro
|
||||
- ../data/staticfiles:/staticfiles:ro
|
||||
- ../data/staticfiles:/usr/share/nginx/html/staticfiles/:ro
|
||||
- ../deploy/funkwhale_proxy.conf:/etc/nginx/funkwhale_proxy.conf:ro
|
||||
- ../docker/nginx/conf.dev:/etc/nginx/templates/default.conf.template:ro
|
||||
- ../front:/frontend:ro
|
||||
|
|
|
@ -53,7 +53,7 @@ repos:
|
|||
- id: isort
|
||||
|
||||
- repo: https://github.com/pycqa/flake8
|
||||
rev: 6.0.0
|
||||
rev: 6.1.0
|
||||
hooks:
|
||||
- id: flake8
|
||||
|
||||
|
|
85
CHANGELOG.md
85
CHANGELOG.md
|
@ -9,12 +9,13 @@ This changelog is viewable on the web at https://docs.funkwhale.audio/changelog.
|
|||
|
||||
<!-- towncrier -->
|
||||
|
||||
## 1.4.0-rc1 (2023-11-28)
|
||||
## 1.4.0 (2023-12-12)
|
||||
|
||||
Upgrade instructions are available at https://docs.funkwhale.audio/administrator/upgrade/index.html
|
||||
|
||||
Features:
|
||||
|
||||
- Add a management command to generate dummy notifications for testing
|
||||
- Add atom1.0 to node info services (#2085)
|
||||
- Add basic cypress testing
|
||||
- Add NodeInfo 2.1 (#2085)
|
||||
|
@ -25,14 +26,14 @@ Features:
|
|||
- Cache radio queryset into redis. New radio track endpoint for api v2 is /api/v2/radios/sessions/{radiosessionid}/tracks (#2135)
|
||||
- Create a testing environment in production for ListenBrainz recommendation engine (troi-recommendation-playground) (#1861)
|
||||
- Generate all nginx configurations from one template
|
||||
- New management command to update Uploads which have been imported using --in-place and are now stored in s3 (#2156)
|
||||
- Add option to only allow MusicBrainz tagged file on a pod (#2083)
|
||||
- New management command to update Uploads which have been imported using --in-place and are now
|
||||
stored in s3 (#2156)
|
||||
- Only allow MusicBrainz tagged file on a pod (#2083)
|
||||
- Prohibit the creation of new users using django's `createsuperuser` command in favor of our own CLI
|
||||
entry point. Run `funkwhale-manage fw users create --superuser` instead. (#1288)
|
||||
|
||||
Enhancements:
|
||||
|
||||
- Add a management command to generate dummy notifications for testing
|
||||
- Add custom logging functionality (#2155)
|
||||
- Adding typesense container and api client (2104)
|
||||
- Cache pip package in api docker builds (#2193)
|
||||
|
@ -50,9 +51,12 @@ Bugfixes:
|
|||
|
||||
- `postgres > db_dump.sql` cannot be used if the postgres container is stopped. Update command.
|
||||
- Avoid troi radio to give duplicates (#2231)
|
||||
- Do not cache all requests to avoid missing updates #2258
|
||||
- Fix broken nginx templates for docker setup (#2252)
|
||||
- Fix help messages for running scripts using funkwhale-manage
|
||||
- Fix missing og meta tags (#2208)
|
||||
- Fix multiarch docker builds #2211
|
||||
- Fix regression that prevent static files from being served in non-docker-deployments (#2256)
|
||||
- Fixed an issue where the copy button didn't copy the Embed code in the embed modal.
|
||||
- Fixed an issue with the nginx templates that caused issues when connecting to websockets.
|
||||
- Fixed development docker setup (2102)
|
||||
|
@ -96,6 +100,79 @@ Other:
|
|||
Removal:
|
||||
|
||||
- Drop support for python3.7
|
||||
- This release doesn't support Debian 10 anymore. If you are still on Debian 10, we recommend
|
||||
updating to a later version. Alternatively, install a supported Python version (>= Python 3.8). Python 3.11 is recommended.
|
||||
|
||||
Contributors to our Issues:
|
||||
|
||||
- AMoonRabbit
|
||||
- Alexandra Parker
|
||||
- ChengChung
|
||||
- Ciarán Ainsworth
|
||||
- Georg Krause
|
||||
- Ghost User
|
||||
- Johann Queuniet
|
||||
- JuniorJPDJ
|
||||
- Kasper Seweryn
|
||||
- Kay Borowski
|
||||
- Marcos Peña
|
||||
- Mathieu Jourdan
|
||||
- Nicolas Derive
|
||||
- Virgile Robles
|
||||
- jooola
|
||||
- petitminion
|
||||
- theit8514
|
||||
|
||||
Contributors to our Merge Requests:
|
||||
|
||||
- AMoonRabbit
|
||||
- Alexander Dunkel
|
||||
- Alexander Torre
|
||||
- Ciarán Ainsworth
|
||||
- Georg Krause
|
||||
- JuniorJPDJ
|
||||
- Kasper Seweryn
|
||||
- Kay Borowski
|
||||
- Marcos Peña
|
||||
- Mathieu Jourdan
|
||||
- Philipp Wolfer
|
||||
- Virgile Robles
|
||||
- interfect
|
||||
- jooola
|
||||
- petitminion
|
||||
|
||||
Committers:
|
||||
|
||||
- Aitor
|
||||
- Alexander Dunkel
|
||||
- alextprog
|
||||
- Aznörth Niryn
|
||||
- Ciarán Ainsworth
|
||||
- dignny
|
||||
- drakonicguy
|
||||
- Fun.k.whale Trad
|
||||
- Georg krause
|
||||
- Georg Krause
|
||||
- Jérémie Lorente
|
||||
- jo
|
||||
- jooola
|
||||
- josé m
|
||||
- Julian-Samuel Gebühr
|
||||
- JuniorJPDJ
|
||||
- Kasper Seweryn
|
||||
- Marcos Peña
|
||||
- Mathieu Jourdan
|
||||
- Matteo Piovanelli
|
||||
- Matyáš Caras
|
||||
- MhP
|
||||
- omarmaciasmolina
|
||||
- petitminion
|
||||
- Philipp Wolfer
|
||||
- ppom
|
||||
- Quentin PAGÈS
|
||||
- rinenweb
|
||||
- Thomas
|
||||
- Transcriber allium
|
||||
|
||||
## 1.3.4 (2023-11-16)
|
||||
|
||||
|
|
38
Makefile
38
Makefile
|
@ -17,3 +17,41 @@ docker-build: docker-metadata
|
|||
|
||||
build-metadata:
|
||||
./scripts/build_metadata.py --format env | tee build_metadata.env
|
||||
|
||||
BUILD_DIR = dist
|
||||
package:
|
||||
rm -Rf $(BUILD_DIR)
|
||||
mkdir -p $(BUILD_DIR)
|
||||
tar --create --gunzip --file='$(BUILD_DIR)/funkwhale-api.tar.gz' \
|
||||
--owner='root' \
|
||||
--group='root' \
|
||||
--exclude-vcs \
|
||||
api/config \
|
||||
api/funkwhale_api \
|
||||
api/install_os_dependencies.sh \
|
||||
api/manage.py \
|
||||
api/poetry.lock \
|
||||
api/pyproject.toml \
|
||||
api/Readme.md
|
||||
|
||||
cd '$(BUILD_DIR)' && \
|
||||
tar --extract --gunzip --file='funkwhale-api.tar.gz' && \
|
||||
zip -q 'funkwhale-api.zip' -r api && \
|
||||
rm -Rf api
|
||||
|
||||
tar --create --gunzip --file='$(BUILD_DIR)/funkwhale-front.tar.gz' \
|
||||
--owner='root' \
|
||||
--group='root' \
|
||||
--exclude-vcs \
|
||||
--transform='s/^front\/dist/front/' \
|
||||
front/dist
|
||||
|
||||
cd '$(BUILD_DIR)' && \
|
||||
tar --extract --gunzip --file='funkwhale-front.tar.gz' && \
|
||||
zip -q 'funkwhale-front.zip' -r front && \
|
||||
rm -Rf front
|
||||
|
||||
cd '$(BUILD_DIR)' && \
|
||||
cp ../front/tauri/target/release/bundle/appimage/funkwhale_*.AppImage FunkwhaleDesktop.AppImage
|
||||
|
||||
cd '$(BUILD_DIR)' && sha256sum * > SHA256SUMS
|
||||
|
|
|
@ -1,8 +1,4 @@
|
|||
FROM alpine:3.17 as requirements
|
||||
|
||||
# We need this additional step to avoid having poetrys deps interacting with our
|
||||
# dependencies. This is only required until alpine 3.16 is released, since this
|
||||
# allows us to install poetry as package.
|
||||
FROM alpine:3.19 as requirements
|
||||
|
||||
RUN set -eux; \
|
||||
apk add --no-cache \
|
||||
|
@ -16,7 +12,7 @@ RUN set -eux; \
|
|||
poetry export --without-hashes --extras typesense > requirements.txt; \
|
||||
poetry export --without-hashes --with dev > dev-requirements.txt;
|
||||
|
||||
FROM alpine:3.17 as builder
|
||||
FROM alpine:3.19 as builder
|
||||
|
||||
ENV PYTHONDONTWRITEBYTECODE=1
|
||||
ENV PYTHONUNBUFFERED=1
|
||||
|
@ -41,11 +37,11 @@ RUN set -eux; \
|
|||
openssl-dev \
|
||||
postgresql-dev \
|
||||
zlib-dev \
|
||||
py3-cryptography=38.0.3-r1 \
|
||||
py3-cryptography=41.0.7-r0 \
|
||||
py3-lxml=4.9.3-r1 \
|
||||
py3-pillow=9.3.0-r0 \
|
||||
py3-psycopg2=2.9.5-r0 \
|
||||
py3-watchfiles=0.18.1-r0 \
|
||||
py3-pillow=10.3.0-r0 \
|
||||
py3-psycopg2=2.9.9-r0 \
|
||||
py3-watchfiles=0.19.0-r1 \
|
||||
python3-dev
|
||||
|
||||
# Create virtual env
|
||||
|
@ -65,11 +61,11 @@ RUN --mount=type=cache,target=~/.cache/pip; \
|
|||
# to install the deps using pip.
|
||||
grep -Ev 'cryptography|lxml|pillow|psycopg2|watchfiles' /requirements.txt \
|
||||
| pip3 install -r /dev/stdin \
|
||||
cryptography==38.0.3 \
|
||||
cryptography==41.0.7 \
|
||||
lxml==4.9.3 \
|
||||
pillow==9.3.0 \
|
||||
psycopg2==2.9.5 \
|
||||
watchfiles==0.18.1
|
||||
pillow==10.2.0 \
|
||||
psycopg2==2.9.9 \
|
||||
watchfiles==0.19.0
|
||||
|
||||
ARG install_dev_deps=0
|
||||
RUN --mount=type=cache,target=~/.cache/pip; \
|
||||
|
@ -77,14 +73,14 @@ RUN --mount=type=cache,target=~/.cache/pip; \
|
|||
if [ "$install_dev_deps" = "1" ] ; then \
|
||||
grep -Ev 'cryptography|lxml|pillow|psycopg2|watchfiles' /dev-requirements.txt \
|
||||
| pip3 install -r /dev/stdin \
|
||||
cryptography==38.0.3 \
|
||||
cryptography==41.0.7 \
|
||||
lxml==4.9.3 \
|
||||
pillow==9.3.0 \
|
||||
psycopg2==2.9.5 \
|
||||
watchfiles==0.18.1; \
|
||||
pillow==10.2.0 \
|
||||
psycopg2==2.9.9 \
|
||||
watchfiles==0.19.0; \
|
||||
fi
|
||||
|
||||
FROM alpine:3.17 as production
|
||||
FROM alpine:3.19 as production
|
||||
|
||||
ENV PYTHONDONTWRITEBYTECODE=1
|
||||
ENV PYTHONUNBUFFERED=1
|
||||
|
@ -101,11 +97,11 @@ RUN set -eux; \
|
|||
libpq \
|
||||
libxml2 \
|
||||
libxslt \
|
||||
py3-cryptography=38.0.3-r1 \
|
||||
py3-cryptography=41.0.7-r0 \
|
||||
py3-lxml=4.9.3-r1 \
|
||||
py3-pillow=9.3.0-r0 \
|
||||
py3-psycopg2=2.9.5-r0 \
|
||||
py3-watchfiles=0.18.1-r0 \
|
||||
py3-pillow=10.3.0-r0 \
|
||||
py3-psycopg2=2.9.9-r0 \
|
||||
py3-watchfiles=0.19.0-r1 \
|
||||
python3 \
|
||||
tzdata
|
||||
|
||||
|
|
|
@ -4,7 +4,7 @@ CPU_CORES := $(shell N=$$(nproc); echo $$(( $$N > 4 ? 4 : $$N )))
|
|||
.PHONY: install lint
|
||||
|
||||
install:
|
||||
poetry install
|
||||
poetry install --all-extras
|
||||
|
||||
lint:
|
||||
poetry run pylint \
|
||||
|
|
|
@ -303,6 +303,23 @@ LISTENING_CREATED = "listening_created"
|
|||
"""
|
||||
Called when a track is being listened
|
||||
"""
|
||||
LISTENING_SYNC = "listening_sync"
|
||||
"""
|
||||
Called by the task manager to trigger listening sync
|
||||
"""
|
||||
FAVORITE_CREATED = "favorite_created"
|
||||
"""
|
||||
Called when a track is being favorited
|
||||
"""
|
||||
FAVORITE_DELETED = "favorite_deleted"
|
||||
"""
|
||||
Called when a favorited track is being unfavorited
|
||||
"""
|
||||
FAVORITE_SYNC = "favorite_sync"
|
||||
"""
|
||||
Called by the task manager to trigger favorite sync
|
||||
"""
|
||||
|
||||
SCAN = "scan"
|
||||
"""
|
||||
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
from channels.auth import AuthMiddlewareStack
|
||||
from channels.routing import ProtocolTypeRouter, URLRouter
|
||||
from django.conf.urls import url
|
||||
from django.core.asgi import get_asgi_application
|
||||
from django.urls import re_path
|
||||
|
||||
from funkwhale_api.instance import consumers
|
||||
|
||||
|
@ -10,7 +10,12 @@ application = ProtocolTypeRouter(
|
|||
# Empty for now (http->django views is added by default)
|
||||
"websocket": AuthMiddlewareStack(
|
||||
URLRouter(
|
||||
[url("^api/v1/activity$", consumers.InstanceActivityConsumer.as_asgi())]
|
||||
[
|
||||
re_path(
|
||||
"^api/v1/activity$",
|
||||
consumers.InstanceActivityConsumer.as_asgi(),
|
||||
)
|
||||
]
|
||||
)
|
||||
),
|
||||
"http": get_asgi_application(),
|
||||
|
|
|
@ -2,7 +2,7 @@ import logging.config
|
|||
import sys
|
||||
import warnings
|
||||
from collections import OrderedDict
|
||||
from urllib.parse import urlsplit
|
||||
from urllib.parse import urlparse, urlsplit
|
||||
|
||||
import environ
|
||||
from celery.schedules import crontab
|
||||
|
@ -224,6 +224,13 @@ ALLOWED_HOSTS = env.list("DJANGO_ALLOWED_HOSTS", default=[]) + [FUNKWHALE_HOSTNA
|
|||
List of allowed hostnames for which the Funkwhale server will answer.
|
||||
"""
|
||||
|
||||
CSRF_TRUSTED_ORIGINS = [urlparse(o, FUNKWHALE_PROTOCOL).geturl() for o in ALLOWED_HOSTS]
|
||||
"""
|
||||
List of origins that are trusted for unsafe requests
|
||||
We simply consider all allowed hosts to be trusted origins
|
||||
See https://docs.djangoproject.com/en/4.2/ref/settings/#csrf-trusted-origins
|
||||
"""
|
||||
|
||||
# APP CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
DJANGO_APPS = (
|
||||
|
@ -269,6 +276,7 @@ LOCAL_APPS = (
|
|||
# Your stuff: custom apps go here
|
||||
"funkwhale_api.instance",
|
||||
"funkwhale_api.audio",
|
||||
"funkwhale_api.contrib.listenbrainz",
|
||||
"funkwhale_api.music",
|
||||
"funkwhale_api.requests",
|
||||
"funkwhale_api.favorites",
|
||||
|
@ -830,7 +838,7 @@ If you're using password auth (the extra slash is important)
|
|||
.. note::
|
||||
|
||||
If you want to use Redis over unix sockets, you also need to update
|
||||
:attr:`CELERY_BROKER_URL`, because the scheme differ from the one used by
|
||||
:attr:`CELERY_BROKER_URL`, because the scheme differs from the one used by
|
||||
:attr:`CACHE_URL`.
|
||||
|
||||
"""
|
||||
|
@ -881,7 +889,7 @@ to use a different server or use Redis sockets to connect.
|
|||
|
||||
Example:
|
||||
|
||||
- ``redis://127.0.0.1:6379/0``
|
||||
- ``unix://127.0.0.1:6379/0``
|
||||
- ``redis+socket:///run/redis/redis.sock?virtual_host=0``
|
||||
|
||||
"""
|
||||
|
@ -942,13 +950,25 @@ CELERY_BEAT_SCHEDULE = {
|
|||
),
|
||||
"options": {"expires": 60 * 60},
|
||||
},
|
||||
"typesense.build_canonical_index": {
|
||||
"task": "typesense.build_canonical_index",
|
||||
"schedule": crontab(day_of_week="*/2", minute="0", hour="3"),
|
||||
"listenbrainz.trigger_listening_sync_with_listenbrainz": {
|
||||
"task": "listenbrainz.trigger_listening_sync_with_listenbrainz",
|
||||
"schedule": crontab(day_of_week="*", minute="0", hour="3"),
|
||||
"options": {"expires": 60 * 60 * 24},
|
||||
},
|
||||
"listenbrainz.trigger_favorite_sync_with_listenbrainz": {
|
||||
"task": "listenbrainz.trigger_favorite_sync_with_listenbrainz",
|
||||
"schedule": crontab(day_of_week="*", minute="0", hour="3"),
|
||||
"options": {"expires": 60 * 60 * 24},
|
||||
},
|
||||
}
|
||||
|
||||
if env.str("TYPESENSE_API_KEY", default=None):
|
||||
CELERY_BEAT_SCHEDULE["typesense.build_canonical_index"] = {
|
||||
"task": "typesense.build_canonical_index",
|
||||
"schedule": crontab(day_of_week="*/2", minute="0", hour="3"),
|
||||
"options": {"expires": 60 * 60 * 24},
|
||||
}
|
||||
|
||||
if env.bool("ADD_ALBUM_TAGS_FROM_TRACKS", default=True):
|
||||
CELERY_BEAT_SCHEDULE["music.albums_set_tags_from_tracks"] = {
|
||||
"task": "music.albums_set_tags_from_tracks",
|
||||
|
@ -1193,7 +1213,7 @@ if BROWSABLE_API_ENABLED:
|
|||
"rest_framework.renderers.BrowsableAPIRenderer",
|
||||
)
|
||||
|
||||
REST_AUTH_SERIALIZERS = {
|
||||
REST_AUTH = {
|
||||
"PASSWORD_RESET_SERIALIZER": "funkwhale_api.users.serializers.PasswordResetSerializer", # noqa
|
||||
"PASSWORD_RESET_CONFIRM_SERIALIZER": "funkwhale_api.users.serializers.PasswordResetConfirmSerializer", # noqa
|
||||
}
|
||||
|
|
|
@ -96,8 +96,6 @@ CELERY_TASK_ALWAYS_EAGER = False
|
|||
|
||||
# Your local stuff: Below this line define 3rd party library settings
|
||||
|
||||
CSRF_TRUSTED_ORIGINS = [o for o in ALLOWED_HOSTS]
|
||||
|
||||
REST_FRAMEWORK["DEFAULT_SCHEMA_CLASS"] = "funkwhale_api.schema.CustomAutoSchema"
|
||||
SPECTACULAR_SETTINGS = {
|
||||
"TITLE": "Funkwhale API",
|
||||
|
|
|
@ -41,14 +41,6 @@ SECRET_KEY = env("DJANGO_SECRET_KEY")
|
|||
# SESSION_COOKIE_HTTPONLY = True
|
||||
# SECURE_SSL_REDIRECT = env.bool("DJANGO_SECURE_SSL_REDIRECT", default=True)
|
||||
|
||||
# SITE CONFIGURATION
|
||||
# ------------------------------------------------------------------------------
|
||||
# Hosts/domain names that are valid for this site
|
||||
# See https://docs.djangoproject.com/en/1.6/ref/settings/#allowed-hosts
|
||||
CSRF_TRUSTED_ORIGINS = ALLOWED_HOSTS
|
||||
|
||||
# END SITE CONFIGURATION
|
||||
|
||||
# Static Assets
|
||||
# ------------------------
|
||||
STATICFILES_STORAGE = "django.contrib.staticfiles.storage.StaticFilesStorage"
|
||||
|
|
|
@ -1,7 +1,6 @@
|
|||
from django.conf import settings
|
||||
from django.conf.urls import url
|
||||
from django.conf.urls.static import static
|
||||
from django.urls import include, path
|
||||
from django.urls import include, path, re_path
|
||||
from django.views import defaults as default_views
|
||||
|
||||
from config import plugins
|
||||
|
@ -10,34 +9,34 @@ from funkwhale_api.common import admin
|
|||
plugins_patterns = plugins.trigger_filter(plugins.URLS, [], enabled=True)
|
||||
|
||||
api_patterns = [
|
||||
url("v1/", include("config.urls.api")),
|
||||
url("v2/", include("config.urls.api_v2")),
|
||||
url("subsonic/", include("config.urls.subsonic")),
|
||||
re_path("v1/", include("config.urls.api")),
|
||||
re_path("v2/", include("config.urls.api_v2")),
|
||||
re_path("subsonic/", include("config.urls.subsonic")),
|
||||
]
|
||||
|
||||
|
||||
urlpatterns = [
|
||||
# Django Admin, use {% url 'admin:index' %}
|
||||
url(settings.ADMIN_URL, admin.site.urls),
|
||||
url(r"^api/", include((api_patterns, "api"), namespace="api")),
|
||||
url(
|
||||
re_path(settings.ADMIN_URL, admin.site.urls),
|
||||
re_path(r"^api/", include((api_patterns, "api"), namespace="api")),
|
||||
re_path(
|
||||
r"^",
|
||||
include(
|
||||
("funkwhale_api.federation.urls", "federation"), namespace="federation"
|
||||
),
|
||||
),
|
||||
url(r"^api/v1/auth/", include("funkwhale_api.users.rest_auth_urls")),
|
||||
url(r"^accounts/", include("allauth.urls")),
|
||||
re_path(r"^api/v1/auth/", include("funkwhale_api.users.rest_auth_urls")),
|
||||
re_path(r"^accounts/", include("allauth.urls")),
|
||||
] + plugins_patterns
|
||||
|
||||
if settings.DEBUG:
|
||||
# This allows the error pages to be debugged during development, just visit
|
||||
# these url in browser to see how these error pages look like.
|
||||
urlpatterns += [
|
||||
url(r"^400/$", default_views.bad_request),
|
||||
url(r"^403/$", default_views.permission_denied),
|
||||
url(r"^404/$", default_views.page_not_found),
|
||||
url(r"^500/$", default_views.server_error),
|
||||
re_path(r"^400/$", default_views.bad_request),
|
||||
re_path(r"^403/$", default_views.permission_denied),
|
||||
re_path(r"^404/$", default_views.page_not_found),
|
||||
re_path(r"^500/$", default_views.server_error),
|
||||
] + static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)
|
||||
|
||||
if "debug_toolbar" in settings.INSTALLED_APPS:
|
||||
|
@ -49,5 +48,5 @@ if settings.DEBUG:
|
|||
|
||||
if "silk" in settings.INSTALLED_APPS:
|
||||
urlpatterns = [
|
||||
url(r"^api/silk/", include("silk.urls", namespace="silk"))
|
||||
re_path(r"^api/silk/", include("silk.urls", namespace="silk"))
|
||||
] + urlpatterns
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
from django.conf.urls import include, url
|
||||
from django.conf.urls import include
|
||||
from django.urls import re_path
|
||||
|
||||
from funkwhale_api.activity import views as activity_views
|
||||
from funkwhale_api.audio import views as audio_views
|
||||
|
@ -28,61 +29,61 @@ router.register(r"attachments", common_views.AttachmentViewSet, "attachments")
|
|||
v1_patterns = router.urls
|
||||
|
||||
v1_patterns += [
|
||||
url(r"^oembed/$", views.OembedView.as_view(), name="oembed"),
|
||||
url(
|
||||
re_path(r"^oembed/$", views.OembedView.as_view(), name="oembed"),
|
||||
re_path(
|
||||
r"^instance/",
|
||||
include(("funkwhale_api.instance.urls", "instance"), namespace="instance"),
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r"^manage/",
|
||||
include(("funkwhale_api.manage.urls", "manage"), namespace="manage"),
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r"^moderation/",
|
||||
include(
|
||||
("funkwhale_api.moderation.urls", "moderation"), namespace="moderation"
|
||||
),
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r"^federation/",
|
||||
include(
|
||||
("funkwhale_api.federation.api_urls", "federation"), namespace="federation"
|
||||
),
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r"^providers/",
|
||||
include(("funkwhale_api.providers.urls", "providers"), namespace="providers"),
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r"^favorites/",
|
||||
include(("funkwhale_api.favorites.urls", "favorites"), namespace="favorites"),
|
||||
),
|
||||
url(r"^search$", views.Search.as_view(), name="search"),
|
||||
url(
|
||||
re_path(r"^search$", views.Search.as_view(), name="search"),
|
||||
re_path(
|
||||
r"^radios/",
|
||||
include(("funkwhale_api.radios.urls", "radios"), namespace="radios"),
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r"^history/",
|
||||
include(("funkwhale_api.history.urls", "history"), namespace="history"),
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r"^",
|
||||
include(("funkwhale_api.users.api_urls", "users"), namespace="users"),
|
||||
),
|
||||
# XXX: remove if Funkwhale 1.1
|
||||
url(
|
||||
re_path(
|
||||
r"^users/",
|
||||
include(("funkwhale_api.users.api_urls", "users"), namespace="users-nested"),
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r"^oauth/",
|
||||
include(("funkwhale_api.users.oauth.urls", "oauth"), namespace="oauth"),
|
||||
),
|
||||
url(r"^rate-limit/?$", common_views.RateLimitView.as_view(), name="rate-limit"),
|
||||
url(
|
||||
re_path(r"^rate-limit/?$", common_views.RateLimitView.as_view(), name="rate-limit"),
|
||||
re_path(
|
||||
r"^text-preview/?$", common_views.TextPreviewView.as_view(), name="text-preview"
|
||||
),
|
||||
]
|
||||
|
||||
urlpatterns = [url("", include((v1_patterns, "v1"), namespace="v1"))]
|
||||
urlpatterns = [re_path("", include((v1_patterns, "v1"), namespace="v1"))]
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
from django.conf.urls import include, url
|
||||
from django.conf.urls import include
|
||||
from django.urls import re_path
|
||||
|
||||
from funkwhale_api.common import routers as common_routers
|
||||
|
||||
|
@ -6,14 +7,14 @@ router = common_routers.OptionalSlashRouter()
|
|||
v2_patterns = router.urls
|
||||
|
||||
v2_patterns += [
|
||||
url(
|
||||
re_path(
|
||||
r"^instance/",
|
||||
include(("funkwhale_api.instance.urls_v2", "instance"), namespace="instance"),
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r"^radios/",
|
||||
include(("funkwhale_api.radios.urls_v2", "radios"), namespace="radios"),
|
||||
),
|
||||
]
|
||||
|
||||
urlpatterns = [url("", include((v2_patterns, "v2"), namespace="v2"))]
|
||||
urlpatterns = [re_path("", include((v2_patterns, "v2"), namespace="v2"))]
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
from django.conf.urls import include, url
|
||||
from django.conf.urls import include
|
||||
from django.urls import re_path
|
||||
from rest_framework import routers
|
||||
from rest_framework.urlpatterns import format_suffix_patterns
|
||||
|
||||
|
@ -8,7 +9,9 @@ subsonic_router = routers.SimpleRouter(trailing_slash=False)
|
|||
subsonic_router.register(r"rest", SubsonicViewSet, basename="subsonic")
|
||||
|
||||
subsonic_patterns = format_suffix_patterns(subsonic_router.urls, allowed=["view"])
|
||||
urlpatterns = [url("", include((subsonic_patterns, "subsonic"), namespace="subsonic"))]
|
||||
urlpatterns = [
|
||||
re_path("", include((subsonic_patterns, "subsonic"), namespace="subsonic"))
|
||||
]
|
||||
|
||||
# urlpatterns = [
|
||||
# url(
|
||||
|
|
|
@ -48,4 +48,5 @@ def get_activity(user, limit=20):
|
|||
),
|
||||
]
|
||||
records = combined_recent(limit=limit, querysets=querysets)
|
||||
|
||||
return [r["object"] for r in records]
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
from allauth.account.utils import send_email_confirmation
|
||||
from allauth.account.models import EmailAddress
|
||||
from django.core.cache import cache
|
||||
from django.utils.translation import ugettext as _
|
||||
from django.utils.translation import gettext as _
|
||||
from oauth2_provider.contrib.rest_framework.authentication import (
|
||||
OAuth2Authentication as BaseOAuth2Authentication,
|
||||
)
|
||||
|
@ -20,9 +20,13 @@ def resend_confirmation_email(request, user):
|
|||
if cache.get(cache_key):
|
||||
return False
|
||||
|
||||
done = send_email_confirmation(request, user)
|
||||
# We do the sending of the conformation by hand because we don't want to pass the request down
|
||||
# to the email rendering, which would cause another UnverifiedEmail Exception and restarts the sending
|
||||
# again and again
|
||||
email = EmailAddress.objects.get_for_user(user, user.email)
|
||||
email.send_confirmation()
|
||||
cache.set(cache_key, True, THROTTLE_DELAY)
|
||||
return done
|
||||
return True
|
||||
|
||||
|
||||
class OAuth2Authentication(BaseOAuth2Authentication):
|
||||
|
|
|
@ -10,7 +10,7 @@ class Command(BaseCommand):
|
|||
|
||||
self.help = "Helper to generate randomized testdata"
|
||||
self.type_choices = {"notifications": self.handle_notifications}
|
||||
self.missing_args_message = f"Please specify one of the following sub-commands: { *self.type_choices.keys(), }"
|
||||
self.missing_args_message = f"Please specify one of the following sub-commands: {*self.type_choices.keys(), }"
|
||||
|
||||
def add_arguments(self, parser):
|
||||
subparsers = parser.add_subparsers(dest="subcommand")
|
||||
|
|
|
@ -60,12 +60,12 @@ class NullsLastSQLCompiler(SQLCompiler):
|
|||
class NullsLastQuery(models.sql.query.Query):
|
||||
"""Use a custom compiler to inject 'NULLS LAST' (for PostgreSQL)."""
|
||||
|
||||
def get_compiler(self, using=None, connection=None):
|
||||
def get_compiler(self, using=None, connection=None, elide_empty=True):
|
||||
if using is None and connection is None:
|
||||
raise ValueError("Need either using or connection")
|
||||
if using:
|
||||
connection = connections[using]
|
||||
return NullsLastSQLCompiler(self, connection, using)
|
||||
return NullsLastSQLCompiler(self, connection, using, elide_empty)
|
||||
|
||||
|
||||
class NullsLastQuerySet(models.QuerySet):
|
||||
|
|
|
@ -2,7 +2,7 @@ import json
|
|||
|
||||
from django import forms
|
||||
from django.conf import settings
|
||||
from django.contrib.postgres.forms import JSONField
|
||||
from django.forms import JSONField
|
||||
from dynamic_preferences import serializers, types
|
||||
from dynamic_preferences.registries import global_preferences_registry
|
||||
|
||||
|
@ -93,7 +93,6 @@ class SerializedPreference(types.BasePreferenceType):
|
|||
serializer
|
||||
"""
|
||||
|
||||
serializer = JSONSerializer
|
||||
data_serializer_class = None
|
||||
field_class = JSONField
|
||||
widget = forms.Textarea
|
||||
|
|
|
@ -5,8 +5,8 @@ import os
|
|||
import PIL
|
||||
from django.core.exceptions import ObjectDoesNotExist
|
||||
from django.core.files.uploadedfile import SimpleUploadedFile
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.encoding import smart_str
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from drf_spectacular.types import OpenApiTypes
|
||||
from drf_spectacular.utils import extend_schema_field
|
||||
from rest_framework import serializers
|
||||
|
@ -52,7 +52,7 @@ class RelatedField(serializers.RelatedField):
|
|||
self.fail(
|
||||
"does_not_exist",
|
||||
related_field_name=self.related_field_name,
|
||||
value=smart_text(data),
|
||||
value=smart_str(data),
|
||||
)
|
||||
except (TypeError, ValueError):
|
||||
self.fail("invalid")
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
import django.dispatch
|
||||
|
||||
mutation_created = django.dispatch.Signal(providing_args=["mutation"])
|
||||
mutation_updated = django.dispatch.Signal(
|
||||
providing_args=["mutation", "old_is_approved", "new_is_approved"]
|
||||
)
|
||||
""" Required args: mutation """
|
||||
mutation_created = django.dispatch.Signal()
|
||||
""" Required args: mutation, old_is_approved, new_is_approved """
|
||||
mutation_updated = django.dispatch.Signal()
|
||||
|
|
|
@ -6,7 +6,7 @@ from django.core.exceptions import ValidationError
|
|||
from django.core.files.images import get_image_dimensions
|
||||
from django.template.defaultfilters import filesizeformat
|
||||
from django.utils.deconstruct import deconstructible
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
|
||||
@deconstructible
|
||||
|
|
|
@ -1,168 +0,0 @@
|
|||
# Copyright (c) 2018 Philipp Wolfer <ph.wolfer@gmail.com>
|
||||
#
|
||||
# Permission is hereby granted, free of charge, to any person obtaining
|
||||
# a copy of this software and associated documentation files (the
|
||||
# "Software"), to deal in the Software without restriction, including
|
||||
# without limitation the rights to use, copy, modify, merge, publish,
|
||||
# distribute, sublicense, and/or sell copies of the Software, and to
|
||||
# permit persons to whom the Software is furnished to do so, subject to
|
||||
# the following conditions:
|
||||
#
|
||||
# The above copyright notice and this permission notice shall be
|
||||
# included in all copies or substantial portions of the Software.
|
||||
#
|
||||
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
||||
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
|
||||
# LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||
# OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
|
||||
# WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
|
||||
import json
|
||||
import logging
|
||||
import ssl
|
||||
import time
|
||||
from http.client import HTTPSConnection
|
||||
|
||||
HOST_NAME = "api.listenbrainz.org"
|
||||
PATH_SUBMIT = "/1/submit-listens"
|
||||
SSL_CONTEXT = ssl.create_default_context()
|
||||
|
||||
|
||||
class Track:
|
||||
"""
|
||||
Represents a single track to submit.
|
||||
|
||||
See https://listenbrainz.readthedocs.io/en/latest/dev/json.html
|
||||
"""
|
||||
|
||||
def __init__(self, artist_name, track_name, release_name=None, additional_info={}):
|
||||
"""
|
||||
Create a new Track instance
|
||||
@param artist_name as str
|
||||
@param track_name as str
|
||||
@param release_name as str
|
||||
@param additional_info as dict
|
||||
"""
|
||||
self.artist_name = artist_name
|
||||
self.track_name = track_name
|
||||
self.release_name = release_name
|
||||
self.additional_info = additional_info
|
||||
|
||||
@staticmethod
|
||||
def from_dict(data):
|
||||
return Track(
|
||||
data["artist_name"],
|
||||
data["track_name"],
|
||||
data.get("release_name", None),
|
||||
data.get("additional_info", {}),
|
||||
)
|
||||
|
||||
def to_dict(self):
|
||||
return {
|
||||
"artist_name": self.artist_name,
|
||||
"track_name": self.track_name,
|
||||
"release_name": self.release_name,
|
||||
"additional_info": self.additional_info,
|
||||
}
|
||||
|
||||
def __repr__(self):
|
||||
return f"Track({self.artist_name}, {self.track_name})"
|
||||
|
||||
|
||||
class ListenBrainzClient:
|
||||
"""
|
||||
Submit listens to ListenBrainz.org.
|
||||
|
||||
See https://listenbrainz.readthedocs.io/en/latest/dev/api.html
|
||||
"""
|
||||
|
||||
def __init__(self, user_token, logger=logging.getLogger(__name__)):
|
||||
self.__next_request_time = 0
|
||||
self.user_token = user_token
|
||||
self.logger = logger
|
||||
|
||||
def listen(self, listened_at, track):
|
||||
"""
|
||||
Submit a listen for a track
|
||||
@param listened_at as int
|
||||
@param entry as Track
|
||||
"""
|
||||
payload = _get_payload(track, listened_at)
|
||||
return self._submit("single", [payload])
|
||||
|
||||
def playing_now(self, track):
|
||||
"""
|
||||
Submit a playing now notification for a track
|
||||
@param track as Track
|
||||
"""
|
||||
payload = _get_payload(track)
|
||||
return self._submit("playing_now", [payload])
|
||||
|
||||
def import_tracks(self, tracks):
|
||||
"""
|
||||
Import a list of tracks as (listened_at, Track) pairs
|
||||
@param track as [(int, Track)]
|
||||
"""
|
||||
payload = _get_payload_many(tracks)
|
||||
return self._submit("import", payload)
|
||||
|
||||
def _submit(self, listen_type, payload, retry=0):
|
||||
self._wait_for_ratelimit()
|
||||
self.logger.debug("ListenBrainz %s: %r", listen_type, payload)
|
||||
data = {"listen_type": listen_type, "payload": payload}
|
||||
headers = {
|
||||
"Authorization": "Token %s" % self.user_token,
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
body = json.dumps(data)
|
||||
conn = HTTPSConnection(HOST_NAME, context=SSL_CONTEXT)
|
||||
conn.request("POST", PATH_SUBMIT, body, headers)
|
||||
response = conn.getresponse()
|
||||
response_text = response.read()
|
||||
try:
|
||||
response_data = json.loads(response_text)
|
||||
except json.decoder.JSONDecodeError:
|
||||
response_data = response_text
|
||||
|
||||
self._handle_ratelimit(response)
|
||||
log_msg = f"Response {response.status}: {response_data!r}"
|
||||
if response.status == 429 and retry < 5: # Too Many Requests
|
||||
self.logger.warning(log_msg)
|
||||
return self._submit(listen_type, payload, retry + 1)
|
||||
elif response.status == 200:
|
||||
self.logger.debug(log_msg)
|
||||
else:
|
||||
self.logger.error(log_msg)
|
||||
return response
|
||||
|
||||
def _wait_for_ratelimit(self):
|
||||
now = time.time()
|
||||
if self.__next_request_time > now:
|
||||
delay = self.__next_request_time - now
|
||||
self.logger.debug("Rate limit applies, delay %d", delay)
|
||||
time.sleep(delay)
|
||||
|
||||
def _handle_ratelimit(self, response):
|
||||
remaining = int(response.getheader("X-RateLimit-Remaining", 0))
|
||||
reset_in = int(response.getheader("X-RateLimit-Reset-In", 0))
|
||||
self.logger.debug("X-RateLimit-Remaining: %i", remaining)
|
||||
self.logger.debug("X-RateLimit-Reset-In: %i", reset_in)
|
||||
if remaining == 0:
|
||||
self.__next_request_time = time.time() + reset_in
|
||||
|
||||
|
||||
def _get_payload_many(tracks):
|
||||
payload = []
|
||||
for listened_at, track in tracks:
|
||||
data = _get_payload(track, listened_at)
|
||||
payload.append(data)
|
||||
return payload
|
||||
|
||||
|
||||
def _get_payload(track, listened_at=None):
|
||||
data = {"track_metadata": track.to_dict()}
|
||||
if listened_at is not None:
|
||||
data["listened_at"] = listened_at
|
||||
return data
|
|
@ -1,27 +1,31 @@
|
|||
import liblistenbrainz
|
||||
|
||||
import funkwhale_api
|
||||
from config import plugins
|
||||
from funkwhale_api.favorites import models as favorites_models
|
||||
from funkwhale_api.history import models as history_models
|
||||
|
||||
from .client import ListenBrainzClient, Track
|
||||
from . import tasks
|
||||
from .funkwhale_startup import PLUGIN
|
||||
|
||||
|
||||
@plugins.register_hook(plugins.LISTENING_CREATED, PLUGIN)
|
||||
def submit_listen(listening, conf, **kwargs):
|
||||
user_token = conf["user_token"]
|
||||
if not user_token:
|
||||
if not user_token and not conf["submit_listenings"]:
|
||||
return
|
||||
|
||||
logger = PLUGIN["logger"]
|
||||
logger.info("Submitting listen to ListenBrainz")
|
||||
client = ListenBrainzClient(user_token=user_token, logger=logger)
|
||||
track = get_track(listening.track)
|
||||
client.listen(int(listening.creation_date.timestamp()), track)
|
||||
client = liblistenbrainz.ListenBrainz()
|
||||
client.set_auth_token(user_token)
|
||||
listen = get_lb_listen(listening)
|
||||
|
||||
client.submit_single_listen(listen)
|
||||
|
||||
|
||||
def get_track(track):
|
||||
artist = track.artist.name
|
||||
title = track.title
|
||||
album = None
|
||||
def get_lb_listen(listening):
|
||||
track = listening.track
|
||||
additional_info = {
|
||||
"media_player": "Funkwhale",
|
||||
"media_player_version": funkwhale_api.__version__,
|
||||
|
@ -36,7 +40,7 @@ def get_track(track):
|
|||
|
||||
if track.album:
|
||||
if track.album.title:
|
||||
album = track.album.title
|
||||
release_name = track.album.title
|
||||
if track.album.mbid:
|
||||
additional_info["release_mbid"] = str(track.album.mbid)
|
||||
|
||||
|
@ -47,4 +51,86 @@ def get_track(track):
|
|||
if upload:
|
||||
additional_info["duration"] = upload.duration
|
||||
|
||||
return Track(artist, title, album, additional_info)
|
||||
return liblistenbrainz.Listen(
|
||||
track_name=track.title,
|
||||
artist_name=track.artist.name,
|
||||
listened_at=listening.creation_date.timestamp(),
|
||||
release_name=release_name,
|
||||
additional_info=additional_info,
|
||||
)
|
||||
|
||||
|
||||
@plugins.register_hook(plugins.FAVORITE_CREATED, PLUGIN)
|
||||
def submit_favorite_creation(track_favorite, conf, **kwargs):
|
||||
user_token = conf["user_token"]
|
||||
if not user_token or not conf["submit_favorites"]:
|
||||
return
|
||||
logger = PLUGIN["logger"]
|
||||
logger.info("Submitting favorite to ListenBrainz")
|
||||
client = liblistenbrainz.ListenBrainz()
|
||||
track = track_favorite.track
|
||||
if not track.mbid:
|
||||
logger.warning(
|
||||
"This tracks doesn't have a mbid. Feedback will not be submitted to Listenbrainz"
|
||||
)
|
||||
return
|
||||
client.submit_user_feedback(1, track.mbid)
|
||||
|
||||
|
||||
@plugins.register_hook(plugins.FAVORITE_DELETED, PLUGIN)
|
||||
def submit_favorite_deletion(track_favorite, conf, **kwargs):
|
||||
user_token = conf["user_token"]
|
||||
if not user_token or not conf["submit_favorites"]:
|
||||
return
|
||||
logger = PLUGIN["logger"]
|
||||
logger.info("Submitting favorite deletion to ListenBrainz")
|
||||
client = liblistenbrainz.ListenBrainz()
|
||||
track = track_favorite.track
|
||||
if not track.mbid:
|
||||
logger.warning(
|
||||
"This tracks doesn't have a mbid. Feedback will not be submitted to Listenbrainz"
|
||||
)
|
||||
return
|
||||
client.submit_user_feedback(0, track.mbid)
|
||||
|
||||
|
||||
@plugins.register_hook(plugins.LISTENING_SYNC, PLUGIN)
|
||||
def sync_listenings_from_listenbrainz(user, conf):
|
||||
user_name = conf["user_name"]
|
||||
|
||||
if not user_name or not conf["sync_listenings"]:
|
||||
return
|
||||
logger = PLUGIN["logger"]
|
||||
logger.info("Getting listenings from ListenBrainz")
|
||||
try:
|
||||
last_ts = (
|
||||
history_models.Listening.objects.filter(user=user)
|
||||
.filter(source="Listenbrainz")
|
||||
.latest("creation_date")
|
||||
.values_list("creation_date", flat=True)
|
||||
).timestamp()
|
||||
except funkwhale_api.history.models.Listening.DoesNotExist:
|
||||
tasks.import_listenbrainz_listenings(user, user_name, 0)
|
||||
return
|
||||
|
||||
tasks.import_listenbrainz_listenings(user, user_name, last_ts)
|
||||
|
||||
|
||||
@plugins.register_hook(plugins.FAVORITE_SYNC, PLUGIN)
|
||||
def sync_favorites_from_listenbrainz(user, conf):
|
||||
user_name = conf["user_name"]
|
||||
|
||||
if not user_name or not conf["sync_favorites"]:
|
||||
return
|
||||
try:
|
||||
last_ts = (
|
||||
favorites_models.TrackFavorite.objects.filter(user=user)
|
||||
.filter(source="Listenbrainz")
|
||||
.latest("creation_date")
|
||||
.creation_date.timestamp()
|
||||
)
|
||||
except favorites_models.TrackFavorite.DoesNotExist:
|
||||
tasks.import_listenbrainz_favorites(user, user_name, 0)
|
||||
return
|
||||
|
||||
tasks.import_listenbrainz_favorites(user, user_name, last_ts)
|
||||
|
|
|
@ -3,7 +3,7 @@ from config import plugins
|
|||
PLUGIN = plugins.get_plugin_config(
|
||||
name="listenbrainz",
|
||||
label="ListenBrainz",
|
||||
description="A plugin that allows you to submit your listens to ListenBrainz.",
|
||||
description="A plugin that allows you to submit or sync your listens and favorites to ListenBrainz.",
|
||||
homepage="https://docs.funkwhale.audio/users/builtinplugins.html#listenbrainz-plugin", # noqa
|
||||
version="0.3",
|
||||
user=True,
|
||||
|
@ -13,6 +13,45 @@ PLUGIN = plugins.get_plugin_config(
|
|||
"type": "text",
|
||||
"label": "Your ListenBrainz user token",
|
||||
"help": "You can find your user token in your ListenBrainz profile at https://listenbrainz.org/profile/",
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "user_name",
|
||||
"type": "text",
|
||||
"required": False,
|
||||
"label": "Your ListenBrainz user name.",
|
||||
"help": "Required for importing listenings and favorites with ListenBrainz \
|
||||
but not to send activities",
|
||||
},
|
||||
{
|
||||
"name": "submit_listenings",
|
||||
"type": "boolean",
|
||||
"default": True,
|
||||
"label": "Enable listening submission to ListenBrainz",
|
||||
"help": "If enabled, your listenings from Funkwhale will be imported into ListenBrainz.",
|
||||
},
|
||||
{
|
||||
"name": "sync_listenings",
|
||||
"type": "boolean",
|
||||
"default": False,
|
||||
"label": "Enable listenings sync",
|
||||
"help": "If enabled, your listening from ListenBrainz will be imported into Funkwhale. This means they \
|
||||
will be used along with Funkwhale listenings to filter out recently listened content or \
|
||||
generate recommendations",
|
||||
},
|
||||
{
|
||||
"name": "sync_favorites",
|
||||
"type": "boolean",
|
||||
"default": False,
|
||||
"label": "Enable favorite sync",
|
||||
"help": "If enabled, your favorites from ListenBrainz will be imported into Funkwhale. This means they \
|
||||
will be used along with Funkwhale favorites (UI display, federation activity)",
|
||||
},
|
||||
{
|
||||
"name": "submit_favorites",
|
||||
"type": "boolean",
|
||||
"default": False,
|
||||
"label": "Enable favorite submission to ListenBrainz services",
|
||||
"help": "If enabled, your favorites from Funkwhale will be submitted to ListenBrainz",
|
||||
},
|
||||
],
|
||||
)
|
||||
|
|
|
@ -0,0 +1,165 @@
|
|||
import datetime
|
||||
|
||||
import liblistenbrainz
|
||||
from django.utils import timezone
|
||||
|
||||
from config import plugins
|
||||
from funkwhale_api.favorites import models as favorites_models
|
||||
from funkwhale_api.history import models as history_models
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.taskapp import celery
|
||||
from funkwhale_api.users import models
|
||||
|
||||
from .funkwhale_startup import PLUGIN
|
||||
|
||||
|
||||
@celery.app.task(name="listenbrainz.trigger_listening_sync_with_listenbrainz")
|
||||
def trigger_listening_sync_with_listenbrainz():
|
||||
now = timezone.now()
|
||||
active_month = now - datetime.timedelta(days=30)
|
||||
users = (
|
||||
models.User.objects.filter(plugins__code="listenbrainz")
|
||||
.filter(plugins__conf__sync_listenings=True)
|
||||
.filter(last_activity__gte=active_month)
|
||||
)
|
||||
for user in users:
|
||||
plugins.trigger_hook(
|
||||
plugins.LISTENING_SYNC,
|
||||
user=user,
|
||||
confs=plugins.get_confs(user),
|
||||
)
|
||||
|
||||
|
||||
@celery.app.task(name="listenbrainz.trigger_favorite_sync_with_listenbrainz")
|
||||
def trigger_favorite_sync_with_listenbrainz():
|
||||
now = timezone.now()
|
||||
active_month = now - datetime.timedelta(days=30)
|
||||
users = (
|
||||
models.User.objects.filter(plugins__code="listenbrainz")
|
||||
.filter(plugins__conf__sync_listenings=True)
|
||||
.filter(last_activity__gte=active_month)
|
||||
)
|
||||
for user in users:
|
||||
plugins.trigger_hook(
|
||||
plugins.FAVORITE_SYNC,
|
||||
user=user,
|
||||
confs=plugins.get_confs(user),
|
||||
)
|
||||
|
||||
|
||||
@celery.app.task(name="listenbrainz.import_listenbrainz_listenings")
|
||||
def import_listenbrainz_listenings(user, user_name, since):
|
||||
client = liblistenbrainz.ListenBrainz()
|
||||
response = client.get_listens(username=user_name, min_ts=since, count=100)
|
||||
listens = response["payload"]["listens"]
|
||||
while listens:
|
||||
add_lb_listenings_to_db(listens, user)
|
||||
new_ts = max(
|
||||
listens,
|
||||
key=lambda obj: datetime.datetime.fromtimestamp(
|
||||
obj.listened_at, timezone.utc
|
||||
),
|
||||
)
|
||||
response = client.get_listens(username=user_name, min_ts=new_ts, count=100)
|
||||
listens = response["payload"]["listens"]
|
||||
|
||||
|
||||
def add_lb_listenings_to_db(listens, user):
|
||||
logger = PLUGIN["logger"]
|
||||
fw_listens = []
|
||||
for listen in listens:
|
||||
if (
|
||||
listen.additional_info.get("submission_client")
|
||||
and listen.additional_info.get("submission_client")
|
||||
== "Funkwhale ListenBrainz plugin"
|
||||
and history_models.Listening.objects.filter(
|
||||
creation_date=datetime.datetime.fromtimestamp(
|
||||
listen.listened_at, timezone.utc
|
||||
)
|
||||
).exists()
|
||||
):
|
||||
logger.info(
|
||||
f"Listen with ts {listen.listened_at} skipped because already in db"
|
||||
)
|
||||
continue
|
||||
|
||||
mbid = (
|
||||
listen.mbid_mapping
|
||||
if hasattr(listen, "mbid_mapping")
|
||||
else listen.recording_mbid
|
||||
)
|
||||
|
||||
if not mbid:
|
||||
logger.info("Received listening that doesn't have a mbid. Skipping...")
|
||||
|
||||
try:
|
||||
track = music_models.Track.objects.get(mbid=mbid)
|
||||
except music_models.Track.DoesNotExist:
|
||||
logger.info(
|
||||
"Received listening that doesn't exist in fw database. Skipping..."
|
||||
)
|
||||
continue
|
||||
|
||||
user = user
|
||||
fw_listen = history_models.Listening(
|
||||
creation_date=datetime.datetime.fromtimestamp(
|
||||
listen.listened_at, timezone.utc
|
||||
),
|
||||
track=track,
|
||||
user=user,
|
||||
source="Listenbrainz",
|
||||
)
|
||||
fw_listens.append(fw_listen)
|
||||
|
||||
history_models.Listening.objects.bulk_create(fw_listens)
|
||||
|
||||
|
||||
@celery.app.task(name="listenbrainz.import_listenbrainz_favorites")
|
||||
def import_listenbrainz_favorites(user, user_name, since):
|
||||
client = liblistenbrainz.ListenBrainz()
|
||||
response = client.get_user_feedback(username=user_name)
|
||||
offset = 0
|
||||
while response["feedback"]:
|
||||
count = response["count"]
|
||||
offset = offset + count
|
||||
last_sync = min(
|
||||
response["feedback"],
|
||||
key=lambda obj: datetime.datetime.fromtimestamp(
|
||||
obj["created"], timezone.utc
|
||||
),
|
||||
)["created"]
|
||||
add_lb_feedback_to_db(response["feedback"], user)
|
||||
if last_sync <= since or count == 0:
|
||||
return
|
||||
response = client.get_user_feedback(username=user_name, offset=offset)
|
||||
|
||||
|
||||
def add_lb_feedback_to_db(feedbacks, user):
|
||||
logger = PLUGIN["logger"]
|
||||
for feedback in feedbacks:
|
||||
try:
|
||||
track = music_models.Track.objects.get(mbid=feedback["recording_mbid"])
|
||||
except music_models.Track.DoesNotExist:
|
||||
logger.info(
|
||||
"Received feedback track that doesn't exist in fw database. Skipping..."
|
||||
)
|
||||
continue
|
||||
|
||||
if feedback["score"] == 1:
|
||||
favorites_models.TrackFavorite.objects.get_or_create(
|
||||
user=user,
|
||||
creation_date=datetime.datetime.fromtimestamp(
|
||||
feedback["created"], timezone.utc
|
||||
),
|
||||
track=track,
|
||||
source="Listenbrainz",
|
||||
)
|
||||
elif feedback["score"] == 0:
|
||||
try:
|
||||
favorites_models.TrackFavorite.objects.get(
|
||||
user=user, track=track
|
||||
).delete()
|
||||
except favorites_models.TrackFavorite.DoesNotExist:
|
||||
continue
|
||||
elif feedback["score"] == -1:
|
||||
logger.info("Funkwhale doesn't support disliked tracks")
|
|
@ -0,0 +1,18 @@
|
|||
# Generated by Django 3.2.20 on 2023-12-09 14:25
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('favorites', '0001_initial'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='trackfavorite',
|
||||
name='source',
|
||||
field=models.CharField(blank=True, max_length=100, null=True),
|
||||
),
|
||||
]
|
|
@ -12,6 +12,7 @@ class TrackFavorite(models.Model):
|
|||
track = models.ForeignKey(
|
||||
Track, related_name="track_favorites", on_delete=models.CASCADE
|
||||
)
|
||||
source = models.CharField(max_length=100, null=True, blank=True)
|
||||
|
||||
class Meta:
|
||||
unique_together = ("track", "user")
|
||||
|
|
|
@ -4,6 +4,7 @@ from rest_framework import mixins, status, viewsets
|
|||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
|
||||
from config import plugins
|
||||
from funkwhale_api.activity import record
|
||||
from funkwhale_api.common import fields, permissions
|
||||
from funkwhale_api.music import utils as music_utils
|
||||
|
@ -44,6 +45,11 @@ class TrackFavoriteViewSet(
|
|||
instance = self.perform_create(serializer)
|
||||
serializer = self.get_serializer(instance=instance)
|
||||
headers = self.get_success_headers(serializer.data)
|
||||
plugins.trigger_hook(
|
||||
plugins.FAVORITE_CREATED,
|
||||
track_favorite=serializer.instance,
|
||||
confs=plugins.get_confs(self.request.user),
|
||||
)
|
||||
record.send(instance)
|
||||
return Response(
|
||||
serializer.data, status=status.HTTP_201_CREATED, headers=headers
|
||||
|
@ -76,6 +82,11 @@ class TrackFavoriteViewSet(
|
|||
except (AttributeError, ValueError, models.TrackFavorite.DoesNotExist):
|
||||
return Response({}, status=400)
|
||||
favorite.delete()
|
||||
plugins.trigger_hook(
|
||||
plugins.FAVORITE_DELETED,
|
||||
track_favorite=favorite,
|
||||
confs=plugins.get_confs(self.request.user),
|
||||
)
|
||||
return Response([], status=status.HTTP_204_NO_CONTENT)
|
||||
|
||||
@extend_schema(
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
from django.conf.urls import include, url
|
||||
from django.conf.urls import include
|
||||
from django.urls import re_path
|
||||
from rest_framework import routers
|
||||
|
||||
from . import views
|
||||
|
@ -23,6 +24,8 @@ music_router.register(r"tracks", views.MusicTrackViewSet, "tracks")
|
|||
index_router.register(r"index", views.IndexViewSet, "index")
|
||||
|
||||
urlpatterns = router.urls + [
|
||||
url("federation/music/", include((music_router.urls, "music"), namespace="music")),
|
||||
url("federation/", include((index_router.urls, "index"), namespace="index")),
|
||||
re_path(
|
||||
"federation/music/", include((music_router.urls, "music"), namespace="music")
|
||||
),
|
||||
re_path("federation/", include((index_router.urls, "index"), namespace="index")),
|
||||
]
|
||||
|
|
|
@ -0,0 +1,18 @@
|
|||
# Generated by Django 3.2.20 on 2023-12-09 14:23
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('history', '0002_auto_20180325_1433'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='listening',
|
||||
name='source',
|
||||
field=models.CharField(blank=True, max_length=100, null=True),
|
||||
),
|
||||
]
|
|
@ -17,6 +17,7 @@ class Listening(models.Model):
|
|||
on_delete=models.CASCADE,
|
||||
)
|
||||
session_key = models.CharField(max_length=100, null=True, blank=True)
|
||||
source = models.CharField(max_length=100, null=True, blank=True)
|
||||
|
||||
class Meta:
|
||||
ordering = ("-creation_date",)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from funkwhale_api.common import routers
|
||||
|
||||
|
@ -8,7 +8,7 @@ admin_router = routers.OptionalSlashRouter()
|
|||
admin_router.register(r"admin/settings", views.AdminSettings, "admin-settings")
|
||||
|
||||
urlpatterns = [
|
||||
url(r"^nodeinfo/2.0/?$", views.NodeInfo20.as_view(), name="nodeinfo-2.0"),
|
||||
url(r"^settings/?$", views.InstanceSettings.as_view(), name="settings"),
|
||||
url(r"^spa-manifest.json", views.SpaManifest.as_view(), name="spa-manifest"),
|
||||
re_path(r"^nodeinfo/2.0/?$", views.NodeInfo20.as_view(), name="nodeinfo-2.0"),
|
||||
re_path(r"^settings/?$", views.InstanceSettings.as_view(), name="settings"),
|
||||
re_path(r"^spa-manifest.json", views.SpaManifest.as_view(), name="spa-manifest"),
|
||||
] + admin_router.urls
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from . import views
|
||||
|
||||
urlpatterns = [
|
||||
url(r"^nodeinfo/2.1/?$", views.NodeInfo21.as_view(), name="nodeinfo-2.1"),
|
||||
re_path(r"^nodeinfo/2.1/?$", views.NodeInfo21.as_view(), name="nodeinfo-2.1"),
|
||||
]
|
||||
|
|
|
@ -171,6 +171,9 @@ class NodeInfo21(NodeInfo20):
|
|||
if pref.get("federation__enabled"):
|
||||
data["features"].append("federation")
|
||||
|
||||
if pref.get("music__only_allow_musicbrainz_tagged_files"):
|
||||
data["features"].append("onlyMbidTaggedContent")
|
||||
|
||||
serializer = self.serializer_class(data)
|
||||
return Response(
|
||||
serializer.data, status=200, content_type=NODEINFO_2_CONTENT_TYPE
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
from django.conf.urls import include, url
|
||||
from django.conf.urls import include
|
||||
from django.urls import re_path
|
||||
|
||||
from funkwhale_api.common import routers
|
||||
|
||||
|
@ -32,14 +33,16 @@ other_router.register(r"channels", views.ManageChannelViewSet, "channels")
|
|||
other_router.register(r"tags", views.ManageTagViewSet, "tags")
|
||||
|
||||
urlpatterns = [
|
||||
url(
|
||||
re_path(
|
||||
r"^federation/",
|
||||
include((federation_router.urls, "federation"), namespace="federation"),
|
||||
),
|
||||
url(r"^library/", include((library_router.urls, "instance"), namespace="library")),
|
||||
url(
|
||||
re_path(
|
||||
r"^library/", include((library_router.urls, "instance"), namespace="library")
|
||||
),
|
||||
re_path(
|
||||
r"^moderation/",
|
||||
include((moderation_router.urls, "moderation"), namespace="moderation"),
|
||||
),
|
||||
url(r"^users/", include((users_router.urls, "instance"), namespace="users")),
|
||||
re_path(r"^users/", include((users_router.urls, "instance"), namespace="users")),
|
||||
] + other_router.urls
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
import django.dispatch
|
||||
|
||||
report_created = django.dispatch.Signal(providing_args=["report"])
|
||||
""" Required argument: report """
|
||||
report_created = django.dispatch.Signal()
|
||||
|
|
|
@ -0,0 +1,61 @@
|
|||
from django.core.management.base import BaseCommand
|
||||
from django.db import transaction
|
||||
|
||||
from funkwhale_api.music import models
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = """Deletes any tracks not tagged with a MusicBrainz ID from the database. By default, any tracks that
|
||||
have been favorited by a user or added to a playlist are preserved."""
|
||||
|
||||
def add_arguments(self, parser):
|
||||
parser.add_argument(
|
||||
"--no-dry-run",
|
||||
action="store_true",
|
||||
dest="no_dry_run",
|
||||
default=True,
|
||||
help="Disable dry run mode and apply pruning for real on the database",
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"--include-playlist-content",
|
||||
action="store_true",
|
||||
dest="include_playlist_content",
|
||||
default=False,
|
||||
help="Allow tracks included in playlists to be pruned",
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"--include-favorites-content",
|
||||
action="store_true",
|
||||
dest="include_favorited_content",
|
||||
default=False,
|
||||
help="Allow favorited tracks to be pruned",
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"--include-listened-content",
|
||||
action="store_true",
|
||||
dest="include_listened_content",
|
||||
default=False,
|
||||
help="Allow tracks with listening history to be pruned",
|
||||
)
|
||||
|
||||
@transaction.atomic
|
||||
def handle(self, *args, **options):
|
||||
tracks = models.Track.objects.filter(mbid__isnull=True)
|
||||
if not options["include_favorited_content"]:
|
||||
tracks = tracks.filter(track_favorites__isnull=True)
|
||||
if not options["include_playlist_content"]:
|
||||
tracks = tracks.filter(playlist_tracks__isnull=True)
|
||||
if not options["include_listened_content"]:
|
||||
tracks = tracks.filter(listenings__isnull=True)
|
||||
|
||||
pruned_total = tracks.count()
|
||||
total = models.Track.objects.count()
|
||||
|
||||
if options["no_dry_run"]:
|
||||
self.stdout.write(f"Deleting {pruned_total}/{total} tracks…")
|
||||
tracks.delete()
|
||||
else:
|
||||
self.stdout.write(f"Would prune {pruned_total}/{total} tracks")
|
|
@ -226,17 +226,18 @@ class TrackAlbumSerializer(serializers.ModelSerializer):
|
|||
)
|
||||
|
||||
|
||||
def serialize_upload(upload) -> object:
|
||||
return {
|
||||
"uuid": str(upload.uuid),
|
||||
"listen_url": upload.listen_url,
|
||||
"size": upload.size,
|
||||
"duration": upload.duration,
|
||||
"bitrate": upload.bitrate,
|
||||
"mimetype": upload.mimetype,
|
||||
"extension": upload.extension,
|
||||
"is_local": federation_utils.is_local(upload.fid),
|
||||
}
|
||||
class TrackUploadSerializer(serializers.Serializer):
|
||||
uuid = serializers.UUIDField()
|
||||
listen_url = serializers.URLField()
|
||||
size = serializers.IntegerField()
|
||||
duration = serializers.IntegerField()
|
||||
bitrate = serializers.IntegerField()
|
||||
mimetype = serializers.CharField()
|
||||
extension = serializers.CharField()
|
||||
is_local = serializers.SerializerMethodField()
|
||||
|
||||
def get_is_local(self, upload) -> bool:
|
||||
return federation_utils.is_local(upload.fid)
|
||||
|
||||
|
||||
def sort_uploads_for_listen(uploads):
|
||||
|
@ -281,11 +282,14 @@ class TrackSerializer(OptionalDescriptionMixin, serializers.Serializer):
|
|||
def get_listen_url(self, obj):
|
||||
return obj.listen_url
|
||||
|
||||
@extend_schema_field({"type": "array", "items": {"type": "object"}})
|
||||
# @extend_schema_field({"type": "array", "items": {"type": "object"}})
|
||||
@extend_schema_field(TrackUploadSerializer(many=True))
|
||||
def get_uploads(self, obj):
|
||||
uploads = getattr(obj, "playable_uploads", [])
|
||||
# we put local uploads first
|
||||
uploads = [serialize_upload(u) for u in sort_uploads_for_listen(uploads)]
|
||||
uploads = [
|
||||
TrackUploadSerializer(u).data for u in sort_uploads_for_listen(uploads)
|
||||
]
|
||||
uploads = sorted(uploads, key=lambda u: u["is_local"], reverse=True)
|
||||
return list(uploads)
|
||||
|
||||
|
|
|
@ -1,5 +1,4 @@
|
|||
import django.dispatch
|
||||
|
||||
upload_import_status_updated = django.dispatch.Signal(
|
||||
providing_args=["old_status", "new_status", "upload"]
|
||||
)
|
||||
""" Required args: old_status, new_status, upload """
|
||||
upload_import_status_updated = django.dispatch.Signal()
|
||||
|
|
|
@ -297,8 +297,6 @@ class LibraryViewSet(
|
|||
)
|
||||
instance.delete()
|
||||
|
||||
follows = action
|
||||
|
||||
@extend_schema(
|
||||
responses=federation_api_serializers.LibraryFollowSerializer(many=True)
|
||||
)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from funkwhale_api.common import routers
|
||||
|
||||
|
@ -7,22 +7,22 @@ from . import views
|
|||
router = routers.OptionalSlashRouter()
|
||||
router.register(r"search", views.SearchViewSet, "search")
|
||||
urlpatterns = [
|
||||
url(
|
||||
re_path(
|
||||
"releases/(?P<uuid>[0-9a-z-]+)/$",
|
||||
views.ReleaseDetail.as_view(),
|
||||
name="release-detail",
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
"artists/(?P<uuid>[0-9a-z-]+)/$",
|
||||
views.ArtistDetail.as_view(),
|
||||
name="artist-detail",
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
"release-groups/browse/(?P<artist_uuid>[0-9a-z-]+)/$",
|
||||
views.ReleaseGroupBrowse.as_view(),
|
||||
name="release-group-browse",
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
"releases/browse/(?P<release_group_uuid>[0-9a-z-]+)/$",
|
||||
views.ReleaseBrowse.as_view(),
|
||||
name="release-browse",
|
||||
|
|
|
@ -1,7 +1,8 @@
|
|||
from django.conf.urls import include, url
|
||||
from django.conf.urls import include
|
||||
from django.urls import re_path
|
||||
|
||||
urlpatterns = [
|
||||
url(
|
||||
re_path(
|
||||
r"^musicbrainz/",
|
||||
include(
|
||||
("funkwhale_api.musicbrainz.urls", "musicbrainz"), namespace="musicbrainz"
|
||||
|
|
|
@ -38,14 +38,12 @@ def validate(config):
|
|||
return True
|
||||
|
||||
|
||||
def build_radio_queryset(patch, config, radio_qs):
|
||||
"""Take a troi patch and its arg, match the missing mbid and then build a radio queryset"""
|
||||
|
||||
logger.info("Config used for troi radio generation is " + str(config))
|
||||
def build_radio_queryset(patch, radio_qs):
|
||||
"""Take a troi patch, match the missing mbid and then build a radio queryset"""
|
||||
|
||||
start_time = time.time()
|
||||
try:
|
||||
recommendations = troi.core.generate_playlist(patch, config)
|
||||
recommendations = patch.generate_playlist()
|
||||
except ConnectTimeout:
|
||||
raise ValueError(
|
||||
"Timed out while connecting to ListenBrainz. No candidates could be retrieved for the radio."
|
||||
|
@ -145,4 +143,4 @@ class TroiPatch:
|
|||
def get_queryset(self, config, qs):
|
||||
patch_string = config.pop("patch")
|
||||
patch = patches[patch_string]
|
||||
return build_radio_queryset(patch(), config, qs)
|
||||
return build_radio_queryset(patch(config), qs)
|
||||
|
|
|
@ -6,6 +6,10 @@ from rest_framework import renderers
|
|||
import funkwhale_api
|
||||
|
||||
|
||||
class TagValue(str):
|
||||
"""Use this for string values that must be rendered as tags instead of attributes in XML."""
|
||||
|
||||
|
||||
# from https://stackoverflow.com/a/8915039
|
||||
# because I want to avoid a lxml dependency just for outputting cdata properly
|
||||
# in a RSS feed
|
||||
|
@ -31,10 +35,14 @@ ET._serialize_xml = ET._serialize["xml"] = _serialize_xml
|
|||
|
||||
def structure_payload(data):
|
||||
payload = {
|
||||
# funkwhaleVersion is deprecated and will be removed in a future
|
||||
# release. Use serverVersion instead.
|
||||
"funkwhaleVersion": funkwhale_api.__version__,
|
||||
"serverVersion": funkwhale_api.__version__,
|
||||
"status": "ok",
|
||||
"type": "funkwhale",
|
||||
"version": "1.16.0",
|
||||
"openSubsonic": "true",
|
||||
}
|
||||
payload.update(data)
|
||||
if "detail" in payload:
|
||||
|
@ -81,6 +89,10 @@ def dict_to_xml_tree(root_tag, d, parent=None):
|
|||
el = ET.Element(key)
|
||||
el.text = str(obj)
|
||||
root.append(el)
|
||||
elif isinstance(value, TagValue):
|
||||
el = ET.Element(key)
|
||||
el.text = str(value)
|
||||
root.append(el)
|
||||
else:
|
||||
if key == "value":
|
||||
root.text = str(value)
|
||||
|
|
|
@ -7,6 +7,8 @@ from funkwhale_api.history import models as history_models
|
|||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.music import utils as music_utils
|
||||
|
||||
from .renderers import TagValue
|
||||
|
||||
|
||||
def to_subsonic_date(date):
|
||||
"""
|
||||
|
@ -50,6 +52,7 @@ def get_artist_data(artist_values):
|
|||
"name": artist_values["name"],
|
||||
"albumCount": artist_values["_albums_count"],
|
||||
"coverArt": "ar-{}".format(artist_values["id"]),
|
||||
"musicBrainzId": str(artist_values.get("mbid", "")),
|
||||
}
|
||||
|
||||
|
||||
|
@ -58,7 +61,7 @@ class GetArtistsSerializer(serializers.Serializer):
|
|||
payload = {"ignoredArticles": "", "index": []}
|
||||
queryset = queryset.with_albums_count()
|
||||
queryset = queryset.order_by(functions.Lower("name"))
|
||||
values = queryset.values("id", "_albums_count", "name")
|
||||
values = queryset.values("id", "_albums_count", "name", "mbid")
|
||||
|
||||
first_letter_mapping = collections.defaultdict(list)
|
||||
for artist in values:
|
||||
|
@ -102,6 +105,23 @@ class GetArtistSerializer(serializers.Serializer):
|
|||
return payload
|
||||
|
||||
|
||||
class GetArtistInfo2Serializer(serializers.Serializer):
|
||||
def to_representation(self, artist):
|
||||
payload = {}
|
||||
if artist.mbid:
|
||||
payload["musicBrainzId"] = TagValue(artist.mbid)
|
||||
if artist.attachment_cover:
|
||||
payload["mediumImageUrl"] = TagValue(
|
||||
artist.attachment_cover.download_url_medium_square_crop
|
||||
)
|
||||
payload["largeImageUrl"] = TagValue(
|
||||
artist.attachment_cover.download_url_large_square_crop
|
||||
)
|
||||
if artist.description:
|
||||
payload["biography"] = TagValue(artist.description.rendered)
|
||||
return payload
|
||||
|
||||
|
||||
def get_track_data(album, track, upload):
|
||||
data = {
|
||||
"id": track.pk,
|
||||
|
@ -126,11 +146,13 @@ def get_track_data(album, track, upload):
|
|||
"albumId": album.pk if album else "",
|
||||
"artistId": album.artist.pk if album else track.artist.pk,
|
||||
"type": "music",
|
||||
"mediaType": "song",
|
||||
"musicBrainzId": str(track.mbid or ""),
|
||||
}
|
||||
if album and album.attachment_cover_id:
|
||||
data["coverArt"] = f"al-{album.id}"
|
||||
if upload.bitrate:
|
||||
data["bitrate"] = int(upload.bitrate / 1000)
|
||||
data["bitRate"] = int(upload.bitrate / 1000)
|
||||
if upload.size:
|
||||
data["size"] = upload.size
|
||||
if album and album.release_date:
|
||||
|
@ -149,13 +171,17 @@ def get_album2_data(album):
|
|||
"created": to_subsonic_date(album.creation_date),
|
||||
"duration": album.duration,
|
||||
"playCount": album.tracks.aggregate(l=Count("listenings"))["l"] or 0,
|
||||
"mediaType": "album",
|
||||
"musicBrainzId": str(album.mbid or ""),
|
||||
}
|
||||
if album.attachment_cover_id:
|
||||
payload["coverArt"] = f"al-{album.id}"
|
||||
if album.tagged_items:
|
||||
genres = [{"name": i.tag.name} for i in album.tagged_items.all()]
|
||||
# exposes only first genre since the specification uses singular noun
|
||||
first_genre = album.tagged_items.first()
|
||||
payload["genre"] = first_genre.tag.name if first_genre else ""
|
||||
payload["genre"] = genres[0]["name"] if len(genres) > 0 else ""
|
||||
# OpenSubsonic full genre list
|
||||
payload["genres"] = genres
|
||||
if album.release_date:
|
||||
payload["year"] = album.release_date.year
|
||||
try:
|
||||
|
@ -343,7 +369,7 @@ def get_channel_episode_data(upload, channel_id):
|
|||
"genre": "Podcast",
|
||||
"size": upload.size if upload.size else "",
|
||||
"duration": upload.duration if upload.duration else "",
|
||||
"bitrate": upload.bitrate / 1000 if upload.bitrate else "",
|
||||
"bitRate": upload.bitrate / 1000 if upload.bitrate else "",
|
||||
"contentType": upload.mimetype or "audio/mpeg",
|
||||
"suffix": upload.extension or "mp3",
|
||||
"status": "completed",
|
||||
|
|
|
@ -180,6 +180,19 @@ class SubsonicViewSet(viewsets.GenericViewSet):
|
|||
}
|
||||
return response.Response(data, status=200)
|
||||
|
||||
@action(
|
||||
detail=False,
|
||||
methods=["get", "post"],
|
||||
url_name="get_open_subsonic_extensions",
|
||||
permission_classes=[],
|
||||
url_path="getOpenSubsonicExtensions",
|
||||
)
|
||||
def get_open_subsonic_extensions(self, request, *args, **kwargs):
|
||||
data = {
|
||||
"openSubsonicExtensions": [{"name": "formPost", "versions": [1]}],
|
||||
}
|
||||
return response.Response(data, status=200)
|
||||
|
||||
@action(
|
||||
detail=False,
|
||||
methods=["get", "post"],
|
||||
|
@ -255,7 +268,9 @@ class SubsonicViewSet(viewsets.GenericViewSet):
|
|||
)
|
||||
@find_object(music_models.Artist.objects.all(), filter_playable=True)
|
||||
def get_artist_info2(self, request, *args, **kwargs):
|
||||
payload = {"artist-info2": {}}
|
||||
artist = kwargs.pop("obj")
|
||||
data = serializers.GetArtistInfo2Serializer(artist).data
|
||||
payload = {"artistInfo2": data}
|
||||
|
||||
return response.Response(payload, status=200)
|
||||
|
||||
|
@ -523,7 +538,7 @@ class SubsonicViewSet(viewsets.GenericViewSet):
|
|||
"search_fields": ["name"],
|
||||
"queryset": (
|
||||
music_models.Artist.objects.with_albums_count().values(
|
||||
"id", "_albums_count", "name"
|
||||
"id", "_albums_count", "name", "mbid"
|
||||
)
|
||||
),
|
||||
"serializer": lambda qs: [serializers.get_artist_data(a) for a in qs],
|
||||
|
|
|
@ -36,7 +36,6 @@ def delete_non_alnum_characters(text):
|
|||
def resolve_recordings_to_fw_track(recordings):
|
||||
"""
|
||||
Tries to match a troi recording entity to a fw track using the typesense index.
|
||||
It will save the results in the match_mbid attribute of the Track table.
|
||||
For test purposes : if multiple fw tracks are returned, we log the information
|
||||
but only keep the best result in db to avoid duplicates.
|
||||
"""
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
from django import forms
|
||||
from django.contrib.auth.admin import UserAdmin as AuthUserAdmin
|
||||
from django.contrib.auth.forms import UserChangeForm, UserCreationForm
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from funkwhale_api.common import admin
|
||||
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from funkwhale_api.common import routers
|
||||
|
||||
|
@ -8,6 +8,6 @@ router = routers.OptionalSlashRouter()
|
|||
router.register(r"users", views.UserViewSet, "users")
|
||||
|
||||
urlpatterns = [
|
||||
url(r"^users/login/?$", views.login, name="login"),
|
||||
url(r"^users/logout/?$", views.logout, name="logout"),
|
||||
re_path(r"^users/login/?$", views.login, name="login"),
|
||||
re_path(r"^users/logout/?$", views.logout, name="logout"),
|
||||
] + router.urls
|
||||
|
|
|
@ -12,7 +12,7 @@ from django.db.models import JSONField
|
|||
from django.dispatch import receiver
|
||||
from django.urls import reverse
|
||||
from django.utils import timezone
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django_auth_ldap.backend import populate_user as ldap_populate_user
|
||||
from oauth2_provider import models as oauth2_models
|
||||
from oauth2_provider import validators as oauth2_validators
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
from django.views.decorators.csrf import csrf_exempt
|
||||
|
||||
from funkwhale_api.common import routers
|
||||
|
@ -10,7 +10,9 @@ router.register(r"apps", views.ApplicationViewSet, "apps")
|
|||
router.register(r"grants", views.GrantViewSet, "grants")
|
||||
|
||||
urlpatterns = router.urls + [
|
||||
url("^authorize/$", csrf_exempt(views.AuthorizeView.as_view()), name="authorize"),
|
||||
url("^token/$", views.TokenView.as_view(), name="token"),
|
||||
url("^revoke/$", views.RevokeTokenView.as_view(), name="revoke"),
|
||||
re_path(
|
||||
"^authorize/$", csrf_exempt(views.AuthorizeView.as_view()), name="authorize"
|
||||
),
|
||||
re_path("^token/$", views.TokenView.as_view(), name="token"),
|
||||
re_path("^revoke/$", views.RevokeTokenView.as_view(), name="revoke"),
|
||||
]
|
||||
|
|
|
@ -200,7 +200,7 @@ class AuthorizeView(views.APIView, oauth_views.AuthorizationView):
|
|||
return self.json_payload({"non_field_errors": ["Invalid application"]}, 400)
|
||||
|
||||
def redirect(self, redirect_to, application):
|
||||
if self.request.is_ajax():
|
||||
if self.request.META.get("HTTP_X_REQUESTED_WITH") == "XMLHttpRequest":
|
||||
# Web client need this to be able to redirect the user
|
||||
query = urllib.parse.urlparse(redirect_to).query
|
||||
code = urllib.parse.parse_qs(query)["code"][0]
|
||||
|
|
|
@ -1,38 +1,38 @@
|
|||
from dj_rest_auth import views as rest_auth_views
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
from django.views.generic import TemplateView
|
||||
|
||||
from . import views
|
||||
|
||||
urlpatterns = [
|
||||
# URLs that do not require a session or valid token
|
||||
url(
|
||||
re_path(
|
||||
r"^password/reset/$",
|
||||
views.PasswordResetView.as_view(),
|
||||
name="rest_password_reset",
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r"^password/reset/confirm/$",
|
||||
views.PasswordResetConfirmView.as_view(),
|
||||
name="rest_password_reset_confirm",
|
||||
),
|
||||
# URLs that require a user to be logged in with a valid session / token.
|
||||
url(
|
||||
re_path(
|
||||
r"^user/$", rest_auth_views.UserDetailsView.as_view(), name="rest_user_details"
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r"^password/change/$",
|
||||
views.PasswordChangeView.as_view(),
|
||||
name="rest_password_change",
|
||||
),
|
||||
# Registration URLs
|
||||
url(r"^registration/$", views.RegisterView.as_view(), name="rest_register"),
|
||||
url(
|
||||
re_path(r"^registration/$", views.RegisterView.as_view(), name="rest_register"),
|
||||
re_path(
|
||||
r"^registration/verify-email/?$",
|
||||
views.VerifyEmailView.as_view(),
|
||||
name="rest_verify_email",
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r"^registration/change-password/?$",
|
||||
views.PasswordChangeView.as_view(),
|
||||
name="change_password",
|
||||
|
@ -47,7 +47,7 @@ urlpatterns = [
|
|||
# If you don't want to use API on that step, then just use ConfirmEmailView
|
||||
# view from:
|
||||
# https://github.com/pennersr/django-allauth/blob/a62a370681/allauth/account/views.py#L291
|
||||
url(
|
||||
re_path(
|
||||
r"^registration/account-confirm-email/(?P<key>\w+)/?$",
|
||||
TemplateView.as_view(),
|
||||
name="account_confirm_email",
|
||||
|
|
|
@ -340,4 +340,8 @@ class UserChangeEmailSerializer(serializers.Serializer):
|
|||
email=request.user.email,
|
||||
defaults={"verified": False, "primary": True},
|
||||
)
|
||||
current.change(request, self.validated_data["email"], confirm=True)
|
||||
if request.user.email != self.validated_data["email"]:
|
||||
current.email = self.validated_data["email"]
|
||||
current.verified = False
|
||||
current.save()
|
||||
current.send_confirmation()
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
import json
|
||||
|
||||
from allauth.account.adapter import get_adapter
|
||||
from allauth.account.utils import send_email_confirmation
|
||||
from dj_rest_auth import views as rest_auth_views
|
||||
from dj_rest_auth.registration import views as registration_views
|
||||
from django import http
|
||||
|
@ -11,7 +12,7 @@ from rest_framework import mixins, viewsets
|
|||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
|
||||
from funkwhale_api.common import authentication, preferences, throttling
|
||||
from funkwhale_api.common import preferences, throttling
|
||||
|
||||
from . import models, serializers, tasks
|
||||
|
||||
|
@ -37,7 +38,7 @@ class RegisterView(registration_views.RegisterView):
|
|||
user = super().perform_create(serializer)
|
||||
if not user.is_active:
|
||||
# manual approval, we need to send the confirmation e-mail by hand
|
||||
authentication.send_email_confirmation(self.request, user)
|
||||
send_email_confirmation(self.request, user)
|
||||
if user.invitation:
|
||||
user.invitation.set_invited_user(user)
|
||||
|
||||
|
|
Plik diff jest za duży
Load Diff
|
@ -1,6 +1,6 @@
|
|||
[tool.poetry]
|
||||
name = "funkwhale-api"
|
||||
version = "1.4.0-rc1"
|
||||
version = "1.4.0"
|
||||
description = "Funkwhale API"
|
||||
|
||||
authors = ["Funkwhale Collective"]
|
||||
|
@ -25,103 +25,104 @@ exclude = ["tests"]
|
|||
funkwhale-manage = 'funkwhale_api.main:main'
|
||||
|
||||
[tool.poetry.dependencies]
|
||||
python = "^3.8,<3.12"
|
||||
python = "^3.8,<3.13"
|
||||
|
||||
# Django
|
||||
dj-rest-auth = { extras = ["with_social"], version = "2.2.8" }
|
||||
django = "==3.2.23"
|
||||
django-allauth = "==0.42.0"
|
||||
dj-rest-auth = "5.0.2"
|
||||
django = "4.2.9"
|
||||
django-allauth = "0.55.2"
|
||||
django-cache-memoize = "0.1.10"
|
||||
django-cacheops = "==6.1"
|
||||
django-cleanup = "==6.0.0"
|
||||
django-cors-headers = "==3.13.0"
|
||||
django-cacheops = "==7.0.2"
|
||||
django-cleanup = "==8.1.0"
|
||||
django-cors-headers = "==4.3.1"
|
||||
django-dynamic-preferences = "==1.14.0"
|
||||
django-environ = "==0.10.0"
|
||||
django-filter = "==22.1"
|
||||
django-filter = "==23.5"
|
||||
django-oauth-toolkit = "2.2.0"
|
||||
django-redis = "==5.2.0"
|
||||
django-storages = "==1.13.2"
|
||||
django-versatileimagefield = "==2.2"
|
||||
django-versatileimagefield = "==3.1"
|
||||
djangorestframework = "==3.14.0"
|
||||
drf-spectacular = "==0.26.1"
|
||||
drf-spectacular = "==0.26.5"
|
||||
markdown = "==3.4.4"
|
||||
persisting-theory = "==1.0"
|
||||
psycopg2 = "==2.9.9"
|
||||
redis = "==4.5.5"
|
||||
redis = "==5.0.1"
|
||||
|
||||
# Django LDAP
|
||||
django-auth-ldap = "==4.1.0"
|
||||
python-ldap = "==3.4.3"
|
||||
python-ldap = "==3.4.4"
|
||||
|
||||
# Channels
|
||||
channels = { extras = ["daphne"], version = "==4.0.0" }
|
||||
channels-redis = "==4.1.0"
|
||||
|
||||
# Celery
|
||||
kombu = "==5.2.4"
|
||||
celery = "==5.2.7"
|
||||
kombu = "5.3.4"
|
||||
celery = "5.3.6"
|
||||
|
||||
# Deployment
|
||||
gunicorn = "==20.1.0"
|
||||
gunicorn = "==21.2.0"
|
||||
uvicorn = { version = "==0.20.0", extras = ["standard"] }
|
||||
|
||||
# Libs
|
||||
aiohttp = "==3.8.6"
|
||||
aiohttp = "3.9.1"
|
||||
arrow = "==1.2.3"
|
||||
backports-zoneinfo = { version = "==0.2.1", python = "<3.9" }
|
||||
bleach = "==5.0.1"
|
||||
bleach = "==6.1.0"
|
||||
boto3 = "==1.26.161"
|
||||
click = "==8.1.7"
|
||||
cryptography = "==38.0.4"
|
||||
cryptography = "==41.0.7"
|
||||
feedparser = "==6.0.10"
|
||||
liblistenbrainz = "==0.5.5"
|
||||
musicbrainzngs = "==0.7.1"
|
||||
mutagen = "==1.46.0"
|
||||
pillow = "==9.3.0"
|
||||
pillow = "==10.2.0"
|
||||
pydub = "==0.25.1"
|
||||
pyld = "==2.0.3"
|
||||
python-magic = "==0.4.27"
|
||||
requests = "==2.28.2"
|
||||
requests = "==2.31.0"
|
||||
requests-http-message-signatures = "==0.3.1"
|
||||
sentry-sdk = "==1.19.1"
|
||||
watchdog = "==2.2.1"
|
||||
troi = { git = "https://github.com/metabrainz/troi-recommendation-playground.git", tag = "v-2023-10-30.0"}
|
||||
lb-matching-tools = { git = "https://github.com/metabrainz/listenbrainz-matching-tools.git", branch = "main"}
|
||||
unidecode = "==1.3.6"
|
||||
pycountry = "22.3.5"
|
||||
watchdog = "==4.0.0"
|
||||
troi = "==2024.1.26.0"
|
||||
lb-matching-tools = "==2024.1.25.0rc1"
|
||||
unidecode = "==1.3.7"
|
||||
pycountry = "23.12.11"
|
||||
|
||||
# Typesense
|
||||
typesense = { version = "==0.15.1", optional = true }
|
||||
|
||||
# Dependencies pinning
|
||||
ipython = "==7.34.0"
|
||||
ipython = "==8.12.3"
|
||||
pluralizer = "==1.2.0"
|
||||
service-identity = "==21.1.0"
|
||||
service-identity = "==24.1.0"
|
||||
unicode-slugify = "==0.1.5"
|
||||
|
||||
[tool.poetry.group.dev.dependencies]
|
||||
aioresponses = "==0.7.6"
|
||||
asynctest = "==0.13.0"
|
||||
black = "==23.3.0"
|
||||
coverage = { version = "==6.5.0", extras = ["toml"] }
|
||||
black = "==24.1.1"
|
||||
coverage = { version = "==7.4.1", extras = ["toml"] }
|
||||
debugpy = "==1.6.7.post1"
|
||||
django-coverage-plugin = "==3.0.0"
|
||||
django-debug-toolbar = "==3.8.1"
|
||||
django-debug-toolbar = "==4.2.0"
|
||||
factory-boy = "==3.2.1"
|
||||
faker = "==15.3.4"
|
||||
faker = "==23.2.1"
|
||||
flake8 = "==3.9.2"
|
||||
ipdb = "==0.13.13"
|
||||
pytest = "==7.2.2"
|
||||
pytest = "==8.0.0"
|
||||
pytest-asyncio = "==0.21.0"
|
||||
prompt-toolkit = "==3.0.41"
|
||||
pytest-cov = "==4.0.0"
|
||||
pytest-django = "==4.5.2"
|
||||
pytest-env = "==0.8.1"
|
||||
pytest-env = "==1.1.3"
|
||||
pytest-mock = "==3.10.0"
|
||||
pytest-randomly = "==3.12.0"
|
||||
pytest-sugar = "==0.9.7"
|
||||
pytest-sugar = "==1.0.0"
|
||||
requests-mock = "==1.10.0"
|
||||
pylint = "==2.17.2"
|
||||
pylint-django = "==2.5.3"
|
||||
pylint = "==3.0.3"
|
||||
pylint-django = "==2.5.5"
|
||||
django-extensions = "==3.2.3"
|
||||
|
||||
[tool.poetry.extras]
|
||||
|
|
|
@ -269,6 +269,7 @@ def test_throttle_calls_attach_info(method, mocker):
|
|||
|
||||
|
||||
def test_allow_request(api_request, settings, mocker):
|
||||
settings.THROTTLING_ENABLED = True
|
||||
settings.THROTTLING_RATES = {"test": {"rate": "2/s"}}
|
||||
ip = "92.92.92.92"
|
||||
request = api_request.get("/", HTTP_X_FORWARDED_FOR=ip)
|
||||
|
|
|
@ -0,0 +1,333 @@
|
|||
import datetime
|
||||
import logging
|
||||
|
||||
import liblistenbrainz
|
||||
import pytest
|
||||
from django.urls import reverse
|
||||
from django.utils import timezone
|
||||
|
||||
from config import plugins
|
||||
from funkwhale_api.contrib.listenbrainz import funkwhale_ready
|
||||
from funkwhale_api.favorites import models as favorites_models
|
||||
from funkwhale_api.history import models as history_models
|
||||
|
||||
|
||||
def test_listenbrainz_submit_listen(logged_in_client, mocker, factories):
|
||||
config = plugins.get_plugin_config(
|
||||
name="listenbrainz",
|
||||
description="A plugin that allows you to submit or sync your listens and favorites to ListenBrainz.",
|
||||
conf=[],
|
||||
source=False,
|
||||
)
|
||||
handler = mocker.Mock()
|
||||
plugins.register_hook(plugins.LISTENING_CREATED, config)(handler)
|
||||
plugins.set_conf(
|
||||
"listenbrainz",
|
||||
{
|
||||
"sync_listenings": True,
|
||||
"sync_favorites": True,
|
||||
"submit_favorites": True,
|
||||
"sync_favorites": True,
|
||||
"user_token": "blablabla",
|
||||
},
|
||||
user=logged_in_client.user,
|
||||
)
|
||||
plugins.enable_conf("listenbrainz", True, logged_in_client.user)
|
||||
|
||||
track = factories["music.Track"]()
|
||||
url = reverse("api:v1:history:listenings-list")
|
||||
logged_in_client.post(url, {"track": track.pk})
|
||||
logged_in_client.get(url)
|
||||
listening = history_models.Listening.objects.get(user=logged_in_client.user)
|
||||
handler.assert_called_once_with(listening=listening, conf=None)
|
||||
|
||||
|
||||
def test_sync_listenings_from_listenbrainz(factories, mocker, caplog):
|
||||
logger = logging.getLogger("plugins")
|
||||
caplog.set_level(logging.INFO)
|
||||
logger.addHandler(caplog.handler)
|
||||
user = factories["users.User"]()
|
||||
|
||||
factories["music.Track"](mbid="f89db7f8-4a1f-4228-a0a1-e7ba028b7476")
|
||||
track = factories["music.Track"](mbid="54c60860-f43d-484e-b691-7ab7ec8de559")
|
||||
factories["history.Listening"](
|
||||
creation_date=datetime.datetime.fromtimestamp(1871, timezone.utc), track=track
|
||||
)
|
||||
|
||||
conf = {
|
||||
"user_name": user.username,
|
||||
"user_token": "user_tolkien",
|
||||
"sync_listenings": True,
|
||||
}
|
||||
|
||||
listens = {
|
||||
"payload": {
|
||||
"count": 25,
|
||||
"user_id": "-- the MusicBrainz ID of the user --",
|
||||
"listens": [
|
||||
liblistenbrainz.Listen(
|
||||
track_name="test",
|
||||
artist_name="artist_test",
|
||||
recording_mbid="f89db7f8-4a1f-4228-a0a1-e7ba028b7476",
|
||||
additional_info={"submission_client": "not funkwhale"},
|
||||
listened_at=-3124224000,
|
||||
),
|
||||
liblistenbrainz.Listen(
|
||||
track_name="test2",
|
||||
artist_name="artist_test2",
|
||||
recording_mbid="54c60860-f43d-484e-b691-7ab7ec8de559",
|
||||
additional_info={
|
||||
"submission_client": "Funkwhale ListenBrainz plugin"
|
||||
},
|
||||
listened_at=1871,
|
||||
),
|
||||
liblistenbrainz.Listen(
|
||||
track_name="test3",
|
||||
artist_name="artist_test3",
|
||||
listened_at=0,
|
||||
),
|
||||
],
|
||||
}
|
||||
}
|
||||
no_more_listen = {
|
||||
"payload": {
|
||||
"count": 25,
|
||||
"user_id": "Bilbo",
|
||||
"listens": [],
|
||||
}
|
||||
}
|
||||
|
||||
mocker.patch.object(
|
||||
funkwhale_ready.tasks.liblistenbrainz.ListenBrainz,
|
||||
"get_listens",
|
||||
side_effect=[listens, no_more_listen],
|
||||
)
|
||||
|
||||
funkwhale_ready.sync_listenings_from_listenbrainz(user, conf)
|
||||
|
||||
assert history_models.Listening.objects.filter(
|
||||
track__mbid="f89db7f8-4a1f-4228-a0a1-e7ba028b7476"
|
||||
).exists()
|
||||
|
||||
assert "Listen with ts 1871 skipped because already in db" in caplog.text
|
||||
assert "Received listening that doesn't have a mbid. Skipping..." in caplog.text
|
||||
|
||||
|
||||
def test_sync_favorites_from_listenbrainz(factories, mocker, caplog):
|
||||
logger = logging.getLogger("plugins")
|
||||
caplog.set_level(logging.INFO)
|
||||
logger.addHandler(caplog.handler)
|
||||
user = factories["users.User"]()
|
||||
# track lb fav
|
||||
factories["music.Track"](mbid="195565db-65f9-4d0d-b347-5f0c85509528")
|
||||
# random track
|
||||
factories["music.Track"]()
|
||||
# track lb neutral
|
||||
track = factories["music.Track"](mbid="c5af5351-dbbf-4481-b52e-a480b6c57986")
|
||||
favorite = factories["favorites.TrackFavorite"](track=track, user=user)
|
||||
# last_sync
|
||||
track_last_sync = factories["music.Track"](
|
||||
mbid="c878ef2f-c08d-4a81-a047-f2a9f978cec7"
|
||||
)
|
||||
factories["favorites.TrackFavorite"](track=track_last_sync, source="Listenbrainz")
|
||||
|
||||
conf = {
|
||||
"user_name": user.username,
|
||||
"user_token": "user_tolkien",
|
||||
"sync_favorites": True,
|
||||
}
|
||||
|
||||
feedbacks = {
|
||||
"count": 5,
|
||||
"feedback": [
|
||||
{
|
||||
"created": 1701116226,
|
||||
"recording_mbid": "195565db-65f9-4d0d-b347-5f0c85509528",
|
||||
"score": 1,
|
||||
"user_id": user.username,
|
||||
},
|
||||
{
|
||||
"created": 1701116214,
|
||||
"recording_mbid": "c5af5351-dbbf-4481-b52e-a480b6c57986",
|
||||
"score": 0,
|
||||
"user_id": user.username,
|
||||
},
|
||||
{
|
||||
# last sync
|
||||
"created": 1690775094,
|
||||
"recording_mbid": "c878ef2f-c08d-4a81-a047-f2a9f978cec7",
|
||||
"score": -1,
|
||||
"user_id": user.username,
|
||||
},
|
||||
{
|
||||
"created": 1690775093,
|
||||
"recording_mbid": "1fd02cf2-7247-4715-8862-c378ec1965d2",
|
||||
"score": 1,
|
||||
"user_id": user.username,
|
||||
},
|
||||
],
|
||||
"offset": 0,
|
||||
"total_count": 4,
|
||||
}
|
||||
empty_feedback = {"count": 0, "feedback": [], "offset": 0, "total_count": 0}
|
||||
mocker.patch.object(
|
||||
funkwhale_ready.tasks.liblistenbrainz.ListenBrainz,
|
||||
"get_user_feedback",
|
||||
side_effect=[feedbacks, empty_feedback],
|
||||
)
|
||||
|
||||
funkwhale_ready.sync_favorites_from_listenbrainz(user, conf)
|
||||
|
||||
assert favorites_models.TrackFavorite.objects.filter(
|
||||
track__mbid="195565db-65f9-4d0d-b347-5f0c85509528"
|
||||
).exists()
|
||||
with pytest.raises(favorites_models.TrackFavorite.DoesNotExist):
|
||||
favorite.refresh_from_db()
|
||||
|
||||
|
||||
def test_sync_favorites_from_listenbrainz_since(factories, mocker, caplog):
|
||||
logger = logging.getLogger("plugins")
|
||||
caplog.set_level(logging.INFO)
|
||||
logger.addHandler(caplog.handler)
|
||||
user = factories["users.User"]()
|
||||
# track lb fav
|
||||
factories["music.Track"](mbid="195565db-65f9-4d0d-b347-5f0c85509528")
|
||||
# track lb neutral
|
||||
track = factories["music.Track"](mbid="c5af5351-dbbf-4481-b52e-a480b6c57986")
|
||||
favorite = factories["favorites.TrackFavorite"](track=track, user=user)
|
||||
# track should be not synced
|
||||
factories["music.Track"](mbid="1fd02cf2-7247-4715-8862-c378ec196000")
|
||||
# last_sync
|
||||
track_last_sync = factories["music.Track"](
|
||||
mbid="c878ef2f-c08d-4a81-a047-f2a9f978cec7"
|
||||
)
|
||||
factories["favorites.TrackFavorite"](
|
||||
track=track_last_sync,
|
||||
user=user,
|
||||
source="Listenbrainz",
|
||||
creation_date=datetime.datetime.fromtimestamp(1690775094),
|
||||
)
|
||||
|
||||
conf = {
|
||||
"user_name": user.username,
|
||||
"user_token": "user_tolkien",
|
||||
"sync_favorites": True,
|
||||
}
|
||||
|
||||
feedbacks = {
|
||||
"count": 5,
|
||||
"feedback": [
|
||||
{
|
||||
"created": 1701116226,
|
||||
"recording_mbid": "195565db-65f9-4d0d-b347-5f0c85509528",
|
||||
"score": 1,
|
||||
"user_id": user.username,
|
||||
},
|
||||
{
|
||||
"created": 1701116214,
|
||||
"recording_mbid": "c5af5351-dbbf-4481-b52e-a480b6c57986",
|
||||
"score": 0,
|
||||
"user_id": user.username,
|
||||
},
|
||||
{
|
||||
# last sync
|
||||
"created": 1690775094,
|
||||
"recording_mbid": "c878ef2f-c08d-4a81-a047-f2a9f978cec7",
|
||||
"score": -1,
|
||||
"user_id": user.username,
|
||||
},
|
||||
{
|
||||
"created": 1690775093,
|
||||
"recording_mbid": "1fd02cf2-7247-4715-8862-c378ec1965d2",
|
||||
"score": 1,
|
||||
"user_id": user.username,
|
||||
},
|
||||
],
|
||||
"offset": 0,
|
||||
"total_count": 4,
|
||||
}
|
||||
second_feedback = {
|
||||
"count": 0,
|
||||
"feedback": [
|
||||
{
|
||||
"created": 0,
|
||||
"recording_mbid": "1fd02cf2-7247-4715-8862-c378ec196000",
|
||||
"score": 1,
|
||||
"user_id": user.username,
|
||||
},
|
||||
],
|
||||
"offset": 0,
|
||||
"total_count": 0,
|
||||
}
|
||||
mocker.patch.object(
|
||||
funkwhale_ready.tasks.liblistenbrainz.ListenBrainz,
|
||||
"get_user_feedback",
|
||||
side_effect=[feedbacks, second_feedback],
|
||||
)
|
||||
|
||||
funkwhale_ready.sync_favorites_from_listenbrainz(user, conf)
|
||||
|
||||
assert favorites_models.TrackFavorite.objects.filter(
|
||||
track__mbid="195565db-65f9-4d0d-b347-5f0c85509528"
|
||||
).exists()
|
||||
assert not favorites_models.TrackFavorite.objects.filter(
|
||||
track__mbid="1fd02cf2-7247-4715-8862-c378ec196000"
|
||||
).exists()
|
||||
with pytest.raises(favorites_models.TrackFavorite.DoesNotExist):
|
||||
favorite.refresh_from_db()
|
||||
|
||||
|
||||
def test_submit_favorites_to_listenbrainz(factories, mocker, caplog):
|
||||
logger = logging.getLogger("plugins")
|
||||
caplog.set_level(logging.INFO)
|
||||
logger.addHandler(caplog.handler)
|
||||
user = factories["users.User"]()
|
||||
|
||||
factories["music.Track"](mbid="195565db-65f9-4d0d-b347-5f0c85509528")
|
||||
|
||||
factories["music.Track"](mbid="54c60860-f43d-484e-b691-7ab7ec8de559")
|
||||
track = factories["music.Track"](mbid="c5af5351-dbbf-4481-b52e-a480b6c57986")
|
||||
|
||||
favorite = factories["favorites.TrackFavorite"](track=track)
|
||||
conf = {
|
||||
"user_name": user.username,
|
||||
"user_token": "user_tolkien",
|
||||
"submit_favorites": True,
|
||||
}
|
||||
|
||||
patch = mocker.patch.object(
|
||||
funkwhale_ready.tasks.liblistenbrainz.ListenBrainz,
|
||||
"submit_user_feedback",
|
||||
return_value="Success",
|
||||
)
|
||||
funkwhale_ready.submit_favorite_creation(favorite, conf)
|
||||
|
||||
patch.assert_called_once_with(1, track.mbid)
|
||||
|
||||
|
||||
def test_submit_favorites_deletion(factories, mocker, caplog):
|
||||
logger = logging.getLogger("plugins")
|
||||
caplog.set_level(logging.INFO)
|
||||
logger.addHandler(caplog.handler)
|
||||
user = factories["users.User"]()
|
||||
|
||||
factories["music.Track"](mbid="195565db-65f9-4d0d-b347-5f0c85509528")
|
||||
|
||||
factories["music.Track"](mbid="54c60860-f43d-484e-b691-7ab7ec8de559")
|
||||
track = factories["music.Track"](mbid="c5af5351-dbbf-4481-b52e-a480b6c57986")
|
||||
|
||||
favorite = factories["favorites.TrackFavorite"](track=track)
|
||||
conf = {
|
||||
"user_name": user.username,
|
||||
"user_token": "user_tolkien",
|
||||
"submit_favorites": True,
|
||||
}
|
||||
|
||||
patch = mocker.patch.object(
|
||||
funkwhale_ready.tasks.liblistenbrainz.ListenBrainz,
|
||||
"submit_user_feedback",
|
||||
return_value="Success",
|
||||
)
|
||||
funkwhale_ready.submit_favorite_deletion(favorite, conf)
|
||||
|
||||
patch.assert_called_once_with(0, track.mbid)
|
|
@ -7,6 +7,7 @@ from funkwhale_api.music.management.commands import (
|
|||
check_inplace_files,
|
||||
fix_uploads,
|
||||
prune_library,
|
||||
prune_non_mbid_content,
|
||||
)
|
||||
|
||||
DATA_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
@ -204,3 +205,45 @@ def test_check_inplace_files_no_dry_run(factories, tmpfile):
|
|||
|
||||
for u in not_prunable:
|
||||
u.refresh_from_db()
|
||||
|
||||
|
||||
def test_prune_non_mbid_content(factories):
|
||||
prunable = factories["music.Track"](mbid=None)
|
||||
|
||||
track = factories["music.Track"](mbid=None)
|
||||
factories["playlists.PlaylistTrack"](track=track)
|
||||
not_prunable = [factories["music.Track"](), track]
|
||||
c = prune_non_mbid_content.Command()
|
||||
options = {
|
||||
"include_playlist_content": False,
|
||||
"include_listened_content": False,
|
||||
"include_favorited_content": True,
|
||||
"no_dry_run": True,
|
||||
}
|
||||
c.handle(**options)
|
||||
|
||||
with pytest.raises(prunable.DoesNotExist):
|
||||
prunable.refresh_from_db()
|
||||
|
||||
for t in not_prunable:
|
||||
t.refresh_from_db()
|
||||
|
||||
track = factories["music.Track"](mbid=None)
|
||||
factories["playlists.PlaylistTrack"](track=track)
|
||||
prunable = [factories["music.Track"](mbid=None), track]
|
||||
|
||||
not_prunable = [factories["music.Track"]()]
|
||||
options = {
|
||||
"include_playlist_content": True,
|
||||
"include_listened_content": False,
|
||||
"include_favorited_content": False,
|
||||
"no_dry_run": True,
|
||||
}
|
||||
c.handle(**options)
|
||||
|
||||
for t in prunable:
|
||||
with pytest.raises(t.DoesNotExist):
|
||||
t.refresh_from_db()
|
||||
|
||||
for t in not_prunable:
|
||||
t.refresh_from_db()
|
||||
|
|
|
@ -198,8 +198,8 @@ def test_can_get_pictures(name):
|
|||
cover_data = data.get_picture("cover_front", "other")
|
||||
assert cover_data["mimetype"].startswith("image/")
|
||||
assert len(cover_data["content"]) > 0
|
||||
assert type(cover_data["content"]) == bytes
|
||||
assert type(cover_data["description"]) == str
|
||||
assert type(cover_data["content"]) is bytes
|
||||
assert type(cover_data["description"]) is str
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
|
|
|
@ -245,7 +245,7 @@ def test_track_serializer(factories, to_api_date):
|
|||
"title": track.title,
|
||||
"position": track.position,
|
||||
"disc_number": track.disc_number,
|
||||
"uploads": [serializers.serialize_upload(upload)],
|
||||
"uploads": [serializers.TrackUploadSerializer(upload).data],
|
||||
"creation_date": to_api_date(track.creation_date),
|
||||
"listen_url": track.listen_url,
|
||||
"license": upload.track.license.code,
|
||||
|
@ -373,7 +373,7 @@ def test_manage_upload_action_publish(factories, mocker):
|
|||
m.assert_any_call(tasks.process_upload.delay, upload_id=draft.pk)
|
||||
|
||||
|
||||
def test_serialize_upload(factories):
|
||||
def test_track_upload_serializer(factories):
|
||||
upload = factories["music.Upload"]()
|
||||
|
||||
expected = {
|
||||
|
@ -387,7 +387,7 @@ def test_serialize_upload(factories):
|
|||
"is_local": False,
|
||||
}
|
||||
|
||||
data = serializers.serialize_upload(upload)
|
||||
data = serializers.TrackUploadSerializer(upload).data
|
||||
assert data == expected
|
||||
|
||||
|
||||
|
|
|
@ -135,9 +135,8 @@ def test_transcode_file(name, expected):
|
|||
|
||||
def test_custom_s3_domain(factories, settings):
|
||||
"""See #2220"""
|
||||
settings.DEFAULT_FILE_STORAGE = "funkwhale_api.common.storage.ASCIIS3Boto3Storage"
|
||||
settings.AWS_S3_CUSTOM_DOMAIN = "my.custom.domain.tld"
|
||||
settings.DEFAULT_FILE_STORAGE = "funkwhale_api.common.storage.ASCIIS3Boto3Storage"
|
||||
f = factories["music.Upload"].build(audio_file__filename="test.mp3")
|
||||
print(f.audio_file.url)
|
||||
|
||||
assert f.audio_file.url.startswith("https://")
|
||||
|
|
|
@ -24,7 +24,7 @@ def test_can_build_radio_queryset_with_fw_db(factories, mocker):
|
|||
mocker.patch("funkwhale_api.typesense.utils.resolve_recordings_to_fw_track")
|
||||
|
||||
radio_qs = lb_recommendations.build_radio_queryset(
|
||||
custom_factories.DummyPatch(), {"min_recordings": 1}, qs
|
||||
custom_factories.DummyPatch({"min_recordings": 1}), qs
|
||||
)
|
||||
recommended_recording_mbids = [
|
||||
"87dfa566-21c3-45ed-bc42-1d345b8563fa",
|
||||
|
@ -46,7 +46,7 @@ def test_build_radio_queryset_without_fw_db(mocker):
|
|||
|
||||
with pytest.raises(ValueError):
|
||||
lb_recommendations.build_radio_queryset(
|
||||
custom_factories.DummyPatch(), {"min_recordings": 1}, qs
|
||||
custom_factories.DummyPatch({"min_recordings": 1}), qs
|
||||
)
|
||||
|
||||
assert resolve_recordings_to_fw_track.called_once_with(
|
||||
|
@ -67,7 +67,7 @@ def test_build_radio_queryset_with_redis_and_fw_db(factories, mocker):
|
|||
|
||||
assert list(
|
||||
lb_recommendations.build_radio_queryset(
|
||||
custom_factories.DummyPatch(), {"min_recordings": 1}, qs
|
||||
custom_factories.DummyPatch({"min_recordings": 1}), qs
|
||||
)
|
||||
) == list(Track.objects.all().filter(pk__in=[1, 2]))
|
||||
|
||||
|
@ -84,14 +84,14 @@ def test_build_radio_queryset_with_redis_and_without_fw_db(factories, mocker):
|
|||
|
||||
assert list(
|
||||
lb_recommendations.build_radio_queryset(
|
||||
custom_factories.DummyPatch(), {"min_recordings": 1}, qs
|
||||
custom_factories.DummyPatch({"min_recordings": 1}), qs
|
||||
)
|
||||
) == list(Track.objects.all().filter(pk=1))
|
||||
|
||||
|
||||
def test_build_radio_queryset_catch_troi_ConnectTimeout(mocker):
|
||||
mocker.patch.object(
|
||||
troi.core,
|
||||
troi.core.Patch,
|
||||
"generate_playlist",
|
||||
side_effect=ConnectTimeout,
|
||||
)
|
||||
|
@ -99,18 +99,18 @@ def test_build_radio_queryset_catch_troi_ConnectTimeout(mocker):
|
|||
|
||||
with pytest.raises(ValueError):
|
||||
lb_recommendations.build_radio_queryset(
|
||||
custom_factories.DummyPatch(), {"min_recordings": 1}, qs
|
||||
custom_factories.DummyPatch({"min_recordings": 1}), qs
|
||||
)
|
||||
|
||||
|
||||
def test_build_radio_queryset_catch_troi_no_candidates(mocker):
|
||||
mocker.patch.object(
|
||||
troi.core,
|
||||
troi.core.Patch,
|
||||
"generate_playlist",
|
||||
)
|
||||
qs = Track.objects.all()
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
lb_recommendations.build_radio_queryset(
|
||||
custom_factories.DummyPatch(), {"min_recordings": 1}, qs
|
||||
custom_factories.DummyPatch({"min_recordings": 1}), qs
|
||||
)
|
||||
|
|
|
@ -17,6 +17,8 @@ from funkwhale_api.subsonic import renderers
|
|||
"version": "1.16.0",
|
||||
"type": "funkwhale",
|
||||
"funkwhaleVersion": funkwhale_api.__version__,
|
||||
"serverVersion": funkwhale_api.__version__,
|
||||
"openSubsonic": "true",
|
||||
"hello": "world",
|
||||
},
|
||||
),
|
||||
|
@ -30,6 +32,8 @@ from funkwhale_api.subsonic import renderers
|
|||
"version": "1.16.0",
|
||||
"type": "funkwhale",
|
||||
"funkwhaleVersion": funkwhale_api.__version__,
|
||||
"serverVersion": funkwhale_api.__version__,
|
||||
"openSubsonic": "true",
|
||||
"hello": "world",
|
||||
"error": {"code": 10, "message": "something went wrong"},
|
||||
},
|
||||
|
@ -41,6 +45,8 @@ from funkwhale_api.subsonic import renderers
|
|||
"version": "1.16.0",
|
||||
"type": "funkwhale",
|
||||
"funkwhaleVersion": funkwhale_api.__version__,
|
||||
"serverVersion": funkwhale_api.__version__,
|
||||
"openSubsonic": "true",
|
||||
"hello": "world",
|
||||
"error": {"code": 0, "message": "something went wrong"},
|
||||
},
|
||||
|
@ -59,6 +65,8 @@ def test_json_renderer():
|
|||
"version": "1.16.0",
|
||||
"type": "funkwhale",
|
||||
"funkwhaleVersion": funkwhale_api.__version__,
|
||||
"serverVersion": funkwhale_api.__version__,
|
||||
"openSubsonic": "true",
|
||||
"hello": "world",
|
||||
}
|
||||
}
|
||||
|
@ -71,9 +79,10 @@ def test_xml_renderer_dict_to_xml():
|
|||
"hello": "world",
|
||||
"item": [{"this": 1, "value": "text"}, {"some": "node"}],
|
||||
"list": [1, 2],
|
||||
"some-tag": renderers.TagValue("foo"),
|
||||
}
|
||||
expected = """<?xml version="1.0" encoding="UTF-8"?>
|
||||
<key hello="world"><item this="1">text</item><item some="node" /><list>1</list><list>2</list></key>"""
|
||||
<key hello="world"><item this="1">text</item><item some="node" /><list>1</list><list>2</list><some-tag>foo</some-tag></key>""" # noqa
|
||||
result = renderers.dict_to_xml_tree("key", payload)
|
||||
exp = ET.fromstring(expected)
|
||||
assert ET.tostring(result) == ET.tostring(exp)
|
||||
|
@ -81,8 +90,9 @@ def test_xml_renderer_dict_to_xml():
|
|||
|
||||
def test_xml_renderer():
|
||||
payload = {"hello": "world"}
|
||||
expected = '<?xml version="1.0" encoding="UTF-8"?>\n<subsonic-response funkwhaleVersion="{}" hello="world" status="ok" type="funkwhale" version="1.16.0" xmlns="http://subsonic.org/restapi" />' # noqa
|
||||
expected = expected.format(funkwhale_api.__version__).encode()
|
||||
expected = '<?xml version="1.0" encoding="UTF-8"?>\n<subsonic-response funkwhaleVersion="{}" hello="world" openSubsonic="true" serverVersion="{}" status="ok" type="funkwhale" version="1.16.0" xmlns="http://subsonic.org/restapi" />' # noqa
|
||||
version = funkwhale_api.__version__
|
||||
expected = expected.format(version, version).encode()
|
||||
|
||||
renderer = renderers.SubsonicXMLRenderer()
|
||||
rendered = renderer.render(payload)
|
||||
|
|
|
@ -4,7 +4,7 @@ import pytest
|
|||
from django.db.models.aggregates import Count
|
||||
|
||||
from funkwhale_api.music import models as music_models
|
||||
from funkwhale_api.subsonic import serializers
|
||||
from funkwhale_api.subsonic import renderers, serializers
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
|
@ -90,12 +90,14 @@ def test_get_artists_serializer(factories):
|
|||
"name": artist1.name,
|
||||
"albumCount": 3,
|
||||
"coverArt": f"ar-{artist1.id}",
|
||||
"musicBrainzId": artist1.mbid,
|
||||
},
|
||||
{
|
||||
"id": artist2.pk,
|
||||
"name": artist2.name,
|
||||
"albumCount": 2,
|
||||
"coverArt": f"ar-{artist2.id}",
|
||||
"musicBrainzId": artist2.mbid,
|
||||
},
|
||||
],
|
||||
},
|
||||
|
@ -107,6 +109,7 @@ def test_get_artists_serializer(factories):
|
|||
"name": artist3.name,
|
||||
"albumCount": 0,
|
||||
"coverArt": f"ar-{artist3.id}",
|
||||
"musicBrainzId": artist3.mbid,
|
||||
}
|
||||
],
|
||||
},
|
||||
|
@ -147,6 +150,24 @@ def test_get_artist_serializer(factories):
|
|||
assert serializers.GetArtistSerializer(artist).data == expected
|
||||
|
||||
|
||||
def test_get_artist_info_2_serializer(factories):
|
||||
content = factories["common.Content"]()
|
||||
artist = factories["music.Artist"](with_cover=True, description=content)
|
||||
|
||||
expected = {
|
||||
"musicBrainzId": artist.mbid,
|
||||
"mediumImageUrl": renderers.TagValue(
|
||||
artist.attachment_cover.download_url_medium_square_crop
|
||||
),
|
||||
"largeImageUrl": renderers.TagValue(
|
||||
artist.attachment_cover.download_url_large_square_crop
|
||||
),
|
||||
"biography": renderers.TagValue(artist.description.rendered),
|
||||
}
|
||||
|
||||
assert serializers.GetArtistInfo2Serializer(artist).data == expected
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"mimetype, extension, expected",
|
||||
[
|
||||
|
@ -184,6 +205,9 @@ def test_get_album_serializer(factories):
|
|||
"year": album.release_date.year,
|
||||
"coverArt": f"al-{album.id}",
|
||||
"genre": tagged_item.tag.name,
|
||||
"genres": [{"name": tagged_item.tag.name}],
|
||||
"mediaType": "album",
|
||||
"musicBrainzId": album.mbid,
|
||||
"duration": 43,
|
||||
"playCount": album.tracks.aggregate(l=Count("listenings"))["l"] or 0,
|
||||
"song": [
|
||||
|
@ -200,13 +224,15 @@ def test_get_album_serializer(factories):
|
|||
"contentType": upload.mimetype,
|
||||
"suffix": upload.extension or "",
|
||||
"path": serializers.get_track_path(track, upload.extension),
|
||||
"bitrate": 42,
|
||||
"bitRate": 42,
|
||||
"duration": 43,
|
||||
"size": 44,
|
||||
"created": serializers.to_subsonic_date(track.creation_date),
|
||||
"albumId": album.pk,
|
||||
"artistId": artist.pk,
|
||||
"type": "music",
|
||||
"mediaType": "song",
|
||||
"musicBrainzId": track.mbid,
|
||||
}
|
||||
],
|
||||
}
|
||||
|
@ -341,7 +367,7 @@ def test_channel_episode_serializer(factories):
|
|||
"genre": "Podcast",
|
||||
"size": upload.size,
|
||||
"duration": upload.duration,
|
||||
"bitrate": upload.bitrate / 1000,
|
||||
"bitRate": upload.bitrate / 1000,
|
||||
"contentType": upload.mimetype,
|
||||
"suffix": upload.extension,
|
||||
"status": "completed",
|
||||
|
|
|
@ -97,6 +97,23 @@ def test_ping(f, db, api_client):
|
|||
assert response.data == expected
|
||||
|
||||
|
||||
@pytest.mark.parametrize("f", ["xml", "json"])
|
||||
def test_get_open_subsonic_extensions(f, db, api_client):
|
||||
url = reverse("api:subsonic:subsonic-get_open_subsonic_extensions")
|
||||
response = api_client.get(url, {"f": f})
|
||||
|
||||
expected = {
|
||||
"openSubsonicExtensions": [
|
||||
{
|
||||
"name": "formPost",
|
||||
"versions": [1],
|
||||
}
|
||||
],
|
||||
}
|
||||
assert response.status_code == 200
|
||||
assert response.data == expected
|
||||
|
||||
|
||||
@pytest.mark.parametrize("f", ["json"])
|
||||
def test_get_artists(
|
||||
f, db, logged_in_api_client, factories, mocker, queryset_equal_queries
|
||||
|
@ -166,7 +183,11 @@ def test_get_artist_info2(
|
|||
artist = factories["music.Artist"](playable=True)
|
||||
playable_by = mocker.spy(music_models.ArtistQuerySet, "playable_by")
|
||||
|
||||
expected = {"artist-info2": {}}
|
||||
expected = {
|
||||
"artistInfo2": {
|
||||
"musicBrainzId": artist.mbid,
|
||||
}
|
||||
}
|
||||
response = logged_in_api_client.get(url, {"id": artist.pk})
|
||||
|
||||
assert response.status_code == 200
|
||||
|
@ -592,7 +613,7 @@ def test_search3(f, db, logged_in_api_client, factories):
|
|||
artist_qs = (
|
||||
music_models.Artist.objects.with_albums_count()
|
||||
.filter(pk=artist.pk)
|
||||
.values("_albums_count", "id", "name")
|
||||
.values("_albums_count", "id", "name", "mbid")
|
||||
)
|
||||
assert response.status_code == 200
|
||||
assert response.data == {
|
||||
|
|
|
@ -8,7 +8,27 @@ from funkwhale_api.moderation import tasks as moderation_tasks
|
|||
from funkwhale_api.users.models import User
|
||||
|
||||
|
||||
def test_can_create_user_via_api(preferences, api_client, db):
|
||||
def test_can_create_user_via_api(settings, preferences, api_client, db):
|
||||
url = reverse("rest_register")
|
||||
data = {
|
||||
"username": "test1",
|
||||
"email": "test1@test.com",
|
||||
"password1": "thisismypassword",
|
||||
"password2": "thisismypassword",
|
||||
}
|
||||
preferences["users__registration_enabled"] = True
|
||||
settings.ACCOUNT_EMAIL_VERIFICATION = "mandatory"
|
||||
response = api_client.post(url, data)
|
||||
assert response.status_code == 201
|
||||
assert response.data["detail"] == "Verification e-mail sent."
|
||||
|
||||
u = User.objects.get(email="test1@test.com")
|
||||
assert u.username == "test1"
|
||||
|
||||
|
||||
def test_can_create_user_via_api_mail_verification_mandatory(
|
||||
settings, preferences, api_client, db
|
||||
):
|
||||
url = reverse("rest_register")
|
||||
data = {
|
||||
"username": "test1",
|
||||
|
@ -18,7 +38,7 @@ def test_can_create_user_via_api(preferences, api_client, db):
|
|||
}
|
||||
preferences["users__registration_enabled"] = True
|
||||
response = api_client.post(url, data)
|
||||
assert response.status_code == 201
|
||||
assert response.status_code == 204
|
||||
|
||||
u = User.objects.get(email="test1@test.com")
|
||||
assert u.username == "test1"
|
||||
|
@ -82,7 +102,7 @@ def test_can_signup_with_invitation(preferences, factories, api_client):
|
|||
}
|
||||
preferences["users__registration_enabled"] = False
|
||||
response = api_client.post(url, data)
|
||||
assert response.status_code == 201
|
||||
assert response.status_code == 204
|
||||
u = User.objects.get(email="test1@test.com")
|
||||
assert u.username == "test1"
|
||||
assert u.invitation == invitation
|
||||
|
@ -302,7 +322,7 @@ def test_creating_user_creates_actor_as_well(
|
|||
mocker.patch("funkwhale_api.users.models.create_actor", return_value=actor)
|
||||
response = api_client.post(url, data)
|
||||
|
||||
assert response.status_code == 201
|
||||
assert response.status_code == 204
|
||||
|
||||
user = User.objects.get(username="test1")
|
||||
|
||||
|
@ -323,7 +343,7 @@ def test_creating_user_sends_confirmation_email(
|
|||
preferences["instance__name"] = "Hello world"
|
||||
response = api_client.post(url, data)
|
||||
|
||||
assert response.status_code == 201
|
||||
assert response.status_code == 204
|
||||
|
||||
confirmation_message = mailoutbox[-1]
|
||||
assert "Hello world" in confirmation_message.body
|
||||
|
@ -405,7 +425,7 @@ def test_signup_with_approval_enabled(
|
|||
}
|
||||
on_commit = mocker.patch("funkwhale_api.common.utils.on_commit")
|
||||
response = api_client.post(url, data, format="json")
|
||||
assert response.status_code == 201
|
||||
assert response.status_code == 204
|
||||
u = User.objects.get(email="test1@test.com")
|
||||
assert u.username == "test1"
|
||||
assert u.is_active is False
|
||||
|
|
|
@ -1,3 +0,0 @@
|
|||
|
||||
Prohibit the creation of new users using django's `createsuperuser` command in favor of our own CLI
|
||||
entry point. Run `funkwhale-manage fw users create --superuser` instead. (#1288)
|
|
@ -1 +0,0 @@
|
|||
Connect loglevel and debug mode (#1538)
|
|
@ -1 +0,0 @@
|
|||
Make Artist ordering by name case insensitive
|
|
@ -1 +0,0 @@
|
|||
Create a testing environment in production for ListenBrainz recommendation engine (troi-recommendation-playground) (#1861)
|
|
@ -1 +0,0 @@
|
|||
Merge nginx configs for docker production and development setups (#1939)
|
|
@ -1 +0,0 @@
|
|||
Rename CHANGELOG to CHANGELOG.md
|
|
@ -0,0 +1 @@
|
|||
Add favorite and listening sync ith Listenbrainz (#2079)
|
|
@ -0,0 +1 @@
|
|||
Add cli command to prune non mbid content from db (#2083)
|
|
@ -1 +0,0 @@
|
|||
Only allow MusicBrainz tagged file on a pod (#2083)
|
|
@ -1 +0,0 @@
|
|||
Add NodeInfo 2.1 (#2085)
|
|
@ -1 +0,0 @@
|
|||
Fixed development docker setup (2102)
|
|
@ -1 +0,0 @@
|
|||
Adding typesense container and api client (2104)
|
|
@ -1 +0,0 @@
|
|||
Add a management command to generate dummy notifications for testing
|
|
@ -1 +0,0 @@
|
|||
Cache radio queryset into redis. New radio track endpoint for api v2 is /api/v2/radios/sessions/{radiosessionid}/tracks (#2135)
|
|
@ -1,2 +0,0 @@
|
|||
New management command to update Uploads which have been imported using --in-place and are now
|
||||
stored in s3 (#2156)
|
|
@ -1 +0,0 @@
|
|||
Make sure embed codes generated before 1.3.0 are still working
|
|
@ -1 +0,0 @@
|
|||
Fixed embedded player crash when API returns relative listen URL. (#2163)
|
|
@ -1 +0,0 @@
|
|||
Cache pip package in api docker builds (#2193)
|
|
@ -1 +0,0 @@
|
|||
Fixed development docker setup (2196)
|
|
@ -1 +0,0 @@
|
|||
Fix missing og meta tags (#2208)
|
|
@ -1 +0,0 @@
|
|||
Add custom logging functionality (#2155)
|
Some files were not shown because too many files have changed in this diff Show More
Ładowanie…
Reference in New Issue