Got an IndieWeb site? Want to interact with federated social networks like Mastodon, Hubzilla, and more? Bridgy Fed is for you.
 
 
 
 
Go to file
Ryan Barrett 4c6dfcc0bf
Bluesky MST: optimize tests a bit
2023-04-15 22:27:00 -07:00
.circleci circle: switch back to installing gcloud CLI via apt 2023-02-27 14:27:35 -06:00
.github dependabot auto-merge workflow: message when we skip major versions 2022-04-25 11:49:10 -07:00
scripts move Activity model to scripts/activity_model.py 2023-02-11 18:11:30 -08:00
static user page activities: show counts of delivered, failed, undelivered 2023-02-07 08:01:21 -08:00
templates minor docs language tweak 2023-03-31 20:28:25 -07:00
tests Bluesky MST: optimize tests a bit 2023-04-15 22:27:00 -07:00
.gcloudignore webmention => AP: handle multiple in-reply-to links 2020-06-06 08:40:07 -07:00
.gitignore flask: move app to app.py, get templates working, port /responses 2021-07-07 08:07:20 -07:00
README.md readme: remove dev setup error snippets, most of them are out of date 2023-02-11 07:50:09 -08:00
activitypub.py unify Object.new/changed generation into Protocol.load 2023-04-03 07:53:15 -07:00
app.py move current user into Flask g request-global 2023-03-20 14:28:14 -07:00
app.yaml bump pending latency down to 500ms, threads up to 30, for link preview DOSes 2023-01-22 07:46:06 -08:00
appengine_config.py unify request handler classes and handle_exception; move away from HOST[_URL] 2020-01-31 14:40:11 -08:00
atproto_mst.py Bluesky MST: test_is_order_independent works 2023-04-14 21:57:16 -07:00
common.py remove common.actor() 2023-04-04 22:19:35 -07:00
config.py cache more aggressively: bump expiration up to 60s, threshold to 3k 2023-01-24 16:13:22 -08:00
follow.py unify Object.new/changed generation into Protocol.load 2023-04-03 07:53:15 -07:00
index.yaml add datastore index needed for bc15902bed 2023-03-04 12:50:42 -08:00
indieauth_client_id add follow UI to user/[domain]/following 2023-01-07 09:34:55 -08:00
lexicons update bluesky XRPC handlers for lexicon refactor 2023-04-08 12:47:29 -07:00
models.py /render : escape #s in object id URLs to ^^ 2023-04-05 16:23:49 -07:00
oauth_dropins_fonts lots of UI tweaks 2022-11-12 23:19:09 -08:00
oauth_dropins_static switch oauth_dropins symlink to oauth_dropins_static, drop it from repo 2021-08-06 11:19:33 -07:00
pages.py add issue object type to snippet rendering 2023-04-12 07:57:40 -07:00
protocol.py unify Object.new/changed generation into Protocol.load 2023-04-03 07:53:15 -07:00
redirect.py big webmention.py refactoring 2023-04-01 19:13:51 -07:00
render.py /render: hydrate author/actor if necessary 2023-04-05 07:16:31 -07:00
requirements.txt start on AT Protocol PDS, specifically their MST 2023-04-12 12:37:09 -07:00
superfeedr.py logging: use separate loggers for each module with their names 2022-02-11 22:38:56 -08:00
webfinger.py move current user into Flask g request-global 2023-03-20 14:28:14 -07:00
webmention.py /webmention : add back logging message for monitoring 2023-04-09 14:34:22 -07:00
xrpc_actor.py update bluesky XRPC handlers for lexicon refactor 2023-04-08 12:47:29 -07:00
xrpc_feed.py update bluesky XRPC handlers for lexicon refactor 2023-04-08 12:47:29 -07:00
xrpc_graph.py update bluesky XRPC handlers for lexicon refactor 2023-04-08 12:47:29 -07:00

README.md

Bridgy Fed Circle CI Coverage Status

Bridgy Fed connects your web site to Mastodon and the fediverse via ActivityPub, webmentions, and microformats2. Your site gets its own fediverse profile, posts and avatar and header and all. Bridgy Fed translates likes, reposts, mentions, follows, and more back and forth. See the user docs for more details.

https://fed.brid.gy/

Also see the original design blog posts.

License: This project is placed in the public domain.

Development

Pull requests are welcome! Feel free to ping me in #indieweb-dev with any questions.

First, fork and clone this repo. Then, install the Google Cloud SDK and run gcloud components install beta cloud-datastore-emulator to install the datastore emulator. Once you have them, set up your environment by running these commands in the repo root directory:

gcloud config set project bridgy-federated
python3 -m venv local
source local/bin/activate
pip install -r requirements.txt

Now, run the tests to check that everything is set up ok:

gcloud beta emulators datastore start --use-firestore-in-datastore-mode --no-store-on-disk --host-port=localhost:8089 --quiet < /dev/null >& /dev/null &
python3 -m unittest discover

Finally, run this in the repo root directory to start the web app locally:

GAE_ENV=localdev FLASK_ENV=development flask run -p 8080

If you send a pull request, please include (or update) a test for the new functionality!

If you hit an error during setup, check out the oauth-dropins Troubleshooting/FAQ section.

You may need to change granary, oauth-dropins, mf2util, or other dependencies as well as as Bridgy Fed. To do that, clone their repo locally, then install them in "source" mode with e.g.:

pip uninstall -y granary
pip install -e <path to granary>

To deploy to the production instance on App Engine - if @snarfed has added you as an owner - run:

gcloud -q beta app deploy --no-cache --project bridgy-federated *.yaml

Stats

I occasionally generate stats and graphs of usage and growth via BigQuery, like I do with Bridgy. Here's how.

  1. Export the full datastore to Google Cloud Storage. Include all entities except MagicKey. Check to see if any new kinds have been added since the last time this command was run.

    gcloud datastore export --async gs://bridgy-federated.appspot.com/stats/ --kinds Follower,Response
    

    Note that --kinds is required. From the export docs:

    Data exported without specifying an entity filter cannot be loaded into BigQuery.

  2. Wait for it to be done with gcloud datastore operations list | grep done.

  3. Import it into BigQuery:

    for kind in Follower Response; do
      bq load --replace --nosync --source_format=DATASTORE_BACKUP datastore.$kind gs://bridgy-federated.appspot.com/stats/all_namespaces/kind_$kind/all_namespaces_kind_$kind.export_metadata
    done
    
  4. Check the jobs with bq ls -j, then wait for them with bq wait.

  5. Run the full stats BigQuery query. Download the results as CSV.

  6. Open the stats spreadsheet. Import the CSV, replacing the data sheet.

  7. Check out the graphs! Save full size images with OS or browser screenshots, thumbnails with the Download Chart button.