diff options
author | Anthony Wang | 2022-02-27 18:00:22 -0600 |
---|---|---|
committer | Anthony Wang | 2022-02-27 18:00:22 -0600 |
commit | 7bbd56155b0f2118971da9bd1d0737b1bc33a3af (patch) | |
tree | c5cf137b3befbe075af1022e2029c72c11ac5d31 | |
parent | 1a13ba99092e55d5e236be46a7bf7f20ea57f599 (diff) |
WIP: Add Pleroma and Misskey support
-rw-r--r-- | README.md | 8 | ||||
-rw-r--r-- | bot.py | 26 |
2 files changed, 24 insertions, 10 deletions
@@ -2,15 +2,17 @@ Fediverse ebooks bot using neural networks + ## Usage -First, install Python dependencies with `pip install psycopg2 torch transformers datasets Mastodon.py`. Currently only Mastodon with PostgreSQL is supported, but it should be easy to add support for other platforms and databases. +First, install Python dependencies using your distro's package manager or `pip`: [psycopg2](https://www.psycopg.org), [torch](https://pytorch.org/), [transformers](https://huggingface.co/docs/transformers/index), and [datasets](https://huggingface.co/docs/datasets/). Additionally, for Mastodon and Pleroma, install [Mastodon.py](https://mastodonpy.readthedocs.io/en/stable/), and for Misskey, install [Misskey.py](https://misskeypy.readthedocs.io/ja/latest/). If your database or platform isn't supported, don't worry! It's easy to add support for other platforms and databases, and contributions are welcome! -Generate the training data from the Mastodon database using `python data.py -d 'dbname=test user=postgres password=secret'`. You can skip this step if you have training data from another source. +Now generate the training data from your fediverse server's database using `python data.py -d 'dbname=test user=postgres password=secret host=localhost port=5432'`. You can skip this step if you have collected training data from another source. Next, train the network with `python train.py`, which may take several hours. It's a lot faster when using a GPU. If you need advanced features when training, you can also train using [run_clm.py](https://github.com/huggingface/transformers/blob/master/examples/pytorch/language-modeling/run_clm.py). -Finally, create a Mastodon application for your bot account and run the bot with `python bot.py -i mastodon.instance -t access_token -n 'Initial text'`. You can create a cron job to run this periodically. Enjoy! +Finally, create an application for your bot account and generate an access token. Run the bot with `python bot.py -b server_type -i fediverse.instance -t access_token`. You can omit `-b server_type` for Mastodon and Pleroma. To run the bot periodically, create a cron job. Enjoy! + ## Resources @@ -1,11 +1,12 @@ from argparse import ArgumentParser from random import randint, choice -from mastodon import Mastodon from transformers import AutoTokenizer, AutoModelForCausalLM parser = ArgumentParser() +parser.add_argument('-b', '--backend', choices=['mastodon', 'misskey'], default='mastodon', + help='fediverse server type') parser.add_argument('-i', '--instance', help='Mastodon instance hosting the bot') parser.add_argument('-t', '--token', help='Mastodon application access token') parser.add_argument('-n', '--input', help='initial input text') @@ -76,12 +77,23 @@ output = tokenizer.decode(model.generate( print(output) -# Post it to Mastodon -mastodon = Mastodon( - access_token=args.token, - api_base_url=args.instance -) +# Prepare the post post = output.split('\n')[0] if len(post) < 200: post = output.split('\n')[0] + '\n' + output.split('\n')[1] -mastodon.status_post(post[:500]) + + +# Post it! +if args.backend == 'mastodon': + from mastodon import Mastodon + + mastodon = Mastodon( + access_token=args.token, + api_base_url=args.instance + ) + mastodon.status_post(post[:500]) +elif args.backend == 'misskey': + from Misskey import Misskey + + misskey = Misskey(args.instance, i=args.token) + misskey.notes_create(post[:500]) |