aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorAnthony Wang2022-02-22 16:17:38 -0600
committerAnthony Wang2022-02-22 16:17:38 -0600
commita4dc9c238bad031da5fe478267c6e87a6e3dffa4 (patch)
treef5a775988d5debb4de8cba30ad3e2fac03d93c2e
parentd191b6204f6179cdec06d1eec706c762d3ed3dce (diff)
Add some transformers documentation to README
-rw-r--r--README.md8
1 files changed, 6 insertions, 2 deletions
diff --git a/README.md b/README.md
index ae692e6..55a3092 100644
--- a/README.md
+++ b/README.md
@@ -1,10 +1,10 @@
# ebooks
-Fediverse ebooks bot using LSTM neural networks in PyTorch
+Fediverse ebooks bot using neural networks
## Usage
-First, make sure you have `psycopg2`, `Mastodon.py`, and PyTorch installed. Currently only Mastodon with PostgreSQL is supported, but it should be easy to add support for other platforms and databases.
+First, install Python dependencies with `pip install psycopg2 torch transformers datasets Mastodon.py`. Currently only Mastodon with PostgreSQL is supported, but it should be easy to add support for other platforms and databases.
Generate the training data from the Mastodon database using `python data.py -d 'dbname=test user=postgres password=secret'`.
@@ -17,3 +17,7 @@ Finally, create a Mastodon application for your bot account and run the bot with
- https://closeheat.com/blog/pytorch-lstm-text-generation-tutorial
- https://trungtran.io/2019/02/08/text-generation-with-pytorch/
+
+- https://huggingface.co/docs/transformers/training
+
+- https://huggingface.co/blog/how-to-generate