This is an Unofficial Pytorch Implementation for the paper:
Step 1: Decompress the downloaded 5-core reviews and metadata from Amazon Review 2014, which are in the format reviews_Beauty_5.json.gz
and meta_Beauty.json.gz
. Use the command provided in the TIGER/data/process.ipynb
file to perform the decompression.
Step 2: Use main.py
in TIGER/rqvae
folder to train a rqvae model using semantic embeddings obtained in Step 1.
Step 3: Use generate_code.py
in TIGER/rqvae
folder to select the best model to generate discrete code for semantic embeddings in Step 1 and padding at the last position to resolve duplicate codes.
Use main.py
in TIGER/model
folder to train a T5 encoder-decoder model with semantic code.
Metric | Beauty | Sports | Toys | |||
---|---|---|---|---|---|---|
Ours | Paper | Ours | Paper | Ours | Paper | |
Recall@5 | 0.0392 | 0.0454 | 0.0233 | 0.0264 | 0.0396 | 0.0521 |
Recall@10 | 0.0594 | 0.0648 | 0.0379 | 0.0400 | 0.0577 | 0.0712 |
NDCG@5 | 0.0257 | 0.0321 | 0.0150 | 0.0181 | 0.0270 | 0.0371 |
NDCG@10 | 0.0321 | 0.0384 | 0.0197 | 0.0225 | 0.0328 | 0.0432 |
Recommender Systems with Generative Retrieval
Adapting Large Language Models by Integrating Collaborative Semantics for Recommendation