Skip to content

XiaoLongtaoo/TIGER

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TIGER

This is an Unofficial Pytorch Implementation for the paper:

Recommender Systems with Generative Retrieval

Model Architecture

Overview of TIGER

Data Preprocess

Step 1: Decompress the downloaded 5-core reviews and metadata from Amazon Review 2014, which are in the format reviews_Beauty_5.json.gz and meta_Beauty.json.gz. Use the command provided in the TIGER/data/process.ipynb file to perform the decompression.

Step 2: Use main.py in TIGER/rqvae folder to train a rqvae model using semantic embeddings obtained in Step 1.

Step 3: Use generate_code.py in TIGER/rqvae folder to select the best model to generate discrete code for semantic embeddings in Step 1 and padding at the last position to resolve duplicate codes.

Train a T5 encoder-decoder model

Use main.py in TIGER/model folder to train a T5 encoder-decoder model with semantic code.

Experimental Results

Metric Beauty Sports Toys
Ours Paper Ours Paper Ours Paper
Recall@5 0.0392 0.0454 0.0233 0.0264 0.0396 0.0521
Recall@10 0.0594 0.0648 0.0379 0.0400 0.0577 0.0712
NDCG@5 0.0257 0.0321 0.0150 0.0181 0.0270 0.0371
NDCG@10 0.0321 0.0384 0.0197 0.0225 0.0328 0.0432

References

Recommender Systems with Generative Retrieval

Adapting Large Language Models by Integrating Collaborative Semantics for Recommendation

About

[Pytorch] Unofficial Implementation of "Recommender Systems with Generative Retrieval"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published