Skip to content
/ DPL Public

[2025 ACL Findings] Measuring What Makes You Unique: Difference-Aware User Modeling for Enhancing LLM Personalization

Notifications You must be signed in to change notification settings

SnowCharmQ/DPL

Repository files navigation

Measuring What Makes You Unique: Difference-Aware User Modeling for Enhancing LLM Personalization

arXiv HuggingFace space HuggingFace space HuggingFace space license

Yilun Qiu1*, Xiaoyan Zhao2*, Yang Zhang1, Yimeng Bai3, Wenjie Wang3, Hong Cheng2, Fuli Feng3, Tat-Seng Chua1
1National University of Singapore, 2The Chinese University of Hong Kong, 3University of Science and Technology of China
*Equal Contribution, Corresponding author

This is the implementation of the Difference-aware Personalization Learning (DPL) method proposed in our paper accepted by ACL 2025 Findings.

DPL Framework

📋 Catalogue

⚙️ Environment Setup

conda create -n DPL python=3.11.11
conda activate DPL
pip install -r requirements.txt

📚 Dataset Preprocess

The dataset we used in DPL is adapted from Amazon Reviews'23. We publish our processed datasets in Huggingface. You can also process the dataset yourself and store it locally by the following commands:

cd data/
./create.sh

⌛️ Quick Start

To execute the DPL method, please first complete the required information in the .env file. Then, run the following command:

./main.sh

You can modify the main.sh file to change parameters.

📊 Experimental Results

Experimental Results

📖 Citation

If you find our work useful, please kindly cite our paper:

@article{qiu2025measuring,
  title={Measuring What Makes You Unique: Difference-Aware User Modeling for Enhancing LLM Personalization},
  author={Qiu, Yilun and Zhao, Xiaoyan and Zhang, Yang and Bai, Yimeng and Wang, Wenjie and Cheng, Hong and Feng, Fuli and Chua, Tat-Seng},
  journal={arXiv preprint arXiv:2503.02450},
  year={2025}
}

About

[2025 ACL Findings] Measuring What Makes You Unique: Difference-Aware User Modeling for Enhancing LLM Personalization

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published