"DxHF: Providing High-Quality Human Feedback for LLM Alignment with ..."

Danqing Shi et al. (2025)

Details and statistics

DOI: 10.1145/3746059.3747600

access: closed

type: Conference or Workshop Paper

metadata version: 2025-10-15