"A Hitchhiker's Guide to Jailbreaking ChatGPT via Prompt Engineering."

Yi Liu et al. (2024)

Details and statistics

DOI: 10.1145/3663530.3665021

access: closed

type: Conference or Workshop Paper

metadata version: 2024-08-02

a service of  Schloss Dagstuhl - Leibniz Center for Informatics