


default search action
20th ALTA 2022: Adelaide, Australia
- Pradeesh Parameswaran, Jennifer Biggs, David M. W. Powers:
Proceedings of the 20th Annual Workshop of the Australasian Language Technology Association, ALTA 2020, Adelaide, Australia, December 14-16, 2022. Association for Computational Linguistics 2022
Australasian Language Technology Association Workshop (2022)
- Steven Coats:
The Corpus of Australian and New Zealand Spoken English: A new resource of naturalistic speech transcripts. 1-5 - Manny Rayner, Belinda Chiera, Cathy Chua:
Using public domain resources and off-the-shelf tools to produce high-quality multimedia texts. 6-15 - Aleney Khoo, Maciej Rybinski, Sarvnaz Karimi, Adam G. Dunn:
The Role of Context in Vaccine Stance Prediction for Twitter Users. 16-21 - Fatemeh Shiri, Tongtong Wu, Yuan-Fang Li, Gholamreza Haffari:
TCG-Event: Effective Task Conditioning for Generation-based Event Extraction. 22-30 - Xiao-Yu Guo, Yuan-Fang Li, Gholamreza Haffari:
Complex Reading Comprehension Through Question Decomposition. 31-40 - Pradeesh Parameswaran, Andrew Trotman, Veronica Liesaputra, David M. Eyers:
Using Aspect-Based Sentiment Analysis to Classify Attitude-bearing Words. 41-51 - Zineddine Tighidet, Nicolas Ballier:
Fine-tuning a Subtle Parsing Distinction Using a Probabilistic Decision Tree: the Case of Postnominal "that" in Noun Complement Clauses vs. Relative Clauses. 52-61 - Ho Hung Lim, Tianyuan Cai, John S. Y. Lee, Meichun Liu:
Robustness of Hybrid Models in Cross-domain Readability Assessment. 62-67 - Rolf Schwitter:
Specifying Optimisation Problems for Declarative Programs in Precise Natural Language. 68-72 - Jinghui Liu, Daniel Capurro, Anthony N. Nguyen, Karin Verspoor:
Improving Text-based Early Prediction by Distillation from Privileged Time-Series Text. 73-83 - Junaid Rashid, Jungeun Kim, Usman Naseem, Amir Hussain:
A DistilBERTopic Model for Short Text Documents. 84-89 - Bryan Gregorius, Takeshi Okadome:
Generating Code-Switched Text from Monolingual Text with Dependency Tree. 90-97 - Susan Brown, Shunichi Ishihara:
Stability of Forensic Text Comparison System. 98-106 - Anurag Reddy Muthyala, Vikram Pudi:
Academic Curriculum Generation using Wikipedia for External Knowledge. 107-114 - Jiayi Dai, Mi-Young Kim, Randy Goebel:
Interactive Rationale Extraction for Text Classification. 115-121 - Rui Xing, Shraey Bhatia, Timothy Baldwin, Jey Han Lau:
Automatic Explanation Generation For Climate Science Claims. 122-129 - Yishan Huang, Gwendolyn Hyslop:
Zhangzhou Implosives and Their Variations. 122-129 - Gisela Vallejo, Timothy Baldwin, Lea Frermann:
Evaluating the Examiner: The Perils of Pearson Correlation for Validating Text Similarity Metrics. 130-138 - Crispin Almodovar, Fariza Sabrina, Sarvnaz Karimi, Salahuddin A. Azad:
Can Language Models Help in System Security? Investigating Log Anomaly Detection using BERT. 139-147 - Lucas C. F. Domingos, Paulo Santos:
A Semantics of Spatial Expressions for interacting with unmanned aerial vehicles. 148-155 - Abdul Aziz, Md. Akram Hossain, Abu Nowshed Chy:
Enhancing the DeBERTa Transformers Model for Classifying Sentences from Biomedical Abstracts. 156-160 - David Brock, Ali Y. Khan, Tam Doan, Alicia Lin, Yifan Guo, Paul Tarau:
Textstar: a Fast and Lightweight Graph-Based Algorithm for Extractive Summarization and Keyphrase Extraction. 161-169 - Thanh Tran, Maëlic Neau, Paulo Santos, David Powers:
Contrastive Visual and Language Learning for Visual Relationship Detection. 170-177 - Diego Mollá:
Overview of the 2022 ALTA Shared task: PIBOSO sentence classification, 10 years later. 178-182 - Shunichi Ishihara, Satoru Tsuge, Mitsuyuki Inaba, Wataru Zaitsu:
Estimating the Strength of Authorship Evidence with a Deep-Learning-Based Approach. 183-187 - Necva Bölücü, Pinar Uskaner Hepsag:
Automatic Classification of Evidence Based Medicine Using Transformers. 188-192 - Biaoyan Fang, Fajri Koto:
Context-Aware Sentence Classification in Evidence-Based Medicine. 193-198

manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.