ALTA<p>While you're thinking about what to submit to the Call for Problems for the <a href="https://sigmoid.social/tags/ALTA2024" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ALTA2024</span></a> Shared Task (link below), we're sharing with you the 2nd-place winner of the <a href="https://sigmoid.social/tags/ALTA2023" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ALTA2023</span></a> Shared Task, where participants distinguished between <a href="https://sigmoid.social/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a>-generated and human-generated text. </p><p>Here, Yunhao Fang of <a href="https://sigmoid.social/tags/UniMelb" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>UniMelb</span></a> used <a href="https://sigmoid.social/tags/fineTuning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>fineTuning</span></a> and <a href="https://sigmoid.social/tags/EnsembleModels" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EnsembleModels</span></a> to achieve accuracy of 99%</p><p>🔗 Call for Problems for Shared Task: <a href="https://alta2024.alta.asn.au/calls" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">alta2024.alta.asn.au/calls</span><span class="invisible"></span></a></p><p>🔗 Paper: <a href="https://aclanthology.org/2023.alta-1.19.pdf" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">aclanthology.org/2023.alta-1.1</span><span class="invisible">9.pdf</span></a></p>