BEGIN:VCALENDAR
VERSION:2.0
PRODID:-// - ECPv6.0.12//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://zurich-nlp.ch
X-WR-CALDESC:Events for 
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/Paris
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20230326T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20231029T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Europe/Paris:20230426T164500
DTEND;TZID=Europe/Paris:20230426T164500
DTSTAMP:20260405T152115
CREATED:20230424T115953Z
LAST-MODIFIED:20230424T120432Z
UID:717-1682527500-1682527500@zurich-nlp.ch
SUMMARY:Modular and Parameter-Efficient Fine-Tuning for NLP Models
DESCRIPTION:Summary: State-of-the-art language models in NLP perform best when fine-tuned even on small datasets\, but due to their increasing size\, fine-tuning and downstream usage have become extremely compute-intensive. Being able to efficiently and effectively fine-tune the largest pre-trained models is thus key in order to reap the benefits of the latest advances in NLP. In this tutorial\, we provide a comprehensive overview of parameter-efficient fine-tuning methods. We highlight their similarities and differences by presenting them in a unified view. We explore the benefits and usage scenarios of a neglected property of such parameter-efficient models—modularity—such as composition of modules to deal with previously unseen data conditions. We finally highlight how both properties——parameter efficiency and modularity——can be useful in the real-world setting of adapting pre-trained models to under-represented languages and domains with scarce annotated data for several downstream applications. \nSpeaker: Jonas Pfeiffer is a Research Scientist at Google Research. He is interested in modular representation learning in multi-task\, multilingual\, and multi-modal contexts\, and in low-resource scenarios. He worked on his PhD at the Technical University of Darmstadt\, was a visiting researcher at the New York University and a Research Scientist Intern at Meta Research. Jonas has received the IBM PhD Research Fellowship award for 2021/2022. He has given numerous invited talks at academia\, industry and ML summer schools\, and has co-organized multiple workshops on multilinguality and multimodality. \nAlso available online: https://uzh.zoom.us/j/62400025916?pwd=SElUS2QzOWVuRi9KdVREK2xIQUk3dz09 \nAdditional Information: Guest lecture for the seminar course “Multimodal Multilingual Natural Language Processing”. We will have a small Apero after the talk in the CL coffee lounge. There will be drinks and some snacks.
URL:https://zurich-nlp.ch/event/modular-and-parameter-efficient-fine-tuning-for-nlp-models/
LOCATION:University of Zurich AND (Room 2-02)\, Andreasstrasse 15\, Zurich\, 8050\, Switzerland
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Paris:20230321T180000
DTEND;TZID=Europe/Paris:20230321T200000
DTSTAMP:20260405T152115
CREATED:20230416T155124Z
LAST-MODIFIED:20240115T132806Z
UID:690-1679421600-1679428800@zurich-nlp.ch
SUMMARY:ZurichNLP Meetup #1
DESCRIPTION:We are doing our first kick-off event on March 21st from 6:00 – 8:00 PM at Binzmühlestrasse 13\, 4th Floor\, 8050 Zürich \nWe are very excited to be hosting two speakers\, Luca Beurer-Kellner from the SRI lab at ETH Zurich and Basil Mustafa from Google Brain Zurich Looking forward to seeing you there!
URL:https://zurich-nlp.ch/event/zurich-nlp-meetup-1/
LOCATION:ETH Zurich\, Binzmühlestrasse 13\, Zurich\, 8050\, Switzerland
END:VEVENT
END:VCALENDAR