BEGIN:VCALENDAR
VERSION:2.0
PRODID:-// - ECPv6.0.12//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://zurich-nlp.ch
X-WR-CALDESC:Events for 
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/Paris
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20230326T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20231029T010000
END:STANDARD
TZID:Europe/Helsinki
BEGIN:DAYLIGHT
TZOFFSETFROM:+0200
TZOFFSETTO:+0300
TZNAME:EEST
DTSTART:20230326T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0300
TZOFFSETTO:+0200
TZNAME:EET
DTSTART:20231029T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Europe/Paris:20231207T170000
DTEND;TZID=Europe/Paris:20231207T190000
DTSTAMP:20260404T114848
CREATED:20231123T182717Z
LAST-MODIFIED:20240115T132659Z
UID:1239-1701968400-1701975600@zurich-nlp.ch
SUMMARY:ZurichNLP Meetup #6
DESCRIPTION:Meetup #6 is coming soon and will be our best one yet! It’s our pleasure to announce the following speakers: \n\nLucas Beyer (Google DeepMind/Brain)\nAna Kotarcic (University of Zurich)\n\nRSVP soon as spots are limited\, the last few times we maxed out capacity. Hope to see you there!
URL:https://zurich-nlp.ch/event/zurich-nlp-meetup-6/
LOCATION:OAT ETH Zürich (14th Floor)\, Andreasstrasse 5 (14th floor)\, Zurich\, 8050
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Paris:20231019T180000
DTEND;TZID=Europe/Paris:20231019T200000
DTSTAMP:20260404T114848
CREATED:20230925T170406Z
LAST-MODIFIED:20240115T132710Z
UID:1099-1697738400-1697745600@zurich-nlp.ch
SUMMARY:ZurichNLP Meetup #5
DESCRIPTION:Meetup #5 is fast approaching and will be our best one yet! It’s our pleasure to announce the following speakers: \n\nJonas Pfeiffer\, Research Scientist at Deepmind presenting “Modular Deep Learning“\nEiso Kant\, CTO and Co-founder of Poolside presenting “Roadmap for AI progress by learning from code”\n\nRSVP soon as spots are limited\, last time we maxed capacity. Hope to see you there!
URL:https://zurich-nlp.ch/event/zurich-nlp-meetup-5/
LOCATION:OAT ETH Zürich\, Andreasstrasse 5 (19th floor)\, Zurich\, 8050
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Paris:20231002T161500
DTEND;TZID=Europe/Paris:20231002T171500
DTSTAMP:20260404T114848
CREATED:20230925T165541Z
LAST-MODIFIED:20230925T165936Z
UID:1092-1696263300-1696266900@zurich-nlp.ch
SUMMARY:How Does the Brain Create Language?
DESCRIPTION:ETH Zurich Distinguished Computer Science Colloquium\nChristos Papadimitriou\, Columbia University\nHost: Emo Welzl \nABSTRACT:\nThere is little doubt that cognitive phenomena are the result of neural activity\, and yet there has been slow progress toward articulating a computational theory of how exactly this happens. I will discuss a simplified mathematical model of the brain\, which we call NEMO\, involving brain areas\, spiking neurons\, random synapses\, local inhibition\, Hebbian plasticity\, and long-range interneurons – crucially\, there is no backpropagation in NEMO. Emergent behaviors of the resulting dynamical system – established both analytically and through simulations – include assemblies of neurons\, sequence memorization\, one-shot learning\, and universal computation. NEMO is also a software-based neuromorphic system that can be simulated efficiently at the scale of tens of millions of neurons\, emulating certain high-level cognitive phenomena such as planning and parsing of natural language. I will describe current work aiming at creating through NEMO a neuromorphic language organ: a neural tabula rasa which\, on input consisting of a modest amount of grounded language\, is capable of language acquisition: learning lexicon\, syntax\, semantics\, comprehension\, and generation. Finally\, and on the plane of scientific methodology\, I will argue that experimenting with such brain-like devices\, devoid of backpropagation\, can reveal novel avenues to learning\, and may end up advancing AI. \nBIOGRAPHY:\nChristos Harilaos Papadimitriou is the Donovan Family Professor of Computer Science at Columbia University. Before joining Columbia in 2017\, he was a professor at UC Berkeley for the previous 22 years\, and before that he taught at Harvard\, MIT\, NTU Athens\, Stanford\, and UCSD. He has written five textbooks and many articles on algorithms and complexity\, and their applications to optimization\, databases\, control\, AI\, robotics\, economics and game theory\, the Internet\, evolution\, and the brain. He holds a PhD from Princeton (1976)\, and nine honorary doctorates\, including from ETH\, University of Athens\, EPFL\, and Univ. de Paris Dauphine. He is a member of the National Academy of Sciences of the US\, the American Academy of Arts and Sciences\, and the National Academy of Engineering\, and he has received the Knuth prize\, the Gödel prize\, the Babbage award\, the von Neumann medal\, the IEEE women of the Edvac Computer Pioneer Award\, as well as the 2018 Harvey Prize by Technion. In 2015 the president of the Hellenic Republic named him commander of the order of the Phoenix. He has also written three novels: “Turing\,” “Logicomix” and his latest “Independence.”
URL:https://zurich-nlp.ch/event/how-does-the-brain-create-language/
LOCATION:ETH Zurich CAB\, Universitätstrasse 6 (CAB G.61)\, Zürich\, 8006\, Switzerland
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Paris:20230926T180000
DTEND;TZID=Europe/Paris:20230926T200000
DTSTAMP:20260404T114848
CREATED:20230829T075122Z
LAST-MODIFIED:20240115T132715Z
UID:980-1695751200-1695758400@zurich-nlp.ch
SUMMARY:ZurichNLP Meetup #4
DESCRIPTION:We’re back with an exciting fourth meetup! We are thrilled to be hosting two talks: \n\n“Reinforced Active Learning for Text classification” by Katya Mirylenka from IBM Research.\n“AI Summarizers for Lawyers and the People” by Prof. Dr. Elliott Ash from ETH Zurich\n\nRSVP soon as spots are limited\, we hope to see you there!
URL:https://zurich-nlp.ch/event/zurich-nlp-meetup-4/
LOCATION:OAT ETH Zürich\, Andreasstrasse 5 (19th floor)\, Zurich\, 8050
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Paris:20230627T180000
DTEND;TZID=Europe/Paris:20230627T200000
DTSTAMP:20260404T114848
CREATED:20230526T123814Z
LAST-MODIFIED:20240115T132720Z
UID:873-1687888800-1687896000@zurich-nlp.ch
SUMMARY:ZurichNLP Meetup #3
DESCRIPTION:Join us for our third meetup this time with an exciting fireside chat between Tim Scarfe and Zeerak Talat on democratization of AI\, policy and ethics. The panel is being rescheduled to sometime this fall. \nPlease RSVP\, spots are limited!
URL:https://zurich-nlp.ch/event/zurich-nlp-meetup-3/
LOCATION:OAT ETH Zürich\, Andreasstrasse 5 (19th floor)\, Zurich\, 8050
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Paris:20230605T180000
DTEND;TZID=Europe/Paris:20230605T190000
DTSTAMP:20260404T114848
CREATED:20230517T201954Z
LAST-MODIFIED:20240115T132754Z
UID:867-1685988000-1685991600@zurich-nlp.ch
SUMMARY:ZürichNLP Visitors: Niket Tandon
DESCRIPTION:Niket Tandon from AI2 will be visiting Zurich in June and will be giving a talk titled Commonsense injection in large language models. Please RSVP below! \nAbstract: Large LMs\, while powerful\, are not immune to mistakes which are obvious to humans\, but they can be prohibitively costly to retrain. Our goal is to effectively correct language model mistakes by injecting knowledge via user interactions with the system but without retraining. Our approach is a memory-augmented architecture\, where user feedback is used to make the model generate a better answer or to post hoc correct the errors such that the model does not repeat similar mistakes. We will discuss efficient solutions to designing this memory of knowledge\, and leveraging it in the model. This is a step in the direction of never ending learning\, and we will present a future roadmap to what open research problems need to be addressed to get to never ending learning language models. \nBio: Niket Tandon is a Senior Research Scientist at the Allen Institute for AI in Seattle. His research interests are in commonsense reasoning and natural language guided reasoning. He works at the Aristo team responsible for creating AI which aced science exams. He obtained his Ph.D. from the Max Planck Institute for Informatics in Germany in 2016\, where he was supervised by Professor Gerhard Weikum\, resulting in the largest automatically extracted commonsense knowledge base at the time\, called WebChild. He is also the founder of PQRS research\, an organization providing research opportunities to undergraduate students from underrepresented institutes. \nHomepage:  https://niket.tandon.info/
URL:https://zurich-nlp.ch/event/zurich-nlp-visitors-niket-tandon/
LOCATION:OAT ETH Zürich\, Andreasstrasse 5 (19th floor)\, Zurich\, 8050
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Helsinki:20230510T180000
DTEND;TZID=Europe/Helsinki:20230510T200000
DTSTAMP:20260404T114848
CREATED:20230415T193324Z
LAST-MODIFIED:20240115T132800Z
UID:548-1683741600-1683748800@zurich-nlp.ch
SUMMARY:ZurichNLP Meetup #2
DESCRIPTION:Coming off the success of our first meetup\, we welcome all NLP researchers\, engineers and enthusiasts to our second event. We have two really exciting guests presenting: Julian Eisenschlos from Google and Alex Warstadt from ETH Zurich. \nPlease RSVP\, spots are limited! \nFollow us on Twitter for updates.
URL:https://zurich-nlp.ch/event/zurich-nlp-meetup-2/
LOCATION:OAT ETH Zürich\, Andreasstrasse 5 (19th floor)\, Zurich\, 8050
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Paris:20230426T164500
DTEND;TZID=Europe/Paris:20230426T164500
DTSTAMP:20260404T114848
CREATED:20230424T115953Z
LAST-MODIFIED:20230424T120432Z
UID:717-1682527500-1682527500@zurich-nlp.ch
SUMMARY:Modular and Parameter-Efficient Fine-Tuning for NLP Models
DESCRIPTION:Summary: State-of-the-art language models in NLP perform best when fine-tuned even on small datasets\, but due to their increasing size\, fine-tuning and downstream usage have become extremely compute-intensive. Being able to efficiently and effectively fine-tune the largest pre-trained models is thus key in order to reap the benefits of the latest advances in NLP. In this tutorial\, we provide a comprehensive overview of parameter-efficient fine-tuning methods. We highlight their similarities and differences by presenting them in a unified view. We explore the benefits and usage scenarios of a neglected property of such parameter-efficient models—modularity—such as composition of modules to deal with previously unseen data conditions. We finally highlight how both properties——parameter efficiency and modularity——can be useful in the real-world setting of adapting pre-trained models to under-represented languages and domains with scarce annotated data for several downstream applications. \nSpeaker: Jonas Pfeiffer is a Research Scientist at Google Research. He is interested in modular representation learning in multi-task\, multilingual\, and multi-modal contexts\, and in low-resource scenarios. He worked on his PhD at the Technical University of Darmstadt\, was a visiting researcher at the New York University and a Research Scientist Intern at Meta Research. Jonas has received the IBM PhD Research Fellowship award for 2021/2022. He has given numerous invited talks at academia\, industry and ML summer schools\, and has co-organized multiple workshops on multilinguality and multimodality. \nAlso available online: https://uzh.zoom.us/j/62400025916?pwd=SElUS2QzOWVuRi9KdVREK2xIQUk3dz09 \nAdditional Information: Guest lecture for the seminar course “Multimodal Multilingual Natural Language Processing”. We will have a small Apero after the talk in the CL coffee lounge. There will be drinks and some snacks.
URL:https://zurich-nlp.ch/event/modular-and-parameter-efficient-fine-tuning-for-nlp-models/
LOCATION:University of Zurich AND (Room 2-02)\, Andreasstrasse 15\, Zurich\, 8050\, Switzerland
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Paris:20230321T180000
DTEND;TZID=Europe/Paris:20230321T200000
DTSTAMP:20260404T114848
CREATED:20230416T155124Z
LAST-MODIFIED:20240115T132806Z
UID:690-1679421600-1679428800@zurich-nlp.ch
SUMMARY:ZurichNLP Meetup #1
DESCRIPTION:We are doing our first kick-off event on March 21st from 6:00 – 8:00 PM at Binzmühlestrasse 13\, 4th Floor\, 8050 Zürich \nWe are very excited to be hosting two speakers\, Luca Beurer-Kellner from the SRI lab at ETH Zurich and Basil Mustafa from Google Brain Zurich Looking forward to seeing you there!
URL:https://zurich-nlp.ch/event/zurich-nlp-meetup-1/
LOCATION:ETH Zurich\, Binzmühlestrasse 13\, Zurich\, 8050\, Switzerland
END:VEVENT
END:VCALENDAR