BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Etis - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://www.etis-lab.fr
X-WR-CALDESC:Events for Etis
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/Paris
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20240331T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20241027T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20250330T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20251026T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20260329T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20261025T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Europe/Paris:20250207T100000
DTEND;TZID=Europe/Paris:20250207T113000
DTSTAMP:20260422T024707
CREATED:20250124T072146Z
LAST-MODIFIED:20250124T072146Z
UID:8972-1738922400-1738927800@www.etis-lab.fr
SUMMARY:Training neural models using logic: results\, challenges\, and applications
DESCRIPTION:DATA&AI Team Seminar (Online) \nEfi Tsamoura\, Senior Researcher at Samsung AI \nTitle: Training neural models using logic: results\, challenges\, and applications\nAbstract: Neurosymbolic learning (NSL) vows to transform AI by combining the strong induction capabilities of neural models with the strong deduction capabilities of symbolic knowledge representation and reasoning techniques. This talk centers around an NSL problem that has received significant attention lately: training neural classifiers using supervision produced by logical theories. Empirical research has shown the advantages of this learning setting over end-to-end deep neural architectures in multiple aspects\, including accuracy and model complexity. Despite the extensive empirical research\, limited theoretical analysis has been dedicated to understanding if and under which conditions we can learn the underlying neural models. \nThis talk covers this gap by proposing necessary and sufficient conditions\, which ensure that we can learn the underlying models under rigorous guarantees. I will also discuss the relationship between this problem and other known problems in the machine learning literature. Furthermore\, I will present new challenges inherent to this NSL setting and propose solutions to overcome those challenges\, leading to models with substantially higher accuracy. I will conclude this talk with recent applied results and open challenges. \n  \nBio: Efi Tsamoura is a Senior Researcher at Samsung AI\, Cambridge\, UK. In 2016\, she was awarded a prestigious early career fellowship from the Alan Turing Institute\, UK\, for her work on logic and databases\, and before that\, she was a Postdoctoral Researcher in the Department of Computer Science of the University of Oxford. Her main research interests lie in the areas of logic\, knowledge representation and reasoning\, and neurosymbolic learning\, while her recent outcomes involve scaling symbolic reasoning to billions of triples\, as well as addressing open problems in neuro-symbolic learning. Her research has been published in top-tier machine learning\, AI\, and database venues (NeurIPS\, ICML\, SIGMOD\, VLDB\, PODS\, AAAI\, IJCAI\, etc.).
URL:https://www.etis-lab.fr/event/training-neural-models-using-logic-results-challenges-and-applications/
LOCATION:Zoom
CATEGORIES:Seminar
ORGANIZER;CN="Vassiis Christophides":MAILTO:vassilis.christophides@ensea.fr
END:VEVENT
END:VCALENDAR