BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//pretalx//cfp.scipy.org//2023//talk//B9CHA7
BEGIN:VTIMEZONE
TZID:CST
BEGIN:STANDARD
DTSTART:20001029T030000
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=10;UNTIL=20061029T080000Z
TZNAME:CST
TZOFFSETFROM:-0500
TZOFFSETTO:-0600
END:STANDARD
BEGIN:STANDARD
DTSTART:20071104T030000
RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=11
TZNAME:CST
TZOFFSETFROM:-0500
TZOFFSETTO:-0600
END:STANDARD
BEGIN:DAYLIGHT
DTSTART:20000402T030000
RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=4;UNTIL=20060402T090000Z
TZNAME:CDT
TZOFFSETFROM:-0600
TZOFFSETTO:-0500
END:DAYLIGHT
BEGIN:DAYLIGHT
DTSTART:20070311T030000
RRULE:FREQ=YEARLY;BYDAY=2SU;BYMONTH=3
TZNAME:CDT
TZOFFSETFROM:-0600
TZOFFSETTO:-0500
END:DAYLIGHT
END:VTIMEZONE
BEGIN:VEVENT
UID:pretalx-2023-B9CHA7@cfp.scipy.org
DTSTART;TZID=CST:20230710T080000
DTEND;TZID=CST:20230710T120000
DESCRIPTION:Privacy guarantee is **the** most crucial requirement when it c
 omes to analyse sensitive data. However\, data anonymisation techniques al
 one do not always provide complete privacy protection\; moreover Machine L
 earning models could also be exploited to _leak_ sensitive data when _atta
 cked_\, and no counter-measure is applied. *Privacy-preserving machine lea
 rning* (PPML) methods hold the promise to overcome all these issues\, allo
 wing to train machine learning models with full privacy guarantees. In thi
 s tutorial we will explore several methods for privacy-preserving data ana
 lysis\, and how these techniques can be used to safely train ML models _wi
 thout_ actually seeing the data.
DTSTAMP:20260417T063625Z
LOCATION:Classroom 203
SUMMARY:PPML: Machine Learning on data you cannot see - Valerio Maggio
URL:https://cfp.scipy.org/2023/talk/B9CHA7/
END:VEVENT
END:VCALENDAR
