Write a Blog >>
SPLASH 2017
Sun 22 - Fri 27 October 2017 Vancouver, Canada
Sun 22 Oct 2017 11:37 - 12:00 at Regency A - Session 2 Chair(s): Nada Amin

Automatic differentiation (AD) is an essential primitive for machine learning programming systems. Tangent is a Python package that performs AD using source code transformation (SCT) in Python. It takes numeric functions written in a syntactic subset of Python and NumPy as input, and transforms them into new Python functions which calculate a derivative. This approach to automatic differentiation is different from existing packages popular in machine learning, such as TensorFlow and Autograd. Advantages are that Tangent generates gradient code in Python which is readable by the user and easy to understand and debug. Tangent also introduces a new syntax for easily injecting code into the generated gradient code, further improving usability.

Sun 22 Oct
Times are displayed in time zone: (GMT-07:00) Tijuana, Baja California change

10:30 - 12:00: Session 2DSLDI at Regency A
Chair(s): Nada AminUniversity of Cambridge
10:30 - 10:52
Talk
DSLDI
Wode NiColumbia University, Katherine Ye, Joshua SunshineCarnegie Mellon University, Jonathan AldrichCarnegie Mellon University, Keenan CraneCarnegie Mellon University
File Attached
10:52 - 11:15
Talk
DSLDI
Xiangqi LiUniversity of Utah, Matthew FlattUniversity of Utah
File Attached
11:15 - 11:37
Talk
DSLDI
Christopher SimpkinsGeorgia Institute of Technology, Spencer RugaberGeorgia Institute of Technology, Charles Isbell, Jr.Georgia Institute of Technology
File Attached
11:37 - 12:00
Talk
DSLDI
Bart van MerriënboerUniversity of Montreal, Alexander B. WiltschkoGoogle Brain
File Attached