Write a Blog >>
SPLASH 2017
Sun 22 - Fri 27 October 2017 Vancouver, Canada
Sun 22 Oct 2017 11:37 - 12:00 at Regency A - Session 2 Chair(s): Nada Amin

Automatic differentiation (AD) is an essential primitive for machine learning programming systems. Tangent is a Python package that performs AD using source code transformation (SCT) in Python. It takes numeric functions written in a syntactic subset of Python and NumPy as input, and transforms them into new Python functions which calculate a derivative. This approach to automatic differentiation is different from existing packages popular in machine learning, such as TensorFlow and Autograd. Advantages are that Tangent generates gradient code in Python which is readable by the user and easy to understand and debug. Tangent also introduces a new syntax for easily injecting code into the generated gradient code, further improving usability.

Sun 22 Oct

dsldi-2017
10:30 - 12:00: DSLDI 2017 - Session 2 at Regency A
Chair(s): Nada AminUniversity of Cambridge
dsldi-201710:30 - 10:52
Talk
Wode NiColumbia University, Katherine Ye, Joshua SunshineCarnegie Mellon University, Jonathan AldrichCarnegie Mellon University, Keenan CraneCarnegie Mellon University
File Attached
dsldi-201710:52 - 11:15
Talk
Xiangqi LiUniversity of Utah, Matthew FlattUniversity of Utah
File Attached
dsldi-201711:15 - 11:37
Talk
Christopher SimpkinsGeorgia Institute of Technology, Spencer RugaberGeorgia Institute of Technology, Charles Isbell, Jr.Georgia Institute of Technology
File Attached
dsldi-201711:37 - 12:00
Talk
Bart van MerriënboerUniversity of Montreal, Alexander B. WiltschkoGoogle Brain
File Attached