Other - Colloquium on Artificial Intelligence Research and Optimization | |
Second Order Optimizations for Scalable Training of DNNs | |
Andrew Lumsdaine, Northwest Institute for Advanced Computing | |
Chief Scientist | |
Virtual- REGISTRATION REQUIRED (SEE ABSTRACT) Zoom February 17, 2021 - 01:00 pm |
|
Abstract:
With the increasing availability of scalable computing platforms (including accelerators), there is an opportunity for more sophisticated approaches for deep network training to be developed and applied. Higher-order optimization methods (e.g., Hessian-based) can provide improved convergence rates compared to first-order approaches such as stochastic gradient descent, at the cost of increased resource
REGISTRATION IS FREE AND REQUIRED TO RECIEVE ZOOM ID: |
|
Speaker's Bio: Andrew Lumsdaine is an internationally recognized expert in the area of high-performance computing who has made important contributions in many of the constitutive areas of HPC, including systems, programming languages, software libraries, and performance modeling. His work in HPC has been motivated by data-driven problems (e.g., large-scale graph analytics), as well as more traditional computational science problems. He has been an active participant in multiple standardization efforts, including the MPI Forum, the BLAS Technical Forum, the ISO C++ standardization committee, the oneAPI Technical Advisory Board, and the SYCL Advisory Panel. Open source software projects resulting from his work include the Matrix Template Library, the Boost Graph Library, and Open MPI. |
|