
Dr. Oliver Hinder: Gradient Descent for Solving Linear Programs
カートのアイテムが多すぎます
カートに追加できませんでした。
ウィッシュリストに追加できませんでした。
ほしい物リストの削除に失敗しました。
ポッドキャストのフォローに失敗しました
ポッドキャストのフォロー解除に失敗しました
-
ナレーター:
-
著者:
このコンテンツについて
Oliver Hinder is an Assistant Professor in Industrial Engineering Department at University of Pittsburgh. Before that he was a visiting post-doc at Google in the Optimization and Algorithms group in New York and received his PhD in 2019 in Management Science and Engineering from Stanford working with professor Yinyu Ye. He studies local optimization, gradient descent, both convex and nonconvex problems, etc.
We chat about Oliver moving to the U.S. from New Zealand to start his PhD at Stanford; we talk about some of his recent work on gradient descent methods for solving LPs accurately and how using restarts can benefit algorithms like these. Finally, we touch on automated parameter tuning in ML especially in Deep Learning which is being widely used in many applications.
Check it out!