Zoom Logo

MATH-IMS Seminar - Shared screen with speaker view
Zehui ZHOU
01:20:42
Prof. Michael Hinterm├╝ller, thank you very much for your interesting sharing! Could you please explain a little bit more about how to choose the activation functions (smooth or nonsmooth) based on the prior information?
xihaohe
01:23:49
Prof. Hintermueller, it is really nice and interesting talk. Thank you very much. I may have some questions to ask. Why the machine learning technique is only used to approximate the optimization constraint instead of the solution of the whole inverse/optimization problem? For the neural network, how large is the training data and how to generate the training data to get the learning based model?
Fuqun HAN
01:25:36
Prof. Hinterm├╝ller, thanks for the very inspiring talk.For the optimization framework, can the current method get better reconstruction results with regularization terms like TV or rank(X) for certain scenarios?
Fuqun HAN
01:31:41
Thank you very much for your explanation and thanks again for the very nice sharing!
Hok Shing Wong
01:31:54
Prof. Hintermueller, thank you for the wonderful talk. For the optimization framework, you mentioned that the dictionary space is a high dimensional manifold. I wonder if there are any studies on the geometric property of this space? As the objective function is relatively simpler compared to the dictionary space, it seems possible to try manifold optimization approach if we have some geometric understanding of the space.
Zhipeng ZHU
01:33:17
Thank you for your great talk. May I ask after you train the network to approximate the optimization constraint, how do you incorporate it into the optimization process? Do you use algorithms such as ADMM?
Administrator
01:49:59
Thank you for your great talk.