IAM

DAVIDSTUTZ

TAG»TALK«

2022
Learning Optimal Conformal Classifiers, DELTA Lab, UCL (Invited Talk).

Learning Optimal Conformal Classifiers, Dataiku (Invited Talk).

2021
Relating Adversarially Robust Generalization to Flat Minima, MLSec – PraLab, University of Cagliari (Invited Talk). [Recording]

Conformal Training: Learning Optimcal Conformal Classifiers, International Seminar on Distribution-Free Statistics (Invited Talk). [Recording]

Adversarial Robustness, Weight Robustness and Flatness, Math Machine Learning seminar MPI MiS + UCLA (Invited Talk). [Recording]

Relating Adversarial Robustness and Flat Minima, ICCV. [Recording]

Random Bit Errors for Energy-Efficient DNN Accelerators, CVPR CV-AML Workshop (Outstanding Paper Talk). [Recording]

Random Bit Errors for Energy-Efficient DNN Accelerators, MLSys. [Recording]

Random and Adversarial Bit Error Robustness of DNNs, TU Dortmund (Invited Talk). [Slides]

Confidence-Calibrated Adversarial Training and Bit Error Robustness for Energy-Efficient DNNs, Lorentz Center Workshop on Robust Artificial Intelligence (Invited Talk). [Recording]

2020
Bit Error Robustness for Energy-Efficient DNN Accelerators, IBM Research Workshop on the Future of Computing Architectures (Invited Talk). [Recording]

Confidence-Calibrated Adversarial Training / Mitigating Random Bit Errors in Quantized Weights, Qian Xuesen Laboratory (China Academy of Space Technology, Invited Talk).

Confidence-Calibrated Adversarial Training / Mitigating Random Bit Errors in Quantized Weights, Qualcomm (Invited Talk, Part of Qualcomm Innovation Fellowship). [Slides]

Confidence-Calibrated Adversarial Training, ICML Workshop on Uncertainty and Robustness in Deep Learning (Contributed Talk).

Confidence-Calibrated Adversarial Training, ICML. [Recording]

Confidence-Calibrated Adversarial Training, University of Tübingen (Invited Talk). [Slides]

Confidence-Calibrated Adversarial Training, Bosch Center for AI (Invited Talk). [Slides]

2019
Disentangling Adversarial Robustness and Generalization, ICML Workshop on Uncertainty and Robustness in Deep Learning (Spotlight).
2018
Weakly-Supervised Shape Completion, International Max Planck Research School for Computer Science.

Weakly-Supervised Shape Completion, ZF Friedrichshafen (Invited Talk, Part of MINT Award IT 2018, German).

2017
Benchmarking Superpixel Algorithms / Weakly-Supervised Shape-Completion, Max Planck Institute for Informatics. [Slides]

Weakly-Supervised Shape Completion, Max Planck Institute for Intelligent Systems (Master Thesis Talk). [Slides]

Weakly-Supervised Shape Completion, RWTH Aachen University (Master Thesis Talk). [Slides]

ARTICLE

Final Talk “Superpixel Segmentation using Depth Information”

At RWTH Aachen University, after writing a bachelor thesis, one has to give a final talk on the topic. My bachelor thesis “Superpixel Segmentation using Depth Information” examines the use of depth information to enhance superpixel segmentation by extending the superpixel algorithm called SEEDS [1]. This article presents the slides of my final talk.

More ...

ARTICLE

Introductory Talk “Superpixel Segmentation using Depth Information”

As part of my bachelor thesis at RWTH Aachen University, entitled “Superpixel Segmentation using Depth Information”, I prepared an introductory talk to present my work to the whole Computer Vision Group. This article provides the corresponding slides.

More ...

ARTICLE

Seminar Paper “Understanding Convolutional Neural Networks”

In my sixth semester at RWTH Aachen University, I am currently attending a seminar offered by the Computer Vision Group headed by Prof. Leibe on “Current Topics in Computer Vision and Machine Learning”. In the course of this seminar, I wrote a seminar paper entitled “Understanding Convolutional Neural Networks”. Both the seminar paper as well as the slides of the corresponding talk can be found here.

More ...

ARTICLE

Seminar Paper “Introduction to Neural Networks”

During my fifth semester at RWTH Aachen University I participated in a seminar offered by the Chair of Computer Science 6 on “Selected Topics in Human Language Technology and Pattern Recognition”. As result, I wrote a seminar paper on neural networks for pattern recognition titled “Introduction to Neural Networks”. At the end of the seminar, all participants had to give a short talk on their topic. Both my seminar paper and my slides of the talk can be found here.

More ...