IAM

ARTICLE

Some TensorFlow Experiments on MNIST

As part of the online course Creative Applications of Deep Learning with TensorFlow, and to get started with TensorFlow, I implemented some experiments on MNIST. Specifically, I tested different architectures, activation functions and initialization schemes. While these experiments are not systematic enough for reliable results, they can be useful as an introduction to TensorFlow. In this article, I want to share the code and the corresponding presentation.

Both the code and the presentation of the project can be found on GitHub. Documentation is included in the form of Sphinx comments. The presentation can also be found below.

Code on GitHub CADL-slides
What is your opinion on this article? Let me know your thoughts on Twitter @davidstutz92 or LinkedIn in/davidstutz92.