Here is a collection of online resources, especially questions on stack exchange, I find interesting. Gets updated.
General Machine Learning
- When should linear regression be called “machine learning”?
- What is the difference between machine learning and statistics?
General Neural Networks
- Do scientists know what is happening inside artificial neural networks?
- What is the difference between an Embedding Layer and a Dense Layer?
- Tradeoff batch size vs. number of iterations to train a neural network
- What does 1x1 convolution mean in a neural network?
- Where should I place dropout layers in a neural network?
- Why is softmax output not a good uncertainty measure for Deep Learning models?
- When to use GRU over LSTM?
- Universal approximation theorem
Training a NN
- Practical hyperparameter optimization: Random vs. grid search
- Does batch_size in Keras have any effects in results’ quality?
- Training Neural Nets on Larger Batches: Practical Tips for 1-GPU, Multi-GPU & Distributed setups
- An overview of gradient descent optimization algorithms
Distances
Metrics
- Cohen’s kappa in plain English
- Micro Average vs Macro average Performance in a Multiclass classification setting
- Why F1 is not a good Metric
- How to interpret F-measure values?
Loss
Preprocessing Images
- Why normalize images by subtracting dataset’s image mean, instead of the current image mean in deep learning?
- Grayscale Image and keep 3 Dimensions