Deep Learning #3: More on CNNs & Handling Overfitting
-
设置 decay 值,可以调整学习率,使学习率逐渐减小
- keras.optimizers.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)
-
-
通过添加正则化来实现
- model.add(Dense(units=200,input_dim=784,activation='tanh',kernel_regularizer=l2(0.01)))
Steps for reducing overfitting:
- Add more data
- Use data augmentation
- Use architectures that generalize well
- Add regularization (mostly dropout, L1/L2 regularization are also possible)
- Reduce architecture complexity.