《计算机视觉和图像处理简介 - 中英双语版》:神经网络中的激活函数 ReLU vs Sigmoid
文章大纲
- Neural Network Module and Training Function
- 创建数据集
- Define Neural Network, Criterion function, Optimizer and Train the Model
- Test Sigmoid and Relu
- Analyze Results
- 参考文献与学习路径
在本文中,我们使用含有两个隐藏层的神经网络基于MNIST数据集测试Sigmoid和Relu激活函数
Neural Network Rectified Linear Unit (ReLU) vs Sigmoid
Objective 目标如下
1. Define Several Neural Network, Criterion function, Optimizer.
2. Test Sigmoid and Relu.
3. Analyze Results.
In this lab, you will test Sigmoid and Relu activation functions on the MNIST dataset with two hidden Layers.
预估时间