- Understand the basics of MXNet
- Understand the basics of NDarray module
- Understand the bascis of Symbol module
- Play with the code
- Start to be familiar with github by using github desk
- Able to create github account
- Able to create a repo
- Able to push/pull code by github desktop
- Create github account
- Create a repo, try to pull and push code
- Play the code of mxnet under mxnet-week1
- Work with Pycharm for mxnet project
- Able to use github under ubuntu/linux environment
- AWS GPU/non-GPU instance setting up
Design/train a MLP in mxnet
-
Finish "TDDO" in "mxnet-week2/mlp_sym.py" and "train_mlp.py"
"mx.sym.Convolution" "mx.sym.Activation" "mx.sym.Pooling" "mx.symbol.FullyConnected" "mx.sym.SoftmaxOutput"
-
Upload the network architecture you design to your repository
The figure file(pdf, jpg, png ,etc) generated by mxnet Hint: try "mx.sym.plot_network"
-
Compare the accuracy and the training time between MLP and CNN
Fill the table of the comparison of two methods
|Comparison | MLP | Your Network
|Acc(5 epoch) |
|CPU Epoch time |
|GPU Epoch Time(optional)|
1. Understand mxnet data io, get/convert/ data
2. Build network either from scratch or from others' work
3. Sanity check by predict a random image before training.
4. Overfit a very small dataset
5. Full-scale training
6. Testing
Homework is "https://github.com/BitTiger-MP/DS502-AI-Engineer/blob/master/DS502-1702/MXNET_course/mxnet-week3/HW3/Homework3.ipynb"
run "sh download_data.sh" to download the cifar 10 rec file.
Finish all the "???" and TODO in the ipynb file. Run the jpynb file succefully and post the link to issue.
(Optional) Fully train cifar10 (if you have GPU) to be converged. Try different network (vgg, resnet, ...) and other hyperparameters (random search)