Reference & Links

This page includes references and the links to our project's material

Reference:

Brownlee, J. (2019, July 05). How to Visualize Filters and Feature Maps in Convolutional Neural Networks. Retrieved December 09, 2020, from https://machinelearningmastery.com/how-to-visualize-filters-and-feature-maps-in-convolutional-neural-networks/

C. H. Setjo, B. Achmad and Faridah, “Thermal image human detection using Haar-cascade classifier,” 2017 7th International Annual Engineering Seminar (InAES), Yogyakarta, 2017, pp. 1-6, doi: 10.1109/INAES.2017.8068554.

Hassan, M. (2018, November 21). VGG16 - Convolutional Network for Classification and Detection. Retrieved December 09, 2020, from https://neurohive.io/en/popular-networks/vgg16/

J. Redmon, S. Divvala, R. Girshick and A. Farhadi, “You Only Look Once: Unified, Real-Time Object Detection,” 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, 2016, pp. 779-788, doi: 10.1109/CVPR.2016.91.

Shashank Gupta. “Practical Use Cases of Facial Emotion Detection Using Artificial Intelligence”, DZone AI, 2018.

S. Li and W. Deng, “Deep Facial Expression Recognition: A Survey,” in IEEE Transactions on Affective Computing, doi: 10.1109/TAFFC.2020.2981446.

X. Zhang, J. Zou, K. He and J. Sun, “Accelerating Very Deep Convolutional Networks for Classification and Detection,” in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 38, no. 10, pp. 1943-1955, 1 Oct. 2016, doi: 10.1109/TPAMI.2015.2502579.

  1. Presentation video

  2. Slide

  3. Project Proposal

  4. Code