Interactions between the Feature and Attention Masks of the Residual Attetntion Network [Image Referenced from the Paper]
We have used the PyTorch-Lightning Framework to implement a Residual Attention Network which can be used for Image Classification. A Residual Attention Network is nothing but a, "Convolutional Neural Network using Attention Mechanism which can implement State-of-art feed forward network architecture", as mentioned in the abstract of the paper.
You can run the notebook for testing our code.
For training the model, you have to run train_pl.py
Dataset Used | Architecture implemented (Attention Type) | Optimiser Used | Image size | Training Loss | Test Loss |
---|---|---|---|---|---|
CIFAR-100 | Attention-92 | SGD | 32 | 1.26 | 1.58 |
CIFAR-10 | Attention-92 | SGD | 32 | 0.51 | 0.53 |
CIFAR-100 | Attention-56 | SGD | 224 | 1.42 | 1.80 |
CIFAR-10 | Attention-56 | SGD | 224 | 0.61 | 0.65 |
CIFAR-100 | Attention-92 | SGD | 224 | 2.95 | 2.90 |
CIFAR-10 | Attention-92 | SGD | 224 | 1.12 | 1.01 |
Some of the models' saved checkpoints can be downloaded from the drive
We were able to implement only a few Res-Net architectures and that too only on 2 datasets because of the computional power and time required to run the model on our machines. Areas we are looking to improve on and work in the future
- Implementing Attention-56 Architecture
- Implementing Attention-92 Architecture
- Implementing Attention-128, Attention-156 Architecture
- Implementing the Paper using other Deep Learning Frameworks like Tensorflow
ResidualAttentionNetwork-pytorch (GitHub)
Residual Attention Network (GitHub)
@inproceedings{wang2017residual,
title={Residual attention network for image classification},
author={Wang, Fei and Jiang, Mengqing and Qian, Chen and Yang, Shuo and Li, Cheng and Zhang, Honggang and Wang, Xiaogang and Tang, Xiaoou},
booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition},
pages={3156--3164},
year={2017}
}
Harshit Aggarwal
|
Kunal Mundada
|
Pranav B Kashyap
|