What are the top 10 AI research papers published this year?

As an AI language model, I do not have real-time data regarding the latest research papers published this year. However, I can list some popular AI research papers from recent years that have had significant impact in the field. Here are ten notable papers:

1. “Generative Pre-trained Transformer 3 (GPT-3)” by Brown, T. B. et al. (2020)
2. “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding” by Devlin, J. et al. (2018)
3. “ImageNet Classification with Deep Convolutional Neural Networks” by Krizhevsky, A. et al. (2012)
4. “Deep Residual Learning for Image Recognition” by He, K. et al. (2016)
5. “Spatial Transformer Networks” by Jaderberg, M. et al. (2016)
6. “Generative Adversarial Networks” by Goodfellow, I. J. et al. (2014)
7. “DeepFace: Closing the Gap to Human-Level Performance in Face Verification” by Taigman, Y. et al. (2014)
8. “Attention Is All You Need” by Vaswani, A. et al. (2017)
9. “CycleGAN: Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks” by Zhu, J-Y. et al. (2017)
10. “YOLOv3: An Incremental Improvement” by Redmon, J. et al. (2018)

Please note that this list is not exhaustive and only represents a fraction of influential AI research papers.