2014

  1. AJPFKSNPR
    Romero, Adriana, Ballas, Nicolas, Kahou, Samira Ebrahimi, Chassang, Antoine, Gatta, Carlo, and Bengio, Yoshua
    arXiv preprint arXiv:1412.6550 2014

2015

  1. Unsupervised domain adaptation by backpropagation
    Ganin, Yaroslav, and Lempitsky, Victor
    In ICML 2015
  2. Distilling the knowledge in a neural network
    Hinton, Geoffrey, Vinyals, Oriol, and Dean, Jeff
    arXiv preprint arXiv:1503.02531 2015
  3. AMK
    Adversarial autoencoders
    Makhzani, Alireza, Shlens, Jonathon, Jaitly, Navdeep, Goodfellow, Ian, and Frey, Brendan
    arXiv preprint arXiv:1511.05644 2015
  4. TRS
    SMPL: A skinned multi-person linear model
    Loper, Matthew, Mahmood, Naureen, Romero, Javier, Pons-Moll, Gerard, and Black, Michael J
    ACM transactions on graphics 2015

2016

  1. Paying more attention to attention: Improving the performance of convolutional neural networks via attention transfer
    Zagoruyko, Sergey, and Komodakis, Nikos
    arXiv preprint arXiv:1612.03928 2016
  2. Cross modal distillation for supervision transfer
    Gupta, Saurabh, Hoffman, Judy, and Malik, Jitendra
    In CVPR 2016
  3. Unsupervised domain adaptation with residual transfer networks
    Long, Mingsheng, Zhu, Han, Wang, Jianmin, and Jordan, Michael I
    In NeurIPS 2016

2017

  1. AJP
    Cyberpsychology: An introduction to human-computer interaction
    Norman, Kent L
    In NeurIPS 2017
  2. On human motion prediction using recurrent neural networks
    Martinez, Julieta, Black, Michael J, and Romero, Javier
    In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition 2017
  3. A gift from knowledge distillation: Fast optimization, network minimization and transfer learning
    Yim, Junho, Joo, Donggyu, Bae, Jihoon, and Kim, Junmo
    In CVPR 2017

2018

  1. PhysRev
    Auto-conditioned recurrent networks for extended complex human motion synthesis
    Zhou, Yi, Li, Zimo, Xiao, Shuangjiu, He, Chong, Huang, Zeng, and Li, Hao
    In ICLR 2018
  2. Knowledge Transfer with Jacobian Matching
    Srinivas, Suraj, and Fleuret, Francois
    In ICML 2018
  3. Training deep networks with synthetic data: Bridging the reality gap by domain randomization
    Tremblay, Jonathan, Prakash, Aayush, Acuna, David, Brophy, Mark, Jampani, Varun, Anil, Cem, To, Thang, Cameracci, Eric, Boochoon, Shaad, and Birchfield, Stan
    In CVPR Workshops 2018
  4. Spatial temporal graph convolutional networks for skeleton-based action recognition
    Yan, Sijie, Xiong, Yuanjun, and Lin, Dahua
    In AAAI 2018
  5. Representation learning with contrastive predictive coding
    Oord, Aaron van den, Li, Yazhe, and Vinyals, Oriol
    arXiv preprint arXiv:1807.03748 2018
  6. Multi-modal cycle-consistent generalized zero-shot learning
    Felix, Rafael, Kumar, Vijay BG, Reid, Ian, and Carneiro, Gustavo
    In ECCV 2018
  7. Co-teaching: Robust training of deep neural networks with extremely noisy labels
    Han, Bo, Yao, Quanming, Yu, Xingrui, Niu, Gang, Xu, Miao, Hu, Weihua, Tsang, Ivor, and Sugiyama, Masashi
    In NeurIPS 2018
  8. Recovering accurate 3d human pose in the wild using imus and a moving camera
    Marcard, Timo, Henschel, Roberto, Black, Michael J, Rosenhahn, Bodo, and Pons-Moll, Gerard
    In ECCV 2018

2019

  1. Correlation congruence for knowledge distillation
    Peng, Baoyun, Jin, Xiao, Liu, Jiaheng, Li, Dongsheng, Wu, Yichao, Liu, Yu, Zhou, Shunfeng, and Zhang, Zhaoning
    In CVPR 2019
  2. Similarity-preserving knowledge distillation
    Tung, Frederick, and Mori, Greg
    In CVPR 2019
  3. Knowledge distillation via instance relationship graph
    Liu, Yufan, Cao, Jiajiong, Li, Bing, Yuan, Chunfeng, Hu, Weiming, Li, Yangxi, and Duan, Yunqiang
    In CVPR 2019
  4. Relational knowledge distillation
    Park, Wonpyo, Kim, Dongju, Lu, Yan, and Cho, Minsu
    In CVPR 2019
  5. Real-time rendering
    Akenine-Möller, Tomas, Haines, Eric, and Hoffman, Naty
    In CVPR 2019
  6. A theoretical analysis of contrastive unsupervised representation learning
    Arora, Sanjeev, Khandeparkar, Hrishikesh, Khodak, Mikhail, Plevrakis, Orestis, and Saunshi, Nikunj
    arXiv preprint arXiv:1902.09229 2019
  7. Contrastive Representation Distillation
    Tian, Yonglong, Krishnan, Dilip, and Isola, Phillip
    In ICLR 2019
  8. Domain randomization and pyramid consistency: Simulation-to-real generalization without accessing target domain data
    Yue, Xiangyu, Zhang, Yang, Zhao, Sicheng, Sangiovanni-Vincentelli, Alberto, Keutzer, Kurt, and Gong, Boqing
    In ICCV 2019

2020

  1. Self-supervised learning of pretext-invariant representations
    Misra, Ishan, and Maaten, Laurens van der
    In CVPR 2020
  2. Knowledge Distillation Meets Self-Supervision
    Xu, Guodong, Liu, Ziwei, Li, Xiaoxiao, and Loy, Chen Change
    In ECCV 2020
  3. Distilling Cross-Task Knowledge via Relationship Matching
    Ye, Han-Jia, Lu, Su, and Zhan, De-Chuan
    In CVPR 2020

1998

  1. Retargetting motion to new characters
    Gleicher, Michael
    In Proceedings of the 25th annual conference on Computer graphics and interactive techniques 1998

1975

  1. Generalized procrustes analysis
    Gower, John C
    Psychometrika 1975