Bio.

Ph.D. in Cognitive Science.

Applied Scientist at Amazon Web Services.

Research Interests

Representation Learning: I am interested forming and analysing representations of machine learning models efficiently. The representations themselves include vectors, matrices, and graphs.

Linear Models: At the end of the day, we are all using linear regression. Also, they are interpretable.

Current Status

newbie in differential privacy

News

01/2022-, Applied Scientist at AWS Long-term Science Team.

05/2021-01/2022, Applied Scientist at AWS AI Labs.

02/2021, Ph.D. in Cognitive Science. (mostly machine learning stuff)

02/2021, one paper accepted at AISTATS2021, and two papers rejected.

04/2017-02/2021, working with Prof. Virginia R. de Sa .

06/2020-09/2020, Research internship at Amazon in Seattle, WA. (Virtual)

04/2020, speaking at UCSD AI Seminar about my research on comparing neural networks with both features and gradients.

04/2020, two papers rejected by the International Conference on Machine Learning (ICML2020).

03/2020, a talk @ Vision research group led by Leonid Sigal at the University of British Columbia

06/2019-12/2019, Research internship at Amazon in Cambridge, UK.

11/2019, a talk @ CLunch at the University of Pennsylvania.

11/2019, a talk @ Computational Linguistics group led by Paul Smolensky at the Johns Hopkins University

10/2019, a talk @ Language Technology Lab at the University of Cambridge. (Link to the abstract click here)

10/2019, a short paper with Wesley, Pablo, Andrew and Andreas accepted to MetaLearn workshop at NeurIPS 2019.

09/2019, three papers rejected by the Conference of Neural Information Processing Systems (NeurIPS2019).

06/2019, attending Amazon re:MARS 2019

05/2019, a long paper accepted to the Annual Meeting of the Association of Computational Linguistics (ACL2019). (scores: 2, 4.5, 4.5 out of 5)

04/2019, two papers rejected by the International Conference on Machine Learning (ICML2019).

03/2019, attending the Amazon Grad Research Symposium in Seattle, USA.

02/2019, speaking at UCSD AI Seminar about my research on learning distributed representations of sentences. (slides)

12/2018, a talk at IRASL workshop @ NeurIPS 2018, slides are here.

12/2018, a long paper rejected by the International Conference on Learning Representations (ICLR2019). (scores: 5, 6, 7 out of 10, openreview.)

11/2018, two long papers, one with Virginia (poster) and the other one with Paul and Virginia (oral), accepted to IRASL workshop at NIPS2018. (scores: 4, 4, 4 out of 5 for both papers)

10/2018, invited to be an inaugural member of ACL Special Interest Group on Representation Learning (SIGREP).

06/2018-09/2018, research internship at Microsoft Research, Redmond, working with Prof. Paul Smolensky.

09/2018, Andrej Zukov-Gregoric and I met at ACL2018, and we decided to put our notes together in an organised file. Here it is.

09/2018, a long paper rejected by Neural Information Processing Systems (NIPS2018). (scores: 5, 6, 6 out of 10)

08/2018, a short paper rejected by Empirical Methods in Natural Language Processing (EMNLP2018). (scores: 2, 3, 3 out of 5)

05/2018, advanced to PhD candidate, committee members: Virginia de Sa, Ben Bergen, Eran Mukamel, Lawrence Saul, and Ndapa Nakashole. (slides)

05/2018, a long paper with Hailin, Chen, Zhaowen and Virginia, accepted to 3rd Workshop on Representation Learning for NLP. (scores: 3, 4 out of 5)

04/2018, two papers rejected by the annual meeting of Association of Computational Linguisitics (ACL2018). (scores: 4, 4, 4 out of 6, and 4, 4, 4 out of 6)

01/2018, one paper rejected by the International Conference on Learning Representations (ICLR2018). (scores: 3, 6, 7 out of 10)

11/2017, speaking at UCSD AI Seminar about my research on sentence representation learning. (slides)

09/2017, one paper rejected by the annual conference on Neural Information Processing Systems (NIPS2017). (scores: 4, 5, 6 out of 10)

06/2017-09/2017, research internship at Adobe research, working on text-location based image search.

05/2017, a long paper with Hailin, Chen, Zhaowen, and Virginia, accepted to 2nd Workshop on Representation Learning for NLP. (scores: 3, 4 out of 5)

09/2015-12/2017, Machine Learning, Perception, and Cognition Lab, working with Prof. Zhuowen Tu .

06/2016-09/2016, research internship at Adobe research, working on unsupervised sentence representation learning.

01/2016-04/2016, lab rotation in Prof. Garrison W. Cottrell's lab, working on the recurrent attention model for fine-grained classification tasks.

01/2015-04/2015, B.Sc. in Information Science @ Zhejiang University, working with Prof. Zhiyu Xiang.

Publications

[0] Mahta Mousavi, Eric Lybrand, Shuangquan Feng, Shuai Tang, Rayan Saab, Virginia R. de Sa, "Improving Robustness in Motor Imagery Brain-Computer Interfaces", (DistShift, NeurIPS2021).

[1] Wesley J. Maddox, Shuai Tang, Pablo G. Moreno, Andrew G. Wilson, Andreas Damianou, "Fast Adaptation with Linearized Neural Networks", (AISTATS2021).

[2] Shuai Tang, Virginia R. de Sa, "Deep Transfer Learning with Ridge Regression", (ArXiv, 2020)

[3] Shuai Tang, Wesley J. Maddox, Charlie Dickens, Tom Diethe, Andreas Damianou, "Similarity of Neural Networks with Gradients", (ArXiv, 2020)

[4] Yinghao Li, Shuai Tang, Virginia R. de Sa, "Supervised Spike Sorting Using Deep Convolutional Siamese Network and Hierarchical Clustering", (2019).

[5] Shuai Tang, Mahta Mousavi, Virginia de Sa, "An Empirical Study on Post-processing Methods for Word Embeddings", (ArXiv, 2019).

[6] Shuai Tang, Virginia de Sa, "Exploiting Invertible Decoders for Unsupervised Sentence Representation Learning", (ACL2019).

[7b] Shuai Tang, Paul Smolensky, Virginia de Sa, "A Simple Recurrent Unit with Reduced Tensor Product Representations", (ArXiv, 2019).

[7a] Shuai Tang, Paul Smolensky, Virginia de Sa, "Learning Distributed Representations of Symbolic Structure Using Binding and Unbinding Operations", (IRASL, NeurIPS2018).

[8b] Shuai Tang, Virginia de Sa, "Improving Sentence Representations with Consensus Maximisation", (IRASL, NeurIPS2018).

[8a] Shuai Tang, Virginia de Sa, "Multi-view Sentence Representation Learning", (ArXiv, 2018).

[9] Shuai Tang, Hailin Jin, Chen Fang, Zhaowen Wang, and Virginia de Sa, "Speeding up Context-based Sentence Representation Learning with Non-autoregressive Convolutional Decoding", (RepL4NLP, ACL2018).

[10] Shuai Tang, Hailin Jin, Chen Fang, Zhaowen Wang, and Virginia de Sa, "Trimming and Improving Skip-thought Vectors", (ArXiv, 2017).

[11] Shuai Tang, Hailin Jin, Chen Fang, Zhaowen Wang, and Virginia de Sa, "Rethinking Skip-thought: A Neighborhood based Approach", (RepL4NLP, ACL2017).

[12] Patrick W. Gallagher, Shuai Tang, and Zhuowen Tu, "What Happened to My Dog in That Network: Unraveling Top-down Generators in Convolutional Neural Networks", (ArXiv,2015).

Conference Notes

[0] Shuai Tang, Andrej Zukov-Gregoric, "Conference Notes / ACL2018".

Academic Services

  • Member
    2018: Inaugural member of ACL Special Interest Group on Representation Learning (SIGREP)
  • Reviewer
    2022: ICLR, AISTATS, ACL, Data-Centric Engineering, NAACL
    2021: ICLR, EACL, AAAI, AISTATS, NAACL, ICML, ACL, NeurIPS, EMNLP, CogSci
    2020: AAAI, ICLR, ACL, ICML, EMNLP, NeurIPS
    2019: TNNLS, ICWSM, ICML, CogSci, ACL, NeurIPS, EMNLP/IJCNLP, SciPy
    2018: CogSci, NeurIPS

  • Teaching

  • Teaching Assistant
    COGS 118B Intro to Machine Learning II (2020 Fall)
    COGS 9 Intro to Data Science (2020 Winter)
    COGS118B Intro to Machine Learning II (2018 Fall)
    COGS108 Data Science in Practice (2018 Winter)
    COGS118B Intro to Machine Learning II (2017 Fall)
    COGS181 Neural Networks and Deep Learning (2017 Winter)
    COGS118B Intro to Machine Learning II (2016 Fall)
    COGS118A Intro to Machine Learning I (2016 Winter)
  • Get In Touch