Bio.

Ph.D. in Cognitive Science.

Applied Scientist at the AWS.

Research Interests

Representation Learning: I am interested forming and analysing representations of machine learning models efficiently. The representations themselves include vectors, matrices, and graphs.

Linear Models: At the end of the day, we are all using linear regression.

Current Status

Differential Privacy and things

News

04/2023, speaking at the Privacy Seminar at Google.

12/2022, speaking as a panelist at SyntheticData4MLWorkshop at NeurIPS.

09/2022, one paper accepted at NeurIPS2022, and one paper accepted at TSRML Workshop at NeurIPS2022.

05/2021, Applied Scientist at Amazon Web Services.

02/2021, Ph.D. in Cognitive Science. (mostly machine learning stuff)

02/2021, one paper accepted at AISTATS2021, and two papers rejected.

04/2017-02/2021, working with Prof. Virginia R. de Sa .

06/2020-09/2020, Research internship at Amazon in Seattle, WA. (Virtual)

04/2020, speaking at UCSD AI Seminar about my research on comparing neural networks with both features and gradients.

04/2020, two papers rejected by the International Conference on Machine Learning (ICML2020).

03/2020, a talk @ Vision research group led by Leonid Sigal at the University of British Columbia

06/2019-12/2019, Research internship at Amazon in Cambridge, UK.

11/2019, a talk @ CLunch at the University of Pennsylvania.

11/2019, a talk @ Computational Linguistics group led by Paul Smolensky at the Johns Hopkins University

10/2019, a talk @ Language Technology Lab at the University of Cambridge. (Link to the abstract click here)

10/2019, a short paper with Wesley, Pablo, Andrew and Andreas accepted to MetaLearn workshop at NeurIPS 2019.

09/2019, three papers rejected by the Conference of Neural Information Processing Systems (NeurIPS2019).

06/2019, attending Amazon re:MARS 2019

05/2019, a long paper accepted to the Annual Meeting of the Association of Computational Linguistics (ACL2019). (scores: 2, 4.5, 4.5 out of 5)

04/2019, two papers rejected by the International Conference on Machine Learning (ICML2019).

03/2019, attending the Amazon Grad Research Symposium in Seattle, USA.

02/2019, speaking at UCSD AI Seminar about my research on learning distributed representations of sentences. (slides)

12/2018, a talk at IRASL workshop @ NeurIPS 2018, slides are here.

12/2018, a long paper rejected by the International Conference on Learning Representations (ICLR2019). (scores: 5, 6, 7 out of 10, openreview.)

11/2018, two long papers, one with Virginia (poster) and the other one with Paul and Virginia (oral), accepted to IRASL workshop at NIPS2018. (scores: 4, 4, 4 out of 5 for both papers)

10/2018, invited to be an inaugural member of ACL Special Interest Group on Representation Learning (SIGREP).

06/2018-09/2018, research internship at Microsoft Research, Redmond, working with Prof. Paul Smolensky.

09/2018, Andrej Zukov-Gregoric and I met at ACL2018, and we decided to put our notes together in an organised file. Here it is.

09/2018, a long paper rejected by Neural Information Processing Systems (NIPS2018). (scores: 5, 6, 6 out of 10)

08/2018, a short paper rejected by Empirical Methods in Natural Language Processing (EMNLP2018). (scores: 2, 3, 3 out of 5)

05/2018, advanced to PhD candidate, committee members: Virginia de Sa, Ben Bergen, Eran Mukamel, Lawrence Saul, and Ndapa Nakashole. (slides)

05/2018, a long paper with Hailin, Chen, Zhaowen and Virginia, accepted to 3rd Workshop on Representation Learning for NLP. (scores: 3, 4 out of 5)

04/2018, two papers rejected by the annual meeting of Association of Computational Linguisitics (ACL2018). (scores: 4, 4, 4 out of 6, and 4, 4, 4 out of 6)

01/2018, one paper rejected by the International Conference on Learning Representations (ICLR2018). (scores: 3, 6, 7 out of 10)

11/2017, speaking at UCSD AI Seminar about my research on sentence representation learning. (slides)

09/2017, one paper rejected by the annual conference on Neural Information Processing Systems (NIPS2017). (scores: 4, 5, 6 out of 10)

06/2017-09/2017, research internship at Adobe research, working on text-location based image search.

05/2017, a long paper with Hailin, Chen, Zhaowen, and Virginia, accepted to 2nd Workshop on Representation Learning for NLP. (scores: 3, 4 out of 5)

09/2015-12/2017, Machine Learning, Perception, and Cognition Lab, working with Prof. Zhuowen Tu .

06/2016-09/2016, research internship at Adobe research, working on unsupervised sentence representation learning.

01/2016-04/2016, lab rotation in Prof. Garrison W. Cottrell's lab, working on the recurrent attention model for fine-grained classification tasks.

01/2015-04/2015, B.Sc. in Information Science @ Zhejiang University, working with Prof. Zhiyu Xiang.

Publications

Improved Differentially Private Regression via Gradient Boosting, Shuai Tang, Sergul Aydore, Michael Kearns, Saeyoung Rho, Aaron Roth, Yichen Wang, Yu-Xiang Wang, Zhiwei Steven Wu

"Private Synthetic Data for Multitask Learning and Marginal Queries", Giuseppe Vietri, Cedric Archambeau, Sergul Aydore, William Brown, Michael Kearns, Aaron Roth, Ankit Siva, Shuai Tang, Zhiwei Steven Wu, (NeurIPS 2023).

Differentially Private Gradient Boosting on Linear Learners for Tabular Data, Saeyoung Rho, Cedric Archambeau, Sergul Aydore, Beyza Ermis, Michael Kearns, Aaron Roth, Shuai Tang>, Yu-Xiang Wang, Steven Wu, (TSRML, NeurIPS 2023)

"Improving Robustness in Motor Imagery Brain-Computer Interfaces", Mahta Mousavi, Eric Lybrand, Shuangquan Feng, Shuai Tang, Rayan Saab, Virginia R. de Sa, (DistShift, NeurIPS2021).

"Fast Adaptation with Linearized Neural Networks", Wesley J. Maddox, Shuai Tang, Pablo G. Moreno, Andrew G. Wilson, Andreas Damianou, (AISTATS2021).

"Deep Transfer Learning with Ridge Regression", Shuai Tang, Virginia R. de Sa, (ArXiv, 2020)

"Similarity of Neural Networks with Gradients", Shuai Tang, Wesley J. Maddox, Charlie Dickens, Tom Diethe, Andreas Damianou , (ArXiv, 2020)

"Supervised Spike Sorting Using Deep Convolutional Siamese Network and Hierarchical Clustering", Yinghao Li, Shuai Tang, Virginia R. de Sa, (2019).

"An Empirical Study on Post-processing Methods for Word Embeddings", Shuai Tang, Mahta Mousavi, Virginia de Sa, (ArXiv, 2019).

"Exploiting Invertible Decoders for Unsupervised Sentence Representation Learning", Shuai Tang, Virginia de Sa , (ACL2019).

"A Simple Recurrent Unit with Reduced Tensor Product Representations", Shuai Tang, Paul Smolensky, Virginia de Sa, , (ArXiv, 2019).

"Improving Sentence Representations with Consensus Maximisation", Shuai Tang, Virginia de Sa, (IRASL, NeurIPS2018).

"Speeding up Context-based Sentence Representation Learning with Non-autoregressive Convolutional Decoding", Shuai Tang, Hailin Jin, Chen Fang, Zhaowen Wang, and Virginia de Sa , (RepL4NLP, ACL2018).

"Trimming and Improving Skip-thought Vectors", Shuai Tang, Hailin Jin, Chen Fang, Zhaowen Wang, and Virginia de Sa , (ArXiv, 2017).

"Rethinking Skip-thought: A Neighborhood based Approach", Shuai Tang, Hailin Jin, Chen Fang, Zhaowen Wang, and Virginia de Sa , (RepL4NLP, ACL2017).

"What Happened to My Dog in That Network: Unraveling Top-down Generators in Convolutional Neural Networks"> Patrick W. Gallagher, Shuai Tang, and Zhuowen Tu , (ArXiv,2015).

Conference Notes

[0] Shuai Tang, Andrej Zukov-Gregoric, "Conference Notes / ACL2018".

Academic Services

  • Member
    2018: Inaugural member of ACL Special Interest Group on Representation Learning (SIGREP)
  • Reviewer
    ACL, NAACL, EACL, AACL, EMNLP,
    NeurIPS, ICML, ICLR, AISTATS, AAAI
    TNNLS, TMLR

  • Teaching

  • Teaching Assistant
    COGS 118B Intro to Machine Learning II (2020 Fall)
    COGS 9 Intro to Data Science (2020 Winter)
    COGS118B Intro to Machine Learning II (2018 Fall)
    COGS108 Data Science in Practice (2018 Winter)
    COGS118B Intro to Machine Learning II (2017 Fall)
    COGS181 Neural Networks and Deep Learning (2017 Winter)
    COGS118B Intro to Machine Learning II (2016 Fall)
    COGS118A Intro to Machine Learning I (2016 Winter)
  • Get In Touch