-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathbio.jemdoc
166 lines (92 loc) · 9.24 KB
/
bio.jemdoc
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
# jemdoc: menu{MENU}{index.html}
# The first line of this file is a special command that tells jemdoc which menu
# entry in the file named MENU to associate this page with.
= Short Bio
Qi Lei is an assistant professor of Mathematics and Data Science at the [https://www.courant.nyu.edu/ Courant Institute of Mathematical Sciences] and the [https://cds.nyu.edu/ Center for Data Science] at [https://www.nyu.edu/ NYU]. Previously she was an associate research scholar at the ECE department of Princeton University. She received her Ph.D. from [https://www.oden.utexas.edu/ Oden Institute for Computational Engineering & Sciences] at UT Austin. She visited the Institute for Advanced Study (IAS)/Princeton for the Theoretical Machine Learning Program. Before that, she was a research fellow at Simons Institute for the Foundations of Deep Learning Program. Her research aims to develop sample- and computationally efficient machine learning algorithms and bridge the theoretical and empirical gap in machine learning. Qi has received several awards, including the Outstanding Dissertation Award, National Initiative for Modeling and Simulation Graduate Research Fellowship, Computing Innovative Fellowship, and Simons-Berkeley Research Fellowship.
== Research Interests:
Modern Machine Learning (ML) models are transforming applications across various domains. Pushing the limits of their potential relies on training more complex models, using larger data sets, and persistent hyper-parameter tuning. This procedure requires sophisticated user experience, expensive equipment such as GPU machines, and extensive label annotations costs. These criteria leave machine learning exclusive to only specialized researchers and institutes. I aim to make machine learning more accessible to the general populace by developing *efficient* and *easily trainable* machine learning algorithms with *low computational cost*, *fewer security concerns*, and *low requirement of labeled data*. Over the past seven years, I have focused on bringing more theoretical ideas and principles to algorithm design towards *efficient*, *robust*, and *few-shot* machine learning algorithms.
== Curriculum Vitae:
([cv_2.pdf Curriculum Vitae], [https://github.com/cecilialeiqi/ Github], [https://scholar.google.com/citations?user=kGOgaowAAAAJ&hl=en Google Scholar])
== Education
[http://www.utexas.edu/ Unversity of Texas at Austin], Austin, TX
Ph.D student in [https://www.ices.utexas.edu/ Institute for Computational Engineering and Sciences] ~~~~~~/August 2014 - May 2020/
[http://www.math.ias.edu/theoretical_machine_learning Institute of Advanced Study], Princeton, NJ
Visiting Graduate Student for the [https://www.math.ias.edu/sp/Optimization_Statistics_and_Theoretical_Machine_Learning Special Year on Optimization, Statistics and Theoretical Machine Learning] ~~~~~~ /September 2019 - May 2020/
[https://simons.berkeley.edu/programs/dl2019 Simons Institute], Berkeley, CA
Research Fellow for the [https://simons.berkeley.edu/programs/dl2019 Foundations of Deep Learning Program] ~~~~~ /May 2019 - August 2019/
[http://www.zju.edu.cn/english/ Zhejiang University], Zhejiang, China
B.S. in Mathematics ~~~~~~ /August 2010 - May 2014/
== Awards and Recognitions
- Computing Innovation Fellowship, CRA, 2020-2022
- Simons-Berkeley Research Fellowship Simons Institute, 2019
- The National Initiative for Modeling and Simulation Research Fellowship ($225K for four years) UT Austin, 2014-2018
- Young Investigators Lecturer award, Caltech, 2021
- Outstanding Dissertation Award, Oden Institute, 2021
- Rising Star for EECS (An Academic Career Workshop for Women in EECS), UIUC, 2019; MIT, 2021
- Rising Star for Machine Learning, University Maryland, 2021
- Rising Star for Computational and Data Science, UT Austin, 2020
- Meritorious Winner (First Prize) for The Mathematical Contest in Modeling
(MCM) COMAP, 2014
- Gold medal (5th place) in China Girls Math Olympiad (CGMO, an international competition with a proof-based format similar to the International Math Olympiad), 2009
- First Prize for CMC (the Mathematics competition of Chinese College Student), China, 2012
- First Prize for National Olympiad in Informatics in Provinces (NOIP), China, 2007(perfect score), 2008
== Experience
Facebook Visual Understanding Team, Menlo Park, CA
Software Engineering Intern ~~~~~~ /June 2018 - September 2018/
[https://www.a9.com/whatwedo/product-search/ Amazon/A9 Product Search Lab], Palo Alto, CA
Software Development Intern, Search Technologies ~~~~~ /May 2017 - August 2017/
Amazon Web Services (AWS) Deep Learning Team, Palo Alto, CA
Applied Scientist Intern ~~~~~~ /January 2017 - April 2017/
[https://www.research.ibm.com/labs/watson/ IBM Thomas J. Watson Research Center], Yorktown Heights, NY
Research Summer Intern ~~~~~~ /May 2016 - October 2016/
UCLA Biomath Department, Los Angeles, CA
Visiting Student ~~~~~~ /July 2013 - September 2013/
== Invited Talk
``Optimal Gradient-based Algorithms for Non-concave Bandit Optimization."
- [http://bliss.eecs.berkeley.edu/Seminar/fa21/qi.html BLISS seminar], UC Berkeley, virtual 2021
- [https://simons.berkeley.edu/talks/optimal-gradient-based-algorithms-non-concave-bandit-optimization Sampling Algorithms and Geometries on Probability Distributions Workshop], Simons Institute, CA, 2021
`` Few-Shot Learning via Learning the Representation, Provably.''
- [https://iclr.cc/media/Slides/iclr/2021/virtual(07-00-00)-07-00-00UTC-2535-few-shot_learni.pdf International Conference on Learning Representations], virtual, 2021
- IAS, Princeton, NJ, 2020
- Simons Institute Reunion, virtual, 2020 -UC Berkeley, virtual, 2020
``Predicting What You Already Know Helps: Provable Self-Supervised Learning.''
- Neural Information Processing Systems, virtual, 2021
- [https://www.ifml.institute/events Institute for Foundations of Machine Learning], virtual, 2020
- [https://www.oneworldml.org/past-events One-World ML seminar], virtual, 2020
- UW-Madison, virtual, 2020
``Provable representation learning.''
- [https://sites.google.com/view/slowdnn/ Young Researcher Spotlight Talk] at ``Seeking Low-dimensionality in Deep Learning'' workshop, virtual, 2020
- Microsoft Research (Redmond and NY), virtual, 2021
- [https://cms.caltech.edu/events/90169 Caltech Young Investigators Lecture Series], virtual, 2021
- [https://twitter.com/ml_umd/status/1458125253197058054 Rising star presentation] at the University of Maryland, virtual, 2021
- [https://talks-calendar.app.ist.ac.at/events/3329 ELLIS Talk at IST Austria], virtual, 2021
- [https://www.ece.gatech.edu/calendar/day/2021/12/03/108208?utm_source=ECE+Lists+2019&utm_campaign=a5a1732812-EMAIL_CAMPAIGN_2019_08_06_08_30_COPY_01&utm_medium=email&utm_term=0_0324b14402-a5a1732812-103287103 CSIP seminar at Georgia Tech], virtual, 2021
- USC Machine Learning Seminar, virtual, 2021
``SGD Learns One-Layer Networks in WGANs.''
- [https://papertalk.org/papertalks/6195 International Conference of Machine Learning] (ICML), virtual, 2020
- [https://simons.berkeley.edu/talks/sgd-learns-one-layer-networks-wgans Workshop on Learning and Testing in High Dimensions], Simons Institute, 2020
``Deep Generative models and Inverse Problems.''
- [http://faculty.smu.edu/sxu/SIAMTXLA19/submissions.html Minisymposium on Machine Learning for Solving Partial Differential Equations and Inverse Problems], 2019 SIAM Texas-Louisiana Section, Dallas, TX, 2019
- Princeton, virtual, 2020
- Google Research, virtual, 2021
``Similarity Preserving Representation Learning for Time Series Analysis.''
- [https://www.ijcai.org/proceedings/2019/394 The 28th International Joint Conference on Artificial Intelligence] (IJCAI), Macao, China, 2019
``Discrete Adversarial Attacks and Submodular Optimization with Applications to Text Classification.''
- Simons-Berkeley Fellows Talk, Berkeley, CA, 2019
- [https://www.youtube.com/watch?v=UnakqzVLVLI The Conference on Systems and Machine Learning] (SysML), Stanford, CA,
2019
``Recent Advances in Primal-Dual Coordinate Methods for ERM.''
- [https://meetings.siam.org/sess/dsp_programsess.cfm?SESSIONCODE=66077 Minisymposium on Recent Progress in Coordinate-wise Descent Methods], SIAM Conference on Computational Science and Engineering, Spokane, WA, 2019
- International Conference of Machine Learning (ICML), Sydney, 2017
``Coordinate Descent Methods for Matrix Factorization.''
- [https://archive.siam.org/meetings/an16/ Minisymposium on Recent Advances in Nonnegative Matrix Factorization], SIAM Annual Meeting, Boston, MA, 2016
== Service
*Conference Reviewer:* MLSys (19,20,Meta-reviewer'21, TPC'22), COLT (21,22), STOC (20), NeurIPS (16,17,18,19,20,21), ICML (18,19,20,21), ICLR (18,19,20,21),
AISTATS (18,19,20,21), AAAI (20,21), ACML (19), and more
*Journal Reviewer:* JSAIT(20), MOR (18,19,20), TNNLS (19,20), TKDE (19), ISIT (17,18), TIIS (17),
IT (16,17), and more
== Teaching
Theory of Deep Learning: Representation and Weakly Supervised Learning, Teaching Assistant, Fall 2020
Scalable Machine Learning, Teaching Assistant, Fall 2019
Mathematical Methods in Applied Engineering and Sciences, Instructer Intern, Spring 2016
#{{<div align='center'><a href='http://www.hit-counts.com'><img src='http://www.hit-counts.com/counter.php?t=MTQwMjAyOA==' border='0' alt='Visitor Counter'></a><BR><a href='http://www.hit-counts.com'>Visitor Counter</a></div>}}