2024
Awesome Sparse Autoencoders
GitHub
Awesome Representation Engineering
GitHub
Awesome Large Language Model Unlearning
GitHub
2022
Toward Disentangling Double Descent and Information Flow in Deep Neural Networks
Chris Yuhao Liu, Brendan, King, Jing Gu
Paper, code, slides
Learning to Extract Compact Vector Representations from Weight Matrices
Chris Yuhao Liu
Paper, code, slides
Understanding Biased Datasets and Machines Requires Rethinking Bias From Scratch
Chris Yuhao Liu, Yuhang Gan, Zichao Li, Ruilin Zhou
2021
Sample Complexity Scaling Laws For Adversarial Training
Chris Yuhao Liu
Paper, code
2020
TAPT: Text Augmentation Using Pre-Trained Transformers With Reinforcement Learning
Chris Yuhao Liu
Code
Conditional Generation of Research Paper Abstracts
Chris Yuhao Liu
Paper, code
Sentiment Analysis With Transformers
Chris Yuhao Liu
Code
Others
During the early stage of my research career in 2020 and 2021, I have also worked on several interesting projects that did not result in publications. Here are some of them:
- Structural Risk Minimization for Deep Neural Networks: A new regularization method based on structural risk minimization that directly minimizes the generalization gap.
- What Determines Sample Complexity Rate in Practice?: We empirically estimate the power-law exponents of various model architectures and study how they are altered by a wide range of training conditions for classification.
- Faster Sample Complexity Rates With Ensemble Filtering: We present a dataset filtering approach that uses sets of classifiers, similar to ensembling, to estimate noisy (or non-realizable) examples and exclude them so a faster sample complexity rate is achievable in practice.