This year marks the 9th annual conference on International Conference on Learning Representations (ICLR) taking place in a fully virtual format from May 4th through May 8th, 2021. ICLR is a premier academic…
The empirical success of deep learning has posed significant challenges to machine learning theory: Why can we efficiently train neural networks with gradient descent despite its highly non-convex optimization landscape? Why do over-parametrized…
TL;DR: We propose controllable counterfactuals (CoCo) to evaluate dialogue state tracking (DST) models on novel scenarios, which results in significant performance drop of up to 30.8% for state-of-the-art DST models. Using CoCo for…
We are proud to announce the 2020 winners of our Salesforce AI Research Grant. Each of our winners will receive a $50K grant to advance their work and help us shape the future of AI.
TL; DR: We find that current self-supervised learning approaches suffer from poor visual grounding and receive improper supervisory signal when trained on complex scene images. We introduce CAST to improve visual grounding during…
This year marks the 34th annual conference on Neural Information Processing Systems (NeurIPS [https://neurips.cc/]) reimagined for the first time ever in a fully virtual format. NeurIPS is a leading conference in the area…
TL; DR: We propose a new semi-supervised learning method which achieves state-of-the-art performance by learning jointly-evolved class probabilities and image representations. What are the existing semi-supervised learning methods? Semi-supervised learning aims to leverage…
This year marks the 24th annual Empirical Methods in Natural Language Processing (EMNLP) conference reimagined for the first time ever in a fully virtual format. EMNLP is a leading conference in the area…