NeurIPS is to Machine Learning (ML) researchers what the WEF Davos conference is to policymakers. Founded in 1987, the annual NeurIPS conference brings together the brightest minds of the ML world to debate new ideas, present the latest research and findings, and share perspectives on current and upcoming industry trends.

Borealis AI is a proud sponsor of the NeurIPS conference. And, as we attend the conference each year, we aim to contribute through publishing and supporting cutting-edge research. 

This year, the 2022 Borealis AI Fellowship Winner, Amin Rakhsha, will present a paper on Operator Splitting Value Iteration. In the conference workshops, former intern and current employee Saghar Irandoust will present a paper on training vision transformers at the Has It Trained Yet workshop; Mahmoud Salem, a former Borealis AI intern, has a paper accepted on Gumbel-Softmax Selective Networks it will be presented at the Challenges in Deploying and Monitoring Machine Learning Systems workshop; and Leo Feng, a current Borealis AI intern, has a paper on Efficient Queries Transformer Neural Processes at the MetaLearn workshop

Here’s a summary of what those papers cover:

Conference Paper

Amin Rakhsha’s conference paper on Operator Splitting Value Iteration.

This paper introduces new planning and reinforcement learning algorithms for discounted MDPs that utilize an approximate model of the environment to accelerate the convergence of the value function. Inspired by the splitting approach in numerical linear algebra, the paper describes the Operator Splitting Value Iteration (OS-VI) algorithm for both Policy Evaluation and Control problems and demonstrates that OS-VI can achieve a much faster convergence rate when the model is accurate enough.

Workshop Papers

Saghar Irandoust’s workshop paper on Training a Vision Transformer.

Transformers have become central to recent advances in computer vision. However, training a vision Transformer (ViT) model from scratch can be resource-intensive and time-consuming. This paper introduces some algorithmic improvements to enable training a ViT model from scratch with limited hardware (1 GPU) and time (24 hours) resources. Using a new variant of the popular ImageNet1k benchmark, the paper demonstrates significant performance improvements given the proposed training budget.

Mahmoud Salem’s workshop paper on Gumbel-Softmax Selective Networks.

Selective neural networks are trained with an integrated option to abstain, allowing them to learn to recognize and optimize for the subset of the data distribution for which confident predictions can be made. However, optimizing selective networks is challenging due to the non-differentiability of the binary selection function (the discrete decision of whether to predict or abstain). This paper presents a general method for training selective networks that leverages the Gumbel-softmax reparameterization trick to enable selection.

Leo Feng’s workshop paper on Efficient Queries Transformer Neural Processes.

This paper proposes Efficient Queries Transformer Neural Processes (EQTNPs) as a more computationally efficient Neural Process (NP) variant. NPs are a popular method of estimating predictive uncertainty on target data points by conditioning on a context dataset. But current approaches require quadratic computation with respect to the number of context data points per query, thereby limiting its applications. This paper empirically shows that EQTNPs achieve results competitive with the state-of-the-art.

Every year, the NeurIPS conference helps catalyze new approaches, ideas and collaborations within the AI and Machine Learning ecosystem. We are proud of the contribution that Borealis AI is able to make to this endeavor, both as a participant and as a member of the research community.

Dr. Foteini Agrafioti

Head of Borealis AI

The NeurIPS Conference runs from November 28th through to December 9th with in-person sessions from November 28h to December 1st. To learn more about the event, click here.

To find out what else the Borealis AI team is getting up to during #NeurIPS2022, click here.

Join our Prism Trading Competition!

The Prism Trading Competition invites any aficionados in forecasting, trading, or adversarial attacks to join this fun lighthearted challenge, test out your model capabilities, and win prizes. Prepare to buy high and sell low.. wait, inverse that. The Prism arena awaits.

Learn more