Research

Contextual Recurrent Neural Networks

There is an implicit assumption that by unfolding recurrent neural networks (RNN) in finite time, the misspecification of choosing a zero value for the initial hidden state is mitigated by later time steps. This assumption has been shown to work in practice and alternative initialization may be suggested but often overlooked. In this paper, we propose a method of parameterizing the initial hidden state of an RNN. The resulting architecture, referred to as a Contextual RNN, can be trained end-to-end. The performance on an associative retrieval task is found to improve by conditioning the RNN initial hidden state on contextual information from the input sequence. Furthermore, we propose a novel method of conditionally generating sequences using the hidden state parameterization of Contextual RNN.

Numerai Competition

Numerai is a hedge fund which uses a weekly competition to source predictions they use to make…


TensorFlow from Node.js

Tutorial on loading a TensorFlow graph in Node.js with applications for other host languages.


Contextual Recurrent Neural Networks

There is an implicit assumption that by unfolding recurrent neural networks (RNN) in finite time,…


What could you do as an AI-powered company?