Blog Articles Banner
Data Science

Neural Arithmetic Logic Unit


Neural networks are being widely adopted across disciplines with regards to computation.  One fundamental flaw with neural networks is that they are unable to count. The neural arithmetic logic unit (NALU) was created to extend the neural accumulator model (NAC) to allow for better arithmetic computation.  
Neural networks have been shown to compute very well with the training sets they are given.  However, outside of the training sets, even simple tasks like learning the scalar identity function is impossible outside of the training sets.  However, with the introduction of NACs and NALUs, the ability for neural networks to train and retain mathematically complex models outside of their original training sets has been increased.  
Building a NALU involves building a NAC.  NACs outputs the linear transformation of the inputs.  Simple arithmetic operations such as additions and subtractions are performed.  In Python, the following code is provided for the NAC. W is the transformation matrix, M is the matrix of interest, A is the accumulation vector, and G is the learned sigosmodial gate. For some freedom, one can determine the standard of deviation for the layers. Note that one will need TensorFlow and NumPY for the implementation: