• Created from MiT

  • A type of neural network which is based on a closer mathematical model of the brain synapses, including how the firing of neurons are probabilistic

  • It uses dynamic models like those from calculus

  • Because it is continuous, it can be sampled at any frequency needed

  • The problem is that it is slow, limited by the speed of the ODE solver

  • The same author found a mathematical workaround which finds a closed-form approximation to the dynamic equation, leading to one-shot estimations of future time steps, which significantly improves the speed it takes to train and run inference

    • The improved solution is called ""Closed-form continuous (CFC) time network networks""
  • They have demonstrated the usefulness of these networks for robotics application:

    • driving and navigation
    • drones that follow a target
  • In both cases, they simply swap out the fully-connected sections of the neural network with their liquid neural network, and inspected the attention maps

    • The attention maps are more aligned with how humans perceive our environment
    • It also generalizes well
      • In the drone example, they trained to navigate around the forest in the summer condition, and it still performed well in the fall and winter, supposedly.
      • In fact, they adapted the drone to follow someone with a red bag in an urban setting too
  • The other benefit of this network is that it is extremely compact, only needed 19 liquid neurons, instead of hundreds of thousands of traditional neurons

    • By extension, this makes it efficient to run, and more interpretable
  • The reason why it’s dubbed “liquid” is because the connections between synapses are dynamic, and can adapt to new situations / perturbations.

  • The author has multiple GitHub repos on the topic. The most relevant one thus far is npcs which is compatible with PyTorch and TensorFlow.