Cédric Goemaere


An outlet for my brain
  • Predictive Coding as backprop with local losses

    Deep Learning has been massively successful in recent years, in part because of the efficiency of its learning algorithm: backpropagation. However, the brain is also pretty good at learning(citation needed), and it’s definitely not doing backprop. One theory for how the brain learns, is Predictive Coding (PC), which has recently... [Read More]
  • Why I stopped working on Hopfield networks

    The first year of my PhD eventually became a full deep dive into Hopfield networks. These neural networks, inspired by biological memory, promise a new direction for AI that is more biologically plausible than traditional Deep Learning — and significantly more energy-efficient. And yet, after that first year, I shifted... [Read More]
  • Classifiers as generators: ditch the softmax!

    This post was inspired by Your Classifier is Secretly an Energy Based Model and You Should Treat it Like One. It made me rethink how image classifiers work—it’s much more nuanced than you might suspect. [Read More]