Tool

DensEMANN, Self-structuring algorithms

Author

Link

Description

Self-structuring algorithms are a unique family of algorithms for neural architecture search and neural network optimization. Given an initial (sometimes minimal) neural network architecture and a task to learn, a self-structuring algorithm modifies this architecture – by adding (growing) and/or removing (pruning) elements in it – while the architecture is simultaneously being trained on the task at hand. This way, they allow for automatically designing highly efficient, small-sized and specialized neural networks in a relatively short amount of time, and for a virtually unlimited range of applications. 

DensEMANN is an algorithm that Antonio is co-developing with Prof. Hugues Bersini (IRIDIA-CoDE, ULB), and that belongs to the self-structuring family. It can automatically generate very small and efficient neural network architectures virtually from scratch, for data classification tasks (mainly image classification and equivalent tasks). DensEMANN is an ideal technology for designing neural networks for embedded systems and edge computing. This way, instead of using a large and powerful computer (often in the cloud) to run the neural network, one can directly run it on smaller hardware such as a drone, smartphone or tablet. 

Keywords: neural architecture search, neural networks, optimization 

Experience with the tool: 

Since the beginning of his PhD (2018) Antonio is developing DensEMANN together with his supervisor, Prof. Hugues Bersini (IRIDIA-CoDE, ULB). He is currently co-writing whit Prof. Bersini his fourth paper on this algorithm. Antonio also has got experience with the neural network pruning techniques provided the Python library “Fasterai”, developed by fellow researcher Nathan Hubens (UMONS), and which covers self-structuring. 

To discuss further applications

Feel free to contact: antonio.garcia.diaz@ulb.be