We are looking for a Master’s level intern for a period of at least 6 months.
Who we are
ETIS is a joint research laboratory between CY Cergy Paris University, ENSEA Graduate School of Electrical Engineering, and CNRS (Sciences Informatiques).
Context
In the context of embedded computing, the hardware constraints for Deep Neural Networks (DNNs) can be very strict, especially for microcontrollers or FPGAs, making most DNN architectures unavailable due to their large size.
This can lead to suboptimal resource utilization, a problem that is even more pressing in hardware using different types of resources for DNN inference (e.g., for FPGAs: LUTs, DSPs, and BRAM).
Neural Architecture Search (NAS) is a solution to adapt neural networks to specific hardware constraints.
The proposed internship is complementary with previous work dealing with distributing DNNs on SoC-FPGA boards, which can be found on the HAL repository.
Internship Goals
- Conduct a state-of-the-art review of Neural Architecture Search for TinyML.
- Propose innovative solutions to this problem.
- Develop and implement a proof of concept for a Neural Architecture Search pipeline.
Your Profile
- Strong knowledge of deep learning.
- Some experience with Python and AI frameworks (specifically, PyTorch and/or TensorFlow).
- An understanding of embedded systems (including, among others, FPGAs and the FINN framwork) would be a plus.
Contact persons
- Stéphane Zuckerman, stephane.zuckerman@etis-lab.fr
- Mathieu Hannoun, mathieu.hannoun@ensea.fr