Research

DVS vision sensors

We develop neuromorphic hardware, network  models, and algorithms aimed at mimicing and understanding the computation underlying biological sensors and brain processing. This research is within the field of neuromorphic engineering, a field that is inspired by the way our brain computes.

Our neuromorphic sensors, for example, capture the operating principles of the biological retina and cochlea. The retina pixels or cochlea channels output events when they detect a feature. This asynchronous detection process leads to a natural compressed, sparse output with low latency and high temporal resolution. Furthermore the retina sensors operate under very high dynamic range of background intensity.

 

Research Topics

 

  • Event-driven sensors 
    • Dynamic and Active Pixel Vision Sensor (DAVIS) and Dynamic Vision Sensor (DVS) retinas
    • Dynamic Audio Sensor (DAS) cochlea
  • Event-driven visual and auditory signal processing algorithms (e.g. noise filtering, motion, orientation, optical flow, tracking, sound recognition, object recognition, localization)
  • Sensory-fusion deep neural models and algorithms
  • Spiking Deep Networks and hardware implementations including on logic hardware platforms
  • Low-latency sensors and algorithms for robotics (e.g motion systems for microflyers)
  • Sensory-motor robotic applications (e.g. pen-balancing robot)
  • Logic and custom VLSI neural architectures for event-driven algorithms
    • active dendritic architectures
    • logic architectures for sensory processing
    • inference hardware

Workshops

 

We help organize the annual Telluride Neuromorphic Engineering Workshop and regularly participate or co-organize tracks in the CapoCaccia Neuromorphic Cognition Workshop and the IEEE International Circuits and Systems Conference.