Skip to content

neural-reckoning/snn_workshop_2020

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 

Repository files navigation

The last years have seen many exciting new developments to train spiking neural networks to perform complex information processing. This online workshop brings together researchers in the field to present their work and discuss ways of translating these findings into a better understanding of neural circuits. Topics include artificial and biologically plausible learning algorithms and the dissection of trained spiking circuits toward understanding neural processing. We have a manageable number of talks with ample time for discussions.

The workshop is being organised by Dan Goodman and Friedemann Zenke.

Registration

Registration is now closed. You can see a replay of the talks on Crowdcast or the talks and one of the recorded discussions on YouTube. Note that Claudia Clopath's talk and the day 2 discussion were not recorded.

Agenda

Talks will be 45m (30m + 15m questions/discussion). Hover over the talk titles to see abstracts (if one is available).

August 31st

<script language="javascript"> function LT(d, t) { var date = new Date(d+' 2020 '+t+' UTC+2'); document.write(date.toString()); } </script>
Time (CET) Session Local date/time
14:00 Welcome by organizers <script language="javascript">LT('31 Aug', '14:00')</script>
14:10 Sander Bohte (CWI)
Effective and Efficient Computation with Multiple-timescale Spiking Recurrent Neural Networks
<script language="javascript">LT('31 Aug', '14:10')</script>
14:55 Iulia M. Comsa (Google Research)
On temporal coding in spiking neural networks with alpha synaptic function
<script language="javascript">LT('31 Aug', '14:55')</script>
15:40 Break (30mins) <script language="javascript">LT('31 Aug', '15:40')</script>
16:10 Franz Scherr (TUG)
E-prop: A biologically inspired paradigm for learning in recurrent networks of spiking neurons
<script language="javascript">LT('31 Aug', '16:10')</script>
16:55 Emre Neftci (UC Irvine)
Synthesizing Machine Intelligence in Neuromorphic Computers with Differentiable Programming
<script language="javascript">LT('31 Aug', '16:55')</script>
17:40 Break (30mins) <script language="javascript">LT('31 Aug', '17:40')</script>
18:10 Discussion (can continue as long as needed)
Current technical constraints and bottlenecks.
We can train spiking neural networks. What now?
<script language="javascript">LT('31 Aug', '18:10')</script>

September 1st

Time (CET) Session Local date/time
14:00 Welcome by organizers <script language="javascript">LT('1 Sep', '14:00')</script>
14:10 Timothee Masquelier (CNRS Toulouse)
Back-propagation in spiking neural networks
<script language="javascript">LT('1 Sep', '14:10')</script>
14:55 Claudia Clopath (Imperial College)
Training spiking neurons with FORCE to uncover hippocampal function
<script language="javascript">LT('1 Sep', '14:55')</script>
15:40 Break (30mins) <script language="javascript">LT('1 Sep', '15:40')</script>
16:10 Richard Naud (U Ottawa)
Burst-dependent synaptic plasticity can coordinate learning in hierarchical circuits
<script language="javascript">LT('1 Sep', '16:10')</script>
16:55 Julian Goeltz (Uni Bern)
Fast and deep neuromorphic learning with time-to-first-spike coding
<script language="javascript">LT('1 Sep', '16:55')</script>
17:40 Break (30mins) <script language="javascript">LT('1 Sep', '17:40')</script>
18:10 Discussion (can continue as long as needed)
Why spiking?
<script language="javascript">LT('1 Sep', '18:10')</script>

Discussion topics

Monday, August 31

We will discuss two topics:

Current technical constraints and bottlenecks.

How large do we need SNNs to be and how do we train them to do our research? What are the next steps the field should take to swiftly move forward?

We can train spiking neural networks. What now?

How do we leverage this methodological advance to gain a better understanding of how the brain processes information? What constitutes a conceptual advance? And how do we compare trained spiking neural networks to biology?

Tuesday, September 1

Why spiking?

Neurons communicate via precisely timed, discrete pulses rather than by analogue signals. Why? Is there a computational advantage to this mode of communication, or is just to save energy? With the recent advances in our ability to train spiking neural networks discussed in this workshop, can we throw new light on this age old discussion, and outline a programme to resolve it?

About

SNN Workshop 2020

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published