Post has attachment
The call for applications to the 2017 Telluride Neuromorphic Cognition Engineering Workshop is open. This year's workshop is centered around the theme of Neuromorphic Autonomous Agents. Please go to for details how to apply. Application deadline is April 2nd 2017.

Post has attachment
Feb. 15 Application Deadline - 2017 Nengo Summer School

[All details about this school can be found online at]

The Centre for Theoretical Neuroscience at the University of Waterloo is inviting applications for our 4th annual summer school on large-scale brain modeling. This two-week school will teach participants how to use the Nengo software package to build state-of-the-art cognitive and neural models to run in simulation and on neuromorphic hardware. Nengo has been used to build what is currently the world's largest functional brain model, Spaun [1], and provides users with a versatile and powerful environment for designing cognitive and neural systems to run in simulated and real environments. For a look at last year's summer school, check out this short video:

We welcome applications from all interested graduate students, research associates, postdocs, professors, and industry professionals. No specific training in the use of modeling software is required, but we encourage applications from active researchers with a relevant background in psychology, neuroscience, cognitive science, robotics, neuromorphic engineering, computer science, or a related field.

[1] Eliasmith, C., Stewart T. C., Choo X., Bekolay T., DeWolf T., Tang Y., Rasmussen, D. (2012). A large-scale model of the functioning brain. Science. Vol. 338 no. 6111 pp. 1202-1205. DOI: 10.1126/science.1225266. []

*Application Deadline: February 15, 2017*

Format: A combination of tutorials and project-based work. Participants are encouraged to bring their own ideas for projects, which may focus on testing hypotheses, modeling neural or cognitive data, implementing specific behavioural functions with neurons, expanding past models, or providing a proof-of-concept of various neural mechanisms. Hands-on tutorials, work on individual or group projects, and talks from invited faculty members will make up the bulk of day-to-day activities. A project demonstration event will be held on the last day of the school, with prizes for strong projects!

Topics Covered: Participants will have the opportunity to learn how to: build perceptual, motor, and sophisticated cognitive models using spiking neurons; model anatomical, electrophysiological, cognitive, and behavioural data; use a variety of single cell models within a large-scale model
; integrate machine learning methods into biologically oriented models; interface Nengo with various kinds of neuromorphic hardware (e.g. SpiNNaker); interface Nengo with cameras and robotic systems; implement modern nonlinear control methods in neural models; and much more…

Date and Location: June 4th to June 16th, 2017 at the University of Waterloo, Ontario, Canada.

Applications: Please visit, where you can find more information regarding costs, travel, lodging, along with an application form listing required materials.

If you have any questions about the school or the application process, please contact Peter Blouw ( We look forward to hearing from you!

Post has attachment

Post has attachment
Dear all

** Apologies for cross-posting

Would like to draw your attention to a new Special issue in JETCAS on "Low-Power, Adaptive Neuromorphic Systems: Devices, Circuit, Architectures and Algorithms" that I am co-editing with colleagues. Manuscript submission deadline is on April 30, 2017. Look forward to receiving your contributions.


Low-Power, Adaptive Neuromorphic Systems: Devices, Circuit, Architectures and Algorithms
Guest Editors
Full name Email Affiliation
Arindam Basu*
Nanyang Technological University
Tanay Karnik
Hai Li
Duke University
Elisabetta Chicca
Bielefeld University
Meng-Fan Chang
National Tsing Hua University
Jae-sun Seo
Arizona State University
(* Corresponding)

Scope and Purpose
The recent success of “Deep neural networks” (DNN) has renewed interest in bio-inspired machine learning algorithms. DNN refers to neural networks with multiple layers (typically two or more) where the neurons are interconnected using tunable weights. Though these architectures are not new, availability of lots of data, huge computing power and new training techniques (such as unsupervised initialization, use of rectified linear units as the neuronal nonlinearity, regularization using dropout or sparsity, etc.) to prevent the networks from over-fitting have led to its great success in recent times. DNN has been applied to a variety of fields such as object or face recognition in images, word recognition in speech or even natural language processing and the success stories of DNN keep on increasing every day.
However, the common training method in deep learning, such as back propagation, tunes the weights of neural networks based on the gradient of the error function, which requires a known output value for every input. It would be difficult to use such supervised learning methods to train and adapt to real-time sensory input data that are mostly unlabeled. In addition, training and classification phases of deep neural networks are typically separated, such that training occurs in the cloud or high-end graphics processing units, while their weights or synapses are fixed during deployment for classification. However, this makes it difficult for the neural network to continuously adapt to input or environment changes in real-world applications. By adopting unsupervised and semi-supervised learning rules found in biological nervous systems, we anticipate to enable adaptive neuromorphic systems for many real-time applications with a large amount of unlabeled data, similar to how humans analyze and associate sensory input data. Energy-efficient hardware implementation of these adaptive neuromorphic systems is particularly challenging due to intensive computation, memory, and communication that are necessary for online, real-time learning and classification. Cross-layer innovations on algorithms, architectures, circuits, and devices are required to enable adaptive intelligence especially on embedded systems with severe power and area constraints.
Topics of Interest
This special issue invites submissions relating to all aspects of adaptive neuromorphic systems across algorithms, devices, circuits, and architectures. Possible scalability to human brain-scale computing level with energy-efficient online learning is desired. Submissions are welcome in the following topics or other related topics:
• Spin mode adaptive neuromorphics with devices such as spin transfer nano-oscillator, domain wall memory, tunneling magnetic resistance, inverse spin hall effect, etc.
• Memristive technology based learning synapse and neurons
• Neuromorphic implementations of synaptic plasticity, short-term adaptation and homeostatic mechanisms
• Self-learning synapses (STDP and variants) and self-adaptive neuromorphic systems
• High fan-in scalable interconnect fabric technologies mimicking brain-scale networks
• Circuits and systems for efficient interfacing with post-CMOS memory based learning synapses
• Design methodology and design tools for adaptive neuromorphic systems with post-CMOS devices
• Algorithm, device, circuit, and architecture co-design for energy-efficient adaptive neuromorphic hardware
Important Dates
1. Manuscript submissions due: April 30, 2017
2. First decision: July 15, 2017
3. Revised manuscripts due: August 15, 2017
4. Final Decision: October 15, 2017
5. Final manuscripts due: November 15, 2017
Request for Information
Arindam Basu (

Best regards,

Post has attachment
A spike-based neuromorphic stereo-vision architecture, that shows how spike-timing helps resolve open problems (e.g. false negatives in stereo-correspondence):

I’ve accumulated some interesting(?) links over the last few years while following Deep Learning's development. FYI – the lists aren’t prioritized in any manner nor are they complete if you know of some additional ones that should be noted please let me know about them. I also apologize in advance if I've associated anyone with the wrong university or company, some of these change faster than I can keep up with and I've slowed down the last few months in tracking them.

Researcher websites: Yoshua Bengio - Deep Learning and AI researcher at University of Montreal Yann Lecun - Deep Learning (especially CNNs) researcher at NYU and Leads Facebook’s Research Yann Lecun’s thoughts on IBMs TrueNorth Chip Juergen Schmidhuber - Deep Learning (RNNs LSTMs) and AI researcher at IDSIA in Switzerland Eugenia Culurciello - Deep Learning hardware developer with a focus on Image Processing AI researcher at UC Berkeley Geoffrey Hinton – Deep Learning and AI pioneer at University of Toronto and Google Demis Hassabis – AI Researcher at Google’s Deep Mind Eugene Izhekevich – Researcher at Brian Corp, part of Qualcomm now Andrew Ng Stanford Researcher now heading Baidu’s Research Bruno Olshausen – Neuroscience researcher at UC Berkeley Ilya Sutskever – Deep Learning and AI researcher at OpenAI Gail Carpenter Neural Modeling researcher Stephen Grossburg Neural Modeling researcher

Some Companies: Focusing on medical applications Data Science Competitions Part of Intel now Demis Hassabis’ company (part of Google) Hughes Research Labs participated in the DAPRA SyNAPSE program IBM TrueNorth site Qualcomm Zeroth Processor site Numenta focuses on Hierarchical Temporal Memory (HTM) architectures (Generative model, Anomaly detection) Eugenio Culurciello’s company Eugene Izhekevich company (part of Qualcomm?) Facebooks Engineering page Microsoft’s Distributed Machine Learning Toolkit page Jurgen Schmidhuber’s company OpenAI website Microsoft Research page Microsoft’s Cognitive Toolkit page Google Research page Google TensorFlow site Amazon Cloud Machine Learning site Google Cloud Data Science site Google Cloud Machine Learning site

Some Research Centers: University of Brno speech research group NYU’s Computational & Biological Learning Lab where Yann Lecun works Spinnaker website modeling the human brain Swiss AI Lab IDSIA – where Schmidhuber works Stanford’s Research website

Software Websites and Repositories: Caffe website Baidu’s speech recognition repository Deep Learning resources for computer vision Microsoft Computational Network Toolkit for Deep Learning and Machine Learning runs on Windows Keras Github repository, Deep Learning for Python matconvnet Github repository, CNNs for MatLab Github site for RNNSharp toolkit RNNLIB repository by Alex Graves (Google Deepmind) for RNNs CURRENNT repository for RNNs Theano website PyLearn 2 machine learning library built on Theano Machine Learning Library for Python JHU HLT COE ASR open source ASR project called Kaldi Kaldi Github repository

Tutorials: A tutorial on Neural Nets, good introduction to concepts A tutorial on Neural Nets, good introduction to concepts A tutorial on Neural Nets, good introduction to concepts RNN tutorial Open Source Deep Learning Curriculum Stanford Machine Learning class JHU Machine Learning class

Other Worthwhile sites: about the SyNAPSE project Brain Architecture Project Institute of Neuromorphic Engineering website Human Brain Project MIT tech Review article on DARPA SyNAPSE Critique of Deep Learning Hype Singapore University Extreme Learning Machines website Deep Learning Clearing house website with links to lots of good stuff Deep Learning Software Links NVidia CUDA site NVidia Deep Learning Hardware and Software site CEVA specialized hardware for DNN processing in mobile devices Open Source Personal Assistant website from U of Michigan Allen Institute for AI Online deep Learning book by Goodfellow, Bengio, and Courville


The Second Misha Mahowald Prize for Neuromorphic Engineering.

The Misha Mahowald Prize recognizes outstanding research in neuromorphic engineering in a broad sense: neurally-inspired hardware, but also neuromorphic software, algorithms, and architectures can compete for the award.

The Prize is awarded by a jury of international experts and carries a cash prize of USD 3000.

The inaugural prize was awarded in 2016 to IBM Research - Almaden for their ground-breaking project on the neuromorphic processor TrueNorth.

The competition is open to any individual or research group worldwide. A description of any type of neurally-inspired hardware, software, or algorithm may be submitted. The award is for an original, ground-breaking contribution to neuromorphic engineering. The work of individuals and groups will be considered equally. Only one winner is announced each year. There are no runners-up. Revised resubmissions are encouraged.

To apply:

Send an extended abstract in English of up to two DIN A4 pages, containing:

• Applicant(s) and affiliation(s)
• Contact person information
• Project title
• Brief description of the work, its novelty, and its potential impact, including images/tables/original paper links
• Link to a video, if applicable (authors must arrange for unrestricted online viewing of video)

Send the document as a PDF file (max. size 2 MB) to

If a video is included in the submission, a download link to the original source file should be included.

The submission deadline is February 1, 2017.

2017 Jury:

• Prof. Dr. Steve Furber, University of Manchester
• Dr. Dan Hammerstrom, DARPA
• Prof. Dr. Christof Koch, CSO, Allen Institute for Brain Science
• Dr. Dharmendra Modha, IBM Research - Almaden
• Dr. Eric Ryu, Master, Samsung Electronics
• Prof. Dr. Terrence Sejnowski, Salk Institute (head of the Jury)

The Prize is sponsored by iniLabs, a technology company based in Switzerland that invents, produces, and sells neuromorphic technologies for research. iniLabs plays no role in selecting the nominee.

The award is named for Misha Mahowald, a creative and influential pioneer, who passed away before she could see the field flourishing. She created some of the first neuromorphic circuits including the silicon retina and the silicon neuron.

Post has attachment How the brain recognizes faces: Machine-learning system spontaneously reproduces aspects of human neurology.

Post has attachment
Dear colleagues,

I am very to let you know that our open access article for the Wiley Encyclopedia of EEE is already available at:

Thanks to everybody for getting us to this point, and special thanks to those who pushed to have it open access.
Best wishes to everybody,


Post has attachment
New PhD open positions for the NeuroAgents project are available at INI:

Wait while more posts are being loaded