Post has attachment
Neocortical circuits, as large heterogeneous recurrent networks, can potentially operate and process signals at multiple timescales, but appear to be differentially tuned to operate within certain temporal receptive windows. The modular and hierarchical organization of this selectivity mirrors anatomical and physiological relations throughout the cortex and is likely determined by the regional electrochemical composition. Being consistently patterned and actively regulated, the expression of molecules involved in synaptic transmission constitutes the most significant source of laminar and regional variability. Due to their complex kinetics and adaptability, synapses form a natural primary candidate underlying this regional temporal selectivity. The ability of cortical networks to reflect the temporal structure of the sensory environment can thus be regulated by evolutionary and experience-dependent processes.

Post has attachment
International Journal of Computational & Neural Engineering

source : goo.gl/k2cAmM

#SciDocPublishers #ComputationalNeuroscience
Photo

Post has attachment
- neural networks - - feed forward neural networks (FF or FFNN) and perceptrons (P)
Rosenblatt, Frank. “The perceptron: a probabilistic model for information storage and organization in the brain.” Psychological review 65.6 (1958): 386
http://www.ling.upenn.edu/courses/cogs501/Rosenblatt1958.pdf

- radial basis function (RBF)
Broomhead, David S., and David Lowe. Radial basis functions, multi-variable functional interpolation and adaptive networks. No. RSRE-MEMO-4148. ROYAL SIGNALS AND RADAR ESTABLISHMENT MALVERN (UNITED KINGDOM), 1988.
http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA196234

- a Hopfield network (HN)
Hopfield, John J. “Neural networks and physical systems with emergent collective computational abilities.” Proceedings of the national academy of sciences 79.8 (1982): 2554-2558.
https://bi.snu.ac.kr/Courses/g-ai09-2/hopfield82.pdf

- markov chains (MC or discrete time Markov Chain, DTMC)
Hayes, Brian. “First links in the Markov chain.” American Scientist 101.2 (2013): 252.
http://www.americanscientist.org/libraries/documents/201321152149545-2013-03Hayes.pdf

- Generative adversarial networks (GAN)
Goodfellow, Ian, et al. “Generative adversarial nets.” Advances in Neural Information Processing Systems. 2014.
https://arxiv.org/pdf/1406.2661v1.pdf

- - Boltzmann machines (BM)
Hinton, Geoffrey E., and Terrence J. Sejnowski. “Learning and releaming in Boltzmann machines.” Parallel distributed processing: Explorations in the microstructure of cognition 1 (1986): 282-317.
https://www.researchgate.net/profile/Terrence_Sejnowski/publication/242509302_Learning_and_relearning_in_Boltzmann_machines/links/54a4b00f0cf256bf8bb327cc.pdf

- Restricted Boltzmann machines (RBM)
Smolensky, Paul. Information processing in dynamical systems: Foundations of harmony theory. No. CU-CS-321-86. COLORADO UNIV AT BOULDER DEPT OF COMPUTER SCIENCE, 1986.
http://www.dtic.mil/cgi-bin/GetTRDoc?Location=U2&doc=GetTRDoc.pdf&AD=ADA620727

- autoencoders (AE)
Bourlard, Hervé, and Yves Kamp. “Auto-association by multilayer perceptrons and singular value decomposition.” Biological cybernetics 59.4-5 (1988): 291-294.

https://pdfs.semanticscholar.org/f582/1548720901c89b3b7481f7500d7cd64e99bd.pdf

- Sparse autoencoders (SAE)
Marc’Aurelio Ranzato, Christopher Poultney, Sumit Chopra, and Yann LeCun. “Efficient learning of sparse representations with an energy-based model.” Proceedings of NIPS. 2007.
https://papers.nips.cc/paper/3112-efficient-learning-of-sparse-representations-with-an-energy-based-model.pdf

- variational autoencoders (VAE)
Kingma, Diederik P., and Max Welling. “Auto-encoding variational bayes.” arXiv preprint arXiv:1312.6114 (2013).

https://arxiv.org/pdf/1312.6114v10.pdf

- denoising autoencoders (DAE)
Vincent, Pascal, et al. “Extracting and composing robust features with denoising autoencoders.” Proceedings of the 25th international conference on Machine learning. ACM, 2008.
http://machinelearning.org/archive/icml2008/papers/592.pdf

- deep belief networks (DBN)
Bengio, Yoshua, et al. “Greedy layer-wise training of deep networks.” Advances in neural information processing systems 19 (2007): 153.
https://papers.nips.cc/paper/3048-greedy-layer-wise-training-of-deep-networks.pdf

- deconvolutional networks (DN),
Zeiler, Matthew D., et al. “Deconvolutional networks.” Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on. IEEE, 2010.
http://www.matthewzeiler.com/pubs/cvpr2010/cvpr2010.pdf

-Deep convolutional inverse graphics networks (DCIGN)
Kulkarni, Tejas D., et al. “Deep convolutional inverse graphics network.” Advances in Neural Information Processing Systems. 2015.
https://arxiv.org/pdf/1503.03167v4.pdf

cr: asimovorg

Photo

Post has attachment
Voluntary muscle movement is known to coincide with changes in the 20-30 Hz oscillations in electrical recordings of both the brain (EEG) and the muscle (EMG). This computational model published in the Journal of Neurocomputing shows how the descending motor tract can translate patterns of oscillations in motor cortex into patterns of muscle contraction in a simulated limb. It offers a new perspective on the functional role of oscillations in motor control.

http://www.sciencedirect.com/science/article/pii/S0925231215008814
Photo

Post has attachment

Post has attachment
2017 Summer School on Large-Scale Brain Modelling

[All details about this school can be found online at http://www.nengo.ca/summerschool]

The Centre for Theoretical Neuroscience at the University of Waterloo is inviting applications for our 4th annual summer school on large-scale brain modeling. This two-week school will teach participants how to use the Nengo software package to build state-of-the-art cognitive and neural models to run in simulation and on neuromorphic hardware. Nengo has been used to build what is currently the world's largest functional brain model, Spaun [1], and provides users with a versatile and powerful environment for designing cognitive and neural systems to run in simulated and real environments. For a look at last year's summer school, check out this short video: https://goo.gl/EkhWCJ

We welcome applications from all interested graduate students, research associates, postdocs, professors, and industry professionals. No specific training in the use of modeling software is required, but we encourage applications from active researchers with a relevant background in psychology, neuroscience, cognitive science, robotics, neuromorphic engineering, computer science, or a related field.

[1] Eliasmith, C., Stewart T. C., Choo X., Bekolay T., DeWolf T., Tang Y., Rasmussen, D. (2012). A large-scale model of the functioning brain. Science. Vol. 338 no. 6111 pp. 1202-1205. DOI: 10.1126/science.1225266. [http://nengo.ca/publications/spaunsciencepaper]

**Application Deadline: February 15, 2017**

Format: A combination of tutorials and project-based work. Participants are encouraged to bring their own ideas for projects, which may focus on testing hypotheses, modeling neural or cognitive data, implementing specific behavioural functions with neurons, expanding past models, or providing a proof-of-concept of various neural mechanisms. Hands-on tutorials, work on individual or group projects, and talks from invited faculty members will make up the bulk of day-to-day activities. A project demonstration event will be held on the last day of the school, with prizes for strong projects!

Topics Covered: Participants will have the opportunity to learn how to: build perceptual, motor, and sophisticated cognitive models using spiking neurons; model anatomical, electrophysiological, cognitive, and behavioural data; use a variety of single cell models within a large-scale model
; integrate machine learning methods into biologically oriented models; interface Nengo with various kinds of neuromorphic hardware (e.g. SpiNNaker); interface Nengo with cameras and robotic systems; implement modern nonlinear control methods in neural models; and much more…

Date and Location: June 4th to June 16th, 2017 at the University of Waterloo, Ontario, Canada.

Applications: Please visit http://www.nengo.ca/summerschool, where you can find more information regarding costs, travel, lodging, along with an application form listing required materials.

If you have any questions about the school or the application process, please contact Peter Blouw (pblouw@uwaterloo.ca). We look forward to hearing from you!

Post has attachment
Death and rebirth of neural activity in sparse inhibitory networks




David Angulo-Garcia, Stefano Luccioli, Simona Olmi, Alessandro Torcini
doi: http://dx.doi.org/10.1101/082974
Photo

Computational Theory of Mind

Post has attachment

Thank you, can't wait to dig in!
Wait while more posts are being loaded