Schedule 2013

Lecture slides are here:
http://www.illc.uva.nl/LaCo/clas/fncm13/lectures


Week 1: Dynamical Systems
28/10-1/11

Reading:

  • Sejnowski et al. 
  • Math or Neuroscience refreshers (if needed)
  • Extra: The dynamical systems theory covered in lectures 1-3 is also described in the (unpublished) reader Theoretical Biology by Rob de Boer (Utrecht University), chapter 13 (Mathematical Background), or in more detail in the (unpublished) reader Mathematics for Biologists by Kirsten ten Tusscher and Alexander Panfilov (Utrecht University).
Classes:
  1. L1. Introduction, Marr's Levels, Introduction to Dynamical Systems (differential and difference equations of 1 variable)
  2. L2. Matrix algebra, Differential Equations with 2 variables, Equilibria types


Week 2: Models of Single Neurons
4-8/11

Readings:


Classes:

  1. L. Nonlinear dynamical systems
  2. C. Computer lab: Analysis of differential equations in R, Fitzhugh-Nagumo, Ishikevich
  3. W. Exercises Dynamical systems, Discussion of literature, L. Hodgkin-Huxley model, Fitzhugh-Nagumo, Izhekivich model


Week 3: Neural networks I (Perceptron, Feedforward networks, Backpropagation)
11-15/11

Readings:
  • Kröse, Ben, and Patrick van der Smagt (1996), "An introduction to neural networks." (1996), chapters 2, 3, 4, (up to section 4.5). (pdf)
  • Marcus et al. (1999), Science; and replies in TICS.
Classes:
  1. L. Perceptron, deltarule, backpropagation
  2. C. Computerlab Perceptron, deltarule, backpropagation
  3. W/L. Exercises, discussion of literature


Week 4: Neural networks II (Hopfield Networks, Hebbian learning)
18-22/11

Readings:
  • Recommended: Kevin Gurney, An Introduction to Neural Networks - Lecture Notes (chapters 5 and Chapter 6, upto section 2) (see chapter 5 chapter 6)
  • Alternatively: Kröse, Ben, and Patrick van der Smagt (1996), "An introduction to neural networks." (1996), chapter 5 (pdf)
Classes
  1. L.  Lecture Hopfield networks, Hebbian learning
  2. C. Computerlab Hopfield Networks
  3. Lecture: Symbolic Models (in particular: contextfree grammars)

Week 5: The binding problem (linking symbolic and neural models)
25-29/11

Readings:
  • Von der Malsburg, Christoph. "The what and why of binding: the modeler's perspective." Neuron 24.1 (1999): 95-104. (article)
  • Altmann, E. M. and Trafton, J. G. (1999). Memory for goals: An architectural perspective. In Proceedings of the twenty-first annual conference of the Cognitive Science Society (pp. 19-24). Hillsdale, NJ: Erlbaum.
    pdf
  • Terrence C. Stewart and Chris Eliasmith (2011). Neural Cognitive Modelling: A Biologically Constrained Spiking Neuron Model of the Tower of Hanoi Task. In Proceedings of the thirty-third annual conference of the Cognitive Science Society. Hillsdale, NJ: Erlbaum.
    pdf
Classes:
  1. L. Guest lecture - Gideon Borensztajn (ILLC, UvA): Dynamic Binding
  2. C. Computerlab - Recursion in Towers of Hanoi and Language
  3. W/L. Discussion of literature


Week 6: Miniprojects
2-6/12

Classes:
  1. Guest lecture: Leendert van Maanen (Psychology, UvA): Evidence accumulator models of decision making
  2. C. Computerlab - Projects
  3. W. Student presentations: groups 2 and 3. L. Introduction Bayesian Modelling


Week 7: Bayesian Models
9-13/12

Readings:
  • Griffiths, T., Chater, N., Kemp, C., Perfors, A., & Tenenbaum, J. (2010). Probabilistic models of cognition: exploring representations and inductive biases Trends in Cognitive Sciences, 14 (8), 357-364 DOI: 10.1016/j.tics.2010.05.004 
  • McClelland, J., Botvinick, M., Noelle, D., Plaut, D., Rogers, T., Seidenberg, M., & Smith, L. (2010). Letting structure emerge: connectionist and dynamical systems approaches to cognition Trends in Cognitive Sciences, 14 (8), 348-356 DOI: 10.1016/j.tics.2010.06.002 
  • Extra: nice blog post by ReplicatedTupo:
  • Extra: a tutorial on Bayesian models in cognitive neuroscience by O'Reilly & Mars (in: Model-based cognitive neuroscience)
Classes:
  1. W. Student presentations: groups 4,5,6,7.
  2. C. Computer lab
  3. L. Modelling: from neuron to behavior

Additional Reading
  • Dehaene, Stanislas, and Jean-Pierre Changeux. "A hierarchical neuronal network for planning behavior." Proceedings of the National Academy of Sciences 94.24 (1997): 13293-13298. (article) - another neural model of a Towers task: the Towers of London
  • Maass, Wolfgang, Thomas Natschläger, and Henry Markram. "Real-time computing without stable states: A new framework for neural computation based on perturbations." Neural computation 14.11 (2002): 2531-2560. (full article from publisher's site) - heavy mathematics in exciting paper introducing a new framework based on spiking networks and reservoir computing
  • Jeff Hawkins & Sandra Blakeslee, On Intelligence (2004), New York: Owl Books - popular science book, reviewing developments in AI and neuroscience and arriving at a new integration called the Memory Prediction Framework
  • Eliasmith, Chris, et al. "A large-scale model of the functioning brain." Science (2012), 338:1202-1205. (article) - paper on building a massive simulation of an artificial brain solving a variety of tasks - skips over all mathematical details

Geen opmerkingen:

Een reactie posten