Notes on readings related to reinforcement learning, neural nets, cognitive architectures, AI, and related topics.
See the live version at https://nickjalbert.github.io/reading/
- start spellcheck:
:set spell
- move between misspelled words:
]s
and[s
- get suggestions:
z=
- add word to dictionary:
zg
- mark a word as incorrect:
zw
- Jekyll theme:
/usr/local/lib/ruby/gems/2.7.0/gems/minima-2.5.1
- Serve the site locally:
./scripts/serve
- Strang Lecture 14: Orthogonal Vectors and Subspaces
- Read and feedback on Andy's agentos proposal
- Start mouse simulator work: implement inference on two+ timescales (e.g. phi vs v_p).
- Strang Lecture 9: Independence, Basis and Dimension
- Strang Lecture 10: The Four Fundamental Subspaces
- Discuss mouse simulator next steps
- Figure out where the dynamical eqns from exercise 3 come from.
- Finish section 2 in tutorial.
- Discuss what more applied engineering here looks like.
- A tutorial on the free-energy framework for modelling perception and learning: exercise 3
- 3.1 Matrix Transformations
- Strang Lecture 8: Solving Ax = b: Row Reduced Form R
- Surfing Uncertainty: Chapter 3
- Strang Lecture 6: Column Space and Nullspace
- The free-energy principle: a unified brain theory?
- Introduction to Bayesian data analysis - Part 1: What is Bayes?
- Introduction to Bayesian data analysis - Part 2: Why use Bayes?
- Introduction to Bayesian data analysis - Part 3: How to do Bayes?
- Strang Lecture 14: Orthogonal Vectors and Subspaces
- Surfing Uncertainty: Chapter 2
- Six views of embodied cognition
- Surfing Uncertainty: Chapter 1
- Memory engrams: Recalling the past and imagining the future
- Surfing Uncertainty: Introduction