Hippocampal codes, hippocampal function

Added an excellent article preview by Howard Eichenbaum [cite source=’pubmed’]23522037[/cite]. The piece is ostensibly about the finding of spatial-replays in the hippocampus at choice points in a W-maze, and how replays seem to occur for multiple remembered paths, with replays of the correct path predictive of success on the task. More than this, however, Eichenbaum pulls together a range of recent findings to synthesize two traditions of hippocampus research: spatial navigation (O’Keefe & Nadel, hippocampus as a cognitive map) and memory (Squire and Zola-Morgan–hippocampus as a memory processing center). This synthesis depends critically on the ongoing work of Georg Buzsaki, which is conveniently summarized (in part) in an accompanying perspective [cite source=’pubmed’]23522038[/cite] on the theta-gamma neural code (which, I must admit, I still don’t fully understand). Anyways, I can remember reading Eichenbaum’s work as an undergrad which was already drawing parallels between navigation and declarative memory (with a clever olfactory memory paradigm, if I recall correctly)–this paper is an excellent overview of an emerging and unified view of what exactly the hippocampus might be doing.

Clarity – new technique for clearing brain tissue

New CLARITY technique seems capable of leeching away lipids in the CNS while maintaining intact protein structure through a transparent matrix [cite source=’pubmed’]23575631[/cite]. The upshot is a way of turning the brain almost completely transparent, allowing intact imaging of proteins. This is pretty amazing stuff. Not clear if it is better/worse than SCALE, which was recently reported in Nature Neuroscience [cite source=’pubmed’]21878933[/cite].

Neuroscientists dream of electric dream decoders…

This is the popular-press writeup of a paper out in Science on using EEG to try to decode dreams. In practice, it’s less sexy and useful than it sounds: “They could identify the type of object a subject had seen: it could predict that a man had dreamt about a car, not that he’d been cruising around in a Maserati. And the decoder only worked when the researchers gave it a pair of possible objects to chose from (whether it was a man or a chair, for example).”.

Will technology like this grow incrementally better to the point where it truly reaches ‘mind reading’? Or is the information collected to poor and the complexities of thought to subtle to ever allow this to progress beyond trivial and constrained types of decoding?

Nicolelis day – Part 2

Third and final installment of Nicolelis day. Here, the Nicolelis lab develops a cortical sensory prosthesis which allows rats to detect and respond to infrarad light [cite source=’pubmed’]23403583[/cite]. A IR detector is mounted on the head, and the detected intensity determines the level of stimulation over the a whisker-barrel of the somatosensory cortex. With training, the rats become quite proficient at using the prosthesis–they sweep their heads back and forth to scan the IR field and can reliably choose a lever cued by an IR signal. Importantly, the input to S1 is not actually deleted prior to the prosthesis being turned on, and neurons end up tuning to *both* IR and touch. This is early stages, but one limitation is the use of a simple point-source as the sensation (no spatial dimension to the new sense). Very clever, and they give an appropriate shout-out to the pioneering work of Paul Bach-y-Rita on sensory substitution.

Nicolelis day – part 1

It’s Nioclelis day, with another incredible report from this lab [cite source=’pubmed’]23448946[/cite]. In this case, it’s a brain-to-brain interface (BTBI)–a realtime flow of information from one brain to another. In practical details, it’s not quite as exciting as one might think. One rat (the encoder) makes a decision between two textures of levers. A multi-electrode array records cortical activity related to the choice, codes it as texture 1 or texture 2 and then plays one of 2 patterns into a stimulating array of another rat (the receiver). The receiver rat can then make a behavioral response based on which of the two patterns it has received. As you can see, the heavy lifting here is done by the interface between the two rats. In fact, both rats can do the task independently (and were trained up that way). It’s like having one rat press a button to play one of two sounds, and the other rat has been trained to make responses based on the sound it hears–really it’s just a fancy way of communicating the information between the two. The researchers note changes in response latency when working as a dyad, which they take to indicate a complex interrelationship forming between the rats. I didn’t dig to deeply into the data, but it seems to me that this is just evidence for the task being a bit more difficult/different when working as a true dyad (especially in terms of the timing of the broadcast). It really is impressive, but not quite revolutionary as some press coverage has implied. It also seems to be a missed opportunity–the real chances for inter-brain communication come from true interaction. For example, perhaps amygdala activation from one rat is broadcast to the cortexes of the others, enabling them to know when it is afraid. Delgado once did something like this, where monkeys in a colony could press buttons that could calm or agitate a member in the colony, and they quickly began using these to exert social control over one another! Anyways, this is a cool and fascinating start, but probably just the tip of the iceberg of how direct intra-brain interaction could be used.

Neuro classic – Physical control of the mind

To finish out Nicolelis day, here is a scan of kindred-spirit JM Delgado’s masterpiece “Physical Control of the Mind“. I found this online, typed out by hand with the photos scanned in. It compares pretty well with my original copy, but some typos are evident. Also, it seems to be missing the last 4 chapters, which have some of Delgado’s most provocative ideas about freedom, the self, and the future. Despite that, free is a good bargain, especially when originals are now going on Amazon from $40 up to $289! (I feel glad to have got mine a couple of years ago for $4!).