TFNN

The TFNN project has turned into SynthNet – you can read more about it here.

TFNN, short for “Temporal Frame Neural Network”, is a very on-going artificial intelligence research project I started back a number of years ago. The ultimate goal is a true, functional model of the biological neural network in software. While this is an incredibly lofty goal, the project serves as more of a learning opportunity for me (and anyone else interested).

Currently, the Temporal Frame Neural Network demonstrates the following abilities:

  1. Associative Learning (Via Hebbian Plasticity)
  2. Non-Associative Learning (Habituation and Sensitivity)
  3. Increased or decreased transmitter effectiveness via virtual neuromodulators
  4. Connectivity via axodendritic, axosomatic, and axoaxonic synaptic connections
  5. Cell growth and death due to virtual neurotrophins
  6. Geographic representation of neural network, allowing for spatial dependent connections
  7. Parallel functionality, allowing for a more accurate simulation
  8. A “visual fMRI” engine to display activity within a specific matrix

Some (very lofty) things left to do:

  1. Include more accurate support for neurotransmitters and neuromodulators with specific behaviors to each other
  2. Functionality to grow neural pathways as dictated by virtual DNA
  3. Engine to take real DNA data (sea slug, etc) and convert to virtual DNA
  4. Include ability to take advantage of multiple core processors

Again, this is a learning adventure for me, so if you have any knowledge or ideas to contribute, please don’t hesitate.