AITree of Artificial Intelligence Mind Modules
1. Code the Main 
Alife Program Loop of 
 
AI4U Chapter 1(2002):1-9.
            ___________                    ___________           
           /           \                  /           \          
          /  Motorium   \                /  Security   \         
          \_____________/\    ______    /\_____________/
       __________         \  /      \  /          _________
      /          \         \/  main  \/          /         \
     (  Volition  )--------<   Alife  >---------( Sensorium )
      \__________/         /\  loop  /\          \_________/
           _____________  /  \______/  \  _____________
          /             \/              \/             \
          \    Think    /                \   Emotion   /
           \___________/                  \___________/
Code the Alife loop shown above in your chosen programming language.
Use either an actual loop with subroutine calls, or make a ringlet
of perhaps object-oriented module stubs, each calling the next stub.
Provide the ESCAPE key or other mechanisms for the user to stop the AI.
Spread your code around the Web and invite AI coders to expand on it.
Watch for a proliferation of unique AI Mind entities evolving rapidly
on the Web and competing genetically for the survival of the fittest.
2. Code the 
Sensorium module of 
 
AI4U Chapter 23(2002):99-103.
Now you have two modules, a main aLife module and a subordinate, 
Sensorium module. But what should come next in evolution? 
Now we need a reaction module, so that the organism may react 
to its environment. Let's call the reaction-module "Think". 
3. Stub in the Think module of AI4U Chapter 6(2002):29-34.
Now, of course, the simple organism is not truly thinking yet,4. Initiate the Audition module of AI4U Chapter 24(2002):104-107.
Drop the [ESCAPE] mechanism down by one tier, into the Audition
module, but do not eliminate or bypass the quite essential
Sensorium module, because another programmer may wish to specialize
 
in implementing some elaborate sensory modality among your
Sensorium stubs.  Code the Audition module initially to deal
with ASCII keyboard input.  If you are an expert at speech
recognition, extrapolate backwards from the storage requirements
(space and format) of the acoustic input of real phonemes in
your Audition system, so that the emerging robot Mind may be
ready in advance for the switch from hearing by keyboard to
hearing by microphone or artificial ear. Anticipate evolution.
5. The Listen module of AI4U Chapter 25(2002):108-111.
Stub in a new module and call it the Listen module.
6.  
Auditory Short Term Memory (audSTM) in 
 
AI4U Chapter 26(2002):112-118.
Create an array for the sequential capture and retrieval of each 
discrete unit of auditory input, be it an ASCII key-press or a 
phoneme of acoustic sound. Plan and coordinate your engram array 
to simulate any required feature of a neuronal memory synapse -- 
spiking connectivity, rapid excitation and gradual signal-decay, etc. 
Do not mimic what everybody else in avant-garde AI is doing, but 
rather fling your own line of AI evolution out onto the Web and 
nearby parsecs with the most advanced I/O that you can devise. 
7. The 
Motorium module in 
 
AI4U Chapter 4(2002):20-23.
As soon as you have sensory memory for audition, 
it is imperative to include motor memory for action. 
The polarity of robot-to-world is about to become a 
circularity of robot - motorium - world - sensorium - robot. 
If you have been making robots longer than you have been 
making minds, you now need to engrammatize whatever 
motor software routines you may have written for your 
particular automaton. You must decouple your legacy 
motor output software from whatever mindless stimuli 
were controlling the robot and you must now associate 
each motor output routine with memory engram nodes 
accreting over time onto a lifelong motor memory channel 
for your mentally awakening robot. If you have not been 
making robots, implement some simple motor output 
function like emitting sounds or moving in four directions 
across a real or virtual world.
8. Stub in the 
Volition module of 
 
AI4U Chapter 5(2002):24-28.
In your robot software, de-link any direct connection 
that you have hardcoded between a sensory stimulus 
and a motor initiative. Force motor execution commands 
to transit through your stubbed-in Volition module, so that 
future versions of your thought-bot will afford at least the 
option of incorporating a sophisticated algorithm for free 
will in robots. If you have no robot and you are building 
a critter of pure reason, nevertheless include a Volition 
stub for the sake of AI-Complete design patterns.
9. The 
Security module of 
 
AI4U Chapter 2(2002):10-14.
The Security module is not a natural component of the mind, 
but rather a machine equivalent of the immune system in 
a human body. When we have advanced AI robots running 
factories to fabricate even more advanced AI robots, 
let not the complaint arise that nobody bothered to 
build in any security precautions. Do it now.
10. 
Human-Computer Interaction (HCI) module of 
 
AI4U Chapter 3(2002):15-19.
The HCI module, called by Security, is likewise not a natural 
component of the mind, but rather a requirement of robot 
hardware and AI software. Security and HCI work together 
to prevent dangerous inputs and dangerous outputs. We don't 
want intruders taking over the AI, and we don't want AI 
taking over as our robotic overlords -- who might welcome 
the possibility of implanting RFID chips in all humans.
11.  The 
Rejuvenate module of 
 
AI4U Chapter 21(2002):87-92.
Logic dictates that, if and probably if you use a programming language
that permits changes on the fly to AI source code already running, then 
the demo AI in your museum or corporate waiting room may live forever.
On a principle of muse it or luse it, your AI Mind forgets en masse all 
its oldest memories -- except for those brought forward by associative recall. 
Standards in artificial intelligence urge you to code AI on a 64-bit platform 
so that ab origine you have a practically unlimited memory space, just as 
human babies are outfitted with enough wetware RAM to last a human lifetime. 
Robot babies are said to quicken when the Rejuvenate module works properly. 
Let one hundred robots blossom and contend to have the oldest-living AI Mind. 
12. 
Ego Self-Preservation module of 
 
AI4U Chapter 20(2002):84-86.
The Ego module is not the seat of the concept of self in the AI, 
but rather it is a supervenient mechanism that intervenes to 
jolt the self back into thought and consciousness after any 
flatline period of the accidental cessation of associative 
chains of thought. Suppose that you are a Cerebrifex providing 
robot brains to industry and you wish to put on demonstrations 
without the show-stopper glitches that plague Apple and 
Microsoft when the whole world is watching as the product 
goes brain-dead at the worst possible moment. At a similar 
moment in your AI demonstration, when the big screen in 
Las Vegas goes blank for a few seconds, the Ego module 
kicks in and says, "I (mumble) (whatever)." And it does not 
have to be the Ego module that saves the day. If your 
conglomerate client wants the AI to think first and foremost 
 
about widgets, then a Widget module could resuscitate the 
robot mind and not an Ego module. In any event, these 
death-defying reset modules are necessary only in the early 
days of primitive AI Minds, into which the robot-makers have 
not yet integrated such compelling sensory inputs that 
robotic brain function could not possibly stop. Disembodied AI 
needs the Ego-reset module or its equivalent to preserve 
the artificial life of the individual specimen. If your 
career forces you to play a godlike role, you may not care 
about the survival of any individual and your chief worry 
may be about the survival of an entire species. Then you 
might outfit half of your individuals -- the worker robots -- 
with a Reproduction module that always thinks about building 
new robots, while the other half -- that feeds and clothes 
the young -- is blessed with a Shopping module. In human 
terms, it would be as if the males were controlled by a 
S*x module and the females were ruled by a Shopping module.
13.  The Troubleshoot module on 
 
AI4U page 206 
is specific to the JavaScript Mind.html program, and takes care of 
the internal JavaScript housekeeping required when the human user 
selelcts the diagnostic troubleshoot display-mode. 
14. The 
English Bootstrap (enBoot) module of 
 
AI4U Chapter 22(2002):93-98 
provides the world's first True AI with several dozen words that are 
the English names of just enough concepts to demonstrate not how a baby 
mind grows but rather how a mature mind thinks and reasons. It will be 
the task of a more evolved AI to start life as a blank slate that needs 
to learn human language in the same way as a human baby learns. 
Mind.Forth and Mind.html start out with several dozen concepts and 
immediately learn new concepts taught to the AI by a human user. 
15. The 
Auditory Recognition (audRecog) in 
 
AI4U Chapter 27(2002):119-123 
is a mind-module that uses neural-network pattern-recognition to recognize 
whole words in English. The algorithm is not simple string-matching, 
but rather quasi-phonemic match-up based upon the activation-levels 
of the engrams stored in the auditory memory channel. Future evolution 
of the audRecog mind-module must eventually deal with morphemes as 
subsets of whole words, and with true acoustic phonemes instead of 
ASCII keyboard input. 
16. The audDamp module on 
 
AI4U page 166 
resets auditory engram activations to zero after the recognition of a word, 
so that the AI may recognize the next incoming word without interference. 
17. The 
Instantiate module of 
 
AI4U Chapter 32(2002):144-147 
creates a time-bound node on the software model of a conceptual brain 
fiber in the artificial Mind. Thus a new instance of any concept is 
instantiated when the Mind deals with the concept in either the thinking 
of output or the comprehension of input. Software tags are attached to 
take the place of synapses on human brain fibers. Only one fiber is 
modeled for each concept in software, based upon the presumption of 
software reliability vis-a-vis reliability by redundancy in a human brain. 
18. The 
oldConcept module of 
 
AI4U Chapter 29(2002):128-130 
deals with the recognition of an already known word in the input stream 
by activating all recent nodes of the concept underlying the English word. 
19. The 
newConcept module of 
 
AI4U Chapter 28(2002):124-127 
treats any unkown word of input, even a mistake or a variant spelling, as 
a new concept to be learned and understood by the artificial intelligence. 
As long as a mistaken or unusual spelling is used consistently, the AI will 
build up a pattern of knowledge about the concept represented by the word. 
Each new word is given some initial activation so as to stimulate thought. 
20. The 
Parser module of 
 
AI4U Chapter 30(2002):131-140 
tries to determine the part of speech (noun, verb, etc.) of any word 
encountered in the input stream entering the artificial mind. Many 
parsers are available on the 'Net and could potentially be integrated 
into AI Minds based on the algorithms stated in these mind-module steps. 
21. The 
English Vocabulary (enVocab) in 
 
AI4U Chapter 31(2002):141-143 
is a mind-module that performs the software housekeeping of attaching tags 
to English lexical vocabulary words, so that activations may flow from deep 
mindcore concepts up to English (or foreign language) lexical items, and so 
that the activation of lexical items may in turn cause the re-activation of 
auditory word-engrams stored in the auditory memory channel -- 
a self-perceiving channel where the mind hears itself think. 
22. Enhance the 
Think module of 
 
AI4U Chapter 6(2002):29-34 
which initiates the thinking process by selecting a language to think in. 
Concepts deep in the mind are independent of language, but grammar and 
syntax and vocabulary are different for, say, English and German and Russian. 
Mentifex AI claims not only to have solved AI but also to have solved the 
problem of 
 
machine translation (MT). 
23.  The 
Activate module of 
 
AI4U Chapter 33(2002):148-151 
reactivates older nodes of a newly active concept during input 
from the external world. During generation of mental output, the 
modules nounAct and verbAct, derived from the Activate module, 
reactivate nouns and verbs in particular so as to enable the 
carry-over or slosh-over of initial activation from a subject-noun, 
 
which arrives as a spike at specific nodes of a verb-concept 
and, incremented by the additive activation of a selected verb, 
selects in turn the proper object of the subject and verb. 
24. 
Spreading Activation (spreadAct) in 
 
AI4U Chapter 34(2002):152-155 
is the fundamental principle of the artificial mind. Starting from a most active 
concept, the modules Activate, nounAct and verbAct send a spike of 
activation by associative tag to all associated concepts in a process 
called "thought" or "thinking." It takes a Chomskyan linguistic 
superstructure to express deep conceptual thought in shallow 
human language. 
25. 
Subject-Verb-Object (SVO) module of 
 
AI4U Chapter 12(2002):54-57 
is the most basic syntax of verbal thinking in man or machine. 
It takes a village of concepts interacting by associative tag to know, 
remember and express the relationships mediated by a verb between 
agents and objects. Robot 
 
Mind.Forth and tutorial 
 
Mind.html 
show the operation of the SVO module in tutorial mode. 
26. 
nounPhrase module of 
 
AI4U Chapter 15(2002):66-68 
flushes out the momentarily most active noun or pronoun to be 
a component in the verbal expression of an emerging thought. 
27.  The 
Reify module of 
 
AI4U Chapter 18(2002):77-79 
transfers activation from deep wordless concepts up to 
the vocabulary words of a particular human language, 
such as English or German or Russian. Under Reify, 
the abstract concept is "realized" or is expressed 
as a "real" thing -- a word of natural human language. 
The thinking occurs among the deep, abstract concepts, 
but the linguistic expression of thought requires real 
words in a real language. 
28.  English Lexicon Damping (enDamp) module on  
 
AI4U page 166 
resets lexical activations to zero after the thinking of a word 
so that the Reify module may transfer the next set of conceptual 
activations from the deep conceptual mindcore up to the shallow 
lexicon in time for selection of the next word in a nascent thought. 
29. The 
psiDamp module of 
 
AI4U page 164 
plays a key role in
artificial 
 
consciousness by damping down the activation of each 
concept immediately after it has been thought as part of an idea, 
so that consciousness may ride a 
 
moving wave of active concepts 
to flit from thought to thought in the stream of consciousness.
30. The 
psiDecay module of 
 
AI4U page 165 
lets semi-activated, 
subconscious concepts sink towards oblivion slowly enough to remain 
briefly available to the searchlight of attention and for the 
 
moving wave 
of 
activation to bring the subconscious concepts back into 
 consciousness.
31. The 
Speech module of 
 
AI4U Chapter 16(2002):69-72 
activates a series of phonemic engrams to speak or display a word 
that has been selected as part of a sentence being generated by 
the linguistic superstructure of an artificial intelligence. 
32. The 
Reentry module of 
 
AI4U Chapter 17(2002):73-76 
treats the output of the artificial Mind as its own input, so that 
the AI Mind may think about its own thoughts and become 
aware of its own awareness. 
33. The 
verbPhrase module of 
 
AI4U Chapter 14(2002):62-65 
flushes out the momentarily most active verb in the mind so that 
not only may a verb be thought and spoken, but also so that the 
combined activation from a preceding noun or pronoun and from 
the verb itself may coactivate the proper direct object from among 
all the historically available direct objects of the verb. 
34. 
Negational Subject-Verb-Object (negSVO) of 
 
AI4U Chapter 10(2002):46-50 
converts subject-verb-object (SVO) sentences into a negation of the same 
underlying idea. The end result, a kind of Chomskyan transformation, makes 
use of the auxiliary verb (auxVerb) module to insert a negatable form of 
the verb "do" into the verb phrase of a sentence. Even in the most primitive 
AI Minds, negation is an important step on the way towards automated reasoning 
by means of syllogisms and other forms of logic. 
35. The 
Auxiliary Verb (auxVerb) module of 
 
AI4U Chapter 11(2002):51-53 
initially provides the word "DO" as an auxiliary verb for the syntax of questions 
and negation. In more advanced evolutions of the auxVerb mind-module 
it will be necessary to accommodate person, tense and number for a 
wide variety of auxiliary verbs. 
36. The 
Ask module of 
 
AI4U Chapter 8(2002):39-41 
is a waystation for AI programmers to code a mechanism that chooses 
what kind of question the artificial mind will ask. Initially the Ask 
module calls just one kind of question, an interrogative "what" query, 
as an example for additional question-formats on the order of the 
typical "who? what? when? where? how? and why?" inquisition. 
37. The 
wtAuxSDo module of 
 
AI4U Chapter 9(2002):42-45 
asks a simple question in the [What Do S(ubjects) Do?] format. Originally 
the module was intended to demonstrate machine curiosity and learning. 
Early versions of Mind.Forth and Mind.html were programmed to respond 
with "WHAT DO [BLANK]S DO" upon detecting the input of a previously 
unknown noun, so that the human user would have a chance to teach 
the AI new information for the AI knowledge base. More sophisticated 
versions of the same mind-module may yet evolve as a means of avoiding 
the generation of false statements by switching from an incomplete 
statement to a question that seeks a response with the missing knowledge. 
38.  The 
Conjoin module of 
 
AI4U Chapter 13(2002):58-61.
Code the Conjoin module as a mechanism to select conjunctions.
Once the AI has learned to think or utter one simple SVO statement,
it requires the ability to select the right conjunction in order to
string meandering thoughts together in a stream of output that lasts
as long as the process of spreading activation gives rise to ideas.
In answer to input questions of "why", the Conjoin module may select
the conjunction "because" -- followed by a statement of associated
and possibly explanatory ideas, in a demonstration of AI reasoning.
39.  The 
Article module was created on 
 
23.AUG.2008. 
The AI Mind software, born in 1993, did not start thinking 
properly until it was a teenager -- fifteen years old in 2008, 
to be precise. Then in 2008 the expansion of the AI software 
began with the simple introduction of an Article module to 
select "a" or "the" as an article coming before a noun in a 
sentence of thought being generated. 
See the material on "Comments" at
http://mind.sourceforge.net/progman.html#comments
41.1  Sequential Order of Function Calls
See the material on Structured Programming Sequence at
http://mind.sourceforge.net/progman.html#anatomy
42.  Robot AI Mindmaking Resources
 
/projects/mind
Return to top; or to the
sitemap.html Sitemap.