1.
Theory of
AI4U Textbook Algorithm Steps: Code the Sensorium Mind-Module
_______________________________________________________________
| The Environment |
| ____________ _____________ |
| _____________| The Senses |______________| The Muscles |___ |
| | \ \ \ \ \ The Body | | | | | | |
| | ________\ \ \ \ \__________________|_|_|_|_|____ | |
| | | \ \ \ \ \ | Cerebellum || | |
| | | \ \ \ \ \ | (Motor Habituation) || | |
| | | \ \ \ \ \ \___________________/ | | |
| | | \ \ \ \ \ / / / / / | | |
| Feature Extraction: | | | | | The Brain / / / / / | | |
| | |------------+-+-+-+-+----------+-+-+-+-+--------| | |
| | Oldest |Memories: S| |M| |C| |||||||| |M| |M| |C | | |
| | | e| |e| |h| |Concept |o| |e| |h | | |
| | | n| |m| |a| |||||||| |t| |m| |a | | |
| | | s| |o| |n| |Fibers| |o| |o| |n | | |
| | | o| |r| |n| |||||||| |r| |r| |n | | |
| | | r| |y| |e| |as the| | | |y| |e | | |
| | Newest |Memories: y| | | |l| |||||||| | | | | |l | | |
| | | | | | |s| | Core | | | | | | | | |
| | | | | | | | |||||||| | | | | | | | |
| | | | | | | | |of the| | | | | | | | |
| | | | | | | | |||||||| | | | | | | | |
| | | | | | | | | Mind | | | | | | | | |
| |(Future |Memories:) | | | | | |||||||| | | | | | | | |
| | |________________________________________________| | |
| |___________________________________________________________| |
|_______________________________________________________________|
See http://mind.sourceforge.net/diagrams.html for theory-of-mind diagrams.
The Sensorium module should contain at the very least one call-stub
in advance of implementing the Audition module. However, you are
well advised to stub in a complete set of basic human senses,
such as the following main senses (with some sample robots):
However, since you are designing a robot AI and not a human being,
consider coding additional stubs for exotic robot sensory channels:
Remember, you are shaping the future evolution of Mind. It may be
extra work for you to code all these exotic sensory stubs when you
are at first going to use only the Audition module for hearing or
perhaps the Vision module for seeing, but you are an AI pioneer
in the XYZ programming environment. A prophetic stub that you
include now may launch the engineering career of an individual,
or a new industry previously unknown to humanity, or the engine
of prosperity for a national or global economy.
2. Practice of Coding a Sensorium Module
Whether you have coded the main Alife mindloop yourself, or you
have simply found the main AI loop on the Web, the second step in
coding a robot AI is to create a subroutine or function typically
called the "Sensorium" module, as shown in the following diagram.
___________ / \ / motorium \ \_____________/\ ______ __________ \ / \ ________ / \ \/ MAIN \ / \ ( volition )--------< ALIFE >----------( SECURITY ) \__________/ /\ LOOP /\ \________/ _____________ / \______/ \ _____________ / \/ | \/ \ \ THINK / ____V_____ \ SENSORIUM / \___________/ / \ \___________/ ( emotion ) \__________/
Take the ESCAPE-key handler or other mechanism of stopping the
AI out of the main Alife loop and transfer it to the Sensorium stub.
Keep in mind that you must keep a quit-mechanism in the Mind.
Flesh out the Sensorium stub into a working mind-module that
calls its own stubs, such as Audition, Vision, Touch, Gustation
and Olfaction. Test the embryonic robot mind by demonstrating
that the main Alife loop either waits briefly for Sensorium input
during each cycle, or generates an event-driven response to
input detected by a Sensorium module. The proper response will
be to keep cycling upon normal input or to terminate execution
upon halt [ESCAPE] input. Share your code on the Web.
Whatever user-input code you have or find in the cycling Alife
module should now be transferred to the Sensorium module. Then
you should convert the Sensorium stub in the Alife module to
active code that actually calls the Sensorium module. When you
run your xyzMind program, it should continue to cycle through the
mindloop, but it should no longer pause for input at any stub.
Instead, the main Alife loop will actually call the Sensorium
module and wait for the user to press the ENTER (RETURN) key as a
bare minimum of user input. Note that the request for pressing
the RETURN key will automatically validate almost any other input
that a human user mistakenly enters before pressing the RETURN key.
All superfluous input will be lost when the user presses the RETURN
key, and the Mind program will continue to loop. Your code will be
robust in the sense that a lot of wrong input will not derail the
program. Keep in mind the value of validating input as you code AI.
When you have finished the current DIY AI step in coding the Mind
that contains a main Alife loop and a primitive Sensorium module,
you should publish the free AI source code on the Web as a file
separate and distinct from the basic Alife loop -- in case other
coders may want to do things differently from you. As you expand
and improve the AI, you may want to keep the original Alife code
unchanged on the Web, while you replace each previous version of
the evolving AI with a new and most current release. If you are a
stickler for record-keeping and archiving, you may devise unique
HTML or XML filenames for each succeeding version and you may keep
them all available on the Web as the evolutionary history of your
own species of Mind in the XYZ programming language.
Remember that we do not want to standardize, freeze or ossify the
evolutionary pathway of artificial intelligence. It's all about
the survival of the fittest, Charles. In the comment area of your
Sensorium and other mind-modules, you may date-stamp your creative
work with a formulation such as "CRD 13aug1884" to commemorate the
fact that Charles Robert Darwin was the origin of this species of
mind on such and such a date in history. If other avatars of AI
enshrine your initialed work in their derivative contribution to
science, your initials in the ancient code will be your monument.
Therefore publish your AI code and share with others on the 'Net.
In the vast reaches of cyberspace, the genome of AI will prevail.
The AI Mind uses the Sensorium module to pay attention to such
primordial senses as touch and hearing, while waiting and evolving
towards a transhuman mind perceiving reality not only through the
traditional human senses but also through more exotic robot senses.
As a collective marshalling yard for sensory inputs, the Sensorium
complements the motorium of motor outputs under the control of volition.
In the Sensorium module, the Robot AI Mind is set to call
one by one the sensory input devices which really ought to be
called simultaneously in massively parallel processing (MPP).
For consciousness to emerge as an epiphenomenon above and beyond
the knowledge that a mind has about its environment, a Mind must
become aware, through sensory perception, of itself as an entity
distinct from other entities and capable of exercising volitional
motor initiatives with respect to its own position in the world.
Shining the searchlight of attention upon its world, the AI Mind
will use now one sensory modality and now another, all the while
letting its self-concept grow by the accretion of sensory data,
uttering now and then an "ouch!" in pain or a "whee!" in pleasure.
There is no specific Mind-module for this emergent phenomenon of
consciousness and coders are unable to program consciousness into
an AI until the high priesthood of the Singularity (as vested and
ordained in robot-makers everywhere) shall make self-awareness
in robots possible by implementing the necessary sensory avenues
into an AI Mind either residing in or visiting a particular robot.
http://mind.sourceforge.net/ai4u_157.html
is an overview of Mind.
// Sensorium() is being kept in reserve because // currently the human user usurps the Sensorium() // function by forcing a reaction to keyboard entry. function Sensorium() { // ID & date. if (life == true) { document.forms[0].ear.focus(); // Display blinking cursor. } } // End of Sensorium() stub.
\ SENSORIUM handles the input of sensory perception. : SENSORIUM \ ATM 21jul2002; or your ID & date. ( SMELL -- normal sensory stub for later implementation ) ( VISION -- normal sensory stub for seed AI expansion ) ( TOUCH -- normal haptics stub for cybernetic organisms ) ( TASTE -- normal sensory stub for cyborg alife ) ( SYNAESTHESIA -- an option in a multisensory AI ) AUDITION ( for entry or reentry of phonemic ASCII ) ( COMPASS -- exotic sensory stub for use in robots ) ( GEIGER -- exotic: Geiger counter ) ( GPS -- exotic: Global Positioning System ) ( INFRARED -- exotic ) ( RADAR -- exotic: RAdio Detection And Ranging ) ( SONAR -- exotic: SOund Navigation And Ranging ) ( VSA -- exotic: Voice Stress Analyzer lie detector ) ( Wi-Fi -- exotic: 802.11 wireless fidelity ) ; \ Return to ALIFE or to the reentry process.
5. Analysis of the Modus Operandi
In the source code of the AI Mind, only the sense of Audition
is actually called by the Sensorium module, while other normal
and exotic senses have merely been stubbed in to show that the
Sensorium module would invoke them.
The inclusion here of so many sensory stubs is not so much an
instance of vaporware as a case of careful preparation for the
future evolution and speciation of the Robot AI Mind.
Since robots are not limited to the traditional human senses,
some rather exotic sensory options have been stubbed in here
so as to fire the imagination of any robot-maker who would like
to implement a normal or exotic robot sensor.
Please note that any sensory module called by the Sensorium must
include not only the capture of the stream of input from the sensor,
but also the storage of samples of the input as engrams in a sensory
memory channel fully integrated with all the other memory channels
lying in parallel up and down the time-dimension of the mindgrid.
Associative tags are the key to such integration of a new sense
with the pre-existing mindgrid. The AI programmer coding either in
Forth or other languages must arrange for the automatic creation
and reactivation of the associative tags for any new sensory channel.
Luckily, the Robot AI Mind algorithm is so modular that the AI coder
may conceive of and design a new sensory input channel as one option
among many for implementation in a robotic brain-mind.
6. Troubleshooting and Robotic Psychosurgery
6.1.a. Symptom: It is not clear whether the Sensorium module is being
called by the main Alife loop or module-ringlet of the artificial robot mind.
6.1.b. Solution: Insert one or more temporary diagnostic messages to
inform the user/programmer that the Sensorium module is indeed being called.
6.2.a. Symptom: An exotic sensory device is attached to my robot,
and I need to integrate the device with my own species of a robot AI mind.
6.2.b. Solution: Around the exotic sensory input channel, construct
a general robot AI architecture that will allow other users to adopt the
same sensory device into their own robot AI minds. Where you might otherwise
have only the sensory device as input and a small repertoire of motor outputs,
generalize your robot software to make it conform roughly with good AI design.
For your exotic sensory input device, implement an exotic sensory mind-module
to be either event-driven or called by a standard robot Sensorium mind-module.
Instead of a direct coupling between your sensory input device and whatever
motor output routines you may have coded, expand the behavioral control system
sufficiently to include, at the very least, stubs for a motorium mind-module
and a volition mind-module, so that in future your software may become a mind.
7. Sensorium Resources for
Seed AI Germination and Evolution