Cyborg AI Minds are a true concept-based artificial intelligence with natural language understanding, simple at first and lacking robot embodiment, and expandable all the way to human-level intelligence and beyond. Privacy policy: Third parties advertising here may place and read cookies on your browser; and may use web beacons to collect information as a result of ads displayed here.

Saturday, February 11, 2012

feb11ruai

Artificial Intelligence in Russian

1. Thurs.9.FEB.2012 -- Unspoken Be-Verbs as a Default

The Russian-speaking artificial intelligence Dushka needs a default BeVerb module that will silently assert itself as the automatic carrier of thought until a non-be-verb takes over from the provisional default. In our coding of a Russian mind, we will assume that any noun or pronoun, beginning a thought in the nominative case, is automatically the subject of a putative BeVerb until proven otherwise. In this way, our cognitive software will prepare for a BeVerb and switch automatically when a non-be-verb occurs.

We should work first on the comprehension of putative be-verbs and second on their generation, so that what we learn in comprehending be-verbs may be used in generating thoughts involving a BeVerb. So we type into the AI a Russian sentence to see if the software can understand it.

Human: душка робот

Robot: ДУШКА ЧТО ДУШКА ТАКОЕ

We said "Dushka is a robot" but the AI responded only, "Dushka -- what is Dushka?" We need to implement a default BeVerb in the comprehension of a sentence that lacks a visible BeVerb.

In the InStantiate module, we can trap for the input of a "c==32" space-bar when the "seqneed" is set to "8" for want of an incoming verb. We may then do something outrageous, but normal for Russian. From InStantiate we may provisionally send into AudMem a space-bar character with an "audpsi" of "800" for the verb БЫТЬ ("to be"), so that the AI is ready to record any noun coming in as a predicate nominative in conjunction with the be-verb. Now, if we implement such an outrageous step, it is possible that our AI memory-banks will become replete with quasi-spurious engrams of infinitive be-verbs that typically do not materialize. It could be that the presence of a spurious be-verb engram will not matter, if the cancellation of the default occurs as soon as some actual verb comes in. Then cancelling the spurious default will involve removing or nullifying any associative tags laid down momentarily during the enactment of the default.

2. Fri.10.FEB.2012 -- Instantiating Imaginary Be-Verbs

In the InStantiate module we will now experiment with code to create in auditory memory a pseudo-engram of a non-existent be-verb after the perception of a nominative noun or pronoun. Since the Russian-speaking mind waits for a predicate nominative, it needs at least an imaginary be-verb as the holder of associative links between subject and predicate nominative.

Now inside InStantiate we have assembled the code that creates a be-verb pseudo-engram in the three memory arrays for "Psi" concepts, Russian words and auditory engrams. The Psi node is automatically creating a "pre" tag that links the pseudo-verb back to its subject. We need to implement code that will finish the intermediation of the unspoken Russian BeVerb between its subject and the predicate nominative. The code must also cancel or uninstall the imaginary BeVerb if a real verb occurs instead of the provisionally expected BeVerb.

3. Sat.11.FEB.2012 -- Integration of Default Be-Verbs

We have the AI pretending that a BeVerb comes in after a nominative subject, and now we need to create the "seq" tag from the subject to the default BeVerb. First in the InStantiate module we insert a line of code declaring that the pseudo-be-verb is indeed a verb with respect to its part of speech, so that the following code will try to reach backwards to the subject engram and install a "seq" tag referring to the now not-so-imaginary BeVerb. We run the Dushka AI and we type in, ты робот -- which is Russian for "You are a robot", but without the be-verb. We are puzzled when Dushka answers, Я ЧТО Я ТАКОЕ ("I -- WHAT AM I?") and that's all she wrote. It may indicate that her concept of self has been activated by the input referring to "you", but she does not seem to have understood the input. We check the diagnostic display, and we see that her concept of self now has a "seq" tag referring right back to herself instead of to the default Russian BeVerb. What went wrong? We look at the JavaScript source code again, and we see that it was not enough to set the part-of-speech as a verb. We go ahead and we set the Psi concept-number to be that of the Russian be-verb. Then we run the Russian AI again with the same input and we sit there in shock when the AI announces to us: Я РОБОТ. Dushka has just said to us, "I AM A ROBOT" in Russian. From the diagnostic display we discover that the same changes that made Dushka able to understand the idea, made her able to think the idea.

Saturday, February 04, 2012

feb4ruai

Artificial Intelligence in Russian

Fri.3.FEB.2012 -- Recognizing Inflections

For the Russian-thinking Dushka AI Mind, we have perhaps stumbled upon a way to avoid the hard-coding of noun paradigms and instead to let the Russian AI learn the inflected endings of Russian nouns from its own experience. For example, right now the Russian artificial intelligence (RuAi) fails to recognize the Psi concept #501 БОГ in the following exchange.

Human: я уважаю бога ("I honor God.")
Robot: ТЫ УВАЖАЕШЬ БОГА ("You honor God.")

Robot: ЧТО БОГА ТАКОЕ ("What is God?")

The diagnostic display reveals that the software has almost recognized the word for God.

559. Б 0 * 1 1 0
560. О 0 * 0 1 0
561. Г 0 * 0 1 501
562. А 0 * 0 0 902
Aha! Suddenly it becomes clear that two things are happening. The Psi concept #501 is indeed being recognized at first, but perhaps the provisional-recognition "prc" variable is not being set, and so AudInput calls NewConcept as if the AI were learning a new word instead of recognizing an old word.

Sat.4.FEB.2012 -- Learning Russian Like a Human Child

Now in a very rough way we have trapped for "zad1" in the AudRecog module so as to recognize a noun (БОГА ) with one character of inflection added onto it. Because the noun was indeed recognized, the InStantiate "seqneed" mechanism tagged the noun in the "ruLexicon" with a "dba" of "4" to indicate a direct-object accusative case. In other words, the Russian AI learned a new noun-form as a human child would learn it, that is, from the speech patterns of another speaker of Russian.


Wednesday, February 01, 2012

feb1ruai

Artificial Intelligence in Russian

Tues.31.JAN.2012 -- Generating and Recognizing Verbs

In our Dushka Russian AI we have the problem that new verb-forms generated on the fly by the VerbGen module are not being recognized and tagged with critical parameters as they settle into auditory memory. However, it looks as though a verb does get recognized if the "audpsi" tags for the verb in auditory memory extend far back enough to cover the stem of the verb. Therefore, instead of devising ways to bypass the operation of ReEntry calling AudMem, calling AudRecog, we should perhaps instead implement a "backfill" of any verb generated in the VerbGen module to let the "audpsi" tags extend back to the last "pho(neme)" of the verb-stem. Then the "provisional recall" mechanism in AudRecog ought to recognize the verb-form generated by the VerbGen module.

We created a "vip" variable to hold the value of "motjuste" when VerbPhrase calls VerbGen and to transfer the known concept-number of the verb, near the end of the stem in VerbGen, into the provisional "prc" variable for AudRecog. In this way, we got the AI internally to recognize and record verb-forms generated internally by the VerbGen module. However, to get the AI to call the correct verb-forms, we had to modify some recent OldConcept code for deciding what "dba" value to store with a lexical item. Now we have a problem with tagging the "dba" of a simple word like МЕНЯ when it comes in.

We can not rely on the form of МЕНЯ to tell us its "dba" because it could be genitive or accusative. We need to extract clues from the incoming sentence in order to assign the proper "dba" during the storage of МЕНЯ.

Wed.1.FEB.2012 -- Tagging Engrams with Parameters

We can perhaps rely on the "seqneed" mechanism of InStantiate to provide the "dba" parameter for a noun or pronoun entering the mind as user input. (Perhaps the "seqneed" variable should change to a "seqseek" variable for greater clarity.) We may be able to strengthen the use of "seqneed" by adding a kind of "pass-over" when a preposition is encountered, so that the software continues to look for a direct-object noun when a preposition-plus-noun combination is detected and skipped.

Where the InStantiate module tests for a "seqneed" of "5" and encounters a satisfying noun or pronoun to become a "seq" for the verb, we make the assumption that the time "t" identifies the temporal location of the noun or pronoun in both the Psi array and the "ruLexicon" array. We insert two lines of code to first "examine" the Russian lexical array and then to substitute a numeric "4" for the "ru4" flag of the "dba" value. Since the noun or pronoun is going to be the "seq" of the verb, that same noun or pronoun warrants a "dba" of "4" as a direct object that should be in the accusative case. However, we may need to make other arrangements if the verb is intransitive and the noun must be in the nominative as a predicate nominative.