Title :
The short-context priority of emergent representations in unsupervised learning
Author :
Ping Gan ; Juyang Weng
Author_Institution :
Sch. of Comput. Sci. & Inf. Eng., Shanghai Inst. of Technol., Shanghai, China
Abstract :
The more books a child reads in mother tongue, the better he understands the text although he does not intend to find out the meanings of new words, i.e., using unsupervised learning with previously established constants. We think this is because semantics and syntax of the language emerges in his brain through past sufficient exposure to the language and past interactions with the environment using that language. In this largely theoretical work, we are interested in how internal neurons represent temporal contexts using incrementally constructed circuits. We analyze how the feature neurons of the Developmental Network (DN) represent earlier contexts. Although both bottom-up input and top-down input both contribute to the pre-action values of the top-winner feature neurons, the earlier the input text in the representation, the less its contribution to the pre-action values of the neurons. Furthermore, the decrease in the contribution of early context is exponential in the number of time frames that have passed. Thus, the self-organization of the internal feature neurons in DN has a desirable short-context priority. This property nicely disregards the early initialization of the internal neurons during the later competition-based self-organization process. Statistically, the data-driven initialization of feature neurons, or the order and the kind of text that the DN reads, becomes less and less relevant while the amount of text reading increases.
Keywords :
text analysis; unsupervised learning; DN; bottom-up input; competition-based self-organization process; data-driven initialization; developmental network; emergent representations; incrementally constructed circuits; internal feature neuron self-organization; short-context priority; syntax; temporal contexts; time frames; top-down input; top-winner feature neurons; unsupervised learning; Computer science; Context; Educational institutions; Natural languages; Neurons; Syntactics; Vectors;
Conference_Titel :
Natural Computation (ICNC), 2014 10th International Conference on
Conference_Location :
Xiamen
Print_ISBN :
978-1-4799-5150-5
DOI :
10.1109/ICNC.2014.6975805