Bimodal integration of phonemes and letters: an application of multimodal self-organizing networks
2006 (English)In: International Joint Conference on Neural Networks: IJCNN '06, Piscataway, NJ: IEEE Communications Society, 2006, 312-318 p.Conference paper (Refereed)
Multimodal integration of sensory information has clear advantages for survival: events that can be sensed in more than one modality are detected more quickly and accurately, and if the sensory information is corrupted by noise the classification of the event is more robust in multimodal percepts than in the unisensory information. It is shown that using a Multimodal Self-Organizing Network (MuSON), consisting of several interconnected Kohonen Self-Organizing Maps (SOM), bimodal integration of phonemes, auditory elements of language, and letters, visual elements of language, can be simulated. Robustness of the bimodal percepts against noise in both the auditory and visual modalities is clearly demonstrated.
Place, publisher, year, edition, pages
Piscataway, NJ: IEEE Communications Society, 2006. 312-318 p.
Research subject Industrial Electronics
IdentifiersURN: urn:nbn:se:ltu:diva-34134DOI: 10.1109/IJCNN.2006.246697Local ID: 84039b50-965f-11db-8975-000ea68e967bISBN: 0-7803-9490-9OAI: oai:DiVA.org:ltu-34134DiVA: diva2:1007384
IEEE World Congress on Computational Intelligence : 16/07/2006 - 21/07/2006
Godkänd; 2006; 20061228 (ysko)2016-09-302016-09-30Bibliographically approved