Kopp S, Bergmann K. Using Cognitive Models to Understand Multimodal Processes. The Case for Speech and Gesture Production. In: Oviatt S, Schuller B, Cohen P, Krüger A, eds. Handbook of Multimodal-Multisensor Interfaces. Vol. 1: Foundations, User Modeling, and Common Modality Combinations. ACM books. Vol 14. New York, NY: ACM, Morgan Claypool; 2017: 239-276
Kopp S, Bergmann K, Kahl S. A spreading-activation model of the semantic coordination of speech and ...
In this chapter, we present a plea for a stronger inclusion of two strands of research in Cognitive ...
Proceedings of the NODALIDA 2009 workshop Multimodal Communication — from Human Behaviour to ...
Bergmann K, Kahl S, Kopp S. How is information distributed across speech and gesture? A cognitive mo...
Kopp S, Bergmann K, Wachsmuth I. Multimodal Communication from Multimodal Thinking - Towards an Inte...
Multimodal signals allow us to gain insights into internal cognitive processes of a person, for exam...
Bergmann K, Kopp S. Multimodal Content Representation for Speech and Gesture Production. In: Theune ...
Kopp S. Giving interaction a hand - Deep Models of Co-speech Gesture in Multimodal Systems. In: Asso...
Since the beginnings of psycholinguistics, gestures were considered as significant parts of the mult...
Bergmann K, Kahl S, Kopp S. Modeling the semantic coordination of speech and gesture under cognitive...
In this paper, I will give an overview of some well-studied multimodal signals that humans produce w...
In this paper, I will give an overview of some well-studied multimodal signals that humans produce w...
Bergmann K, Kopp S. Co-expressivity of Speech and Gesture: Lessons for Models of Aligned Speech and ...
Bergmann K, Kahl S, Kopp S. Modeling the semantic coordination of speech and gesture under cognitive...
One of the implicit assumptions of multi-modal interfaces is that human-computer interaction is sign...
Kopp S, Bergmann K, Kahl S. A spreading-activation model of the semantic coordination of speech and ...
In this chapter, we present a plea for a stronger inclusion of two strands of research in Cognitive ...
Proceedings of the NODALIDA 2009 workshop Multimodal Communication — from Human Behaviour to ...
Bergmann K, Kahl S, Kopp S. How is information distributed across speech and gesture? A cognitive mo...
Kopp S, Bergmann K, Wachsmuth I. Multimodal Communication from Multimodal Thinking - Towards an Inte...
Multimodal signals allow us to gain insights into internal cognitive processes of a person, for exam...
Bergmann K, Kopp S. Multimodal Content Representation for Speech and Gesture Production. In: Theune ...
Kopp S. Giving interaction a hand - Deep Models of Co-speech Gesture in Multimodal Systems. In: Asso...
Since the beginnings of psycholinguistics, gestures were considered as significant parts of the mult...
Bergmann K, Kahl S, Kopp S. Modeling the semantic coordination of speech and gesture under cognitive...
In this paper, I will give an overview of some well-studied multimodal signals that humans produce w...
In this paper, I will give an overview of some well-studied multimodal signals that humans produce w...
Bergmann K, Kopp S. Co-expressivity of Speech and Gesture: Lessons for Models of Aligned Speech and ...
Bergmann K, Kahl S, Kopp S. Modeling the semantic coordination of speech and gesture under cognitive...
One of the implicit assumptions of multi-modal interfaces is that human-computer interaction is sign...
Kopp S, Bergmann K, Kahl S. A spreading-activation model of the semantic coordination of speech and ...
In this chapter, we present a plea for a stronger inclusion of two strands of research in Cognitive ...
Proceedings of the NODALIDA 2009 workshop Multimodal Communication — from Human Behaviour to ...