Get The FACS Fast

September 16th, 2009

Brick, T.R., Hunter, M.D., & Cohn, J.F. Get The FACS Fast: Automated FACS face analysis benefits from the addition of velocity. 2009 International Conference on Affective Computing & Intelligent Interaction (ACII 2009)

This article reports that the accuracy of automatic recognition of FACS codes from video data can be improved by including velocity and acceleration of tracked points on faces from the Cohn-Kanade database.

The article can be downloaded as a PDF.

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author’s copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.

OpenMx Software Begins Beta Testing

August 4th, 2009

The first beta test for OpenMx began today as the software was released to the first 16 beta testers some twenty months after the project began. OpenMx is free and open source structural equation modeling (SEM) software for use with the statistical software R. The project is led by Steven Boker and currently includes 14 core developers at UVa and other universities such as Virginia Commonwealth University, University of Chicago, University of Houston, McMaster University, and University of Edinburgh. The open source project currently is comprised of over 25,000 lines of code and documentation. OpenMx will begin public beta testing in October.

Learn more about the OpenMx software, documentation, and forums at the OpenMX web site.

Effects of Damping Head Movement and Facial Expression in Dyadic Conversation Using Real-Time Facial Expression Tracking and Synthesized Avatars

July 9th, 2009

Boker, S. M., Cohn, J. F., Theobald, B.-J., Matthews, I., Brick, T. & Spies, J. (in press) Effects of Damping Head Movement and Facial Expression in Dyadic Conversation Using Real-Time Facial Expression Tracking and Synthesized Avatars. Philosophical Transactions of the Royal Society B.

This article reports an experiment in which research assistants’ head movements and facial expressions were motion tracked during videoconference conversations, an avatar face was reconstructed in real time, and naive participants spoke with the avatar face. Research assistants’ facial expressions, vocal inflections, and head movements were attenuated at one minute intervals in a fully crossed experimental design. Attenuated head movements led to increased head nods and lateral head turns, and attenuated facial expressions led to increased head nodding in both naive participants and in confederates. The results are consistent with a hypothesis that the dynamics of head movements in dyadic conversation include a shared equilibrium.

(a) four facial expression (b) four attenuated facial expression.

The manuscript of this article accepted for publication can be downloaded as a PDF. This preprint may not exactly replicate the final version published in the journal. It is not the copy of record.

Time Delay Embedding Increases Estimation Precision of Models of Intraindividual Variability

July 1st, 2009

von Oertzen, T. & Boker, S. (in press) Time Delay Embedding Increases Estimation Precision of Models of Intraindividual Variability. Psychometrika

An article describing a surprising result for estimating dynamical systems models was recently accepted for publication by Psychometrika. Publication rules for Psychometrika prohibit us from quoting excerpts or distributing the manuscript on the web, but we can send a preprint if you send one of the authors an email request.

Spatiotemporal Symmetry and Multifractal Structure of Head Movements

July 1st, 2009

Ashenfelter, K. T., Boker, S. M., Waddell, J. R., & Vitanov, N. (in press). Spatiotemporal Symmetry and Multifractal Structure of Head Movements during Dyadic Conversation. Journal of Experimental Psychology: Human Perception & Performance

This study examined the influence of sex, social dominance, and context on motion-tracked head movements during dyadic conversations. Windowed cross-correlation analyses found high peak correlation between conversants’ head movements over short (2 second) intervals and a high degree of nonstationarity. Nonstationarity in head movements was found to be related to gender of the participants. Multifractal analysis found small-scale fluctuations to be persistent, and large-scale fluctuations to be antipersistent. These results are consistent with a view that symmetry is formed between conversants over short intervals and that this symmetry is broken at longer, irregular intervals.

(a) Coordination between dancers is mostly stationary (b) Coordination during conversation is highly nonstationary.

The manuscript of this article accepted for publication can be downloaded as a PDF. This article may not exactly replicate the final version published in the APA journal. It is not the copy of record.

Describing Intraindividual Variability at Multiple Time Scales Using Derivative Estimates

July 1st, 2009

Deboeck, P. R., Montpetit, M. A., Bergeman, C. S. & Boker, S. M. (in press). Describing Intraindividual Variability at Multiple Time Scales Using Derivative Estimates. Psychological Methods.

Studying intraindividual variability can be made more productive by examining variability of interest at specific time scales, rather than considering the variability of entire time series. Examination of variance in observed scores may not be sufficient, as these neglect the time scale dependent relationships between observations. This article outlines a method to examine intraindividual variability through estimates of the variance and other distributional properties at multiple time scales using estimated derivatives.

The manuscript of this article accepted for publication can be downloaded as a PDF. This article may not exactly replicate the final version published in the APA journal. It is not the copy of record.

Mapping and Manipulating Facial Expression

July 1st, 2009

Theobald, B., Matthews, I., Mangini, M., Spies, J., Brick, T., Cohn, J. F., & Boker, S. (2009) Mapping and Manipulating Visual Prosody. Language and Speech 52:2, 369-386.

This article, just published in Language and Speech, describes the process by which we are mapping facial expressions from one individual to another in real time.

The manuscript as it was submitted can be downloaded as a PDF

Something in the Way We Move - JEP:HPP 2009 Demo Video

May 21st, 2009

This is the demonstration video accompanying the publication linked here.

Download in MOV format.

Something in the Way We Move - JEP:HPP 2009 Article

May 21st, 2009

Boker, S. M., Cohn, J. F., Theobald, B., Matthews, I., Mangini, M., Spies, J. R., Ambadar, Z., & Brick, T. R.  (In press). Something in the Way We Move: Motion Dynamics, not Perceived Sex, Influence Head Movements in Conversation. Journal of Experimental Psychology: Human Perception and Performance.

The manuscript of this article accepted for publication can be downloaded as a PDF This article may not exactly replicate the final version published in the APA journal. It is not the copy of record.

WIAMIS 2009

May 15th, 2009

The WIAMIS 2009 paper:

Brick, T. R., Spies, J. R., Theobald, B-J., Matthews, I., and Boker, S. M. (2009). High-Presence, Low-Bandwidth, Apparent 3D Video-Conferencing With a Single Camera. In Proceedings of the 10th Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS 2009).

PDF

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author’s copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.