Studying early interaction is essential for understanding development and psychopathology. Automatic computational methods offer the possibility to analyse social signals and behaviours of several partners simultaneously and dynamically. Here, 20 dyads of mothers and their 13-36-month-old infants were videotaped during mother-infant interaction including 10 extremely high-risk and 10 low-risk dyads using two-dimensional (2D) and three-dimensional (3D) sensors. From 2D+3D data and 3D space reconstruction, we extracted individual parameters (quantity of movement and motion activity ratio for each partner) and dyadic parameters related to the dynamics of partners heads distance (contribution to heads distance), to the focus of mutual engagement (percentage of time spent face to face or oriented to the task) and to the dynamics of motion activity (synchrony ratio, overlap ratio, pause ratio). Features are compared with blind global rating of the interaction using the coding interactive behavior (CIB). We found that individual and dyadic parameters of 2D+3D motion features perfectly correlates with rated CIB maternal and dyadic composite scores. Support Vector Machine classification using all 2D-3D motion features classified 100% of the dyads in their group meaning that motion behaviours are sufficient to distinguish high-risk from low-risk dyads. The proposed method may present a promising, low-cost methodology that can uniquely use artificial technology to detect meaningful features of human interactions and may have several implications for studying dyadic behaviours in psychiatry. Combining both global rating scales and computerized methods may enable a continuum of time scale from a summary of entire interactions to second-by-second dynamics.