Biblio

Found 56 results
Author Title Type [ Year(Asc)]
Filters: First Letter Of Last Name is G  [Clear All Filters]
2018
Fedorov L.A., Chang D.S., Giese MA, Bülthoff H.H., de la Rosa S..  2018.  Adaptation aftereffects reveal representations for encoding of contingent social actions. PNAS. 115:7515–7520.
Sperber C, Christensen A, Ilg W, Giese MA, Karnath H-O.  2018.  Apraxia of object-related action does not depend on visual feedback. Cortex. 99:103–117.
Sperber C, Christensen A, Ilg W, Giese MA, Karnath H-O.  2018.  Apraxia of object-related action does not depend on visual feedback. Cortex. 99:103––117.
Balasubramanian S, Garcia-Cossio E, Birbaumer N, Burdet E, Ramos-Murguialday A.  2018.  Is EMG a Viable Alternative to BCI for Detecting Movement Intention in Severe Stroke? IEEE Transactions on Biomedical Engineering. 65:2790–2797.
Li Y, Ganesh G, Jarrasse N, Haddadin S, Albu-Schaeffer A, Burdet E.  2018.  Force, Impedance, and Trajectory Learning for Contact Tooling and Haptic Identification. IEEE Transactions on Robotics. 34:1170–1182.
Takagi A, Usai F, Ganesh G, Sanguineti V, Burdet E.  2018.  Haptic communication between humans is tuned by the hard or soft mechanics of interaction. PLOS Computational Biology. 14:e1005971.
Ivanenko Y, Gurfinkel VS.  2018.  Human Postural Control. Frontiers in Neuroscience. 12
Gopinathan S, Mohammadi P, Steil JJ.  2018.  Improved Human-Robot Interaction: A manipulability based approach.
Kodl J, Christensen A, Dijkstra TMH, Giese MA.  2018.  Intent Perception of Human and Non-human Agent During Ball Throwing Task in Virtual Reality. PERCEPTION.
Burdet E, Li Y, Kager S, Geok KChua Sui, Hussain A, Campolo D.  2018.  Interactive robot assistance for upper-limb training. Rehabilitation Robotics. Technology and application. :137–148.
Fedorov LA, Dijkstra TMH, Giese MA.  2018.  Lighting-from-above prior in biological motion perception. Scientific Reports. 8
Chiovetto E, Huber ME, Sternad D, Giese MA.  2018.  Low-dimensional organization of angular momentum during walking on a narrow beam. Scientific Reports. 8
Ardestani MHovaidi, Saini N., Martinez A., Giese MA.  2018.  Neural model for the visual recognition of animacy and social interaction. LNCS, vol.11141, 27th Int. Conf. on Artificial Neural Networks, ICANN 2018, Rhodes, Greece, October 4-7, 2018, Proceedings, Part III. :168–177.
Donadio A, Whitehead K, Gonzalez F, Wilhelm E, Formica D, Meek J, Fabrizi L, Burdet E.  2018.  A novel sensor design for accurate measurement of facial somatosensation in pre-term infants. PloS one. 13:e0207145.
Chiovetto E, Curio C, Endres D, Giese MA.  2018.  Perceptual integration of kinematic components in the recognition of emotional facial expressions. Journal of Vision. 18 (4):1–19.
Mohammadi P, Malekzadeh M, Kodl J, Mukovskiy A, Wigand DLeroy, Giese M, Steil JJ.  2018.  Real-time Control of Whole-body Robot Motion and Trajectory Generation for Physiotherapeutic Juggling in VR. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).
Ceccarelli F, La Scaleia B, Russo M, Cesqui B, Gravano S, Mezzetti M, Moscatelli A, d’Avella A, Lacquaniti F, Zago M.  2018.  Rolling motion along an incline: Visual sensitivity to the relation between acceleration and slope. Frontiers in Neuroscience. 12
de la Rosa S, Fademrecht L, Bülthoff HHeinrich, Giese MA, Curio C.  2018.  Two ways to facial expression recognition? Motor and visual information have different effects on facial expression recognition Psychological Science. 29:1257–1269.

Pages