Semester Project: Embodied perception of brushstrokes

The brushstroke — in its various manifestations — is the singular tool of communication that is encountered in paintings and drawings throughout all epochs. The a media artist and painter Liat Grayver, has been investigating the value of incorporating computational and robotic methods in traditional painting techniques aiming to create a library for gestural brushstrokes for robotic painting, to serve timely based human machine collaborative creative work (http://www.liatgrayver.com/).
This project addresses the overall goal to develop an autonomous active perception system for creative generative processes, using a dynamic vision sensor (DVS) which can be connected, eventually, in a closed-loop setup with a robotic arm, providing in-the-loop visual feedback, and driving the artistic process.
The initial goal of this project is to map the spatial and temporal aspects of brushstrokes into streams of DVS events, ans they are being generated, and to associate the images corresponding to different types of brushstrokes to their corresponding streams of events. The DVS will be set up behind a semi-translucent paper so it can detect the temporal changes in ink application on the paper. A data-set will be created that stores the combination of artist gestures and DVS outputs, so that the two patterns can be associated together.
The DVS recorded data, and the corresponding gesture "labels" can then be used to train an artificial neural network to recognize and classify images of different brushstrokes.
In the long term the outcome of this project will be linked to a parallel one currently being carried out for controlling a robot arm to create artificial brushstrokes, and to drive the robot arm to create more natural brushstrokes.
The project raises questions about how the development of contemporary production and the creation of novel technologies in the realm of robotics and computer graphics have impacted and been informed by contemporary painting practices: How does the practice and perception of painting change when given over to deterministic processes and mechanical action? What forms of tactile expression are possible with the limited degrees of freedom available from a robotic system? What kinds of new coupling in the act of painting are possible between a human and a robotic system? Given recent advances in these fields, it is today relevant to leverage critical cultural discussion around how using these tools can serve as an emancipatory platform through which to form a new understanding of visual æsthetics

Background reading
• Berio, D. et al. Generating Calligraphic Trajectories with Model Predictive Control in Proceedings of Graphics Interface (Canadian Human-Computer Communications Society, Edmonton, Canada, May 2017).
• Berio, D. et al. Learning dynamic graffiti strokes with a compliant robot in Proc. IEEE/RSJ Intl Conf. on
Intelligent Robots and Systems (IROS) (Daejeon, Korea, Oct. 2016).
• Boden, M. The Creative Mind: Myths and Mechanisms (London: Routledge, 2004).
• Jean-Pierre, G. & SaId, Z. The artist robot: A robot drawing like a human artist in 2012 IEEE International Conference on Industrial Technology (2012), 486–491.
• Lubart, T. How can computers be partners in the creative process: classification and commentary on the special issue. International Journal of Human-Computer Studies 63, 365–369 (2005).
• Plamondon, R. et al. Recent Developments in the Study of Rapid Human Movements with the Kinematic Theory. Traitement Du Signal 26, 377–394 (2009).Brooks, R. A. New approaches to robotics. Science 253, 1227–1232 (1991).
• Saunders, R. et al. Curious Whispers: An Embodied Artificial Creative System. in ICCC (2010), 100–109.
• Saunders, R. et al. Evaluating human-robot interaction with embodied creative systems in Proceedings of the fourth international conference on computational creativity (2013), 205–209.
• Savery, R. et al. Shimon the Rapper: A Real-Time System for Human-Robot Interactive Rap Battles. arXiv preprint arXiv:2009.09234 (2020).

Contact

For further information contact Giacomo Indiveri [giacomo (at) ini.uzh.ch] and Liat Grayver [griver (at) collegium.ethz.ch]

© 2024 Institut für Neuroinformatik