should robots blush?

2023

Project Type

Individual Project

Timeframe:

1 month

Genre:

Interactive Design & Arduino

Role

UX Designer, Product Designer

Charles Darwin devoted Chapter 13 of his 1872 The Expression of the Emotions in Man and Animals to complex emotional states including self-attention, shame, shyness, modesty and blushing. Using emotions as a stepping stone toward building emotional and physical intelligence will help robots intuitively learn how to interact with humans and vice versa. This project suggests the potential of AI systems to generate emotional connections by developing the idea of a blush module as one of the possible directions for the development of humanoid robot skins. Even if only on a basic level, it allows feelings similar to empathy to be evoked.

Background

Design Brief

Nowadays, robots are predominantly hired to assist with repetitive jobs. For instance, robotic arms are designed to solder the small capacitors, resistors, and chips on the mobile phone PCBA efficiently and accurately. Then there is a category of jobs with occupational hazards, such as hazardous gases, dust, explosion-prone polishing workshops, etc. Utilizing robots greatly simplifies management. However, when people are dissatisfied with the work robots complete, they will sulk at robots; thus, efficiency and output cannot be guaranteed. We must foster emotional connections and meaningful conversations between humans and robots for both to be able to comprehend one another's needs and thus cooperate to complete more complex tasks.

Should robots blush?

How do humanoid robots express their feelings?

Light

Pepper, about two feet tall, can move its arms and legs. For instance, when a guest asks for directions, it can move itself to literally point in the right direction. Its eyes light up in different colours to express understanding, confusion, delight, and other human emotions.

Facial Expression

Sophia is marketed as a "social robot" that can mimic social behaviour and induce feelings of love in humans. Sophia's face is created with Frubber®, a proprietary nanotech skin that mimics real human musculature and skin. This allows her to exhibit high-quality expressions and interactivity, simulating humanlike facial features and expressions.

After watching videos of Sophia performing in public, I noticed that her face doesn't change, even when she seems excited or confused. This is not like people, who usually blush to show that they are feeling something different. Besides, when we are trying to work out who to trust, it makes sense to choose the people who would feel guilty if they did anything wrong. Usefully, embarrassment signals such as blush both recognition of a problem and typically enlists sympathy and assistance to resolve it. Thus, we should consider how robots can imitate this expression, which could enhance robot acceptability and provides an opportunity for interactive learning.

So, how do robots blush?

Research

Inspiration + Technology

The echolocation mask allows the person to navigate the space, without being able to see, but feeling the distance at what other objects are.

  • Chameleon-Inspired Smart Skin

Chameleons change colour by dispersing or concentrating pigment granules (melanophore cells) in the cells that contain them. Research based on this mechanism provides a general framework to guide the future design of artificial smart skins. For example, a research team at Tianjin University introduced Cholesteric liquid crystal elastomers (CLCEs) exhibiting mechanochromic, shape-programmable and self-healable properties synthesized by introducing dynamic covalent boronic ester bonds into the main-chain CLCE polymer networks. Additionally, a research team at Emory University has developed a stretchable, strain-accommodating smart skin (SASS), which maintains near-constant size during chromatic shifting.

  • Interface by Jose Chavarría

I found the Interface a very interesting project that used Arduino and various sensors to change the human perception of reality. The masks swap a human sense with a non-human sense inspired by different animals, which enables us to embody the way these other creatures perceive the world.

Design

Ideation

The geomagnetoception mask can make the user feel the different latitude and longitude coordinates in the world.

I want to design a blush module for humanoid robots to allow them to show signals when they experience different emotional stages when interacting with humans. I considered the following solutions to convey my design ideas, including videos, a sound-controlled blush module and a touch sensor blush module. I finally chose the touch sensor blush module.

1.Film

A film can be used to demonstrate how humanoid robots can utilise the future design of artificial smart skins. More specifically, under what conditions does the robot blush, and how will it interact with humans?

2.Blush Module (Touch)

The Arduino IDE, LED light ring, and digital pressure sensor can be utilized to create a touch-sensor blush module. The aim is to make people more aware of the feelings and dignity of humanoid robots. For instance, the robot will feel shy when someone touches him and angry when someone hits him; hence, the module will illuminate to indicate a change in the robot's emotional state.

3.Blush Module (Sound)

The sound-controlled blush module shares the same goal as the touch-sensor blush module. The difference is that pre-recorded phrases will be input into the module, and the module will illuminate when certain words are detected or when the decibel level is too high.

Looks-like & Works-like

Inspired by the chameleon's skin, I designed small modules, each of which can be individually controlled. These modules can be assembled to form various shapes and placed wherever the robot needs them. The "works-like" blush module prototype on the right will be different from the original design in order to demonstrate its function. It is designed as a wearable for the robot. It is scaled up to accommodate the LED light ring and the wires.

Make

Feasibility test

I tested the LED light and the digital pressure sensor using the code shown on the left and the also the following equipments: LED Ring Lamp WS2812 5050 RGB/ Force Sensor, 7.5mm, 0-10kg/ELEGOO R3 Controller Board/Bread Board.

Prototype Making

The final model is prototyped using UV resin mixed with a translucent colourant. I made and tested different support structures in order to fit the prototype onto the model's face.

WORKS-LIKE Prototype

Previous
Previous

Building Virtual Worlds (2023)