top of page

Robotic

2023

April 5

Robots predict human intention for faster builds

  • aim to teach robots how to predict human preferences in assembly tasks, so they can one day help out on everything from building a satellite to setting a table.

  • When working with people, a robot needs to constantly guess what the person will do next

  • In this new study, however, the researchers found similarities in how an individual will assemble different products. For instance, if you start with the hardest part when building an Ikea sofa, you are likely to use the same tact when putting together a baby's crib.

  • So, instead of "showing" the robot their preferences in a complex task, they created a small assembly task (called a "canonical" task) that people can easily and quickly perform.

  • The robot "watched" the human complete the task using a camera placed directly above the assembly area, looking down. ... Then, the system used machine learning to learn a person's preference based on their sequence of actions in the canonical task.

  • By helping each person in their preferred way, robots can reduce their work, save time and even build trust with them.

  • they hope this research will lead to significant improvements in the safety and productivity of assembly workers in human-robot hybrid factories.

2022

Mar 13     

人類驅動的「非自然生命演化」未來將走向何方(David Farrier)

  • 「異種機器人」(xenobot)

2021

 

Nov 29            

Xenobots: Team builds first living robots that can reproduce

  • Now scientists have discovered an entirely new form of biological reproduction—and applied their discovery to create the first-ever, self-replicating living robots.

  • With the right design—they will spontaneously self-replicate

Nov 4              

A new machine-learning system helps robots understand and perform certain social interactions

  • In a simulated environment, a robot watches its companion, guesses what task it wants to accomplish, and then helps or hinders this other robot based on its own goals.

  • Enabling robots to exhibit social skills could lead to smoother and more positive human-robot interactions. For instance, a robot in an assisted living facility could use these capabilities to help create a more caring environment for elderly individuals. The new model may also enable scientists to measure social interactions quantitatively, which could help psychologists study autism or analyze the effects of antidepressants.

  • The robot is rewarded for actions it takes that get it closer to accomplishing its goals. If a robot is trying to help its companion, it adjusts its reward to match that of the other robot; if it is trying to hinder, it adjusts its reward to be the opposite.

Oct 6              

A new model to synthesize emotional speech for companion robots

  • Researchers at Hitachi R&D Group and University of Tsukuba in Japan have developed a new method to synthesize emotional speech that could allow companion robots to imitate the ways in which caregivers communicate with older adults or vulnerable patients.

  • a speech synthesis method for imitating the emotional states in human speech

  • the researchers trained a machine-learning model on a dataset of human voice recordings gathered at different points during the day. During training, the emotion recognition component of the model learned to recognize emotions in human speech.

  • The results are highly promising, as they suggest that their emotional speech synthesizer that can effectively produce caregiver-like speech that is aligned with the circadian rhythms of most elderly users.

Oct 4              

Study explores how a robot's inner speech affects a human user's trust

  • an experiment using a robot that can talk to itself out loud in a way that resembles humans' inner speech.

  • In their previous work, the researchers showed that a robot's performance can improve when it talks to itself.

  • In their new work, they set out to investigate whether this ability to talk to itself can affect how users perceive a robot's trustworthiness and anthropomorphism

  • based on both the well-known inner speech scales (such as the Self Talk Scale)

Aug 27            

Expanding human-robot collaboration in manufacturing by training AI to detect human intention

  • 'training' robots to detect arm movement intention before humans articulate the movements

  • by interfacing the frontal lobe activity of the human brain.

  • training an AI system to recognize the pre-movement patterns from an electroencephalogram (EEG) 

  • The experimental data shows that the AI system can detect when a human is about to move an arm up to 513 milliseconds (ms) before they move, and on average, around 300ms prior to actual execution.

  • we hope this proposed technology could help towards a closer, symbiotic human-robot collaboration, which still requires a large amount of research and engineering work to be fully established.

Aug 5              

Robot uses tactile sign language to help deaf-blind people communicate independently

  • tactile sign language robot, which she has named TATUM, Tactile ASL Translational User Mechanism

  • people who are both deaf and blind often need an interpreter to be present with them in-person for interactions with others who do not know American Sign Language, so they can feel what shape their hands are making.

  • She sees the robot as potentially helpful at home, at a doctor's office, or in other settings where someone might want to have private communication or an interpreter might not be readily available.

  • Johnson is focusing on the letters of the American Manual Alphabet, and training the robot to finger-spell some basic words.

July 16            

Neuro-evolutionary robotics: A gap between simulation and reality

  • neuro-evolutionary robotics is not yet routinely adopted in real-world applications

  • These methods use computer simulations to generate a neural network appropriate for the specific mission that the robots must accomplish. Once the neural network is generated (in simulation), it is installed on the physical robots and tested.

  • the result will be more general, less specialized to the simulator and therefore more likely to generalize well to reality. The simpler the better.

July 2              

Tuning collagen threads for biohybrid robots

  • biohybrid robots: incorporating actual muscles or neurons into a robotic system

  • renewable, biodegradable robots

  • connecting living muscle actuators to the robot, helping it to walk, jump, or swim.

  • you might need a material that is more muscle-like or more tendon-like.

June 22           

Smart elastomers are making the robots of the future more touchy-feely

  • smart materials

  • soft robotics

  • Our technology is based on smart polymer systems and enables us to create novel soft robotic tools that are lighter, more maneuverable and more flexible than the rigid components in use today

  • 'dielectric elastomer.'

  • these new robot tentacles are free to move in almost any direction

  • Our robot arms don't need to be driven by motors or by hydraulic or pneumatic systems—they can be powered simply by the application of an electric current. The elastomer muscles can also be produced in shapes that meet the requirements of a particular application. And they consume very little electric power.

May 27           

Researchers create robot that smiles back

  • Building trust

  • creating a convincing robotic face has been a formidable challenge for roboticists.

  • EVA can express the six basic emotions of anger, disgust, fear, joy, sadness, and surprise, as well as an array of more nuanced emotions, by using artificial "muscles" (i.e. cables and motors)

  • "I was minding my own business one day when EVA suddenly gave me a big, friendly smile," Lipson recalled. "I knew it was purely mechanical, but I found myself reflexively smiling back."

  • EVA uses deep learning artificial intelligence to "read" and then mirror the expressions on nearby human faces.

  • After several refinements and iterations, EVA acquired the ability to read human face gestures from a camera, and to respond by mirroring that human's facial expression.

May 19           

MOBLOT: A theoretical model that describes molecular oblivious robots

  • A theoretical model that is often used in robotics studies is OBLOT, an approach that represents robots as simple systems, all identical, without a memory and unable to communicate with each other.

  • is inspired by the ways in which atoms naturally arrange themselves to form matter.

  • The acronym stands for Molecular OBLivious robOTs

  • can move to form more complex computational units (which are also called molecules in the model), having an extent and different capabilities with respect to robots."

  • Our end goal is to model a robotic matter that can change shape algorithmically. So far, no such a theoretical model has been considered.

April 22          

Biohybrid soft robot with self-stimulating skeleton outswims other biobots

  • skeletal-muscle-based biobots

  • involved the use of skeletal or cardiac muscles

  • 3D printed the skeleton (which was made of a polymer called PDMS) and used it as a scaffold for growing skeletal muscles.

  • hybrid robots

April 13          

A robot that teaches itself to walk using reinforcement learning

  • the researchers began with a simulation of a robot in a virtual world

  • She learned how to keep from falling when slipping slightly, or to recover when shoved from the side. She also learned to compensate when two of her motors were damaged.

March 2          

A world first: A robot able to 'hear' through the ear of a locust

  • biological systems have a huge advantage over technological systems—both in terms of sensitivity and in terms of energy consumption.

  • The principle we have demonstrated can be used and applied to other senses, such as smell, sight and touch.

2020

 

August 12       

The Trials of BINA 48

“A Documentary Exploring the Legal Rights and Responsibilities of an Conscious Artificial Intelligence.”

                      

bottom of page