Chapter 2: Android (robot)
A humanoid robot is what we refer to as an android.
According to the Oxford English Dictionary, the term was first used (under the name "Androides") in Ephraim Chambers' Cyclopaedia in the year 1728. This publication made reference to an automaton that Saint Albertus Magnus was said to have invented. The word "android" was first used in connection to small humanoid toy automatons in United States patents issued as early as 1863. Although a "cybernetic organism" or "bionic man" would be a creature that is a blend of biological and mechanical elements, a "cyborg" would be a more common term.
The term "droid," which was popularized by George Lucas in the first Star Wars film and is now used extensively within the genre of science fiction, originated as an abbreviation of the word "android." However, Lucas and others have used the term to mean any robot, including machines with distinctly non-human form such as R2-D2. In the episode "What Are Little Girls Made Of?" from Star Trek: The Original Series, the term "android" was used. In the book Do Androids Dream of Electric Sheep?, written by Philip K. Dick, the disparaging acronym "andy" was first used. Since then, the abbreviation has found some more use, such as in the television series Total Recall 2070.
According to Eric G. Wilson, who refers to an android as a "synthetic human being," there are three distinct varieties of androids that may be categorized by the materials that make up their bodies:
the "dead objects" or "stiff, inanimate, natural material" variety, such as mummies, puppets, dolls, and sculptures; this category also includes the "mummy type."
the golem kind, which is formed from malleable and presumably living material and includes homunculi as well as golems
the automaton kind, which is comprised of both lifeless and alive components and may include robots and automatons.
The fascination in developing robots that can mimic human morphology can be found historically in the assimilation of two concepts: simulacra (devices that exhibit likeness), and automata. Although human morphology is not necessarily the ideal form for working robots, the fascination in developing robots that can mimic it can be found historically in the assimilation of two concepts (devices that have independence).
Several initiatives have been started, and others are now under way, with the intention of developing humanoid robots that mimic human characteristics in appearance, behavior, and speech to some extent.
Since the 1970s, Japanese robots have been at the forefront of the industry.
The Intelligent Robotics Lab at Osaka University, which is run by Hiroshi Ishiguro, and the Kokoro firm exhibited the Actroid at Expo 2005 in Aichi Prefecture, Japan. The Telenoid R1 was launched in 2010. A new DER 2 android was created by Kokoro in the year 2006. 165 centimeters is the measured height of the human body component of DER2. There are forty-seven movable points in all. Not only is DER2 capable of changing its facial expression, but it can also move its hands and feet and rotate its body. The "air servosystem" that Kokoro devised in the beginning is what is really employed for the actuator. Because the movement is so smooth and there is so little noise, this is a direct outcome of having a servosystem regulate the actuator accurately using air pressure. By using a cylinder that was of a lower diameter, DER2 was able to achieve a more svelte body than its predecessor. DER2 seems to have a more aesthetically pleasing proportion from the outside. When compared to its predecessor, the DER2 features slimmer arms and a greater variety of facial expressions to choose from. After being programmed, it is able to use its voice to choreograph its movements as well as its gestures.
An android head known as Saya was created by the Intelligent Mechatronics Lab at the Tokyo University of Science, which is run by Hiroshi Kobayashi. Saya was shown during the Robodex 2002 convention held in Yokohama, Japan. At the moment, there are a number of different efforts taking place all over the globe that are engaged in humanoid research and development. It is hoped that these initiatives will bring out a wider range of technologies that have been achieved in the not too distant future. At the present time, Saya is employed as a guide at the Science University of Tokyo.
Waseda University in Japan, in collaboration with the manufacturers of NTT Docomo, has been successful in developing a shape-shifting robot known as WD-2. It is able to alter the appearance of its face. To begin, the designers selected the appropriate locations for the points that would later be used to convey a person's profile, including their eyes, nose, and other facial features. They claim that the robot expresses its face by shifting all of the points to the places that have been agreed upon. In 2003, the first prototype of the robot was created and put through its testing. After then, around one year later, they made a handful of significant adjustments and enhancements to the overall layout. The mask of the robot, which is constructed of elastic, is modeled after a standard head dummy. The driving system is equipped with a three-degrees-of-freedom unit. The WD-2 robot's facial characteristics may be altered by selectively activating face points on a mask; each point has three degrees of freedom, making it possible for the robot to take on a variety of expressions. There are a total of 56 degrees of freedom available thanks to this one's 17 face points. When it comes to the components that went into the WD-2, its facemask is made out of a highly elastic polymer known as Septom, which also has particles of steel wool mixed in for increased tensile strength. Other technological aspects expose a shaft that is driven behind the mask at the required facial spot. This shaft is powered by a DC motor that consists of a basic pulley and a sliding screw. It would seem that the researchers are also able to alter the form of the mask in accordance with genuine human faces. A 3D scanner is all that is required to "copy" a face since it is used to pinpoint the positions of each of an individual's 17 facial points. After that, they are maneuvered into place with the assistance of a laptop and 56 individual motor control boards. When a photograph of a person's face is superimposed onto a 3D mask, the researchers note that the shape-shifting robot is even capable of displaying the person's facial hair and skin tone.
A professor from Nanyang Technological University named Prof. Nadia Thalmann oversaw the work of the Institute for Media Innovation and the School of Computer Engineering in the creation of a social robot named Nadine. The software that drives Nadine is very much like that which drives Apple's Siri and Microsoft's Cortana. In the future, Nadine may work as a personal assistant in workplaces and homes, or she may work as a companion for people of all ages, including children and the elderly.
Associate Professor Gerald Seet, who is affiliated with both the School of Mechanical and Aerospace Engineering and the BeingThere Centre, was in charge of the research and development of EDGAR, which took place over the course of three years. EDGAR is capable of being controlled by a remote user, with the user's face and expressions being shown in real time on the robot's face. Additionally, the robot imitates the way their upper bodies move.
EveR-1 is an android interpersonal interactions model that was studied and created by KITECH. It is capable of replicating human emotional expression through facial "musculature" and can have basic conversation with a vocabulary of around 400 words. She has the usual height and weight for a Korean lady in her twenties, measuring in at 160 centimeters and 50 kilograms respectively. The name EveR-1 comes from the biblical figure Eve, combined with the letter r, which stands for robot. The powerful computational processing capability of EveR-1 allows speech recognition and voice synthesis, while simultaneously processing lip synchronization and visual identification by means of 90-degree micro-CCD cameras equipped with face recognition technology. Her facial expressions, bodily coordination, and emotional expressions are all managed by a microchip that operates independently inside her artificial brain. She is able to display realistic facial expressions, sing, and dance all at the same time since her whole body is constructed of extremely sophisticated synthetic jelly silicon, and she has sixty artificial joints in her face, neck, and lower body. Her entire body is also composed of jelly silicon. The Ministry of Information and Communication in South Korea has an ambitious ambition to place a robot in every home by the year 2020.
Great Moments with Mr. Lincoln was an attraction that was first shown at the 1964 New York World's Fair and was built by Walt Disney and his team of Imagineers.
Maria Bot is a virtual being android that has sophisticated facial expressions, head movement, and participates in conversation with users on a wide range of topics. She resembles a human from the shoulders up and can talk about a number of topics. She utilizes artificial intelligence to analyse and synthesize information in order to make her own judgments on how to communicate to people and interact with them. Conversations, direct data inputs like books or articles, and online sources are all ways that she gathers information for her research.
An worldwide high-tech corporation created Maria Bot for Barry with the goal of assisting in the improvement of school standards and the eradication of educational deprivation. Students will have new opportunities to interact with and explore ethical concerns brought about by the growing presence of...