Are employees really less upset losing their jobs to robots rather than other humans?


Study assesses psychological impact of job losses through technology

Generally speaking, most people find the idea of workers being replaced by robots or software worse than if the jobs are taken over by other workers. But when their own jobs are at stake, people would rather prefer to be replaced by robots than by another employee. That is the conclusion of a study by the Technical University of Munich (TUM) and Erasmus University in Rotterdam.

Over the coming decades, millions of jobs will be threatened by robotics and artificial intelligence. Despite intensive academic debate on these developments, there has been little study on how workers react to being replaced through technology.To find out, business researchers at TUM and Erasmus University Rotterdam conducted 11 scenarios studies and surveys with over 2,000 persons from several countries in Europe and North America. Their findings have now been published in the renowned journal Nature Human Behaviour.

Human replacements pose greater threat to feeling of self-worth

The study shows: In principle, most people view it more favorably when workers are replaced by other people than by robots or intelligent software. This preference reverses, however, when it refers to people’s own jobs. When that is the case, the majority of workers find it less upsetting to see their own jobs go to robots than to other employees. In the long term, however, the same people see machines as more threatening to their future role in the workforce. These effects can also be observed among people who have recently become unemployed.

The researchers were able to identify the causes behind these seemingly paradoxical results, too: People tend to compare themselves less with machines than with other people. Consequently, being replaced by a robot or a software poses less of a threat to their feeling of self-worth. This reduced self-threat could even be observed when participants assumed that they were being replaced by other employees who relied on technological abilities such as artificial intelligence in their work.

Weaker organized resistance?

“Even when unemployment results from the introduction of new technologies, people still judge it in a social context,” says Christoph Fuchs, a professor of the TUM School of Management, one of the authors of the study. “It is important to understand these psychological effects when trying to manage the massive changes in the working world to minimize disruptions in society.”

For example, the insights could help to design better programs for the unemployed. “For people who have lost their job to a robot, boosting their self-esteem will be less of a priority,” says Fuchs. “In that case it is more important to teach them new skills that will reduce their concerns about losing out to robots in the long term.”

The study could also serve as a starting point for further research on other economic topics, says Fuchs: “It is conceivable that employee representatives’ responses to job losses attributed to automation will tend to be weaker than when other causes are involved, for example outsourcing.”

Learn more: Employees less upset at being replaced by robots than by other people


The Latest on: Robots replacing humans

via Google News


The Latest on: Robots replacing humans

via  Bing News


Introducing ‘Operator 4.0,’ a tech-augmented human worker

Image 20170418 32716 1kvwum2Technology can help workers in many ways.
Romero, Stahre, Wuest, et al., CC BY-ND

Thorsten Wuest, West Virginia University; David Romero, Instituto Tecnológico y de Estudios Superiores de Monterrey, and Johan Stahre, Chalmers University of Technology

The Fourth Industrial Revolution has arrived. The first was the steam engine-driven Industrial Revolution; the second involved the innovations from Henry Ford’s assembly line. Third, microelectronics and computer power appeared on factory floors. Now, manufacturing businesses are beginning to integrate robotics, automation and other data-driven technologies into their workflows. The Conversation

Robots have taken over difficult, dangerous and repetitive physical tasks, improving factory safety, worker comfort and product quality. The next phase of labor innovation will do the same thing for cognitive work, removing mentally stressful and repetitive tasks from people’s daily routines.

Human work will become more versatile and creative. Robots and people will work more closely together than ever before. People will use their unique abilities to innovate, collaborate and adapt to new situations. They will handle challenging tasks with knowledge-based reasoning. Machines enabled by the technologies that are now becoming commonplace – virtual assistants like Siri and Alexa, wearable sensors like FitBits and smart watches – will take care of tedious work details.

People will still be essential on the factory floors, even as robots become more common. Future operators will have technical support and be super-strong, super-informed, super-safe and constantly connected.

We call this new generation of tech-augmented human workers, both on factory floors and in offices, “Operator 4.0.” There are several types of enhancements available, which can be used individually or in combination to put humans at the heart of this technological revolution.

Super strong
This Hyundai wearable robot can help a human worker lift very heavy items.

One straightforward enhancement would let workers wear robotic exoskeletons to enhance their strength. A “super-strength operator” could let a human truly control the physical power of a large robot. In today’s warehouses and construction sites, workers risk injury and exhaustion by handling heavy objects themselves. Or they are forced to compromise, using a more powerful tool with less adaptability, like a forklift.

The benefits go well beyond the workplace. Of course, a worker in a powered robotic suit could easily handle extremely heavy objects without losing the flexibility of natural human movements. The worker would also be far less likely to suffer severe injuries from accidents or overwork. And at the end of a day, a super-strength worker could take off the exoskeleton and still have energy to play with the kids or spend time with friends.

Super informed

Fighter pilots use heads-up displays, which provide them with crucial information right on the cockpit windshield and directly in their line of sight. This is “augmented reality,” because it displays information within a live view of the world. It used to be very specialized and expensive technology. Now, Microsoft’s HoloLens makes it available for consumers.

An “augmented operator” can get directions or assistance without interrupting the task he or she is working on. Often, when new equipment or processes are developed, trainers need to travel long distances to factories, staying for weeks to teach workers what to do. Designers do the same, getting feedback for refinements and improvements. All that travel takes up a huge amount of time and is extremely expensive. With augmented reality available, it is often unnecessary.

Augmented reality on the job.

A worker wearing a set of smart glasses can receive individualized, step-by-step instructions displayed right in front of his or her eyes, no matter where he or she is looking. With earbuds and a microphone, she or he could talk directly to trainers in real time.

Super safe

Many manufacturing environments are hazardous, involving heavy equipment, caustic chemicals and other dangers that can maim and kill human workers. A “healthy operator” may be equipped with wearable sensors tracking pulse rate, body temperature, chemical exposure or other factors that indicate risks of injury.

This type of system is already available: Truck drivers can wear the Maven Co-Pilot, a hands-free headset that detects fatigue symptoms, like head-bobbing movements. It can also ensure drivers check their rear-view mirrors regularly to stay aware of nearby traffic. It can even provide reminders to take scheduled breaks. This helps keep the truck’s driver safe and improves everyone else’s road safety.

And beyond…

Possibilities are limitless. An “analytical operator” would wear a monitor showing real-time data and analytics, such as information on chemicals in a sewage treatment plant or pollutants at an incinerator. A “collaborative operator” may be linked to collaborative robots, or co-bots, like the assembly assistant YuMi. A “smarter operator” could be equipped with an intelligent virtual personal assistant, like an advanced Siri or Alexa.

There does not have to be conflict between robots and humans, with machines taking people’s jobs and leaving them unemployed. Technology should be designed with collaboration in mind. That way, companies and workers alike will be able to capitalize on the respective strengths of both human and machine. What’s more, the inherent flexibility of “Operator 4.0” workers will also help to ensure workplaces of the future that can change and adapt. That means getting ever more efficient and safer, as new technologies emerge.

Thorsten Wuest, Assistant Professor & J. Wayne and Kathy Richards Faculty Fellow in Engineering, West Virginia University; David Romero, Professor of Advanced Manufacturing, Instituto Tecnológico y de Estudios Superiores de Monterrey, and Johan Stahre, Professor of Production Systems, Chalmers University of Technology

This article was originally published on The Conversation. Read the original article.



The Latest on: Tech-augmented human worker

via Google News and Bing News

No humans would be harmed in the operation of these robots

Elastic machines: Membranes surrounding sealed, air-filled chambers can be used as actuators, facilitating risk-free contact between humans and robots. Compliant electrodes are attached to each side of the membrane and cause it to stretch when voltage is applied. The membranes are bistable, meaning that they can enclose two different volumes at the same air pressure. A membrane switches from its more compact state to its stretched state when voltage is applied to its electrodes. Even in the case of three or more linked, bubble-shaped chambers, one can be controlled in this way so that it inflates to a larger volume, thereby exerting force. [less] © Alejandro Posada

Elastic machines: Membranes surrounding sealed, air-filled chambers can be used as actuators, facilitating risk-free contact between humans and robots. Compliant electrodes are attached to each side of the membrane and cause it to stretch when voltage is applied. The membranes are bistable, meaning that they can enclose two different volumes at the same air pressure. A membrane switches from its more compact state to its stretched state when voltage is applied to its electrodes. Even in the case of three or more linked, bubble-shaped chambers, one can be controlled in this way so that it inflates to a larger volume, thereby exerting force. 
Photo: Alejandro Posada

A soft actuator using electrically controllable membranes could pave the way for machines that are no danger to humans

In interacting with humans, robots must first and foremost be safe. If a household robot, for example, encounters a human, it should not continue its movements regardless, but rather give way in case of doubt. Researchers at the Max Planck Institute for Intelligent Systems in Stuttgart are now presenting a motion system – a so-called elastic actuator – that is compliant and can be integrated in robots thanks to its space-saving design.

The actuator works with hyperelastic membranes that surround air-filled chambers. The volume of the chambers can be controlled by means of an electric field at the membrane. To date, elastic actuators that exert a force by stretching air-filled chambers have always required connection to pumps and compressors to work. A soft actuator such as the one developed by the Stuttgart-based team means that such bulky payloads or tethers may now be superfluous.

Many robots have become indispensable, and it is accepted that they may be dangerous to humans in their workspace. In the automotive industry, for example, they assemble cars with speed and reliability, but are well shielded from direct contact with humans. These robots go through their motions precisely and relentlessly, and anyone who gets in the way could be seriously injured.

Robots with soft actuators that cannot harm humans, on the other hand, are tethered by pneumatic hoses and so their radius of motion is restricted. This may be about to change. “We have developed an actuator that makes large changes in form possible without an external supply of compressed air”, says Metin Sitti, Director at the Max Planck Institute for Intelligent Systems.

The new device consists of a dielectric elastomer actuator (DEA): a membrane made of hyperelastic material like a latex balloon, with flexible (or ‘compliant’) electrodes attached to each side. The stretching of the membrane is regulated by means of an electric field between the electrodes, as the electrodes attract each other and squeeze the membrane when voltage is applied. By attaching multiple such membranes, the place of deformation can be shifted controllably in the system.

Air is displaced between two chambers

The researchers are helped in this by the fact that their membrane material knows two stable states. In other words, it can have two different volume configurations at a given pressure without the need to minimize the larger volume. This is a little like letting the air out of an inflated balloon; it does not shrink back to its original size, but remains significantly larger. Thanks to this bi-stable state, the researchers are able to move air between a more highly inflated chamber and a less inflated one. They do this by applying an electric current to the membrane of the smaller chamber which responds by stretching and sucking air out of the other bubble. When the power supply is switched off the membrane contracts, but not to its original volume; it remains larger, corresponding to its stretched state.

“It is important to find suitable hyperelastic polymers that will enable strong and fast deformation and be durable,” points out Metin Sitti. With this in mind, the team has tested different membrane materials and also used models to systematically record the behaviour of the elastomer in the actuator.

Thus far, the elastomers tested by Sitti’s team have each had a mix of advantages and disadvantages. Some show strong deformation, but at a slow rate. Others work fast, but their deformation is more limited. “We will combine different materials with a view to combining different properties in a single membrane,” says Sitti. This is, however, just one of the next steps he and his team have in mind. They also plan to integrate their actuator in a robot so that it can, for instance, move its legs but still give way if it happens to come across a human. Only then can machine-human interactions be risk-free.

Learn more: Gentle strength for robots



The Latest on: Machine-human interactions

via Google News


The Latest on: Machine-human interactions

via  Bing News


Robots and Humans Working Together: Radar with 360° Vision

W-Band-Radar Demonstrator via Fraunhofer is smaller than a pack of cigarettes

W-Band-Radar Demonstrator via Fraunhofer is smaller than a pack of cigarettes

Nowadays it is impossible to imagine industry without robots. Safety laser scanners mostly safeguard dangerous areas and protect people from collisions. But optical sensors have their limitations, for instance when plastic surfaces, dust or smoke obstruct their line of sight. Fraunhofer researchers have developed a new, high-frequency radar scanner that cuts through these obstacles. It can monitor its environment in a 360-degree radius, making it ideal for safety applications wherever people and robots work together.

Increasing connectivity of production systems in “smart” industry 4.0 operations is driving the interaction between people and machines. The trend is moving towards industrial robots that operate without protective barriers. A prerequisite for this level of co-working is that people must not be endangered at any time – but that is precisely the Achilles’ heel of collaboration between people and robots. Currently, laser scanners are used to monitor the danger zone around machinery, and to stop the machine as soon as a person enters the zone. However, optical sensors do not always achieve reliable results under changing light conditions. They also do not work if smoke, dust or fog limits visibility.

Researchers at the Fraunhofer Institute for Applied Solid State Physics IAF have developed a compact modular 360-degree radar scanner that is superior to optical sensors in many respects. This makes it a perfect choice for safety applications for human-machine collaboration. The radar works with millimeter waves that are reflected by the objects to be observed, such as people (see box: Radar with 360-degree vision). Transmitted and received signals are processed and evaluated using numerical algorithms. Based on the calculations, it is possible to determine the distance, position and speed of the objects. If several radar units are used, an object’s location in the room can also be determined as can the direction in which it is moving.

“Our radar is not focused on one point. Instead, it sends out millimeter waves in a club shape. Unlike a laser scanner, the signals are reflected even when visibility is obstructed by an object,” explains IAF scientist Christian Zech. The laser scanner can reliably measure the distance and the position of a target – a person, for instance – only if the target is working in an unobstructed line of sight. However, IAF’s 360-degree radar can penetrate optically opaque material (see box), which means it can identify the employee even if there are boxes, cardboard walls or other obstacles in the way.

High-frequency board technology for cost-effective systems

Previous millimeter wave radar systems – based on waveguides – are bulky and expensive. IAF’s scanner has a diameter of only 20 centimeters and is 70 centimeters high. The high-frequency module featuring indium gallium arsenide semiconductor technology is no larger than a pack of cigarettes and is located in the base of the scanner. “These days, millimeter wave applications are dominated by waveguides that are extremely expensive to produce. Thanks to a cost-effective mounting and interconnection technology as well as specially developed circuit boards, we can replace the wave guides with our high-frequency module that has been integrated onto a board measuring just 78 x 42 x 28 millimeters,” says Zech. The high-frequency module, which is the key component of the radar scanner, was developed by IAF researchers in close collaboration with the Fraunhofer Institutes for Reliability and Microintegration IZM and for Manufacturing Engineering and Automation IPA.

In addition to the signal processor, the complete system comprises a transmitting and receiving antenna with a dielectric – that is, electric non-conducting – lens. A self-turning mirror affixed at a 45 degree angle deflects the millimeter waves, guides them, and evaluates the entire room. Thanks to the use of a dielectric antenna, the angle of aperture can be freely selected. That means nearby objects as small as a centimeter in size can be detected as easily as large surfes that are far away. The system‘s range of operation is dependent on the application and can be up to several hundred meters.

The scanner includes an Ethernet interface and is therefore suitable for industry 4.0 applications.

Precise distance measurement

In order to evaluate the measurement accuracy and reliability of the 360-degree radar, the researchers carried out hundreds of measurements in the lab. Maximum deviation from the mean was less than a micrometer; standard deviation was 0.3 micrometers.

The researchers will present a system demonstrator at Hannover Messe (Hall 2, Booth C16/C22) from April 25-29, 2016 and again at the SENSOR+TEST in Nuremberg (Hall 5, Booth 5-248) from May 10-12, 2016.

Learn more: Radar with 360° vision



The Latest on: Industry 4.0

via Google News


The Latest on: Industry 4.0

via  Bing News


Open-source software that turns a human’s motions into instructions a robot can understand

Madeline Gannon via CMU

Madeline Gannon via CMU

Madeline Gannon (A 2016), a Ph.D. candidate in Carnegie Mellon University’s School of Architecture, has put the power of interacting with robots into our hands — literally.

Now programming robots is not just for those with years of coding knowledge, it’s for anyone who wants to experience what it’s like to simply wave at a robot and have it wave back.

Gannon designed Quipt, open-source software that turns a human’s motions into instructions a robot can understand. She designed it while in residence at Autodesk Pier 9 in San Francisco.

When she left for her residency, she had been working with industrial robots at Carnegie Mellon University for a few years. She was close to making a big change.

“I wanted to invent better ways to talk with machines who can make things. Industrial robots are some of the most adaptable and useful to do that,” she said.

But they are also some of the most dangerous. The U.S. Department of Labor has a special website devoted to “Industrial Robots and Robot System Safety.” These robots are big, and they have to be programmed by people with years of training.

That programming takes place “basically with a joystick,” according to Gannon. Programmers move the robot to a place, record a point and iteratively build up a motion path for the robots to remember.

“Then the robot will repeat that task 24/7. That is their world,” Gannon said.

But not anymore. Quipt replaces the joystick technique. Its software stitches together the robot with a motion capture system, which are cameras that look into a space and let the robot see where it is.

“I gave this robot — this big, powerful dumb robot — eyes into the environment,” Gannon said.

When the robot looks with its motion-capture eyes, it sees tracking markers on a person’s hand or clothes. Now it can track a person while remaining a certain distance away, it can mirror a movement, or it can be told to avoid markers.

Which means that potentially these robots are a lot safer — and a lot smarter. Gannon imagines a world where they aren’t just welding parts on an assembly line.

“I think what’s really exciting is taking these machines off of control settings and taking them into live environments, like classrooms or construction sites,” Gannon said.

Gannon collaborated with visiting artist Addie Wagenknecht and the Frank-Ratchye STUDIO for Creative Inquiry to develop a robot that could rock a baby’s cradle according to the sound of the baby’s cry.

This software is a cousin to another of Gannon’s projects that makes technology more hands-on — last year Gannon released Tactum, which takes the software guesswork out of 3-D printing. In fact, Tactum projects an image directly on your body, and with your own hands you can manipulate the image to make it fit or look exactly how you like. Together with a projector, which produces the image on your skin, and a sensor, which can detect your skin and how you’re touching it, the software updates the 3-D model that you’re creating. When you’re ready to print, you just simply close your hand and your design goes to the 3-D printer.

Gannon was drawn to CMU’s College of Fine Arts when the School of Architecture added new fabrication equipment.

“I felt like I had the keys to the candy shop,” she said.

“My research is really playing in the field of computer science and robotics, but the questions I’m able to ask those specific domains is conditioned by my architectural background. It’s really a spatial answer, how to control or interact with a robot. That, in my mind, is an architectural answer to this problem,” she said.

Golan Levin, director of the Frank-Ratchye STUDIO for Creative Inquiry at CMU, is one of Gannon’s doctoral thesis advisors. He thinks her work could change how people design architecture, clothing and furniture, as well as influence industrial design and the arts.




The Latest on: Quipt

via Google News


The Latest on: Quipt

via  Bing News