It was only a matter of time before our machines became not only a functional, but emotional part of our lives. As devices become more and more personalized, we are drawn toward them in a way that we are drawn toward the people in our lives. “It knows me,” this personalization seems to say. And as these devices become more personalized with recognition features, where do we draw the line between functional robotics and love? After all, don’t robotics exist to fill a gap in life or make it more efficient? And aren’t humans sometimes unpredictable and unreliable, unlike our machines?

The Artificial Intelligence and Robotics Technology Laboratory (AIART) in Tawain has been developing a lovotics robot to further explore the human to robot relationship. Involved in this development is the understand of the physiology behind love, which of course is a complex combination of factors including hormones, affect and emotion. According to their website, the artificial intelligence in the lovotics lab mimics the different human systems involved in love and includes the development of an Artificial Endocrine System (physiology), Probabilistic Love Assembly (psychology), and the Affective State Transition (emotions).

The lab has worked on mimicking a myriad of human hormones, evaluating gestures and expressions. The psychological unit has looked into numerous parameters such as proximity, similarity, attachment, attraction, and reciprocal liking, among others. The robot not only enables these human components, but adjusts them based on input and feedback. Amazing, is love as mysterious as we like to think or a controllable environment of many components? This opens the window to questions of our future, will relationships as we know them change? See the video below.

And a video with more explanation:

For those that prefer independent living but require assistance with dally tasks such as eating, Swedish company Bestic has developed a robotic arm that assists with eating. It can be programmed for height, speed, and type of food, and allows the user to eat alongside others without requiring the assistance of another person. For those with musculoskeletal and neurological injuries and diseases that render their upper body muscles weak or with tremor such as Polio, MS, Parkinson’s or Ataxia, this allows for some normalcy at mealtime.

Those that are interested in trying the device can contact the makers through their website.

Watch the video below:

It always seemed so far away that we would be able to control our environment with just our brains, but as our brains produce electrical signals, it was only a matter of time that these could be converted for use in technology. Muse by Interaxon is a brain-sensing headband which uses EEG’s to detect changes in brainwaves which are meant to convert to digital signals. This product features 6 sensors in the headband, and using a tablet or PC the changes in brainwaves can be monitored for mental acuity and relaxation exercises. In a time when our brains can easily fatigue from the constant multitasking and refreshing of our technology at hand, this is something that can prove very valuable to allow us to improve our concentration and get feedback should we lose our focus.

Future implications given on the website include controlling music, playing games, and changing home environments.

Home units can be pre-ordered now for $299. These headbands come in black or white, and include a Calm app and free basic software development kit.

Watch the video about the product below:

Our body has an amazing sense of recognizing something as self or foreign, harmful or beneficial. However, our interpretation of this data and pinpointing specific diseases leads to the sometimes complicated world of diagnostics. The process of finding what disease, organism, or bacteria is present in the body, then recognizing and analyzing it involves multiple systems. Funded by the the U.S. National Science Foundation (NSF) and the U.K. Engineering and Physical Sciences Research Council, scientists are developing a small ‘biohybrid’ robot called the Cyberplasm which uses living cells and technology to find and interact with bacteria and cells within our own bodies. Once something is identified, it can be reported back to an engineered nervous system to interpret. Based on the form and function of a sea lamprey, a simple sea creature pictured above, the small robot will be able to swim through our bodies to possibly record data, find and identify diseases.

This is no small feat. In order to achieve this, the Cyberplasm is equipped with synthetic muscle to propel it through the body, which requires the biologic conversion of sugar to energy. Synthetic sensors scope the environment and report back to an electronic nervous system. This is all part of an engineering principle called “Synthetic Biology,” where man made devices mimic life’s functions. Optoelectric interfaces are being developed to adapt and respond to a changing environment as the robot swims through the body. The power of the robot will come from microbial fuel cells, a renewable energy, converting bacteria to electric current and energy.

 

Researchers at Johns Hopkins University have spent years developing an amazing prosthetic limb called the Modular Prosthetic Limb, which mimics upper body movement controlled by thought. The project was funded by DARPA in the 2005 to assist veterans who had sustained injuries and amputations to their arms. The process of actually using the device involves rerouting the electrical signals of the body to the prosthetic limb, and requires mental imagery exercises of the clients in order to develop the networks similar to how we build them to control limbs in our own bodies.

To mimic upper body movement is incredibly difficult and a feat in and of itself. The fine movement and precision of an arm is much different than that of a leg. While the lower extremities are primary used for ambulation and mobility, the man purpose of the hand and upper extremity is for dexterity; grabbing and reaching objects, helping us eat, dress, and complete most daily tasks. The Modular Prosthetic Limb features more than 100 sensors and 26 degrees of freedom (variations in movement). It can open and close the hand, differentiate grasp as a human hand does, and has the strength of a human arm. In an incredibly complicated system, the device communicates with the brain as would a regular limb. The details of the development are discussed in their paper An Overview of the Developmental Process of the Modular Prosthetic Limb, a look into the arduous process of developing this product.

https://i0.wp.com/www.jhuapl.edu/prosthetics/images/oth_bodyattachment.jpg

sensor arm

sensorshand

Revolutionizing Prosthetics 2009 Modular Prosthetic Limb–Body Interface: Overview of the Prosthetic Socket Development (Johns Hopkins APL Technical Digest, Volume 30, Issue 3, pp. 240–249, 2011)

There are many situations, such as those in emergency environmental conditions, where we are limited in crossing land due to safety limitations. Additionally normal vehicles have difficulty navigating uneven terrain. Google recently acquired Boston Dynamics, which describes itself as ‘an engineering company that specializes in building dynamic robots and software for human simulation.’ This company lists 9 robots on its website, each with a different shape, function, speed to perform a variety of tasks. The machines are impressive in size and ability to simulate locomotion on uneven terrain and different inclines. One of these robots is “BigDog,”a crude-looking beast which weighs in at 240 lbs and is about 2.5 feet tall, designed to mimic the size of a large dog or mule. This robotic animal ‘runs at 4 mph, climbs slopes up to 35 degrees, walks across rubble, climbs muddy hiking trails, walks in snow and water, and carries 340 lb load.’ Unlike a car which is limited in uneven terrain, this machine can stand, walk, trot, run, crawl, and gallop. Components include heat exchanger, engine, computer, actuators, leg spring, and force sensor. Simulated animal joints include hip, knee, ankle and foot. Though this machine is still being improved for noise, righting, autonomy and navigating even rougher terrain, it can provide many uses in which we are currently limited.

Knightscope is Silicon Valley based company that has been developing a personal security robot for use in the streets, schools, and other public areas. The five foot surveillance camera includes facial recognition, a laser imaging sensor that can map a 3D area, and thermal imaging among other capabilities. After public tragedies occur, we often wish there was something to predict and recognize that something was amiss. Some of the many limitations of human security is our field of vision, inability to see through crowds, and sleep cycles. A machine that can crowd source for security and raise flags that the human eye cannot detect may help small and large scale tragedies from occurring.

A mock control center displays some of the capabilities that this 500 lb robot has:

These machines are not for sale yet, and will be tested at the 2014 World Cup in Brazil.

ReWalker Oliver, Berlin, Germany

source

The purpose of this blog is to connect robotics to industry. However, as a physical therapist, I have to say I am personally vested in the amazing products that have been developed for rehabilitation in the past few years. The ReWalk is another great product that allows for the mobility of those who are otherwise wheelchair bound. A bionic exoskeleton with forearm crutches allows those with lower body impairments to stand, walk, and see others ‘face to face,’ as their website points out. Additionally, ReWalk also allows for clients to participate in exercise that is otherwise unfeasible due to their physical limitations. Their models are available at a number of rehabilitation centers throughout the US, Europe, and Israel.

Update: The ReWalk has now been cleared by the FDA for personal use outside of rehabilitation centers.

This won’t be a very indepth post other than to say that I went to Barbot, an event which is a part of the Bay Area Science Festival. While I can’t deny that actual bartenders are an integral part of socializing and the bar experience, it is pretty awesome to be served a cocktail by a robot. Perhaps something at weddings and parties to entertain the crowd. I have to say, the robots served some stiff drinks. Some were programmed to make multiple cocktails adjusted to strength and flavor. My favorite was a Mai Tai robot. The photo does not do it justice.

IMG_2012

As an adult, you are expected to enter situations with a certain composure. Hyperventilating or crying while getting your blood drawn is frowned upon, though many of us naturally have this reaction as someone is prodding our blood vessels, looking to extract the blood our body works so hard to make.

Veebot looks to make this a more efficient, accurate process for predictability and efficiency, which may possible quell the young and not so young when going in for a common blood drawing procedure. According to their website , 20-25% of phlebotomy procedures fail to draw blood on the first stick. That’s 20-25 out of every 100 procedures. Not very efficient.

Veebot’s team has developed a robotic devices with a viewing system that identifies and selects the best insertion site. Using lighting and ultrasound viewing techniques, once selected the machine can insert the needle. According to sources, the process takes about a minute.  A technician must still be present to oversee the procedure, so to assuage some fears  it will not just be you and a needle-sticking robot alone in a room together.