By: Jason Eberhard, Contributing writer

What has been the most challenging part of trying to develop a robot that can actually sit in front of me? The original problem was that my hands were so tiny that a robot like this could not fit in my hand, which would make me leap out of my chair very far.

Well, that was a quick problem.

Well, it’s happening much more rapidly now. One of the most exciting companies I have worked with lately is Neurala. If you’ve ever been to Neurala’s booth at the Neuralstem conference, you know that it is here to introduce the company’s revolutionary neural stimulation system to the public.

Neurala’s hackable robot is a high-performance programmable computer that can move its body movement and almost instantaneously interact with partners via video and audio. This robot could potentially be used as a workstation or voice-based device for medical diagnostics.

In addition to its Siri-like voice recognition capabilities, the neural networking-based computer can also sense unique touch and voice patterns in an object. This could make it a potential robot that is interchangeable with anyone.

I have many hands and legs, but any robot that acts as a roommate would be challenging for one reason alone: Your hands are tiny.

Scientists on the University of California-San Diego Robotics Laboratory have trained their robot by engaging certain specific brain patterns of neurons that are involved in decision-making.

Due to its unusual structure and adaptability, the neural network is invaluable in the development of the company’s robot — as well as designing its operating hardware.

The system has already served its intended purpose — in early projects it managed to project images of proteins on to target proteins, for example, and a gripper-based system of electrical impulses on infected cells — and are even beginning to harness some of the neural network’s potential.

While the neural network had been in use in prototypes for the near-three years, it was taken off the market to focus more directly on pursuing the ultimate goal of finding ways to facilitate the gizmo that will adapt to moves, navigate objects and even provide emotional support.

This is particularly good news for those of us in fields where robots are used to perform difficult, repetitive and unpredictable tasks such as carting or guiding carts.

Neurala has extensive expertise in how to interface with robots through an interface architecture that is technically impossible for computers to do. For example, having to stream video footage of the robot’s actual movements requires a huge leap in computing power, which would be prohibitively expensive for the rest of us.

The neural network’s input rate, an integral component of its unique functionality, is highly variable. As soon as a visual reference such as a video does not work well, the neural network learns that it is very important to encode the video into a metadata stream.

The workflow involved in these experiments requires dedicated, specialized machines that work with high-quality sound.

The Silicon Valley company is currently valued at around $300 million, according to Sohmee.com, a China-based tech site. That’s a remarkable cost for the company, considering that it sells its robots with an annual price tag of $150,000 to $200,000, though some have remarked that it is less expensive to ship a robot than it is to buy a dog.

Be that as it may, but Neurala’s successes in the field of prosthetics and robots add further fuel to the idea that neural networks — and those with more intelligence — are no longer idle areas, but must be fully realized into a way for them to enhance our social lives.

This microrevolution would make massive changes to our sense of order.

And from the look of things, we would all be more efficient.

Got a question about mechanical engineering and artificial intelligence? Send it to jaseberhard@purdue.edu.