EPFL

Prototype prosthesis proffers proper proprioceptive properties

Posted by | EPFL, Gadgets, hardware, Health, medtech, prosthesis, Prosthetics, robotics, science, TC | No Comments

Researchers have created a prosthetic hand that offers its users the ability to feel where it is and how the fingers are positioned — a sense known as proprioception. The headline may be in jest, but the advance is real and may help amputees more effectively and naturally use their prostheses.

Prosthesis rejection is a real problem for amputees, and many choose to simply live without these devices, electronic or mechanical, as they can complicate as much as they simplify. Part of that is the simple fact that, unlike their natural limbs, artificial ones have no real sensation — or if there is any, it’s nowhere near the level someone had before.

Touch and temperature detection are important, of course, but what’s even more critical to ordinary use is simply knowing where your limb is and what it’s doing. If you close your eyes, you can tell where each digit is, how many you’re holding up, whether they’re gripping a small or large object and so on. That’s currently impossible with a prosthesis, even one that’s been integrated with the nervous system to provide feedback — meaning users have to watch what they’re doing at all times. (That is, if the arm isn’t watching for you.)

This prosthesis, built by Swiss, Italian and German neurologists and engineers, is described in a recent issue of Science Robotics. It takes the existing concept of sending touch information to the brain through electrodes patched into the nerves of the arm, and adapts it to provide real-time proprioceptive feedback.

“Our study shows that sensory substitution based on intraneural stimulation can deliver both position feedback and tactile feedback simultaneously and in real time. The brain has no problem combining this information, and patients can process both types in real time with excellent results,” explained Silvestro Micera, of the École Polytechnique Fédérale de Lausanne, in a news release.

It’s been the work of a decade to engineer and demonstrate this possibility, which could be of enormous benefit. Having a natural, intuitive understanding of the position of your hand, arm or leg would likely make prostheses much more useful and comfortable for their users.

Essentially the robotic hand relays its telemetry to the brain through the nerve pathways that would normally be bringing touch to that area. Unfortunately it’s rather difficult to actually recreate the proprioceptive pathways, so the team used what’s called sensory substitution instead. This uses other pathways, like ordinary touch, as ways to present different sense modalities.

(Diagram modified from original to better fit, and to remove some rather bloody imagery.)

A simple example would be a machine that touched your arm in a different location depending on where your hand is. In the case of this research it’s much finer, but still essentially presenting position data as touch data. It sounds weird, but our brains are actually really good at adapting to this kind of thing.

As evidence, witness that after some training two amputees using the system were able to tell the difference between four differently shaped objects being grasped, with their eyes closed, with 75 percent accuracy. Chance would be 25 percent, of course, meaning the sensation of holding objects of different sizes came through loud and clear — clear enough for a prototype, anyway. Amazingly, the team was able to add actual touch feedback to the existing pathways and the users were not overly confused by it. So there’s precedent now for multi-modal sensory feedback from an artificial limb.

The study has well-defined limitations, such as the number and type of fingers it was able to relay information from, and the granularity and type of that data. And the “installation” process is still very invasive. But it’s pioneering work nevertheless: this type of research is very iterative and global, progressing by small steps until, all of a sudden, prosthetics as a science has made huge strides. And the people who use prosthetic limbs will be making strides, as well.

Powered by WPeMatico

These hyper-efficient solar panels could actually live on your roof soon

Posted by | EPFL, Gadgets, GreenTech, hardware, Insolight, science, solar cells, Solar Power | No Comments

The clean energy boffins in their labs are always upping the theoretical limit on how much power you can get out of sunshine, but us plebes actually installing solar cells are stuck with years-old tech that’s not half as good as what they’re seeing. This new design from Insolight could be the one that changes all that.

Insolight is a spinoff from the École Polytechnique Fédérale de Lausanne, where they’ve been working on this new approach for a few years — and it’s almost ready to hit your roof.

Usually solar cells collect sunlight on their entire surface, converting it to electricity at perhaps 15-19 percent efficiency — meaning about 85 percent of the energy is lost in the process. There are more efficient cells out there, but they’re generally expensive and special-purpose, or use some exotic material.

One place people tend to spare no expense, however, is in space. Solar cells on many satellites are more efficient but, predictably, not cheap. But that’s not a problem if you only use just a tiny amount of them and concentrate the sunlight on those; that’s the Insolight insight.

Small but very high-efficiency cells are laid down on a grid, and above that is placed a honeycomb-like lens array that takes light and bends it into a narrow beam concentrated only on the tiny cells. As the sun moves, the cell layer moves ever so slightly, keeping the beams on target. They’ve achieved as high as 37 percent efficiency in tests, and 30 percent in consumer-oriented designs. That means half again or twice the power from the same area as ordinary panels.

Certainly this adds a layer or two of complexity to the current mass-manufactured arrays that are “good enough” but far from state of the art. But the resulting panels aren’t much different in size or shape, and don’t require special placement or hardware, such as a concentrator or special platform. And a recently completed pilot test on an EPFL roof was passed with flying colors.

“Our panels were hooked up to the grid and monitored continually. They kept working without a hitch through heat waves, storms and winter weather,” said Mathiu Ackermann, the company’s CTO, in an EPFL news release. “This hybrid approach is particularly effective when it’s cloudy and the sunlight is less concentrated, since it can keep generating power even under diffuse light rays.”

The company is now in talks with solar panel manufacturers, whom they are no doubt trying to convince that it’s not that hard to integrate this tech with their existing manufacturing lines — “a few additional steps during the assembly stage,” said Ackermann. Expect Insolight panels to hit the market in 2022 — yeah, it’s still a ways off, but maybe by then we’ll all have electric cars too and this will seem like an even better deal.

Powered by WPeMatico

Let’s save the bees with machine learning

Posted by | artificial intelligence, bees, conservation, EPFL, Gadgets, GreenTech, science, TC | No Comments

Machine learning and all its related forms of “AI” are being used to work on just about every problem under the sun, but even so, stemming the alarming decline of the bee population still seems out of left field. In fact it’s a great application for the technology and may help both bees and beekeepers keep hives healthy.

The latest threat to our precious honeybees is the Varroa mite, a parasite that infests hives and sucks the blood from both bees and their young. While it rarely kills a bee outright, it can weaken it and cause young to be born similarly weak or deformed. Over time this can lead to colony collapse.

The worst part is that unless you’re looking closely, you might not even see the mites — being mites, they’re tiny: a millimeter or so across. So infestations often go on for some time without being discovered.

Beekeepers, caring folk at heart obviously, want to avoid this. But the solution has been to put a flat surface beneath a hive and pull it out every few days, inspecting all the waste, dirt and other hive junk for the tiny bodies of the mites. It’s painstaking and time-consuming work, and of course if you miss a few, you might think the infestation is getting better instead of worse.

Machine learning to the rescue!

As I’ve had occasion to mention about a billion times before this, one of the things machine learning models are really good at is sorting through noisy data, like a surface covered in random tiny shapes, and finding targets, like the shape of a dead Varroa mite.

Students at the École Polytechnique Fédérale de Lausanne in Switzerland created an image recognition agent called ApiZoom trained on images of mites that can sort through a photo and identify any visible mite bodies in seconds. All the beekeeper needs to do is take a regular smartphone photo and upload it to the EPFL system.

The project started back in 2017, and since then the model has been trained with tens of thousands of images and achieved a success rate of detection of about 90 percent, which the project’s Alain Bugnon told me is about at parity with humans. The plan now is to distribute the app as widely as possible.

“We envisage two phases: a web solution, then a smartphone solution. These two solutions allow to estimate the rate of infestation of a hive, but if the application is used on a large scale, of a region,” Bugnon said. “By collecting automatic and comprehensive data, it is not impossible to make new findings about a region or atypical practices of a beekeeper, and also possible mutations of the Varroa mites.”

That kind of systematic data collection would be a major help for coordinating infestation response at a national level. ApiZoom is being spun out as a separate company by Bugnon; hopefully this will help get the software to beekeepers as soon as possible. The bees will thank them later.

Powered by WPeMatico

This drone shrinks to fit

Posted by | drones, EPFL, Gadgets, TC | No Comments

Researchers at the University of Zurich and EPFL have created a robot that shrinks to fit through gaps, a feature that could make it perfect for search and rescue missions. The researchers initially created a drone that could assess man-made gaps and squeeze through in seconds using only one camera. This extra feature — a scissor-like system to shrink the drone in flight — makes it even more versatile and allows these drones to react to larger or smaller gaps in nature.

“The idea came up after we worked on quadrotor flight through narrow gaps,” said PhD candidate Davide Falanga. “The goal of our lab is to develop drones which can be in the future used in the aftermath of a disaster, as for example an earthquake, in order to enter building through small cracks or apertures in a collapsed building to look for survivors. Our previous approach required a very aggressive maneuver, therefore we looked into alternative solutions to accomplish a task as passing through a very narrow gap without having to fly at high speed. The solution we came up with is the foldable drone, a quadrotor which can change its shape to adapt to the task.”

The system measures the gap and changes its shape without outside processing, a feat that is quite exciting. All of the processing is done on board and it could be turned into an autonomous system if necessary. The team built the drone with off the shelf and 3D-printed parts.

“The main difference between conventional drones and our foldable drone is in the way the arms are connected to the body: each arm is connected through a servo motor, which can change the relative position between the main body and the arm. This allows the robot to literally fold the arms around the body, which means that potentially any morphology can be obtained. An adaptive controller is aware of the drone’s morphology and adapts to it in order to guarantee stable flight at all times, independently of the configuration,” said Falanga.

The team published a report on their findings in Robotics and Automation Letters. As IEEE notes, this is no flying drone dragon, but it is a far simpler, cooler and more effective product.

Powered by WPeMatico

TWIICE One Exoskeleton furthers the promise of robotic mobility aids

Posted by | EPFL, exoskeleton, Gadgets, hardware, robotics, Wearables | No Comments

Few things in the world of technology can really ever be said to be “done,” and certainly exoskeletons are not among their number. They exist, but they are all works in progress, expensive, heavy, and limited. So it’s great to see this team working continuously on their TWIICE robotic wearable, improving it immensely with the guidance of motivated users.

TWIICE made its debut in 2016, and like all exoskeletons it was more promise made than promise kept. It’s a lower-half exoskeleton that supports and moves the legs of someone with limited mobility, while they support themselves on crutches. It’s far from ideal, and the rigidity and weight of systems like this make them too risky to deploy at scale for now.

But two years of refinement have made a world of difference. The exoskeleton weighs the same (which doesn’t matter since it carries its own weight), but supports heavier users while imparting more force with its motors, which have been integrated into the body itself to make it far less bulky.

Perhaps most importantly, however, the whole apparatus can now be donned and activated by the user all by herself, as Swiss former acrobat and now handcycling champion Silke Pan demonstrated in a video. She levers herself from her wheelchair into the sitting exoskeleton, attaches the fasteners on her legs and trunk, then activates the device and stands right up.

She then proceeds to climb more stairs than I’d rather attempt. She is an athlete, after all.

That kind of independence is often crucially important for the physically disabled for a multitude of reasons, and clearly achieving the capability has been a focus for the TWIICE team.

Although the exoskeleton has been worked on as a research project within the Ecole Polytechnique Federale de Lausanne (EPFL), the plan is to spin off a startup to commercialize the tech as it approaches viability. The more they make and the more people use these devices — despite their limitations — the better future versions will be.

Powered by WPeMatico

BlindPAD’s tablet makes visual information tactile for the vision-impaired

Posted by | accessibility, Disabilities, EPFL, Gadgets, haptics, science, TC | No Comments

 It’s truly amazing, the wealth of information we all have at our fingertips — that is, of course, unless your fingertips are how you have to access that information. An innovative new tablet that uses magnetically configurable bumps may prove to be a powerful tool for translating information like maps and other imagery to a modality more easily accessed by the visually impaired. Read More

Powered by WPeMatico

Swiss system ups security and reliability of finger-based biometrics

Posted by | biometrics, EPFL, Gadgets, science, Security, TC | No Comments

 Biometrics may not be the perfect solution for security, but they can be useful — as long as they’re robust and well thought out. TouchID is all well and good, but you wouldn’t secure a nuclear site with it. Well, movies aside, you probably should secure a nuclear site with a fingerprint, regardless. But this new system from Swiss researchers is a step in the right direction. Read More

Powered by WPeMatico

Having conquered Chess and Go, the robots move to master Foosball

Posted by | artificial intelligence, EPFL, foosball, Gadgets, Gaming, robotics, robots, Sports, TC | No Comments

foosballcreators We’ve come a long way since the days of selecting a CPU player for the other Pong paddle, tank or hand-to-hand combatant. Now the computers are taking it to us, in meatspace, and seemingly no tabletop activity is safe from their depredations. The latest to succumb to computer domination? Foosball. Read More

Powered by WPeMatico

The Pleurobot robo-salamander crawls and swims like a real amphibian

Posted by | artificial intelligence, biomimesis, biomimetic, EPFL, Gadgets, research, robotics, robots, science, TC, university | No Comments

_X3A9216 The mad roboticists at the École Polytechnique Fédérale de Lausanne have produced another biomimetic mechanoid — this one based on the lithe locomotion of the salamander. “Pleurobot” imitates the amphibian’s ambulation with its own articulated vertebrae, allowing it to slither along on land or at sea. Read More

Powered by WPeMatico