robotics

Deploy the space harpoon

Posted by | airbus, Gadgets, hardware, harpoons, moby dick, robotics, science, Space, space debris, space junk | No Comments

Watch out, starwhales. There’s a new weapon for the interstellar dwellers whom you threaten with your planet-crushing gigaflippers, undergoing testing as we speak. This small-scale version may only be good for removing dangerous orbital debris, but in time it will pierce your hypercarbon hides and irredeemable sun-hearts.

Literally a space harpoon. (Credit: Airbus)

However, it would be irresponsible of me to speculate beyond what is possible today with the technology, so let a summary of the harpoon’s present capabilities suffice.

The space harpoon is part of the RemoveDEBRIS project, a multi-organization European effort to create and test methods of reducing space debris. There are thousands of little pieces of who knows what clogging up our orbital neighborhood, ranging in size from microscopic to potentially catastrophic.

There are as many ways to take down these rogue items as there are sizes and shapes of space junk; perhaps it’s enough to use a laser to edge a small piece down toward orbital decay, but larger items require more hands-on solutions. And seemingly all nautical in origin: RemoveDEBRIS has a net, a sail and a harpoon. No cannon?

You can see how the three items are meant to operate here:

The harpoon is meant for larger targets, for example full-size satellites that have malfunctioned and are drifting from their orbit. A simple mass driver could knock them toward the Earth, but capturing them and controlling descent is a more controlled technique.

While an ordinary harpoon would simply be hurled by the likes of Queequeg or Dagoo, in space it’s a bit different. Sadly it’s impractical to suit up a harpooner for EVA missions. So the whole thing has to be automated. Fortunately the organization is also testing computer vision systems that can identify and track targets. From there it’s just a matter of firing the harpoon at it and reeling it in, which is what the satellite demonstrated today.

This Airbus-designed little item is much like a toggling harpoon, which has a piece that flips out once it pierces the target. Obviously it’s a single-use device, but it’s not particularly large and several could be deployed on different interception orbits at once. Once reeled in, a drag sail (seen in the video above) could be deployed to hasten reentry. The whole thing could be done with little or no propellant, which greatly simplifies operation.

Obviously it’s not yet a threat to the starwhales. But we’ll get there. We’ll get those monsters good one day.

Powered by WPeMatico

The Opportunity Mars rover’s greatest shots and discoveries

Posted by | Gadgets, Government, hardware, mars rovers, NASA, robotics, science, Space | No Comments

Opportunity’s mission is complete, and the rover that was supposed to last 90 days closes the book on 15 years of exploration. It’s sad, but it’s also a great time to look back on the mission and see some of its greatest hits. Here are 25 images showing where it came from, where it went, and what it discovered on its marathon-length journey.

Powered by WPeMatico

Robin’s robotic mowers now have a patented doggie door just for them

Posted by | artificial intelligence, Battlefield, Gadgets, hardware, robin, robotic mower, robotics, Startups | No Comments

Back in 2016 we had Robin up onstage demonstrating the possibility of a robotic mower as a service rather than just something you buy. They’re still going strong, and just introduced and patented what seems in retrospect a pretty obvious idea: an automatic door for the mower to go through fences between front and back yards.

It’s pretty common, after all, to have a back yard isolated from the front lawn by a wood or chain link fence so dogs and kids can roam freely with only light supervision. And if you’re lucky enough to have a robot mower, it can be a pain to carry it from one side to the other. Isn’t the whole point of the thing that you don’t have to pick it up or interact with it in any way?

The solution Justin Crandall and his team at Robin came up with is simple and straightforward: an automatic mower-size door that opens only to let it through.

“In Texas over 90 percent of homes have a fenced in backyard, and even in places like Charlotte and Cleveland it’s roughly 25-30 percent, so technology like this is critical to adoption,” Crandall told me. “We generally dock the robots in the backyard for security. When it’s time to mow the front yard, the robots drive to the door we place in the fence. As it approaches the door, the robot drives over a sensor we place in the ground. That sensor unlocks the door to allow the mower access.”

Simple, right? It uses a magnetometer rather than wireless or IR sensor, since those introduced possibilities of false positives. And it costs around $100-$150, easily less than a second robot or base, and probably pays for itself in goodwill around the third or fourth time you realize you didn’t have to carry your robot around.

It’s patented, but rivals (like iRobot, which recently introduced its own mower) could certainly build one if it was sufficiently different.

Robin has expanded to several states and a handful of franchises (its plan from the start) and maintains that its all-inclusive robot-as-a-service method is better than going out and buying one for yourself. Got a big yard and no teenage kids who can mow it for you? See if Robin’s available in your area.

Powered by WPeMatico

Aibo hangs out with some (real) dogs

Posted by | Aibo, Gadgets, robotics, Sony | No Comments

I had this fun idea to make a video called “Real dog vs. robot dog,” where Henri (my Maltese Shih Tzu) and Aibo would go head to head performing a dozen tricks like high-five, bark and play dead. Aibo arrived, however, when I was simultaneously battling a cold and dog-sitting for my best friend. Because two days with Aibo didn’t allow for much time to teach him tricks, I decided to give him free rein to explore the apartment while I followed him around with my iPhone in a Theraflu haze.

As expected, Aibo was incredibly cute, does a bunch of tricks and can learn new ones with practice and patience. Brian goes in-depth here.

Pack life

The (real) dogs were curious at first, and would sniff Aibo (his butt rather), but shortly afterwards would ignore him, despite his numerous attempts to engage with them. If this were elementary school, Aibo would be the smelly new kid no one wanted to play with.

Like a real dog

We were told each Aibo was programmed to have a unique personality. The Aibo we received was a defiant little one that would obey orders half the time. He was also needy and would constantly try to get my attention. Unlike a real dog though, I could tell him to go to his charging station or turn him off.

Although his OLED eyes were meant to be expressive and help mimic a puppy’s endearing personality, they can be creepy at times, especially when he does the side-eye or when his pupils dilate.

Room for improvement

Aibo’s impressive for a robot companion dog, but with a $2,899 price tag, I’d like to suggest the following features for the next iteration:

  • Fur. Aibo isn’t very cuddly, and it’s a bit more difficult to get emotionally attached to a cold, shiny object rather than, say, a teddy bear.
  • Better movement. He would get stuck between rooms or at the edge of the rug and hardwood floors. He’s also quite slow. Watching him performing certain tricks and getting settled into his docking station was like waiting for a .jpg to load on dial-up.
  • The ability to read your expressions, so he knows when you’re sad and can act accordingly.
  • A fart feature, so you can blame your farts on Aibo.

Sony plans to roll out a security package in Japan that uses Aibo’s on-board sensors to keep your home safe. I’m not quite sure what that entails, but if Aibo’s eyeballs could be used as cameras to stream video footage on your smartphone while you’re not home, and alert you when someone’s there, that alone would justify the price tag.

This U.S. version, however, is available now.

Powered by WPeMatico

Don’t worry, this rocket-launching Chinese robo-boat is strictly for science

Posted by | artificial intelligence, Asia, autonomous vehicles, Boats, China, climate change, Gadgets, hardware, robotics, science, TC | No Comments

It seems inevitable that the high seas will eventually play host to a sort of proxy war as automated vessels clash over territory for the algae farms we’ll soon need to feed the growing population. But this rocket-launching robo-boat is a peacetime vessel concerned only with global weather patterns.

The craft is what’s called an unmanned semi-submersible vehicle, or USSV, and it functions as a mobile science base — and now, a rocket launch platform. For meteorological sounding rockets, of course, nothing scary.

It solves a problem we’ve seen addressed by other seagoing robots like the Saildrone: that the ocean is very big, and very dangerous — so monitoring it properly is equally big and dangerous. You can’t have a crew out in the middle of nowhere all the time, even if it would be critical to understanding the formation of a typhoon or the like. But you can have a fleet of robotic ships systematically moving around the ocean.

In fact this is already done in a variety of ways and by numerous countries and organizations, but much of the data collection is both passive and limited in range. A solar-powered buoy drifting on the currents is a great resource, but you can’t exactly steer it, and it’s limited to sampling the water around it. And weather balloons are nice, too, if you don’t mind flying it out to where it needs to be first.

A robotic boat, on the other hand, can go where you need it and deploy instruments in a variety of ways, dropping or projecting them deep into the water or, in the case of China’s new USSV, firing them 20,000 feet into the air.

“Launched from a long-duration unmanned semi-submersible vehicle, with strong mobility and large coverage of the sea area, rocketsonde can be used under severe sea conditions and will be more economical and applicable in the future,” said Jun Li, a researcher at the Chinese Academy of Sciences, in a news release.

The 24-foot craft, which has completed a handful of near-land cruises in Bohai Bay, was announced in the paper. You may wonder what “semi-submersible” means. Essentially they put as much of the craft as possible under the water, with only instruments, hatches and other necessary items aboveboard. That minimizes the effect of rough weather on the craft — but it is still self-righting in case it capsizes in major wave action.

The USSV’s early travels

It runs on a diesel engine, so it’s not exactly the latest tech there, but for a large craft going long distances, solar is still a bit difficult to manage. The diesel on board will last it about 10 days and take it around 3,000 km, or 1,800 miles.

The rocketsondes are essentially small rockets that shoot up to a set altitude and then drop a “driftsonde,” a sensor package attached to a balloon, parachute or some other descent-slowing method. The craft can carry up to 48 of these, meaning it could launch one every few hours for its entire 10-day cruise duration.

The researchers’ findings were published in the journal Advances in Atmospheric Sciences. This is just a prototype, but its success suggests we can expect a few more at the very least to be built and deployed. I’ve asked Li a few questions about the craft and will update this post if I hear back.

Powered by WPeMatico

Autonomous subs spend a year cruising under Antarctic ice

Posted by | antarctica, artificial intelligence, Gadgets, robotics, science, seaglider, university of washington | No Comments

The freezing waters underneath Antarctic ice shelves and the underside of the ice itself are of great interest to scientists… but who wants to go down there? Leave it to the robots. They won’t complain! And indeed, a pair of autonomous subs have been nosing around the ice for a full year now, producing data unlike any other expedition ever has.

The mission began way back in 2017, with a grant from the late Paul Allen. With climate change affecting sea ice around the world, precise measurements and study of these frozen climes is more important than ever. And fortunately, robotic exploration technology had reached a point where long-term missions under and around ice shelves were possible.

The project would use a proven autonomous seagoing vehicle called the Seaglider, which has been around for some time but had been redesigned to perform long-term operations in these dark, sealed-over environments. ne of the craft’s co-creators, UW’s Chris Lee, said of the mission at the time: “This is a high-risk, proof-of-concept test of using robotic technology in a very risky marine environment.”

The risks seem to have paid off, as an update on the project shows. The modified craft have traveled hundreds of miles during a year straight of autonomous operation.

It’s not easy to stick around for a long time on the Antarctic coast for a lot of reasons. But leaving robots behind to work while you go relax elsewhere for a month or two is definitely doable.

“This is the first time we’ve been able to maintain a persistent presence over the span of an entire year,” Lee said in a UW news release today. “Gliders were able to navigate at will to survey the cavity interior… This is the first time any of the modern, long-endurance platforms have made sustained measurements under an ice shelf.”

You can see the paths of the robotic platforms below as they scout around near the edge of the ice and then dive under in trips of increasing length and complexity:

They navigate in the dark by monitoring their position with regard to a pair of underwater acoustic beacons fixed in place by cables. The blue dots are floats that go along with the natural currents to travel long distances on little or no power. Both are equipped with sensors to monitor the shape of the ice above, the temperature of the water, and other interesting data points.

It isn’t the first robotic expedition under the ice shelves by a long shot, but it’s definitely the longest term and potentially the most fruitful. The Seagliders are smaller, lighter, and better equipped for long-term missions. One went 87 miles in a single trip!

The mission continues, and two of the three initial Seagliders are still operational and ready to continue their work.

Powered by WPeMatico

pi-top’s latest edtech tool doubles down on maker culture

Posted by | drone, edtech startup, Education, electronics, Europe, Gadgets, hardware, learn to code, London, pi-top, pi-top 4, Raspberry Pi, robotics, Startups, STEM, TC, United Kingdom | No Comments

London-based edtech startup, pi-top, has unboxed a new flagship learn-to-code product, demoing the “go anywhere” Pi-powered computer at the Bett Show education fare in London today.

Discussing the product with TechCrunch ahead of launch, co-founder and CEO Jesse Lozano talked up the skills the company hopes students in the target 12-to-17 age range will develop and learn to apply by using sensor-based connected tech, powered by its new pi-top 4, to solve real world problems.

“When you get a pi-top 4 out of the box you’re going to start to learn how to code with it, you’re going to start to learn and understand electronic circuits, you’re going to understand sensors from our sensor library. Or components from our components library,” he told us. “So it’s not: ‘I’m going to learn how to create a robot that rolls around on wheels and doesn’t knock into things’.

“It’s more: ‘I’m going to learn how a motor works. I’m going to learn how a distance sensor works. I’m going to learn how to properly hook up power to these different sensors. I’m going to learn how to apply that knowledge… take those skills and [keep making stuff].”

The pi-top 4 is a modular computer that’s designed to be applicable, well, anywhere; up in the air, with the help of a drone attachment; powering a sensing weather balloon; acting as the brains for a rover style wheeled robot; or attached to sensors planted firmly in the ground to monitor local environmental conditions.

The startup was already dabbling in this area, via earlier products — such as a Pi-powered laptop that featured a built in rail for breadboarding electronics. But the pi-top 4 is a full step outside the usual computing box.

The device has a built-in mini OLED screen for displaying project info, along with an array of ports. It can be connected to and programmed via one of pi-top’s other Pi-powered computers, or any PC, Mac and Chromebook, with the company also saying it easily connects to existing screens, keyboards and mice. Versatility looks to be the name of the game for pi-top 4.

pi-top’s approach to computing and electronics is flexible and interoperable, meaning the pi-top 4 can be extended with standard electronics components — or even with Littlebits‘ style kits’ more manageable bits and bobs.

pi-top is also intending to sell a few accessories of its own (such as the drone add-on, pictured above) to help get kids’ creative project juices flowing — and has launched a range of accessories, cameras, motors and sensors to “allow creators of all ages to start learning by making straight out of the box”.

But Lozano emphasizes its platform play is about reaching out to a wider world, not seeking to lock teachers and kids to buying proprietary hardware. (Which would be all but impossible, in any case, given the Raspberry Pi core.)

“It’s really about giving people that breadth of ability,” says Lozano, discussing the sensor-based skills he wants the product to foster. “As you go through these different projects you’re learning these specific skills but you also start to understand how they would apply to other projects.”

He mentions various maker projects the pi-top can be used to make, like a music synth or wheeled robot, but says the point isn’t making any specific connected thing; it’s encouraging kids to come up with project ideas of their own.

“Once that sort of veil has been pierced in students and in teachers we see some of the best stuff starts to be made. People make things that we had no idea they would integrate it into,” he tells us, pointing by way of example to a solar car project from a group of U.S. schoolkids. “These fifteen year olds are building solar cars and they’re racing them from Texas to California — and they’re using pi-tops to understand how their cars are performing to make better race decisions.”

pi-top’s new device is a modular programmable computer designed for maker projects

“What you’re really learning is the base skills,” he adds, with a gentle sideswipe at the flood of STEM toys now targeting parents’ wallets. “We want to teach you real skills. And we want you to be able to create projects that are real. That it’s not block-based coding. It’s not magnetized, clipped in this into that and all of a sudden you have something. It’s about teaching you how to really make things. And how the world actually works around you.”

The pi-top 4 starts at $199 for a foundation bundle which includes a Raspberry Pi 3B+,16GB SD card, power pack, along with a selection of sensors and add-on components for starter projects.

Additional educational bundles will also launch down the line, at a higher price, including more add ons, access to premium software and a full curriculum for educators to support budding makers, according to Lozano.

The startup has certainly come a long way from its founders’ first luridly green 3D printed laptop which caught our eye back in 2015. Today it employs more than 80 people globally, with offices in the UK, US and China, while its creative learning devices are in the hands of “hundreds of thousands” of schoolkids across more than 70 countries at this stage. And Lozano says they’re gunning to pass the million mark this year.

So while the ‘learn to code’ space has erupted into a riot of noise and color over the past half decade, with all sorts of connected playthings now competing for kids’ attention, and pestering parents with quasi-educational claims, pi-top has kept its head down and focused firmly on building a serious edtech business with STEM learning as its core focus, saving it from chasing fickle consumer fads, as Lozano tells it.

“Our relentless focus on real education is something that has differentiated us,” he responds, when asked how pi-top stands out in what’s now a very crowded marketplace. “The consumer market, as we’ve seen with other startups, it can be fickle. And trying to create a hit toy all the time — I’d rather leave that to Mattel… When you’re working with schools it’s not a fickle process.”

Part of that focus includes supporting educators to acquire the necessary skills themselves to be able to teach what’s always a fast-evolving area of study. So schools signing up to pi-top’s subscription product get support materials and guides, to help them create a maker space and understand all the ins and outs of the pi-top platform. It also provides a classroom management backend system that lets teachers track students’ progress.

“If you’re a teacher that has absolutely no experience in computer science or engineering or STEM based learning or making then you’re able to bring on the pi-top platform, learn with it and with your student, and when they’re ready they can create a computer science course — or something of that ilk — in their classroom,” says Lozano.

pi-top wants kids to use tech to tackle real-world problems

“As with all good things it takes time, and you need to build up a bank of experience. One of the things we’ve really focused on is giving teachers that ability to build up that bank of experience, through an after school club, or through a special lesson plan that they might do.

“For us it’s about augmenting that teacher and helping them become a great educator with tools and with resources. There’s some edtech stuff they want to replace the teacher — they want to make the teacher obsolete. I couldn’t disagree with that viewpoint more.”

“Why aren’t teachers just buying textbooks?” he adds. “It takes 24 months to publish a textbook. So how are you supposed to teach computer science with those technology-based skills with something that’s by design two years out of date?”

Last summer pi-top took in $16M in Series B funding, led by existing founders Hambro Perks and Committed Capital. It’s been using the financing to bring pi-top 4 to market while also investing heavily in its team over the past 18 months — expanding in-house expertise in designing learning products and selling in to the education sector via a number of hires. Including the former director of learning at Apple, Dr William Rankin.

The founders’ philosophy is to combine academic expertise in education with “excellence in engineering”. “We want the learning experience to be something we’re 100% confident in,” says Lozano. “You can go into pi-top and immediately start learning with our lesson plans and the kind of framework that we provide.”

“[W]e’ve unabashedly focused on… education. It is the pedagogy,” he adds. “It is the learning outcome that you’re going to get when you use the pi-top. So one of the big changes over the last 18 months is we’ve hired a world class education team. We have over 100 years of pedagogical experience on the team now producing an enormous amount of — we call them learning experience designers.”

He reckons that focus will stand pi-top in good stead as more educators turn their attention to how to arm their pupils with the techie skills of the future.

“There’s loads of competition but now the schools are looking they’re [asking] who’s the team behind the education outcome that you’re selling me?” he suggests. “And you know what if you don’t have a really strong education team then you’re seeing schools and districts become a lot more picky — because there is so much choice. And again that’s something I’m really excited about. Everybody’s always trying to do a commercial brand partnership deal. That’s just not something that we’ve focused on and I do really think that was a smart choice on our end.”

Lozano is also excited about a video the team has produced to promote the new product — which strikes a hip, urban note as pi-top seeks to inspire the next generation of makers.

“We really enjoy working in the education sector and I really, really enjoy helping teachers and schools deliver inspirational content and learning outcomes to their students,” he adds. “It’s genuinely a great reason to wake up in the morning.”

Powered by WPeMatico

Robots learn to grab and scramble with new levels of agility

Posted by | artificial intelligence, Berkeley, ETHZ, Gadgets, hardware, robotics, robots, science, TC | No Comments

Robots are amazing things, but outside of their specific domains they are incredibly limited. So flexibility — not physical, but mental — is a constant area of research. A trio of new robotic setups demonstrate ways they can evolve to accommodate novel situations: using both “hands,” getting up after a fall, and understanding visual instructions they’ve never seen before.

The robots, all developed independently, are gathered together today in a special issue of the journal Science Robotics dedicated to learning. Each shows an interesting new way in which robots can improve their interactions with the real world.

On the other hand…

First there is the question of using the right tool for a job. As humans with multi-purpose grippers on the ends of our arms, we’re pretty experienced with this. We understand from a lifetime of touching stuff that we need to use this grip to pick this up, we need to use tools for that, this will be light, that heavy, and so on.

Robots, of course, have no inherent knowledge of this, which can make things difficult; it may not understand that it can’t pick up something of a given size, shape, or texture. A new system from Berkeley roboticists acts as a rudimentary decision-making process, classifying objects as able to be grabbed either by an ordinary pincer grip or with a suction cup grip.

A robot, wielding both simultaneously, decides on the fly (using depth-based imagery) what items to grab and with which tool; the result is extremely high reliability even on piles of objects it’s never seen before.

It’s done with a neural network that consumed millions of data points on items, arrangements, and attempts to grab them. If you attempted to pick up a teddy bear with a suction cup and it didn’t work the first ten thousand times, would you keep on trying? This system learned to make that kind of determination, and as you can imagine such a thing is potentially very important for tasks like warehouse picking for which robots are being groomed.

Interestingly, because of the “black box” nature of complex neural networks, it’s difficult to tell what exactly Dex-Net 4.0 is actually basing its choices on, although there are some obvious preferences, explained Berkeley’s  Ken Goldberg in an email.

“We can try to infer some intuition but the two networks are inscrutable in that we can’t extract understandable ‘policies,’ ” he wrote. “We empirically find that smooth planar surfaces away from edges generally score well on the suction model and pairs of antipodal points generally score well for the gripper.”

Now that reliability and versatility are high, the next step is speed; Goldberg said that the team is “working on an exciting new approach” to reduce computation time for the network, to be documented, no doubt, in a future paper.

ANYmal’s new tricks

Quadrupedal robots are already flexible in that they can handle all kinds of terrain confidently, even recovering from slips (and of course cruel kicks). But when they fall, they fall hard. And generally speaking they don’t get up.

The way these robots have their legs configured makes it difficult to do things in anything other than an upright position. But ANYmal, a robot developed by ETH Zurich (and which you may recall from its little trip to the sewer recently), has a more versatile setup that gives its legs extra degrees of freedom.

What could you do with that extra movement? All kinds of things. But it’s incredibly difficult to figure out the exact best way for the robot to move in order to maximize speed or stability. So why not use a simulation to test thousands of ANYmals trying different things at once, and use the results from that in the real world?

This simulation-based learning doesn’t always work, because it isn’t possible right now to accurately simulate all the physics involved. But it can produce extremely novel behaviors or streamline ones humans thought were already optimal.

At any rate that’s what the researchers did here, and not only did they arrive at a faster trot for the bot (above), but taught it an amazing new trick: getting up from a fall. Any fall. Watch this:

It’s extraordinary that the robot has come up with essentially a single technique to get on its feet from nearly any likely fall position, as long as it has room and the use of all its legs. Remember, people didn’t design this — the simulation and evolutionary algorithms came up with it by trying thousands of different behaviors over and over and keeping the ones that worked.

Ikea assembly is the killer app

Let’s say you were given three bowls, with red and green balls in the center one. Then you’re given this on a sheet of paper:

As a human with a brain, you take this paper for instructions, and you understand that the green and red circles represent balls of those colors, and that red ones need to go to the left, while green ones go to the right.

This is one of those things where humans apply vast amounts of knowledge and intuitive understanding without even realizing it. How did you choose to decide the circles represent the balls? Because of the shape? Then why don’t the arrows refer to “real” arrows? How do you know how far to go to the right or left? How do you know the paper even refers to these items at all? All questions you would resolve in a fraction of a second, and any of which might stump a robot.

Researchers have taken some baby steps towards being able to connect abstract representations like the above with the real world, a task that involves a significant amount of what amounts to a sort of machine creativity or imagination.

Making the connection between a green dot on a white background in a diagram and a greenish roundish thing on a black background in the real world isn’t obvious, but the “visual cognitive computer” created by Miguel Lázaro-Gredilla and his colleagues at Vicarious AI seems to be doing pretty well at it.

It’s still very primitive, of course, but in theory it’s the same toolset that one uses to, for example, assemble a piece of Ikea furniture: look at an abstract representation, connect it to real-world objects, then manipulate those objects according to the instructions. We’re years away from that, but it wasn’t long ago that we were years away from a robot getting up from a fall or deciding a suction cup or pincer would work better to pick something up.

The papers and videos demonstrating all the concepts above should be available at the Science Robotics site.

Powered by WPeMatico

Watch the ANYmal quadrupedal robot go for an adventure in the sewers of Zurich

Posted by | ANYbotics, artificial intelligence, ETH Zurich, ETHZ, Gadgets, hardware, robotics, robots, science | No Comments

There’s a lot of talk about the many potential uses of multi-legged robots like Cheetahbot and Spot — but in order for those to come to fruition, the robots actually have to go out and do stuff. And to train for a glorious future of sewer inspection (and helping rescue people, probably), this Swiss quadrupedal bot is going deep underground.

ETH Zurich / Daniel Winkler

The robot is called ANYmal, and it’s a long-term collaboration between the Swiss Federal Institute of Technology, abbreviated there as ETH Zurich, and a spin-off from the university called ANYbotics. Its latest escapade was a trip to the sewers below that city, where it could eventually aid or replace the manual inspection process.

ANYmal isn’t brand new — like most robot platforms, it’s been under constant revision for years. But it’s only recently that cameras and sensors like lidar have gotten good enough and small enough that real-world testing in a dark, slimy place like sewer pipes could be considered.

Most cities have miles and miles of underground infrastructure that can only be checked by expert inspectors. This is dangerous and tedious work — perfect for automation. Imagine instead of yearly inspections by people, if robots were swinging by once a week. If anything looks off, it calls in the humans. It could also enter areas rendered inaccessible by disasters or simply too small for people to navigate safely.

But of course, before an army of robots can inhabit our sewers (where have I encountered this concept before? Oh yeah…) the robot needs to experience and learn about that environment. First outings will be only minimally autonomous, with more independence added as the robot and team gain confidence.

“Just because something works in the lab doesn’t always mean it will in the real world,” explained ANYbotics co-founder Peter Fankhauser in the ETHZ story.

Testing the robot’s sensors and skills in a real-world scenario provides new insights and tons of data for the engineers to work with. For instance, when the environment is completely dark, laser-based imaging may work, but what if there’s a lot of water, steam or smoke? ANYmal should also be able to feel its surroundings, its creators decided.

ETH Zurich / Daniel Winkler

So they tested both sensor-equipped feet (with mixed success) and the possibility of ANYmal raising its “paw” to touch a wall, to find a button or determine temperature or texture. This latter action had to be manually improvised by the pilots, but clearly it’s something it should be able to do on its own. Add it to the list!

You can watch “Inspector ANYmal’s” trip below Zurich in the video below.

Powered by WPeMatico

Drones ground flights at UK’s second largest airport

Posted by | Civil Aviation Authority, drone, drones, Emerging-Technologies, Europe, Gadgets, Gatwick Airport, gchq, hardware, robotics, TC, United Kingdom | No Comments

Mystery drone operator/s have grounded flights at the U.K.’s second largest airport, disrupting the travel plans of hundreds of thousands of people hoping to get away over the festive period.

The BBC reports that Gatwick Airport’s runway has been shut since Wednesday night on safety grounds, after drones were spotted being flown repeatedly over the airfield.

It says airlines have been advised to cancel all flights up to at least 16:00 GMT, with the airport saying the runway would not open “until it was safe to do so.”

More than 20 police units are reported to be searching for the drone operator/s.

The U.K. made amendments to existing legislation this year to make illegal flying a drone within 1km of an airport after a planned drone bill got delayed.

The safety focused tweak to the law five months ago also restricted drone flight height to 400 ft. A registration scheme for drone owners is also set to be introduced next year.

Under current U.K. law, a drone operator who is charged with recklessly or negligently acting in a manner likely to endanger an aircraft or a person in an aircraft can face a penalty of up to five years in prison or an unlimited fine, or both.

Although, in the Gatwick incident case, it’s not clear whether simply flying a drone near a runway would constitute an attempt to endanger an aircraft under the law. Even though the incident has clearly caused major disruption to travelers as the safety-conscious airport takes no chances.

Further adding to the misery of disrupted passengers today, the Civil Aviation Authority told the BBC it considered the event to be an “extraordinary circumstance” — meaning airlines aren’t obligated to pay financial compensation.

There’s been a marked rise in U.K. aircraft incidents involving drones over the past five years, with more than 100 recorded so far this year, according to data from the U.K. Airprox Board.

Aviation minister Baroness Sugg faced a barrage of questions about the Gatwick disruption in the House of Lords today, including accusations the government has dragged its feet on bringing in technical specifications that might have avoided the disruption.

“These drones are being operated illegally… It seems that the drones are being used intentionally to disrupt the airport, but, as I said, this is an ongoing investigation,” she told peers, adding: “We changed the law earlier this year, bringing in an exclusion zone around airports. We are working with manufactures and retailers to ensure that the new rules are communicated to those who purchase drones.

“From November next year, people will need to register their drone and take an online safety test. We have also recently consulted on extending police powers and will make an announcement on next steps shortly.”

The minister was also pressed on what the government had done to explore counterdrone technology, which could be used to disable drones, with one peer noting they’d raised the very issue two years ago.

“My Lords, technology is rapidly advancing in this area,” responded Sugg. “That is absolutely something that we are looking at. As I said, part of the consultation we did earlier this year was on counterdrone technology and we will be announcing our next steps on that very soon.”

Another peer wondered whether techniques he said had been developed by the U.K. military and spy agency GCHQ — to rapidly identify the frequency a drone is operating on, and either jam it or take control and land it — will be “given more broadly to various airports”?

“All relevant parts of the Government, including the Ministry of Defence, are working on this issue today to try to resolve it as quickly as possible,” the minister replied. “We are working on the new technology that is available to ensure that such an incident does not happen again. It is not acceptable that passengers have faced such disruption ahead of Christmas and we are doing all we can to resolve it as quickly as possible.”

Powered by WPeMatico