robotics

Calling all hardware startups! Apply to Hardware Battlefield @ TC Shenzhen

Posted by | augmented reality, automotive, Battlefield, biotech, connected devices, Enterprise, Gadgets, Gaming, hardware, hardware battlefield, Hardware Battlefield at TC Shenzhen, Health, Logistics, manufacturing, Mobile, robotics, science, Startup Battlefield, Startups, Virtual reality, Wearables | No Comments

Got hardware? Well then, listen up, because our search continues for boundary-pushing, early-stage hardware startups to join us in Shenzhen, China for an epic opportunity; launch your startup on a global stage and compete in Hardware Battlefield at TC Shenzhen on November 11-12.

Apply here to compete in TC Hardware Battlefield 2019. Why? It’s your chance to demo your product to the top investors and technologists in the world. Hardware Battlefield, cousin to Startup Battlefield, focuses exclusively on innovative hardware because, let’s face it, it’s the backbone of technology. From enterprise solutions to agtech advancements, medical devices to consumer product goods — hardware startups are in the international spotlight.

If you make the cut, you’ll compete against 15 of the world’s most innovative hardware makers for bragging rights, plenty of investor love, media exposure and $25,000 in equity-free cash. Just participating in a Battlefield can change the whole trajectory of your business in the best way possible.

We chose to bring our fifth Hardware Battlefield to Shenzhen because of its outstanding track record of supporting hardware startups. The city achieves this through a combination of accelerators, rapid prototyping and world-class manufacturing. What’s more, TC Hardware Battlefield 2019 takes place as part of the larger TechCrunch Shenzhen that runs November 9-12.

Creativity and innovation no know boundaries, and that’s why we’re opening this competition to any early-stage hardware startup from any country. While we’ve seen amazing hardware in previous Battlefields — like robotic armsfood testing devicesmalaria diagnostic tools, smart socks for diabetics and e-motorcycles, we can’t wait to see the next generation of hardware, so bring it on!

Meet the minimum requirements listed below, and we’ll consider your startup:

Here’s how Hardware Battlefield works. TechCrunch editors vet every qualified application and pick 15 startups to compete. Those startups receive six rigorous weeks of free coaching. Forget stage fright. You’ll be prepped and ready to step into the spotlight.

Teams have six minutes to pitch and demo their products, which is immediately followed by an in-depth Q&A with the judges. If you make it to the final round, you’ll repeat the process in front of a new set of judges.

The judges will name one outstanding startup the Hardware Battlefield champion. Hoist the Battlefield Cup, claim those bragging rights and the $25,000. This nerve-wracking thrill-ride takes place in front of a live audience, and we capture the entire event on video and post it to our global audience on TechCrunch.

Hardware Battlefield at TC Shenzhen takes place on November 11-12. Don’t hide your hardware or miss your chance to show us — and the entire tech world — your startup magic. Apply to compete in TC Hardware Battlefield 2019, and join us in Shenzhen!

Is your company interested in sponsoring or exhibiting at Hardware Battlefield at TC Shenzhen? Contact our sponsorship sales team by filling out this form.

Powered by WPeMatico

These robo-ants can work together in swarms to navigate tricky terrain

Posted by | artificial intelligence, EPFL, Gadgets, hardware, robotics, science, TC | No Comments

While the agility of a Spot or Atlas robot is something to behold, there’s a special merit reserved for tiny, simple robots that work not as a versatile individual but as an adaptable group. These “tribots” are built on the model of ants, and like them can work together to overcome obstacles with teamwork.

Developed by EPFL and Osaka University, tribots are tiny, light and simple, moving more like inchworms than ants, but able to fling themselves up and forward if necessary. The bots themselves and the system they make up are modeled on trap-jaw ants, which alternate between crawling and jumping, and work (as do most other ants) in fluid roles like explorer, worker and leader. Each robot is not itself very intelligent, but they are controlled as a collective that deploys their abilities intelligently.

In this case a team of tribots might be expected to get from one end of a piece of complex terrain to another. An explorer could move ahead, sensing obstacles and relaying their locations and dimensions to the rest of the team. The leader can then assign worker units to head over to try to push the obstacles out of the way. If that doesn’t work, an explorer can try hopping over it — and if successful, it can relay its telemetry to the others so they can do the same thing.

fly tribot fly

Fly, tribot, fly!

It’s all done quite slowly at this point — you’ll notice that in the video, much of the action is happening at 16x speed. But rapidity isn’t the idea here; similar to Squishy Robotics’ creations, it’s more about adaptability and simplicity of deployment.

The little bots weigh only 10 grams each, and are easily mass-produced, as they’re basically PCBs with some mechanical bits and grip points attached — “a quasi-two-dimensional metamaterial sandwich,” according to the paper. If they only cost (say) a buck each, you could drop dozens or hundreds on a target area and over an hour or two they could characterize it, take measurements and look for radiation or heat hot spots, and so on.

If they moved a little faster, the same logic and a modified design could let a set of robots emerge in a kitchen or dining room to find and collect crumbs or scoot plates into place. (Ray Bradbury called them “electric mice” or something in “There will come soft rains,” one of my favorite stories of his. I’m always on the lookout for them.)

Swarm-based bots have the advantage of not failing catastrophically when something goes wrong — when a robot fails, the collective persists, and it can be replaced as easily as a part.

“Since they can be manufactured and deployed in large numbers, having some ‘casualties’ would not affect the success of the mission,” noted EPFL’s Jamie Paik, who co-designed the robots. “With their unique collective intelligence, our tiny robots can demonstrate better adaptability to unknown environments; therefore, for certain missions, they would outperform larger, more powerful robots.”

It raises the question, in fact, of whether the sub-robots themselves constitute a sort of uber-robot? (This is more of a philosophical question, raised first in the case of the Constructicons and Devastator. Transformers was ahead of its time in many ways.)

The robots are still in prototype form, but even as they are, constitute a major advance over other “collective” type robot systems. The team documents their advances in a paper published in the journal Nature.

Powered by WPeMatico

Luminar eyes production vehicles with $100M round and new Iris lidar platform

Posted by | artificial intelligence, automotive, autonomous vehicles, funding, Gadgets, hardware, Lidar, Luminar, robotics, self-driving cars, Transportation | No Comments

Luminar is one of the major players in the new crop of lidar companies that have sprung up all over the world, and it’s moving fast to outpace its peers. Today the company announced a new $100 million funding round, bringing its total raised to more than $250 million — as well as a perception platform and a new, compact lidar unit aimed at inclusion in actual cars. Big day!

The new hardware, called Iris, looks to be about a third of the size of the test unit Luminar has been sticking on vehicles thus far. That one was about the size of a couple hardbacks stacked up, and Iris is more like a really thick sandwich.

Size is very important, of course, as few cars just have caverns of unused space hidden away in prime surfaces like the corners and windshield area. Other lidar makers have lowered the profiles of their hardware in various ways; Luminar seems to have compactified in a fairly straightforward fashion, getting everything into a package smaller in every dimension.

Luminar IRIS AND TEST FLEET LiDARS

Test model, left, Iris on the right.

Photos of Iris put it in various positions: below the headlights on one car, attached to the rear-view mirror in another and high up atop the cabin on a semi truck. It’s small enough that it won’t have to displace other components too much, although of course competitors are aiming to make theirs even more easy to integrate. That won’t matter, Luminar founder and CEO Austin Russell told me recently, if they can’t get it out of the lab.

“The development stage is a huge undertaking — to actually move it towards real-world adoption and into true series production vehicles,” he said (among many other things). The company that gets there first will lead the industry, and naturally he plans to make Luminar that company.

Part of that is of course the production process, which has been vastly improved over the last couple of years. These units can be made quickly enough that they can be supplied by the thousands rather than dozens, and the cost has dropped precipitously — by design.

Iris will cost less than $1,000 per unit for production vehicles seeking serious autonomy, and for $500 you can get a more limited version for more limited purposes like driver assistance, or ADAS. Luminar says Iris is “slated to launch commercially on production vehicles beginning in 2022,” but that doesn’t mean necessarily that they’re shipping to customers right now. The company is negotiating more than a billion dollars in contracts at present, a representative told me, and 2022 would be the earliest that vehicles with Iris could be made available.

LUMINAR IRIS TRAFFIC JAM PILOT

The Iris units are about a foot below the center of the headlight units here. Note that this is not a production vehicle, just a test one.

Another part of integration is software. The signal from the sensor has to go somewhere, and while some lidar companies have indicated they plan to let the carmaker or whoever deal with it their own way, others have opted to build up the tech stack and create “perception” software on top of the lidar. Perception software can be a range of things: something as simple as drawing boxes around objects identified as people would count, as would a much richer process that flags intentions, gaze directions, characterizes motions and suspected next actions and so on.

Luminar has opted to build into perception, or rather has revealed that it has been working on it for some time. It now has 60 people on the task split between Palo Alto and Orlando, and hired a new VP of Software, former robo-taxi head at Daimler, Christoph Schroder.

What exactly will be the nature and limitations of Luminar’s perception stack? There are dangers waiting if you decide to take it too far, because at some point you begin to compete with your customers, carmakers that have their own perception and control stacks that may or may not overlap with yours. The company gave very few details as to what specifically would be covered by its platform, but no doubt that will become clearer as the product itself matures.

Last and certainly not least is the matter of the $100 million in additional funding. This brings Luminar to a total of over a quarter of a billion dollars in the last few years, matching its competitor Innoviz, which has made similar decisions regarding commercialization and development.

The list of investors has gotten quite long, so I’ll just quote Luminar here:

G2VP, Moore Strategic Ventures, LLC, Nick Woodman, The Westly Group, 1517 Fund / Peter Thiel, Canvas Ventures, along with strategic investors Corning Inc, Cornes, and Volvo Cars Tech Fund.

The board has also grown, with former Broadcom exec Scott McGregor and G2VP’s Ben Kortlang joining the table.

We may have already passed “peak lidar” as far as sheer number of deals and startups in the space, but that doesn’t mean things are going to cool down. If anything, the opposite, as established companies battle over lucrative partnerships and begin eating one another to stay competitive. Seems like Luminar has no plans on becoming a meal.

Powered by WPeMatico

NASA picks a dozen science and tech projects to bring to the surface of the Moon

Posted by | Artemis, astrobotic, Gadgets, Government, hardware, NASA, robotics, science, Space | No Comments

With the Artemis mission scheduled to put boots on lunar regolith as soon as 2024, NASA has a lot of launching to do — and you can be sure none of those launches will go to waste. The agency just announced 12 new science and technology projects to send to the Moon’s surface, including a new rover.

The 12 projects are being sent up as part of the Commercial Lunar Payload Services program, which is — as NASA Administrator Jim Bridenstine has emphasized strongly — part of an intentional increase in reliance on private companies. If a company already has a component or rover or craft ready to go and meeting a program’s requirements, why should NASA build it from scratch at great cost?

In this case, the selected projects cover a wide range of origins and intentions. Some are repurposed or spare parts from other missions, like the Lunar Surface Electromagnetics Experiment. LuSEE is related to the Park Solar Probe’s STEREO/Waves instrument and pieces from MAVEN, re-engineered to make observations and measurements on the Moon.

moonrangerOthers are quite new. Astrobotic, which was also recently awarded an $80 million contract to develop its Peregrine lunar lander, will now also be putting together a rover, which it calls MoonRanger (no relation to the NES game). This little bot will autonomously traverse the landscape within half a mile or so of its base and map it in 3D.

The new funding from NASA amounts to $5.6 million, which isn’t a lot to develop a lunar rover from scratch — no doubt it’s using its own funds and working with its partner, Carnegie Mellon University, to make sure the rover isn’t a bargain-bin device. With veteran rover engineer Red Whittaker on board, it should be a good one.

“MoonRanger offers a means to accomplish far-ranging science of significance, and will exhibit an enabling capability on missions to the Moon for NASA and the commercial sector. The autonomy techniques demonstrated by MoonRanger will enable new kinds exploration missions that will ultimately herald in a new era on the Moon,” said Whittaker in an Astrobotic news release.

The distance to the lunar surface isn’t so far that controlling a rover directly from the surface is nearly impossible, like on Mars, but if it can go from here to there without someone in Houston twiddling a joystick, why shouldn’t it?

To be clear, this is different from the upcoming CubeRover project and others that are floating around in Astrobotic and Whittaker’s figurative orbits.

“MoonRanger is a 13 kg microwave-sized rover with advanced autonomous capabilities,” Astrobotic’s Mike Provenzano told me. “The CubeRover is a 2 kg shoebox-sized rover developed for light payloads and geared for affordable science and exploration activities.”

While both have flight contracts, CubeRover is scheduled to go up on the first Peregrine mission in 2021, while MoonRanger is TBD.

Another NASA selection is the Planetary Science Institute’s Heimdall, a new camera system that will point downward during the lander’s descent and collect super-high-resolution imagery of the regolith before, during and after landing.

heimdall

“The camera system will return the highest resolution images of the undisturbed lunar surface yet obtained, which is important for understanding regolith properties. We will be able to essentially video the landing in high resolution for the first time, so we can understand how the plume behaves – how far it spreads, how long particles are lofted. This information is crucial for the safety of future landings,” said the project’s R. Aileen Yingst in a PSI release.

The regolith is naturally the subject of much curiosity, since if we’re to establish a semi-permanent presence on the Moon we’ll have to deal with it one way or another. So projects like Honeybee’s PlanetVac, which can suck up and test materials right at landing, or the Regolith Adherence Characterization, which will see how the stuff sticks to various materials, will be invaluable.

RadSatg Deployed w Crop

RadSat-G deployed from the ISS for its year-long mission to test radiation tolerance on its computer systems

Several projects are continuations of existing projects that are great fits for lunar missions. For example, the lunar surface is constantly being bombarded with all kinds of radiation, since the Moon lacks any kind of atmosphere. That’s not a problem for machinery like wheels or even solar cells, but for computers, radiation can be highly destructive. So Brock LaMere’s work in radiation-tolerant computers will be highly relevant to landers, rovers and payloads.

LaMere’s work has already been tested in space via the Nanoracks facility aboard the International Space Station, and the new NASA funding will allow it to be tested on the lunar surface. If we’re going to be sending computers up there that people’s lives will depend on, we better be completely sure they aren’t going to crash because of a random EM flux.

The rest of the projects are characterized here, with varying degrees of detail. No doubt we’ll learn more soon as the funding disbursed by NASA over the next year or so helps flesh them out.

Powered by WPeMatico

Team studies drone strikes on airplanes by firing them into a wall at 500 MPH

Posted by | drones, fraunhofer, Gadgets, hardware, robotics, science, UAVs | No Comments

Bird strikes are a very real danger to planes in flight, and consequently aircraft are required to undergo bird strike testing — but what about drones? With UAV interference at airports on the rise, drone strike testing may soon be likewise mandatory, and if it’s anything like what these German researchers are doing, it’ll involve shooting the craft out of air cannons at high speed.

The work being done at Fraunhofer EMI in Freiburg is meant to establish some basic parameters for how these things ought to be tested.

Bird strikes, for example, are tested by firing a frozen poultry bird like a chicken or turkey out of an air cannon. It’s not pretty, but it has to be done. Even so, it’s not a very good analogue to a drone strike.

“From a mechanical point of view, drones behave differently to birds and also weigh considerably more. It is therefore uncertain, whether an aircraft that has been successfully tested against bird strike, would also survive a collision with a drone,” explained Fraunhofer’s Sebastian Schopferer in a news release.

The team chose to load an air cannon with drone batteries and engines, since those make up most of any given UAV’s mass. The propellers and arms on which they’re mounted are generally pretty light and will break easily — compared with a battery weighing the better part of a kilogram, they won’t add much to the damage.

drone testing

The remains of a drone engine and battery after being propelled into the plate on the left at hundreds of miles per hour

The drones were fired at speeds from 250 to 570 miles per hour (115 to 255 meters per second by their measurement) at aluminum plates of up to 8 millimeters of thickness. Unsurprisingly, there was “substantial deformation” of the plates and the wingless drones were “completely destroyed.” Said destruction was recorded by a high-speed camera, though unfortunately the footage was not made available.

It’s necessary to do a variety of tests to determine what’s practical and what’s unnecessary or irrelevant — why spend the extra time and money firing the drones at 570 mph when 500 does the same level of damage? Does including the arms and propellers make a difference? At what speed is the plate in danger of being pierced, necessitating additional protective measures? And so on. A new rig is being constructed that will allow acceleration (and deceleration) of larger UAVs.

With enough testing the team hopes that not only could such things be standardized, but simulations could be built that would allow engineers to virtually test different surfaces or materials without a costly and explosive test rig.

Powered by WPeMatico

Startups at the speed of light: Lidar CEOs put their industry in perspective

Posted by | artificial intelligence, automotive, autonomous vehicles, Congruent Ventures, Gadgets, hardware, Innoviz, Lidar, Luminar, Lumotive, robotics, self-driving cars, sense photonics, Startups, TC, Transportation | No Comments

As autonomous cars and robots loom over the landscapes of cities and jobs alike, the technologies that empower them are forming sub-industries of their own. One of those is lidar, which has become an indispensable tool to autonomy, spawning dozens of companies and attracting hundreds of millions in venture funding.

But like all industries built on top of fast-moving technologies, lidar and the sensing business is by definition built somewhat upon a foundation of shifting sands. New research appears weekly advancing the art, and no less frequently are new partnerships minted, as car manufacturers like Audi and BMW scramble to keep ahead of their peers in the emerging autonomy economy.

To compete in the lidar industry means not just to create and follow through on difficult research and engineering, but to be prepared to react with agility as the market shifts in response to trends, regulations, and disasters.

I talked with several CEOs and investors in the lidar space to find out how the industry is changing, how they plan to compete, and what the next few years have in store.

Their opinions and predictions sometimes synced up and at other times diverged completely. For some, the future lies manifestly in partnerships they have already established and hope to nurture, while others feel that it’s too early for automakers to commit, and they’re stringing startups along one non-exclusive contract at a time.

All agreed that the technology itself is obviously important, but not so important that investors will wait forever for engineers to get it out of the lab.

And while some felt a sensor company has no business building a full-stack autonomy solution, others suggested that’s the only way to attract customers navigating a strange new market.

It’s a flourishing market but one, they all agreed, that will experience a major consolidation in the next year. In short, it’s a wild west of ideas, plentiful money, and a bright future — for some.

The evolution of lidar

I’ve previously written an introduction to lidar, but in short, lidar units project lasers out into the world and measure how they are reflected, producing a 3D picture of the environment around them.

Powered by WPeMatico

NASA’s Dragonfly will fly across the surface of Titan, Saturn’s ocean moon

Posted by | Gadgets, Government, hardware, NASA, nasa dragonfly, robotics, Saturn, science, Space, titan | No Comments

NASA has just announced its next big interplanetary mission: Dragonfly, which will deliver a Mars Rover-sized flying vehicle to the surface of Titan, a moon of Saturn with tantalizing life-supporting qualities. The craft will fly from place to place, sampling the delicious organic surface materials and sending high-resolution pictures back to Earth.

Dragonfly will launch in 2026, taking eight years to reach Titan and land (if all goes well) in 2034. So there will be plenty more updates after this one!

The craft will parachute through Titan’s hazy atmosphere and land among its dune-filled equatorial region. It’s equipped with drills and probes to investigate the surface, and of course cameras to capture interesting features and the surrounding alien landscape, flying from place to place using a set of rotors like a drone’s.

We’ve observed Titan from above via the Cassini mission, and we’ve even touched down on its surface briefly with the Huygens probe — which for all we know is still sitting there. But this will be a much more in-depth look at this fascinating moon.

Titan is a weird place. With rivers, oceans, and abundant organic materials on the surface, it’s very like Earth in some ways — but you wouldn’t want to live there. The rivers are liquid methane, for one thing, and if you’re familiar with methane, you’ll know that means it’s really cold there.

dragonfly gifNevertheless, Titan is still an interesting analogue to early Earth.

“We know that Titan has rich organic material, very complex organic material on the surface; there’s energy in the form of sunlight; and we know there’s been water on the surface in the past. These ingredients, that we know are necessary for the development life as we know it are sitting on the surface on Titan,” said principal investigator Elizabeth Turtle. “They’ve been doing chemistry experiments, basically, for hundreds of millions of years, and Dragonfly is designed to go pick up the results of those experiments.”

Don’t expect a flourishing race of methane-dwelling microbes, though. It’s more like going back in time to pre-life Earth to see what conditions may have resulted in the earliest complex self-replicating molecules: the origin of the origin of life, if you will.

dragonfly model

Principal investigator Elizabeth Turtle shows off a 1/4 scale model of the Dragonfly craft.

To do so Dragonfly, true to its name, will be flitting around the surface to collect data from many different locations. It may seem that something the size of a couch may have trouble lifting off, but as Turtle explained, it’s actually a lot easier to fly around Titan than to roll. With a far thicker atmosphere (mostly nitrogen, like ours) and a fraction of Earth’s gravity, it’ll be more like traveling through water than air.

That explains why its rotors are so small — for something that big on Earth, you’d need huge powerful rotors working full time. But even one of these little rotors can shift the craft if necessary (though they’ll want all eight for lift and redundancy).

We’ll learn more soon, no doubt. This is just the opening salvo from NASA on what will surely be years of further highlights, explanations, and updates on Dragonfly’s creation and launch.

“It’s remarkable to think of this rotorcraft flying miles and miles across the organic sand dunes of Saturn’s largest moon, exploring the processes that shape this extraordinary environment,” said NASA associate administrator for science Thomas Zurbuchen. “Titan is unlike any other place in the solar system, and Dragonfly is like no other mission.”

Powered by WPeMatico

Tiny Robobee X-Wing powers its flight with light

Posted by | Gadgets, hardware, harvard, harvard university, robobee, robotics, science, wyss institute | No Comments

We’ve seen Harvard’s Robobee flying robot evolve for years: After first learning to fly, it learned to swim in 2015, then to jump out of the water again in 2017 — and now it has another trick up its non-existent sleeve. The Robobee X-Wing can fly using only the power it collects from light hitting its solar cells, making it possible to stay in the air indefinitely.

Achieving flight at this scale is extremely hard. You might think that being small, it would be easy to take off and maintain flight, like an insect does. But self-powered flight actually gets much harder the smaller, which puts insects among the most bafflingly marvelous feats of engineering we have encountered in nature.

Oh, it’s easy enough to fly when you have a wire feeding you electricity to power a pair of tiny wings — and that’s how the Robobee and others flied before. It’s only very recently that researchers have accomplished meaningful flight using on-board power or, in one case, a laser zapping an attached solar panel.

robobee chartThe new Robobee X-Wing (named for its 4-wing architecture) achieves a new milestone with the ability to fly with no battery and no laser — only plain full-spectrum light coming from above. Brighter than sunlight, to be fair — but close to real-world conditions.

The team at Harvard’s Microrobotics Laboratory accomplished this by making the power conversion and wing mechanical systems incredibly lightweight — the whole thing weighs about a quarter of a gram, or about half a paper clip. Its power consumption is likewise lilliputian:

Consuming only 110–120 milliwatts of power, the system matches the thrust efficiency of similarly sized insects such as bees. This insect-scale aerial vehicle is the lightest thus far to achieve sustained untethered flight (as opposed to impulsive jumping or liftoff).

That last bit is some shade thrown at its competitors, which by nature can’t quite achieve “sustained untethered flight,” though what constitutes that isn’t exactly clear. After all, this Dutch flapping flyer can go a kilometer on battery power. If that isn’t sustained, I don’t know what is.

In the video of the Robobee you can see that when it is activated, it shoots up like a bottle rocket. One thing they don’t really have space for on the robot’s little body (yet) is sophisticated flight control electronics and power storage that could let it use only the energy it needs, flapping in place.

That’s probably the next step for the team, and it’s a non-trivial one: adding weight and new systems completely changes the device’s flight profile. But give them a few months or a year and this thing will be hovering like a real dragonfly.

The Robobee X-Wing is exhaustively described in a paper published in the journal Nature.

Powered by WPeMatico

This robot crawls along wind turbine blades looking for invisible flaws

Posted by | Gadgets, Government, GreenTech, hardware, robotics, sandia, sandia national labs, science, Wind power, wind turbines | No Comments

Wind turbines are a great source of clean power, but their apparent simplicity — just a big thing that spins — belie complex systems that wear down like any other, and can fail with disastrous consequences. Sandia National Labs researchers have created a robot that can inspect the enormous blades of turbines autonomously, helping keep our green power infrastructure in good kit.

The enormous towers that collect energy from wind currents are often only in our view for a few minutes as we drive past. But they must stand for years through inclement weather, temperature extremes, and naturally — being the tallest things around — lightning strikes. Combine that with normal wear and tear and it’s clear these things need to be inspected regularly.

But such inspections can be both difficult and superficial. The blades themselves are among the largest single objects manufactured on the planet, and they’re often installed in distant or inaccessible areas, like the many you see offshore.

“A blade is subject to lightning, hail, rain, humidity and other forces while running through a billion load cycles during its lifetime, but you can’t just land it in a hanger for maintenance,” explained Sandia’s Joshua Paquette in a news release. In other words, not only do crews have to go to the turbines to inspect them, but they often have to do those inspections in place — on structures hundreds of feet tall and potentially in dangerous locations.

Using a crane is one option, but the blade can also be orientated downwards so an inspector can rappel along its length. Even then the inspection may be no more than eyeballing the surface.

“In these visual inspections, you only see surface damage. Often though, by the time you can see a crack on the outside of a blade, the damage is already quite severe,” said Paquette.

Obviously better and deeper inspections are needed, and that’s what the team decided to work on, with partners International Climbing Machines and Dophitech. The result is this crawling robot, which can move along a blade slowly but surely, documenting it both visually and using ultrasonic imaging.

A visual inspection will see cracks or scuffs on the surface, but the ultrasonics penetrate deep into the blades, making them capable of detecting damage to interior layers well before it’s visible outside. And it can do it largely autonomously, moving a bit like a lawnmower: side to side, bottom to top.

Of course at this point it does it quite slowly and requires human oversight, but that’s because it’s fresh out of the lab. In the near future teams could carry around a few of these things, attach one to each blade, and come back a few hours or days later to find problem areas marked for closer inspection or scanning. Perhaps a crawler robot could even live onboard the turbine and scurry out to check each blade on a regular basis.

Another approach the researchers took was drones — a natural enough solution, since the versatile fliers have been pressed into service for inspection of many other structures that are dangerous for humans to get around: bridges, monuments, and so on.

These drones would be equipped with high-resolution cameras and infrared sensors that detect the heat signatures in the blade. The idea is that as warmth from sunlight diffuses through the material of the blade, it will do so irregularly in spots where damage below the surface has changed its thermal properties.

As automation of these systems improves, the opportunities open up: A quick pass by a drone could let crews know whether any particular tower needs closer inspection, then trigger the live-aboard crawler to take a closer look. Meanwhile the humans are on their way, arriving to a better picture of what needs to be done, and no need to risk life and limb just to take a look.

Powered by WPeMatico

Sense Photonics flashes onto the lidar scene with a new approach and $26M

Posted by | automotive, autonomous vehicles, Gadgets, hardware, Lidar, robotics, self-driving cars, sense photonics, Startups, TC | No Comments

Lidar is a critical part of many autonomous cars and robotic systems, but the technology is also evolving quickly. A new company called Sense Photonics just emerged from stealth mode today with a $26M A round, touting a whole new approach that allows for an ultra-wide field of view and (literally) flexible installation.

Still in prototype phase but clearly enough to attract eight figures of investment, Sense Photonics’ lidar doesn’t look dramatically different from others at first, but the changes are both under the hood and, in a way, on both sides of it.

Early popular lidar systems like those from Velodyne use a spinning module that emit and detect infrared laser pulses, finding the range of the surroundings by measuring the light’s time of flight. Subsequent ones have replaced the spinning unit with something less mechanical, like a DLP-type mirror or even metamaterials-based beam steering.

All these systems are “scanning” systems in that they sweep a beam, column, or spot of light across the scene in some structured fashion — faster than we can perceive, but still piece by piece. Few companies, however, have managed to implement what’s called “flash” lidar, which illuminates the whole scene with one giant, well, flash.

That’s what Sense has created, and it claims to have avoided the usual shortcomings of such systems — namely limited resolution and range. Not only that, but by separating the laser emitting part and the sensor that measures the pulses, Sense’s lidar could be simpler to install without redesigning the whole car around it.

I talked with CEO and co-founder Scott Burroughs, a veteran engineer of laser systems, about what makes Sense’s lidar a different animal from the competition.

“It starts with the laser emitter,” he said. “We have some secret sauce that lets us build a massive array of lasers — literally thousands and thousands, spread apart for better thermal performance and eye safety.”

These tiny laser elements are stuck on a flexible backing, meaning the array can be curved — providing a vastly improved field of view. Lidar units (except for the 360-degree ones) tend to be around 120 degrees horizontally, since that’s what you can reliably get from a sensor and emitter on a flat plane, and perhaps 50 or 60 degrees vertically.

“We can go as high as 90 degrees for vert which i think is unprecedented, and as high as 180 degrees for horizontal,” said Burroughs proudly. “And that’s something auto makers we’ve talked to have been very excited about.”

Here it is worth mentioning that lidar systems have also begun to bifurcate into long-range, forward-facing lidar (like those from Luminar and Lumotive) for detecting things like obstacles or people 200 meters down the road, and more short-range, wider-field lidar for more immediate situational awareness — a dog behind the vehicle as it backs up, or a car pulling out of a parking spot just a few meters away. Sense’s devices are very much geared toward the second use case.

These are just prototype units, but they work and you can see they’re more than just renders.

Particularly because of the second interesting innovation they’ve included: the sensor, normally part and parcel with the lidar unit, can exist totally separately from the emitter, and is little more than a specialized camera. That means that while the emitter can be integrated into a curved surface like the headlight assembly, while the tiny detectors can be stuck in places where there are already traditional cameras: side mirrors, bumpers, and so on.

The camera-like architecture is more than convenient for placement; it also fundamentally affects the way the system reconstructs the image of its surroundings. Because the sensor they use is so close to an ordinary RGB camera’s, images from the former can be matched to the latter very easily.

The depth data and traditional camera image correspond pixel-to-pixel right out of the system.

Most lidars output a 3D point cloud, the result of the beam finding millions of points with different ranges. This is a very different form of “image” than a traditional camera, and it can take some work to convert or compare the depths and shapes of a point cloud to a 2D RGB image. Sense’s unit not only outputs a 2D depth map natively, but that data can be synced with a twin camera so the visible light image matches pixel for pixel to the depth map. It saves on computing time and therefore on delay — always a good thing for autonomous platforms.

Sense Photonics’ unit also can output a point cloud, as you see here.

The benefits of Sense’s system are manifest, but of course right now the company is still working on getting the first units to production. To that end it has of course raised the $26 million A round, “co-led by Acadia Woods and Congruent Ventures, with participation from a number of other investors, including Prelude Ventures, Samsung Ventures and Shell Ventures,” as the press release puts it.

Cash on hand is always good. But it has also partnered with Infineon and others, including an unnamed tier-1 automotive company, which is no doubt helping shape the first commercial Sense Photonics product. The details will have to wait until later this year when that offering solidifies, and production should start a few months after that — no hard timeline yet, but expect this all before the end of the year.

“We are very appreciative of this strong vote of investor confidence in our team and our technology,” Burroughs said in the press release. “The demand we’ve encountered – even while operating in stealth mode – has been extraordinary.”

Powered by WPeMatico