Intellectual Ventures

Gates-backed Lumotive upends lidar conventions using metamaterials

Posted by | accelerator, automotive, autonomous vehicles, Bill Gates, Gadgets, hardware, Intellectual Ventures, lasers, Lidar, Lumotive, robotics, science, self-driving cars, TC, Transportation | No Comments

Pretty much every self-driving car on the road, not to mention many a robot and drone, uses lidar to sense its surroundings. But useful as lidar is, it also involves physical compromises that limit its capabilities. Lumotive is a new company with funding from Bill Gates and Intellectual Ventures that uses metamaterials to exceed those limits, perhaps setting a new standard for the industry.

The company is just now coming out of stealth, but it’s been in the works for a long time. I actually met with them back in 2017 when the project was very hush-hush and operating under a different name at IV’s startup incubator. If the terms “metamaterials” and “Intellectual Ventures” tickle something in your brain, it’s because the company has spawned several startups that use intellectual property developed there, building on the work of materials scientist David Smith.

Metamaterials are essentially specially engineered surfaces with microscopic structures — in this case, tunable antennas — embedded in them, working as a single device.

Echodyne is another company that used metamaterials to great effect, shrinking radar arrays to pocket size by engineering a radar transceiver that’s essentially 2D and can have its beam steered electronically rather than mechanically.

The principle works for pretty much any wavelength of electromagnetic radiation — i.e. you could use X-rays instead of radio waves — but until now no one has made it work with visible light. That’s Lumotive’s advance, and the reason it works so well.

Flash, 2D and 1D lidar

Lidar basically works by bouncing light off the environment and measuring how and when it returns; this can be accomplished in several ways.

Flash lidar basically sends out a pulse that illuminates the whole scene with near-infrared light (905 nanometers, most likely) at once. This provides a quick measurement of the whole scene, but limited distance as the power of the light being emitted is limited.

2D or raster scan lidar takes an NIR laser and plays it over the scene incredibly quickly, left to right, down a bit, then does it again, again and again… scores or hundreds of times. Focusing the power into a beam gives these systems excellent range, but similar to a CRT TV with an electron beam tracing out the image, it takes rather a long time to complete the whole scene. Turnaround time is naturally of major importance in driving situations.

1D or line scan lidar strikes a balance between the two, using a vertical line of laser light that only has to go from one side to the other to complete the scene. This sacrifices some range and resolution but significantly improves responsiveness.

Lumotive offered the following diagram, which helps visualize the systems, although obviously “suitability” and “too short” and “too slow” are somewhat subjective:

The main problem with the latter two is that they rely on a mechanical platform to actually move the laser emitter or mirror from place to place. It works fine for the most part, but there are inherent limitations. For instance, it’s difficult to stop, slow or reverse a beam that’s being moved by a high-speed mechanism. If your 2D lidar system sweeps over something that could be worth further inspection, it has to go through the rest of its motions before coming back to it… over and over.

This is the primary advantage offered by a metamaterial system over existing ones: electronic beam steering. In Echodyne’s case the radar could quickly sweep over its whole range like normal, and upon detecting an object could immediately switch over and focus 90 percent of its cycles tracking it in higher spatial and temporal resolution. The same thing is now possible with lidar.

Imagine a deer jumping out around a blind curve. Every millisecond counts because the earlier a self-driving system knows the situation, the more options it has to accommodate it. All other things being equal, an electronically steered lidar system would detect the deer at the same time as the mechanically steered ones, or perhaps a bit sooner; upon noticing this movement, it could not just make more time for evaluating it on the next “pass,” but a microsecond later be backing up the beam and specifically targeting just the deer with the majority of its resolution.

Just for illustration. The beam isn’t some big red thing that comes out.

Targeted illumination would also improve the estimation of direction and speed, further improving the driving system’s knowledge and options — meanwhile, the beam can still dedicate a portion of its cycles to watching the road, requiring no complicated mechanical hijinks to do so. Meanwhile, it has an enormous aperture, allowing high sensitivity.

In terms of specs, it depends on many things, but if the beam is just sweeping normally across its 120×25 degree field of view, the standard unit will have about a 20Hz frame rate, with a 1000×256 resolution. That’s comparable to competitors, but keep in mind that the advantage is in the ability to change that field of view and frame rate on the fly. In the example of the deer, it may maintain a 20Hz refresh for the scene at large but concentrate more beam time on a 5×5 degree area, giving it a much faster rate.

Meta doesn’t mean mega-expensive

Naturally one would assume that such a system would be considerably more expensive than existing ones. Pricing is still a ways out — Lumotive just wanted to show that its tech exists for now — but this is far from exotic tech.

CG render of a lidar metamaterial chip.The team told me in an interview that their engineering process was tricky specifically because they designed it for fabrication using existing methods. It’s silicon-based, meaning it can use cheap and ubiquitous 905nm lasers rather than the rarer 1550nm, and its fabrication isn’t much more complex than making an ordinary display panel.

CTO and co-founder Gleb Akselrod explained: “Essentially it’s a reflective semiconductor chip, and on the surface we fabricate these tiny antennas to manipulate the light. It’s made using a standard semiconductor process, then we add liquid crystal, then the coating. It’s a lot like an LCD.”

An additional bonus of the metamaterial basis is that it works the same regardless of the size or shape of the chip. While an inch-wide rectangular chip is best for automotive purposes, Akselrod said, they could just as easily make one a quarter the size for robots that don’t need the wider field of view, or a larger or custom-shape one for a specialty vehicle or aircraft.

The details, as I said, are still being worked out. Lumotive has been working on this for years and decided it was time to just get the basic information out there. “We spend an inordinate amount of time explaining the technology to investors,” noted CEO and co-founder Bill Colleran. He, it should be noted, is a veteran innovator in this field, having headed Impinj most recently, and before that was at Broadcom, but is perhaps is best known for being CEO of Innovent when it created the first CMOS Bluetooth chip.

Right now the company is seeking investment after running on a 2017 seed round funded by Bill Gates and IV, which (as with other metamaterial-based startups it has spun out) is granting Lumotive an exclusive license to the tech. There are partnerships and other things in the offing, but the company wasn’t ready to talk about them; the product is currently in prototype but very showable form for the inevitable meetings with automotive and tech firms.

Powered by WPeMatico

EarthNow promises real-time views of the whole planet from a new satellite constellation

Posted by | accelerator, earthnow, Gadgets, Intellectual Ventures, OneWeb, Satellites, Space, Startups | No Comments

A new space imaging startup called EarthNow aims to provide not just pictures of the planet on demand, but real-time video anywhere a client desires. Its ambition is matched only by its pedigree: Bill Gates, Intellectual Ventures, Airbus, SoftBank and OneWeb founder Greg Wyler are all backing the play.

Its promise is a constellation of satellites that will provide video of anywhere on Earth with latency of about a second. You won’t have to wait for a satellite to come into range, or worry about leaving range; at least one will be able to view any area at any given time, so they can pass off the monitoring task to the next satellite over if necessary.

Initially aimed at “high value enterprise and government customers,” EarthNow lists things like storm monitoring, illegal fishing vessels (or even pirates), forest fires, whale tracking, watching conflicts in real time and more. Space imaging is turning into quite a crowded field — if all these constellations actually launch, anyway.

The company is in the earliest stages right now, having just been spun out from years of work by founder and CEO Russell Hannigan at Intellectual Ventures under the Invention Science Fund. Early enough, in fact, that there’s no real timeline for prototyping or testing. But it’s not just pie in the sky.

Wyler’s OneWeb connection means EarthNow will be built on a massively upgraded version of that company’s satellite platform. Details are few and far between, but the press release promises that “Each satellite is equipped with an unprecedented amount of onboard processing power, including more CPU cores than all other commercial satellites combined.”

Presumably a large portion of that will be video processing and compression hardware, since they’ll want to minimize bandwidth and latency but don’t want to skimp on quality. Efficiency is important, too; satellites have extremely limited power, so running multiple off-the-shelf GPUs with standard compression methods probably isn’t a good idea. Real-time, continuous video from orbit (as opposed to near-real-time stills or clips) is as much a software problem as it is hardware.

Machine learning also figures in, of course: the company plans to do onboard analysis of the imagery, though to what extent isn’t clear. It really makes more sense to me to do this on the ground, but perhaps a first pass by the satellite’s hardware will help move things along.

Airbus will do its part by actually producing the satellites, in Toulouse and Florida. The release doesn’t say how many will be built, but full (and presumably redundant) Earth coverage means dozens at the least. But if they’re mass-manufactured standard goods, that should keep the price down, relatively speaking anyway.

No word on the actual amount raised by the company in January, but with the stature of the investors and the high costs involved in the industry, I can’t imagine it’s less than a few tens of millions.

Hannigan himself calls EarthNow “ambitious and unprecedented,” which could be taken as an admission of great risk, but it’s clear that the company has powerful partners and plenty of expertise; Intellectual Ventures doesn’t tend to spin something off unless it’s got something special going. Expect more specifics as the company grows, but I doubt we’ll see anything more than renders for a year or so.

Powered by WPeMatico

Echodyne’s pocket-sized radar may be the next must-have tech for drones (and drone hunters)

Posted by | drones, echodyne, Gadgets, Intellectual Ventures, radar, robotics, sensors, TC, Transportation, UAVs | No Comments

 Just because we live in an age of many sensors doesn’t mean we always have the right one for the job. One in particular we’ve been lacking is a radar system that can detect obstacles and aircraft hundreds of meters out, yet fit comfortably on a small drone. The laws of physics, it seemed, prevented it — but Echodyne made it work anyway. Read More

Powered by WPeMatico