automotive

As autonomy stalls, lidar companies learn to adapt

Posted by | automotive, baraja, dji, Gadgets, hardware, Innoviz, Lidar, robotics, TC, Velodyne | No Comments

Lidar sensors are likely to be essential to autonomous vehicles, but if there are none of the latter, how can you make money with the former? Among the industry executives I spoke with, the outlook is optimistic as they unhitch their wagons from the sputtering star of self-driving cars. As it turns out, a few years of manic investment does wonders for those who have the wisdom to apply it properly.

The show floor at CES 2020 was packed with lidar companies exhibiting in larger spaces, seemingly in greater numbers than before. That seemed at odds with reports that 2019 had been a sort of correction year for the industry, so I met with executives and knowledgeable types at several companies to hear their take on the sector’s transformation over the last couple of years.

As context, 2017 was perhaps peak lidar, nearing the end of several years of nearly feverish investment in a variety of companies. It was less a gold rush than a speculative land rush: autonomous vehicles were purportedly right around the corner and each would need a lidar unit… or five. The race to invest in a winner was on, leading to an explosion of companies claiming ascendancy over their rivals.

Unfortunately, as many will recall, autonomous cars seem to be no closer today than they were then, as the true difficulty of the task dawned on those undertaking it.

Powered by WPeMatico

Baraja’s unique and ingenious take on lidar shines in a crowded industry

Posted by | automotive, autonomous vehicles, baraja, Gadgets, hardware, lasers, Lidar, self-driving cars, Startups, TC | No Comments

It seems like every company making lidar has a new and clever approach, but Baraja takes the cake. Its method is not only elegant and powerful, but fundamentally avoids many issues that nag other lidar technologies. But it’ll need more than smart tech to make headway in this complex and evolving industry.

To understand how lidar works in general, consult my handy introduction to the topic. Essentially a laser emitted by a device skims across or otherwise very quickly illuminates the scene, and the time it takes for that laser’s photons to return allows it to quite precisely determine the distance of every spot it points at.

But to picture how Baraja’s lidar works, you need to picture the cover of Pink Floyd’s “Dark Side of the Moon.”

GIFs kind of choke on rainbows, but you get the idea.

Imagine a flashlight shooting through a prism like that, illuminating the scene in front of it — now imagine you could focus that flashlight by selecting which color came out of the prism, sending more light to the top part of the scene (red and orange) or middle (yellow and green). That’s what Baraja’s lidar does, except naturally it’s a bit more complicated than that.

The company has been developing its tech for years with the backing of Sequoia and Australian VC outfit Blackbird, which led a $32 million round late in 2018 — Baraja only revealed its tech the next year and was exhibiting it at CES, where I met with co-founder and CEO Federico Collarte.

“We’ve stayed in stealth for a long, long time,” he told me. “The people who needed to know already knew about us.”

The idea for the tech came out of the telecommunications industry, where Collarte and co-founder Cibby Pulikkaseril thought of a novel use for a fiber optic laser that could reconfigure itself extremely quickly.

We thought if we could set the light free, send it through prism-like optics, then we could steer a laser beam without moving parts. The idea seemed too simple — we thought, ‘if it worked, then everybody would be doing it this way,’ ” he told me, but they quit their jobs and worked on it for a few months with a friends and family round, anyway. “It turns out it does work, and the invention is very novel and hence we’ve been successful in patenting it.”

Rather than send a coherent laser at a single wavelength (1550 nanometers, well into the infrared, is the lidar standard), Baraja uses a set of fixed lenses to refract that beam into a spectrum spread vertically over its field of view. Yet it isn’t one single beam being split but a series of coded pulses, each at a slightly different wavelength that travels ever so slightly differently through the lenses. It returns the same way, the lenses bending it the opposite direction to return to its origin for detection.

It’s a bit difficult to grasp this concept, but once one does it’s hard to see it as anything but astonishingly clever. Not just because of the fascinating optics (something I’m partial to, if it isn’t obvious), but because it obviates a number of serious problems other lidars are facing or about to face.

First, there are next to no moving parts whatsoever in the entire Baraja system. Spinning lidars like the popular early devices from Velodyne are being replaced at large by ones using metamaterials, MEMS, and other methods that don’t have bearings or hinges that can wear out.

Baraja’s “head” unit, connected by fiber optic to the brain.

In Baraja’s system, there are two units, a “dumb” head and an “engine.” The head has no moving parts and no electronics; it’s all glass, just a set of lenses. The engine, which can be located nearby or a foot or two away, produces the laser and sends it to the head via a fiber-optic cable (and some kind of proprietary mechanism that rotates slowly enough that it could theoretically work for years continuously). This means it’s not only very robust physically, but its volume can be spread out wherever is convenient in the car’s body. The head itself also can be resized more or less arbitrarily without significantly altering the optical design, Collarte said.

Second, the method of diffracting the beam gives the system considerable leeway in how it covers the scene. Different wavelengths are sent out at different vertical angles; a shorter wavelength goes out toward the top of the scene and a slightly longer one goes a little lower. But the band of 1550 +/- 20 nanometers allows for millions of fractional wavelengths that the system can choose between, giving it the ability to set its own vertical resolution.

It could for instance (these numbers are imaginary) send out a beam every quarter of a nanometer in wavelength, corresponding to a beam going out every quarter of a degree vertically, and by going from the bottom to the top of its frequency range cover the top to the bottom of the scene with equally spaced beams at reasonable intervals.

But why waste a bunch of beams on the sky, say, when you know most of the action is taking place in the middle part of the scene, where the street and roads are? In that case you can send out a few high frequency beams to check up there, then skip down to the middle frequencies, where you can then send out beams with intervals of a thousandth of a nanometer, emerging correspondingly close together to create a denser picture of that central region.

If this is making your brain hurt a little, don’t worry. Just think of Dark Side of the Moon and imagine if you could skip red, orange and purple, and send out more beams in green and blue — and because you’re only using those colors, you can send out more shades of green-blue and deep blue than before.

Third, the method of creating the spectrum beam provides against interference from other lidar systems. It is an emerging concern that lidar systems of a type could inadvertently send or reflect beams into one another, producing noise and hindering normal operation. Most companies are attempting to mitigate this by some means or another, but Baraja’s method avoids the possibility altogether.

“The interference problem — they’re living with it. We solved it,” said Collarte.

The spectrum system means that for a beam to interfere with the sensor it would have to be both a perfect frequency match and come in at the precise angle at which that frequency emerges from and returns to the lens. That’s already vanishingly unlikely, but to make it astronomically so, each beam from the Baraja device is not a single pulse but a coded set of pulses that can be individually identified. The company’s core technology and secret sauce is the ability to modulate and pulse the laser millions of times per second, and it puts this to good use here.

Collarte acknowledged that competition is fierce in the lidar space, but not necessarily competition for customers. “They have not solved the autonomy problem,” he points out, “so the volumes are too small. Many are running out of money. So if you don’t differentiate, you die.” And some have.

Instead companies are competing for partners and investors, and must show that their solution is not merely a good idea technically, but that it is a sound investment and reasonable to deploy at volume. Collarte praised his investors, Sequoia and Blackbird, but also said that the company will be announcing significant partnerships soon, both in automotive and beyond.

Powered by WPeMatico

Echodyne steers its high-tech radar beam on autonomous cars with EchoDrive

Posted by | artificial intelligence, automotive, autonomous vehicles, echodyne, Gadgets, hardware, radar, robotics, Transportation | No Comments

Echodyne set the radar industry on its ear when it debuted its pocket-sized yet hyper-capable radar unit for drones and aircraft. But these days all the action is in autonomous vehicles — so they reinvented their technology to make a unique sensor that doesn’t just see things but can communicate intelligently with the AI behind the wheel.

EchoDrive, the company’s new product, is aimed squarely at AVs, looking to complement lidar and cameras with automotive radar that’s as smart as you need it to be.

The chief innovation at Echodyne is the use of metamaterials, or highly engineered surfaces, to create a radar unit that can direct its beam quickly and efficiently anywhere in its field of view. That means that it can scan the whole horizon quickly, or repeatedly play the beam over a single object to collect more detail, or anything in between, or all three at once for that matter, with no moving parts and little power.

But the device Echodyne created for release in 2017 was intended for aerospace purposes, where radar is more widely used, and its capabilities were suited for that field: a range of kilometers but a slow refresh rate. That’s great for detecting and reacting to distant aircraft, but not at all what’s needed for autonomous vehicles, which are more concerned with painting a detailed picture of the scene within a hundred meters or so.

“They said they wanted high-resolution, automotive bands [i.e. radiation wavelengths], high refresh rates, wide field of view, and still have that beam-steering capability — can you build a radar like that?,” recalled Echodyne co-founder and CEO Eben Frankenberg. “And while it’s taken a little longer than I thought it would, the answer is yes, we can!”

The EchoDrive system meets all the requirements set out by the company’s automotive partners and testers, with up to 60hz refresh rates, higher resolution than any other automotive radar and all the other goodies.

An example of some raw data — note that Doppler information lets the system tell which objects are moving which direction.

The company is focused specifically on level 4-5 autonomy, meaning their radar isn’t intended for basic features like intelligent cruise control or collision detection. But radar units on cars today are intended for that, and efforts to juice them up into more serious sensors are dubious, Frankenberg said.

“Most ADAS [advanced driver assist system] radars have relatively low resolution in a raw sense, and do a whole lot of processing of the data to make it clearer and make it more accurate as far as the position of an object,” he explained. “The level 4-5 folks say, we don’t want all that processing because we don’t know what they’re doing. They want to know you’re not doing something in the processing that’s throwing away real information.”

More raw data, and less processing — but Echodyne’s tech offers something more. Because the device can change the target of its beam on the fly, it can do so in concert with the needs of the vehicle’s AI.

Say an autonomous vehicle’s brain has integrated the information from its suite of sensors and can’t be sure whether an object it sees a hundred meters out is a moving or stationary bicycle. It can’t tell its regular camera to get a better image, or its lidar to send more lasers. But it can tell Echodyne’s radar to focus its beam on that object for a bit longer or more frequently.

The two-way conversation between sensor and brain, which Echodyne calls cognitive radar or knowledge-aided measurement, isn’t really an option yet — but it will have to be if AVs are going to be as perceptive as we’d like them to be.

Some companies, Frankenberg pointed out, are putting on the sensors themselves the responsibility for deciding which objects or regions need more attention — a camera may very well be able to decide where to look next in some circumstances. But on the scale of a fraction of a second, and involving the other resources available to an AV — only the brain can do that.

EchoDrive is currently being tested by Echodyne’s partner companies, which it would not name but which Frankenberg indicated are running level 4+ AVs on public roads. Given the growing number of companies that fit those once-narrow criteria, it would be irresponsible to speculate on their identities, but it’s hard to imagine an automaker not getting excited by the advantages Echodyne claims.

Powered by WPeMatico

Tesla vehicles are getting Stardew Valley

Posted by | automotive, Gaming, Stardew Valley, TC, Tesla | No Comments

Tesla owners have seen a bunch of games added to their car’s in-dash display over the last few months, from a handful of Atari classics to the painfully hard Cuphead. Next up? The oh-so-sweet farm-living RPG, Stardew Valley.

In addition to just being kind of cool, the in-dash games (playable only when parked, because duh) are meant to help Tesla owners kill time while at superchargers. Stardew Valley is… sort of the perfect game for this. Beyond being so charming that it hurts, it’s an absolutely incredible way to burn 30 minutes in the blink of an eye. Oh, you’ve got 10 minutes to waste? Why not tend to your crops? Or explore the mines? Or catch some fish?

Word of the addition comes from a Musk tweet:

Tesla holiday software update has FSD sneak preview, Stardew Valley, Lost Backgammon & a few other things

— Elon Musk (@elonmusk) December 20, 2019

Musk also mentions a full self-driving (FSD) “sneak preview”… which would normally be a headline on its own, except that no one really knows exactly what it’ll entail, or just how much the “preview” will actually include. But it’s coming!

The tweet also mentions “Lost Backgammon,” which I can only assume is a game of backgammon where JJ Abrams continuously throws in all sorts of mysterious new stuff only to end the whole thing without really explaining 90% of it.

Musk doesn’t get specific about when this update will land. Seeing as he calls it the “holiday” update, though, it’ll presumably be sometime soon.

Powered by WPeMatico

BMW says ‘ja’ to Android Auto

Posted by | Alexa, Android, Android Auto, artificial intelligence, Assistant, automotive, BMW, business model, CarPlay, computing, Cortana, Dieter May, digital services, Google, Microsoft, Mobile, operating systems, smartphones | No Comments

BMW today announced that it is finally bringing Android Auto to its vehicles, starting in July 2020. With that, it will join Apple’s CarPlay in the company’s vehicles.

The first live demo of Android Auto in a BMW will happen at CES 2020 next month. After that, it will become available as an update to drivers in 20 countries with cars that feature the BMW OS 7.0. BMW will support Android Auto over a wireless connection, though, which somewhat limits its comparability.

Only two years ago, the company said that it wasn’t interested in supporting Android Auto. At the time, Dieter May, who was then the senior VP for Digital Services and Business Model, explicitly told me that the company wanted to focus on its first-party apps in order to retain full control over the in-car interface and that he wasn’t interested in seeing Android Auto in BMWs. May has since left the company, though it’s also worth noting that Android Auto itself has become significantly more polished over the course of the last two years.

“The Google Assistant on Android Auto makes it easy to get directions, keep in touch and stay productive. Many of our customers have pointed out the importance to them of having Android Auto inside a BMW for using a number of familiar Android smartphone features safely without being distracted from the road, in addition to BMW’s own functions and services,” said Peter Henrich, senior vice president Product Management BMW, in today’s announcement.

With this, BMW will also finally offer support for the Google Assistant after early bets on Alexa, Cortana and the BMW Assistant (which itself is built on top of Microsoft’s AI stack). The company has long said it wants to offer support for all popular digital assistants. For the Google Assistant, the only way to make that work, at least for the time being, is Android Auto.

In BMWs, Android Auto will see integrations into the car’s digital cockpit, in addition to BMW’s Info Display and the heads-up display (for directions). That’s a pretty deep integration, which goes beyond what most car manufacturers feature today.

“We are excited to work with BMW to bring wireless Android Auto to their customers worldwide next year,” said Patrick Brady, vice president of engineering at Google. “The seamless connection from Android smartphones to BMW vehicles allows customers to hit the road faster while maintaining access to all of their favorite apps and services in a safer experience.”

Powered by WPeMatico

Ghost wants to retrofit your car so it can drive itself on highways in 2020

Posted by | Android, Argo AI, Automation, automotive, autonomous car, AV, california, controller, Emerging-Technologies, founders fund, Ghost Locomotion, gps, IBM, Keith Rabois, Khosla Ventures, Lyft, machine learning, Mike Speiser, National Highway Traffic Safety Administration, Pure Storage, robotics, self-driving cars, sutter hill ventures, TC, technology, Tesla, transport, Transportation, Uber, unmanned ground vehicles, waymo, zoox | No Comments

A new autonomous vehicle company is on the streets — and unbeknownst to most, has been since 2017. Unlike the majority in this burgeoning industry, this new entrant isn’t trying to launch a robotaxi service or sell a self-driving system to suppliers and automakers. It’s not aiming for autonomous delivery, either.

Ghost Locomotion, which emerged Thursday from stealth with $63.7 million in investment from Keith Rabois at Founders Fund, Vinod Khosla at Khosla Ventures and Mike Speiser at Sutter Hill Ventures, is targeting your vehicle.

Ghost is developing a kit that will allow privately owned passenger vehicles to drive autonomously on highways. And the company says it will deliver in 2020. A price has not been set, but the company says it will be less than what Tesla charges for its Autopilot package that includes “full self-driving” or FSD. FSD currently costs $7,000.

This kit isn’t going to give a vehicle a superior advanced driving assistance system. The kit will let human drivers hand control of their vehicle over to a computer, allowing them to do other activities such as look at their phone or even doze off.

The idea might sound similar to what Comma.ai is working on, Tesla hopes to achieve or even the early business model of Cruise. Ghost CEO and co-founder John Hayes says what they’re doing is different.

A different approach

The biggest players in the industry — companies like Waymo, Cruise, Zoox and Argo AI — are trying to solve a really hard problem, which is driving in urban areas, Hayes told TechCrunch in a recent interview.

“It didn’t seem like anyone was actually trying to solve driving on the highways,” said Hayes, who previously founded Pure Storage in 2009. “At the time, we were told that this is so easy that surely the automakers will solve this any day now. And that really hasn’t happened.”

Hayes noted that automakers have continued to make progress in advanced driver assistance systems. The more advanced versions of these systems provide what the SAE describes as Level 2 automation, which means two primary control functions are automated. Tesla’s Autopilot system is a good example of this; when engaged, it automatically steers and has traffic-aware cruise control, which maintains the car’s speed in relation to surrounding traffic. But like all Level 2 systems, the driver is still in the loop.

Ghost wants to take the human out of the loop when they’re driving on highways.

“We’re taking, in some ways, a classic startup attitude to this, which is ‘what is the simplest product that we can perfect, that will put self driving in the hands of ordinary consumers?’ ” Hayes said. “And so we take people’s existing cars and we make them self-driving cars.”

The kit

Ghost is tackling that challenge with software and hardware.

The kit involves hardware like sensors and a computer that is installed in the trunk and connected to the controller area network (CAN) of the vehicle. The CAN bus is essentially the nervous system of the car and allows various parts to communicate with each other.

Vehicles must have a CAN bus and electronic steering to be able to use the kit.

The camera sensors are distributed throughout the vehicle. Cameras are integrated into what looks like a license plate holder at the back of the vehicle, as well as another set that are embedded behind the rearview mirror.

A third device with cameras is attached to the frame around the window of the door (see below).

Initially, this kit will be an aftermarket product; the company is starting with the 20 most popular car brands and will expand from there.

Ghost intends to set up retail spaces where a car owner can see the product and have it installed. But eventually, Hayes said, he believes the kit will become part of the vehicle itself, much like GPS or satellite radio has evolved.

While hardware is the most visible piece of Ghost, the company’s 75 employees have dedicated much of their time on the driving algorithm. It’s here, Hayes says, where Ghost stands apart.

How Ghost is building a driver

Ghost is not testing its self-driving system on public roads, an approach nearly every other AV company has taken. There are 63 companies in California that have received permits from the Department of Motor Vehicles to test autonomous vehicle technology (always with a human safety driver behind the wheel) on public roads.

Ghost’s entire approach is based on an axiom that the human driver is fundamentally correct. It begins by collecting mass amounts of video data from kits that are installed on the cars of high-mileage drivers. Ghost then uses models to figure out what’s going on in the scene and combines that with other data, including how the person is driving by measuring the actions they take.

It doesn’t take long or much data to model ordinary driving, actions like staying in a lane, braking and changing lanes on a highway. But that doesn’t “solve” self-driving on highways because the hard part is how to build a driver that can handle the odd occurrences, such as swerving, or correct for those bad behaviors.

Ghost’s system uses machine learning to find more interesting scenarios in the reams of data it collects and builds training models based on them.

The company’s kits are already installed on the cars of high-mileage drivers like Uber and Lyft drivers and commuters. Ghost has recruited dozens of drivers and plans to have its kits in hundreds of cars by the end of the year. By next year, Hayes says the kits will be in thousands of cars, all for the purpose of collecting data.

The background of the executive team, including co-founder and CTO Volkmar Uhlig, as well as the rest of their employees, provides some hints as to how they’re approaching the software and its integration with hardware.

Employees are data scientists and engineers, not roboticists. A dive into their resumes on LinkedIn and not one comes from another autonomous vehicle company, which is unusual in this era of talent poaching.

For instance, Uhlig, who started his career at IBM Watson Research, co-founded Adello and was the architect behind the company’s programmatic media trading platform. Before that, he built Teza Technologies, a high-frequency trading platform. While earning his PhD in computer science he was part of a team that architected the L4 Pistachio microkernel, which is commercially deployed in more than 3 billion mobile Apple and Android devices.

If Ghost is able to validate its system — which Hayes says is baked into its entire approach — privately owned self-driving cars could be on the highways by next year. While the National Highway Traffic Safety Administration could potentially step in, Ghost’s approach, like Tesla, hits a sweet spot of non-regulation. It’s a space, that Hayes notes, where the government has not yet chosen to regulate.

Powered by WPeMatico

BMW’s magical gesture control finally makes sense as touchscreens take over cars

Posted by | automotive, automotive industry, BMW, driver, Ford, Gadgets, RAM, Tesla, Touchscreens, Virtual reality, volvo | No Comments

BMW has been equipping its cars with in-air gesture control for several years and I never paid attention to it. It seemed redundant. Why wave your hand in the air when there are dials, buttons and touchscreens to do the same thing? Until this week, that is, when I took delivery of a BMW 850i loaner equipped with the tech. This is about the future.

I didn’t know the 850i used gesture control, because, frankly, I had forgotten BMW had this technology; I stumbled upon it. Just make a motion in the air to control the volume or tell the navigation to send you home. Now, in 2019, with giant touchscreens set to takeover cars, I find BMW’s gesture control smart and a great solution to a future void of buttons.

It’s limited in use right now. There are only a few commands: volume, nav, recent calls and turning on and off the center screen. It’s easy to see additional functions added in the future. It’s sorely missing the ability to step back a screen. I want that function the most.

Here’s how it works: To control the volume, take one finger and spin it in the air above the center stack. Anywhere. The range is impressive. A person can do this next to the screen or two feet away. A person’s arm could be resting on the center armrest and lift in the air and twirl their finger. Bam, it controls the volume. Put two fingers up – not spinning, like a flat peace sign – and the screen turns on or off. Make a fist and open it twice to load the navigation or phone (user picks the function).

After using the system for several days, I never had a false positive. The volume control took about 10 minutes to master, while the other gestures worked the first time.

In this car, these commands work in conjunction with physical buttons, dials and a touchscreen. The gestures are optional. A user can turn off the function in the settings, too.

I found the in-air control a lovely addition to the buttons, though. At night, in the rain, they’re great as they do not require the driver to remove their focus from the road. Just twirl your fingers to turn down the volume.

I’m not convinced massive touchscreens are better for the driver. The lack of actual, tactile response along with burying options in menus can lead drivers to take their eyes off the road. For the automaker, using touchscreens is less expensive than developing, manufacturing and installing physical buttons. Instead of having rows of plastic buttons and dials along with the mechanical bits behind them, automakers can use a touchscreen and program everything to be on-screen. Tesla did it first; Ram, Volvo and now Ford are following.

In-air gesture control could improve the user experience with touchscreens. When using BMW’s system, I didn’t have to take my eyes off the road to find the volume — something that I have to do occasionally, even in my car. Instead, I just made a circle in the air with my right hand. Likewise, BMW’s system lets the user call up the nav and navigate to a preset destination (like work or home) by just making another gesture.

As touchscreens take over cars, automakers will likely look to similar systems to supplement the lack of physical buttons. While gestures aren’t as good, they’re better than just a silly touchscreen.

Powered by WPeMatico

Sense Photonics brings its fancy new flash lidar to market

Posted by | automotive, autonomous vehicles, Gadgets, hardware, Lidar, self-driving cars, sense photonics, Startups | No Comments

There’s no shortage of lidar solutions available for autonomous vehicles, drones and robots — theoretically, anyway. But getting a lidar unit from theory to mass production might be harder than coming up with the theory in the first place. Sense Photonics appears to have made it past that part of the journey, and is now offering its advanced flash lidar for pre-order.

Lidar comes in a variety of form factors, but the spinning type we’ve seen so much of is on its way out, and more compact, reliable planar types are on the way in; Luminar is making moves to get ahead, but Sense Photonics isn’t sitting still — and anyway, the two companies have different strengths.

While Luminar and some other companies aim to create a forward-facing lidar that can detect shapes hundreds of feet ahead in a relatively narrow field of view, Sense is going after the short-range, wide-angle side of things. And because they sync up with regular cameras, it’s easy as pie to map depth onto the RGB image:

Sense Photonics makes it easy to match traditional camera views with depth data

These are lidars that you’d want mounted on the rear or sides of the vehicles, able to cover a wide slice of the surroundings and get accurate detection of things like animals, kids and bikes quickly and accurately. But I went through all this when they came out of stealth.

The news today is that these units have gone from prototype to production design. The devices have been ruggedized so they can be attached outside of enclosures even in dusty or rainy environments. And performance has been improved, bumping the maximum range in some cases out to more than 40 meters, well over what was promised before.

The base price of $2,900 covers a unit with an 80×30 degree field of view, but others cover wider areas, up to 95×75 degrees — a large amount by lidar standards, and in higher fidelity than other flash lidars out there. You do give up some other properties in return for the wide view, though. The proprietary tech created by the company lets the lidar’s detector be located elsewhere than the laser emitter, too, which makes designing around the things easier (if not exactly easy).

Obviously if people are meant to order these online from the company these are not going to be appearing in next year’s autonomous vehicles. No, it’s more for bulk purchases by companies doing serious testing in industry settings.

Whether the Sense Photonics kit or some other lucky lidar company’s ends up on the robo-fleets of tomorrow is up in the air, but it does help for your product to actually exist. You can find out more about the company’s lidar platform here.

Powered by WPeMatico

Every angle of Volvo’s first electric vehicle, the XC40 Recharge

Posted by | Android, automotive, electric vehicle, energy consumption, Environmental Protection Agency, Google, hyundai, operating system, Recharge, transport, United States, volvo, Volvo Cars, Volvo XC40, XC40 | No Comments

Volvo Cars introduced Wednesday the XC40 Recharge, an all-electric vehicle that CTO Henrik Green described as “a car of firsts and a car of the future.”

The XC40 Recharge is hardly the first electric vehicle on the market. But for Volvo, the XC40 is a “car of firsts.” This is the company’s first all-electric vehicle. It’s also the first Volvo to have an infotainment system powered by Google’s Android operating system as well as have the ability to make over-the-air software updates.

Before we move on to the photos, here are some of the specs.

The XC40 Recharge is equipped with an all-wheel-drive powertrain and a 78 kilowatt-hour battery that can travel more than 400 kilometers (248 miles) on a single charge, in accordance with WLTP. The WLTP, or Worldwide Harmonised Light Vehicle Test Procedure, is the European standard to measure energy consumption and emissions, and tends to be more generous than the U.S. EPA estimates. The EPA estimates are not yet available, but it’s likely the XC40 Recharge will hit around the 200-mile range.

That would put the range of the Volvo XC40 Recharge below the Tesla Model 3, Chevy Bolt EV, Kia Niro and Hyundai Kona.

The vehicle’s electric motor produces the equivalent of 408 horsepower and 442 pound-feet of torque that allows the vehicle to go from zero to 60 mph in 4.8 seconds. The battery charges to 80% of its capacity in 40 minutes on a fast-charger system.

The XC40 Recharge is expected to go on sale in the U.S. in late 2020.

Here’s what this car of “many firsts” looks like.

Powered by WPeMatico

Volvo unveils its first electric car, the XC40 Recharge

Posted by | Android, automotive, electric vehicles, Environmental Protection Agency, Google, Google Assistant, Google Play Store, Google-Maps, green vehicles, linux, plug-in hybrid, Polestar, Recharge, sensus, transport, volvo, Volvo XC40 | No Comments

Volvo Cars introduced Wednesday the XC40 Recharge, its first electric car under a new EV-focused brand that kicks off a company-wide shift toward electrification.

“It’s a car of firsts and it’s a car of the future,” CTO Henrik Green said.

The Volvo XC40 Recharge is the first electric vehicle in the automaker’s portfolio. It’s also the first Volvo to have an infotainment system powered by Google’s Android operating system as well as have the ability to make over-the-air software updates.

This is also the first vehicle under Volvo’s new Recharge brand. Recharge, which was announced this week, will be the overarching name for all chargeable Volvos with a fully electric and plug-in hybrid powertrain, according to the company.

The all-electric vehicle is based off Volvo’s popular XC40 small SUV. However, this is not a retrofit of a gas-powered vehicle.

The XC40 Recharge is equipped with an all-wheel drive powertrain and a 78 kilowatt-hour battery that can travel more than 400 kilometers (248 miles) on a single charge, in accordance with WLTP. The WLTP, or Worldwide Harmonised Light Vehicle Test Procedure, is the European standard to measure energy consumption and emissions, and tends to be more generous than the U.S. EPA estimates. The EPA estimates are not yet available, but it’s likely the XC40 Recharge will hit around the 200-mile range.

That would put the range of the Volvo XC40 Recharge below the Tesla Model 3, Chevy Bolt EV, Kia Niro and Hyundai Kona.

However, Volvo did make a vehicle with impressive horsepower and fast charging capability, which could attract buyers. The vehicle’s electric motor produces the equivalent of 408 horsepower and 442 pound-feet of torque that allows the vehicle to go from zero to 60 mph in 4.8 seconds. The battery charges to 80% of its capacity in 40 minutes on a fast-charger system.

Volvo XC40 Recharge 1

Android-powered infotainment

The infotainment system in the all-electric Volvo XC40 will be powered by an automotive version of Android OS, and, as a result, bring into the car embedded Google services such as Google Assistant, Google Maps and the Google Play Store.

This Android-powered infotainment system is the product of a years-long partnership between the automaker and Google. In 2017, Volvo announced plans to incorporate a version of its Android operating system into its car infotainment systems. A year later, the company said it would embed voice-controlled Google Assistant, Google  Play Store, Google Maps and other Google services into its next-generation Sensus infotainment system.

The Android-powered infotainment system is fully integrated with Volvo On Call, the company’s digital connected services platform. Plug-in hybrid drivers using the Volvo On Call will be able to track how much time they spend driving on electric power.

Volvo XC40 infotainment system

The infotainment system in the Polestar 2, the new vehicle from Volvo’s standalone performance brand, also is powered by Android OS.

Android Automotive OS shouldn’t be confused with Android Auto, which is a secondary interface that lies on top of an operating system. Android Automotive OS is modeled after its open-source mobile operating system that runs on Linux. But instead of running smartphones and tablets, Google modified it so it could be used in cars.

Volvo isn’t the only automaker to partner with Google to bring Android OS into its vehicles. GM began shipping vehicles with Google Android Automotive OS in 2017, starting with the Cadillac CTS and expanding to other brands. GM said in September that Google will provide in-vehicle voice, navigation and other apps in its Buick, Cadillac, Chevrolet and GMC vehicles starting in 2021.

Over-the-air software updates

The electric XC40 is also the first Volvo that will receive software and operating system updates over the air. Over-the-air, or wireless, software updates were popularized by Tesla, which has used the capability to improve its vehicles over time. Tesla has used the OTAs to fix software bugs, roll out new features in its infotainment system and improve performance.

Volvo intends to use OTAs for the operating system and other software inside the vehicle, Green said. Other automakers, with the exception of Tesla, have slowly inched toward OTAs, but have minimized its use, and limited it to the infotainment system.

“So now the XC40 will stay as fresh as your phone or tablet, and no longer will a car’s best day be the day it leaves the factory,” Green said.

Powered by WPeMatico