• Enabling the in-car cocktail party

    | May 09, 2018

    It's not what you think - we're not talking about drinking and driving! Creating an automotive speech recognition system that works well under all conditions has always been a very challenging proposition. The car’s acoustic environment has lots of loud, non-predictable sounds that compete with the driver’s voice – like wind noise, road noise, and traffic. You may be able to guess another big noise source from inside the car, especially if you’re a parent – the other occupants. Whether it’s children, companions, or colleagues, it’s not always easy to stop all chatter just so the car has a chance of recognizing what you say to it. Recognizing one speaker in a crowd is the so-called “cocktail party problem,” and it’s been a very difficult one for computers to solve, especially within the car.

    Friends talking in carThis is why we’ve developed technology that can distinguish and separate multiple simultaneous voices, making the vehicle’s speech recognition more robust, useful and accurate. Allowing the car to recognize hands-free commands without requiring other conversations to stop provides more natural speech interaction and a better user experience for everyone in the car. And since this solution only needs a single microphone, it doesn’t introduce additional hardware expense to the car. Great – but according to the US Census Bureau’s latest data, over 76 percent of commuters drive alone and therefore don’t need to worry about competing voices.

    Yes, but that’s today. In a few short years, perhaps even within one or two traditional automotive design cycles, mobility will be drastically transformed. Mobility as a Service (MaaS) is taking off and promises the ability for people to dynamically choose their mode of transport, with on-demand vehicle subscriptions and multiple car- or ride-sharing models.  Far from removing the need to talk to your car, we believe self-driving technology will result in even more opportunity to talk to, direct, and control your car. The future is multi-passenger – and conversational.

    Digital assistants like Amazon Alexa and Google Assistant are the hottest thing since sliced bread. With greater exposure, reliability, and comfort in a digital assistant, people will be increasingly relying on them to perform tasks in the car – tasks that don't rely on their eyes or hands.  That’s why we don’t just need digital assistants in our cars, we need ones that can listen to each one of us, even when we’re all talking.


    Jacek Spiewla     Jacek Spiewla
    User-Experience Manager, Advanced Development

    Go comment!
  • Speaking to Smarter Cars

    | May 03, 2018

    If you or I were eavesdropping on someone’s conversation we’d probably be able to tell if the speaker was a man or a woman, if they were young or old, maybe even where they lived or their first language. Computers, on the other hand, have no such ability; in their attempts to understand human speech they often overlook contextual details and user characteristics that people naturally absorb.


    Speech recognition is one of the most challenging areas of computer science. While improvements have been slow to materialize, the technology is finally beginning to become a useful tool – as anyone who uses a virtual assistant can tell you. That said, the next major challenge in computerized speech recognition is understanding the contextual details of speech and the personal characteristics of those speaking.


    What would this advancement mean for cars? We believe it would be a solid building block for improving the in-car user experience. Let’s look at a couple of cases where a car can help its users once it can “listen between the lines.


    • JuanJuan asks his car to “Enciende la radio”. Since the car doesn’t understand that command, it checks it against a few different languages in its library and determines that the request is in Spanish. It then turns on the radio as requested and automatically switches the infotainment system, speech recognition, and instrument cluster settings to Spanish.


    • Olivia uses voice recognition to authorize herself as a new driver of her parent’s sedan. The car recognizes her and auto-sets the “safe-driving” profile her parents have configured for her. Because the car recognizes her voice as that of a young woman, it fine-tunes the speech recognition models using a higher pitch for better accuracy while switching the default satellite radio presets from classic rock to electronic dance music.


    • Josephine flies from Montreal to Atlanta and starts up her rental car. The car detects her French accent and prompts her to see if she would like to switch to French. As Josephine is very comfortable in English, she replies “No”. The car then asks if she would prefer metric measurements and she readily answers “Yes”, so the car switches its units to metric.


    • As a Brooklyn native, Tony has never before owned a car. His brand-new car detects his New York accent and asks if he would like to enable the “urban native” features – prompted street parking warnings, parking lot pricing display, and automatic traffic avoidance suggestions. Tony knows the city but not how to navigate it by car, so he enables them right away. 

    These are just a few examples of how the determination of voice characteristics can help improve the user experience in a car. We at Mitsubishi Electric are looking into this along with how voice context can be merged with information from social networks to let the car guide preferences in music, shopping, restaurants, or even the in-car environment. With some appropriate smarts, the car can be an amazing accompaniment to a predictably perfect experience.

    Jacek Spiewla     Jacek Spiewla
    User-Experience Manager, Advanced Development

    Go comment!
  • Let It Snow

    | Feb 16, 2018
    Snowy winter driving

    If you live in the Snow Belt, you might be tempted to take autonomous driving technology with a dose of (road) salt. You’ve driven in winter and you know what it’s about. Sometimes the road is covered with black ice or snow flies off another car’s roof, or chunks of ice litter the road – sometimes it’s hard to see the road at all making you wonder how today’s ADAS features function in inclement weather. You may even have fond memories as a young driver practicing your winter driving skills in an empty snow-covered parking lot with Mom or Dad. If you do, you know that self-driving cars are going to need their own snowy driving drills before they are up for the task.


    That’s exactly what we’re doing with our self-driving technology at Mitsubishi Electric. Our high-precision mapping technology allows us to always know where the road is – even if it’s covered in a white blanket of snow. This is crucial for self-driving cars in winter and has been a major stumbling block for the industry to date. We’ve already tested it with our own “parking lot” test and road tests are currently in progress.


    If you’ve been watching the PyeongChang Winter Olympics like I have, it’s hard not to be astounded at the world’s best athletes shredding the slopes. For the more adventurous among you, those amazing skiing and boarding skills might have awakened the urge to hit the hill a bit yourself. Consider this: By the time the 2022 Winter Olympics start in Beijing, your car will be able to take you to your local ski resort and back – all by itself.

    Gareth Williams     Gareth Williams
    Executive Director of Advanced Development

    Go comment!
  • Portrait Displays and the 2019 RAM 1500

    | Jan 17, 2018
    CNET 2019 RAM 1500 Infotainment

    If you had a chance to see some of the more exciting announcements coming out of the Detroit Auto Show (aka NAIAS or the North American International Auto Show), you may have seen the new 2019 RAM 1500. If you did, I’m sure you didn’t miss the beautiful new 12” portrait-mode uConnect infotainment system. Here’s a glowing review from CNET RoadShow

    2014 Cadillac CTS demo with FlexConnect


    In that video, the reviewer Antuan Goodwin compares the RAM 1500 display to one in the Tesla Model S. That particular comment got us thinking – at Mitsubishi Electric, we’ve been doing huge portrait displays for a long time now. We were building portrait mode infotainment systems with our FlexConnect platform before the Model S made luxurious portrait displays the hot new thing. Here’s a 2014 Cadillac CTS outfitted with a portrait FlexConnect (conveniently parked in a cafeteria for our press event that year).




    FCA Police Charger w/ FlexConnect

    We think portrait mode displays are the future – and Praveen Chandrasekar at Frost & Sullivan agrees in his piece about the Volvo XC90 display. As another example, we supply the mobile command center for a different FCA product we can talk about: the Law Enforcement Dodge Charger. 


    We love the 2019 RAM 1500 infotainment system, and we’re really happy it’s getting rave reviews. We’ve always loved trucks, and we’re glad they’re getting some long overdue attention as one of the coolest cockpits at NAIAS.


    Gareth Williams     Gareth Williams
    Executive Director of Advanced Development

    Go comment!
  • CES 2018 - Day three highlights: The battle of the digital assistant

    | Jan 12, 2018

    CES 2018 bannerMoving out of the automotive-centric CES north hall into the south and central halls, one thing becomes clear. This is the year for digital assistants – primarily Amazon Alexa and Google Assistant – and they’re fighting for dominance in the technology landscape. Voice technology has finally reached the point where it’s usable by the general public without the hassle and pain of earlier solutions. For the car, that’s great news as voice assistants are more than just a cool feature – they can actually improve safety while increasing functionality and productivity.

    Hey Google Jeep dialog @ CES 2018

    Amazon is ahead in the number of announced integrations and they seem to be working hard to be an easy integration partner. Alexa-based rollouts in the works now include Toyota and Lexus, with previously announced members Ford, Volkswagen, Hyundai, and Volvo. Alexa has also made a big bid in the home, with ecosystem partners on the computer side like Acer, Asus and HP, as well as appliance manufacturers like Whirlpool and Kohler. As we see more home/car integrations, this ecosystem dominance will strongly tie the two together. We can immediately think of several use cases.


    This isn’t to say that Google is standing still. Partners include Honda, Hyundai, GM, and Kia. In the home, they’re going into LG ThinQ, JBL speakers, and Sony TV. Google is clearly making a push to try to unseat Amazon’s market lead.

    Some products incorporate both Google and Amazon digital assistants. Will they both be active simultaneously, dependent on keyword, or will they be enabled exclusively? Time will tell but we may end up seeing a couple of different approaches to handling multiple digital assistants.

    Of course, CES’s new auto-related tech isn’t all voice assistants. LIDAR companies claiming the fastest, cheapest, smallest, or otherwise best technology – from Velodyne, Quanergy, AEye, Luminar, LeddarTech, and Innoviz – are all jockeying for a position in a market that’s about to explode.

    Yamaha drone

    And while there are a huge number of drone companies with use cases from photography, security patrols, emergency supply delivery, and medical assistance, curiously none of them are showing automotive use cases like we’re demoing in our booth.

    It’s been a packed three days – it’s almost time to look forward to next year!


    Gareth Williams     Gareth Williams
    Executive Director of Advanced Development

    Go comment!
  • CES 2018 – Day two highlights: Bringing it home

    | Jan 11, 2018
    Ford 'living street' @ CES2018We’ve been saying it for some time and it’s our theme for the show: Automakers no longer want to be defined by the traditional (and narrow) definition of automotive but want to be seen as mobility providers. Moreover, they want to influence not only how we commute but also our lifestyle. Nowhere was this more evident on the CES show floor than with Ford. This Motorcity giant has an impressive vision for the connected city. Their booth is in fact a “living street”, complete with self-driving delivery car that can cut down on traffic and parking hassles by allowing groceries, dry cleaners, and other businesses to share delivery vehicles. This makes more room for green living spaces – and people. Definitely worth a visit.


    Jeep made a solid lifestyle play on their booth with a Home-to-Car demo that features Amazon Alexa and Hey Google. Owners of the 2018 Jeep Cherokee Latitude with the optional “Tech Connect” package can now ask their favorite digital assistant to remotely start and stop the engine, lock and unlock the doors, monitor vehicle vitals, and more.


    Honda autonomous ATV @ CES2018Honda rules the roost with robotics but another standout is their autonomous ATV, built on their existing rugged ATV chassis, and with heavy-duty rails on top to accept multiple different accessories. It’s not only interesting for things like fire and rescue, and construction, but Honda also imagines this for personal use. Want extra help around the ranch – feeding cattle or cutting weeds? What about plowing the snow out of your driveway, hauling rocks, or raking leaves? Program your rugged D18 to take care of it.


    Mercedes BMUX @ CES2018Mercedes had some very cool steering-wheel haptic HMIs that showed off their new MBUX user interface with natural language integration – CNET has a great detailed review. It’s almost hard to look at the HMI as it’s right next to their gorgeous AMG electric supercar – 0-200 km/h in under 6 seconds – but we managed and so should you.


    Although things continue to be extremely busy on the Mitsubishi Electric booth, we hope to get further afield tomorrow to check out the drone and robotics technology. Once again, stay tuned!


    Gareth Williams     Gareth Williams
    Executive Director of Advanced Development

    Go comment!
  • CES 2018 – Day one highlights: Industry synergy

    | Jan 10, 2018
    We spent most of our time in the North Hall on day one where the majority of automotive tech is focused – sandwiched in-between other meetings, of course. What did we see?


    Toyota 'beloved intelligence' @ CES2018As expected, AI and autonomous were focal points for many. Toyota showed their “beloved intelligence” concept car – a vehicle that learns what its owner wants and tries to deliver more of the same, much like a favorite pooch. Kia also had something similar in their Niro EV– watching drivers’ faces to predict their music and playlists. Both of these concepts are things we’ve discussed as predictive HMIs (and of which we’re demoing our own flavor). Something we hadn’t seen before: Nissan and their “mind reading” technology. Like a Muse headband but for the car; the idea is to make the car a partner in the driving activity rather than a replacement by predicting early on what drivers might do and helping the car take action or warn of trouble situations. Definitely intriguing use of wearable technology but it’ll be interesting to see if this has any legs when it comes to consumer usability.


    Byton @ CES2018EV continues to gain traction; every automaker we saw had more EV models in the pipeline – Kia announced 16 new EV models by 2025 – and some like Hyundai are even looking at hydrogen fuel cells as an alternative option.  Chargepoint had a great booth showcasing their latest charging and power stations. And taking the place of last year’s buzz around Faraday Future was the buzz over Byton, another Chinese-funded, cleanly designed and incredibly sexy EV vehicle. It’s the next competitor in a long line (like the Karma, NextEV, Lucid, etc) that is attempting to unseat Tesla. Will they be successful? Check back next year to see.


    Notable too was the surge of 5G related technology and announcements –Qualcomm, Badiu, Verizon, and Nokia are all throwing in their hats. Think fully mobile network with WiFi data speeds and you’ll sense the excitement. Qualcomm also discussed C-V2X, or cellular V2X – using car-to-car LTE for fast and low-latency in a way that could finally make V2X a reasonable proposition. We’ll have to wait for the network operators to make it ubiquitous before the automakers jump on board but they’ve already started dipping in their toes … it won’t be long.


    Hyundai home @ CES2018Finally, we saw automakers stretching their market muscle into non-automotive applications of their technology like Hyundai with their hydrogen powered home or Toyota with their personal mobility “scooter”. We’ll expect to be seeing a lot more of this trend as soon as we get the chance to cover more of the floor! Stay tuned.


    Gareth Williams     Gareth Williams
    Executive Director of Advanced Development

    Go comment!
  • Autonomous reality check: The pressing need for highly accurate maps

    | Jan 09, 2018
    Five years ago, the most sophisticated in-car navigation system displayed roads using GPS to locate a vehicle within five yards, providing fairly reliable turn-by-turn driving directions. However, the system could be wrong by 50 yards in densely populated areas with urban canyons, and fail completely in tunnels. Too many of us have horror stories of ending up in a sand pit instead of a campground or someone’s driveway instead of a parking lot. The proliferation of Google Maps on smartphones somewhat improved the situation in that most maps were updated far more frequently than their in-car cousins but as a GPS-based system, its accuracy in all situations was still unreliable.


    For the coming generation of autonomous vehicles, the convenience of low-precision maps is no longer enough. Centimeter-level accuracy becomes critical.


    A few years ago, some automakers had hoped that autonomous vehicles might be able to position themselves using low-definition maps and high-powered sensors. With clear road markings, visual sensors could keep cars safely within lanes and spot the dotted lines indicative of exits.


    The problem is a fully driverless car needs to operate safely in all environments and under all conditions. LIDAR has an effective range of around 50 yards but that can dwindle significantly in a snowstorm or when other vehicles obscure objects. Even the smartest car travelling on a freeway can only “see” ahead of itself about a second and a half. Self-driving cars need to be able to anticipate turns and junctions far beyond their sensors’ horizons. More importantly perhaps they also need to locate themselves precisely as an error of a couple of yards could place a driverless car in oncoming traffic.


    That’s why we’re working to help advance autonomous technology on two fronts: building highly accurate maps and building highly accurate position determination into cars.


    Mitsubishi Electric MMS scanning systemTo create high-precision 3D maps, we’ve developed the Mobile Mapping System (MMS), a self-contained sensor-studded platform with multiple cameras, LIDAR, GNSS receivers, and processors mounted on a vehicle. The system scans a road segment to create a comprehensive and highly accurate digital representation of that road – one that is used for both training autonomous systems as well as for creating extremely accurate maps.

    For high-accuracy position determination, we provide a number of technologies that are integrated into the car’s sensor network. We combine inputs from several satellite positioning systems to improve the traditional accuracy that comes from using only one. (Our experience in satellites includes the latest satellite positioning network sponsored by the Japanese government.) We also have technology that improves the accuracy of standard dead reckoning systems – augmenting wheel tick sensors and rough directionality with camera images that track road position and speed – resulting in far better position estimations in the absence of satellite signals.

    We’re working closely with industry mapping experts to further augment our technology. Visit our website for more information and then drop by our booth at CES to learn more.



    Mark Rakoski

    Vice President of Sales, Engineering, and R&D Global Business Development

    Go comment!
  • Harmonizing the home and the hatchback

    | Jan 05, 2018

    The smart home market is growing almost as rapidly as the connected car market but so far the two have yet to fully converge and what their consolidated future will look like is still up for grabs. When most people envision the connection between car and home, they often think of integrated media, navigation, or energy management. We at Mitsubishi Electric think of much more. In fact, we see cars working in harmony with smart homes in a way that offers exciting new conveniences and cost savings in a seamless and intuitive manner.

    Car-to-home integration
    Predictive integration
    It’s easy enough to imagine a driver manually using a vehicle interface to turn up their home’s heating on the way home from work. Or using a smart home assistant to check on the car’s charge status or fuel level. But imagine the benefits that could result from the car and home working together on their occupants’ behalf without continuous human intervention, using a blend of home-to-car connectivity and predictive HMI technology.

    For example, the car could automatically notify the home at the end of the work day as it gets progressively closer so that the home could bring its temperature up or down according to owner preferences and time of year. Once the car signals to the home that it’s pulling into the driveway, the home could turn on the lights for a welcoming arrival. Conversely, as the last occupant leaves for the day, the car could alert the home that everyone was gone, asking it to adjust the temperature, turn off the lights, lock the door, and arm the security system.

    Voice-activated integration
    Digital assistants with conversational speech interfaces can provide another key technology to bridge the car and home. Market researcher Ovum forecasts the digital-assistant market will grow from 3.6 billion in 2016 to 7.5 billion by 2021. This means there will be almost as many digital assistants in 2021 as there are humans on the planet today, creating a tremendous opportunity for the automotive industry.

    Virtual assistants like Amazon Alexa, Apple Siri, or Microsoft Cortana could provide a sense of delight that keeps car ownership sexy. Imagine your customer’s surprise the very first time their assistant asks, “I see you have an appointment across town in an hour; would you like me to warm up the car?”

    New mobility integration
    Home-to-car integration may be just as useful in new mobility scenarios; add in digital assistants and you’ll have some killer applications. For example, vehicle owners in car sharing pools could use a virtual assistant to find out what’s happening with their car – where it is, who’s driving it, who’s in the passenger seat, and how fast it’s going. And, if they need to take their car back for the day, they could find out when it’s expected to complete its current trip and arrive home.

    Mitsubishi Electric has one of the best-integrated infotainment offerings, FLEXConnect.AI, which makes it perfectly positioned for developing car-to-home and home-to-car use cases. We’re also experts at virtual assistant integration. We’ll be showing some innovative ways to take advantage of both trends in our CES 2018 demo car. Visit our CES web page for more information and then make an appointment to see us next week in Vegas.

    Jacek Spiewla     Jacek Spiewla
    User-Experience Manager, Advanced Development

    Go comment!
  • Biometrics: The key to your car

    | Jan 03, 2018

    Advanced technology, which makes cars harder to steal, also makes car keys harder to copy. So while losing a car key has never been fun, these days it’s an expensive proposition that most consumers would rather avoid. One solution is to let wearables act as car keys. With at least a quarter of American adults owning a wearable and estimates on wearable adoption continuing to rise, this may be the ideal way for cars to recognize and authenticate their owners. At Mitsubishi Electric, we’ve been investigating a biometric wearable strategy for vehicle authentication and have come up with the following ideas we’d like to share.


    Lifestyle branding

    Car-aware wearables could provide automakers with the perfect way to extend a premium brand onto an owner’s wrist. A Cadillac, Lexus, Mercedes, or Porsche wrist strap would allow consumers to enter their cars while acting as an extension of those same brand qualities into other mobility spaces. Rather than a single-purpose widget, an OEM-branded biometric wearable could interface with a smart home, favorite apps, personal devices, or enterprise equipment, giving OEMs a much broader mobility presence. This capability could also take advantage of consumers’ existing wearables with branded applications on FitBit, Apple Watch, or Android Wear.


    Car-sharing convenience

    Biometric wearable unlocking carBiometric wearables could make a great accessory for car sharing. By authenticating the owner through their biometric identification, a car sharing service could guarantee proper access to whatever vehicles an individual is allowed to use without needing to worry about pass codes or physical keys. That freedom could extend to car subscription models where someone swaps cars on a dealer lot at will, with a person’s biometric data providing the link to an online account, payment, and insurance details.


    In addition to replacing the physical key, car rentals, car sharing, and ride sharing would all benefit from the car’s ability to confidently and uniquely recognize an individual through personalization. Biometric identification would allow a car to automatically access and provide many convenience features like smartphone pairing, calendar syncing, and routing to preset locations from either personal mobile or cloud repositories. People’s preferred traveling identities would follow them wherever they go without worry about smartphone loss or theft.


    Remote control

    With a biometric wearable and a smart watch, people would have the ability to remotely control their car from anywhere with more identity assurance than a physical key fob. Your customer could roll down the windows and unlock the car for the kids while still inside packing lunches. Or they could open the trunk to let the neighbor borrow a scissor jack while they’re on vacation.


    Increased security

    Accessing cars via biometric prints means that consumers can’t lose their “keys”. It also means that someone can’t steal someone else’s key and thus their car without the appropriate and unique fingerprint, iris, voice, or ECG. Biometric security also makes it far simpler for fleet or business owners to provision vehicles since transferring cars could take place via a few mouse clicks instead of a cumbersome exchange of key fobs.


    Improved UX
    Biometrics could identify everyone in the vehicle, not just the owner (or whomever’s got the key fob). This fine-grained information about a car’s occupants could provide a much improved experience, from individual mobile payments to per-seat personalization. With individual identification, the infotainment system could automatically adapt to the driver and front-seat passenger, and rear-seat entertainment could show everyone’s favorite shows.


    These are just some of the reasons we think biometrics will gain traction within automotive. We’ll be showcasing a subset of these use cases on our booth at CES 2018 – be sure to make an appointment to see them and our other innovations.

    Gareth Williams     Gareth Williams
    Executive Director of Advanced Development

    Go comment!
  • Augmented-reality: Amping up personalization and car sharing

    | Dec 20, 2017

    Imagine this: A city street crowded with cars and a sidewalk teeming with dozens of people, all looking at their phones. Except these people aren’t looking down at their phones, isolated in their personal conversations. Instead, they’re holding their phones up at street level, using them as augmented reality (AR) portals, viewing cars on the street. What do they see?

    Yellow car with AR car sharing tags
    It’s an interesting thought experiment, one that we’ll be answering in our demo car at CES 2018. The concept of having cars with an AR presence may seem like a fun concept without a lot of practical use. At Mitsubishi Electric we’ve imagined both some fun as well as serious and imminently practical uses of how an AR portal could be used to delight your customers.

    • Personalization – At the playful end of the spectrum, people could dictate a text message that hovers over the vehicle as a refreshable bumper sticker, broadcasting a person’s passions. This could also be a graphical image, letting creative people take car customization to the next level and graffiti artists tag their own transportation.

    • Family car – For a shared family car, a private AR message could be anything from “Low on gas, sorry Dad!” to “Pick up Emma at ballet @ 2pm”. This type of messaging becomes even more effective if the smartphone is also used to unlock and start the car.

    • Taxi – Taxis could show their availability status – as well as their rates – in big bubbles floating over their cars. As soon as someone steps in for a ride, the bubble would disappear. This could make it super easy to identify which taxis are in service, even from some distance away.

    • Car sharing – For car-sharing services like Zipcar, car2go, or Turo, a bubble over a shared car could advertise its availability, hours, prices, pick-up points, or any number of useful things. Instead of sharable cars only being advertised online, they’d be promoting themselves everywhere they go.

    • Lyft pickups – It’s a common scenario: you’re scouting for your Uber or Lyft car that says it’s arrived … only you can’t find it. Ride shares could show big tags with customer names – visible only to the relevant customers – letting people use their phones to scan pickup points and easily find their rides.

    These are just some of the concepts we’ve got for mixing AR and cars based on our many years of experience in infotainment and UX. We think this is a rich new way to let drivers communicate to others and we’ve got a prototype that shows it working. Make an appointment to stop by our CES booth in January 2018 and we’ll show it to you in action!

    Gareth Williams     Gareth Williams
    Executive Director of Advanced Development

    Go comment!
  • Changing the way we pay

    | Dec 12, 2017

    Are consumers ready to put their wallets and their smartphones aside, and use their cars to make everyday purchases? While most people still feel best using a credit or debit card (or even cash), this scenario is not as far off as some may think.

    “Mobile payments” is a big term that can mean anything from buying something in person or sending someone an e-check. Most people are familiar with the point of sale payment commonly known as tap to pay. Lots of stores have systems in place and some retailers like fast food, gas stations, coffee shops, etc have what’s called closed loop mobile payments, which are company specific. Similarly, many people have either used PayPal, Google Wallet, or Apple Pay on their phones or know someone who has. 

    Buying coffee with car+mobile payments

    As cars increasingly become smartphones on wheels, it’s logical if not inevitable that they will soon be taking care of purchases too. In-vehicle payments make a lot of sense – who wants to fumble with a credit card to pay for gas in the middle of a snowstorm? It’s easy to envision other scenarios as well: drive- thru windows, parking lots, car washes, etc. But the benefits of in-vehicle payments don’t stop there.

    Cars could be used as trusted payment providers to consolidate individual orders and payments, making complex transactions trivial. For example, a car could be used in a drive-thru situation where each person has an individual order. It could also be used to fairly distribute payments for transportation-related activities that carry an associated cost (like toll roads or bridges, gas, parking, etc).

    How would this work? The car creates a trust fabric – a number of nodes that reciprocally trust each other – from individually associated devices. Those devices are smart phones or wearables (biometric devices) – things that are able to uniquely identify a person with a guarantee of authenticity. The trust fabric allows payment systems access to selected attributes (such as payment methods) of the authorized individuals within the car. The car basically acts as a broker to external entities, making a single consolidated order and payment from several individual transactions.

    Imagine placing an in-vehicle coffee order for four. The driver and front passenger use the IVI system while the rear-seat passengers use their phones or an RSE interface. The car locates the nearest retail store and plots a route to it while sending in the complete order. It also exchanges payment information and informs the store of the car’s anticipated arrival time. Once the car pulls up to the drive-thru, it communicates its trusted ID to the coffee shop's order and payment system – which lets the person at the window verify the order, assemble it, and serve it – all without a coin or card having been exchanged, making the whole experience more pleasant while saving the driver time.

    This is the type of value-added service that FLEXConnect.AI makes possible. We’re demonstrating this and other innovations at CES in just over a month. Be sure to book an appointment to come see us if you’re going to be at the show. 


    Mohammad Horani     Mohammad Horani
    Manager, Advanced Development

    Go comment!
  • The Silicon Valet

    | Dec 05, 2017

    In Iron Man, Tony Stark is reliant on Jarvis, his holographic, always-available virtual assistant on everything from passcodes to propulsion. What makes an AI like Jarvis so amazingly handy isn’t just its capabilities but its ability to predict what Tony needs – most of the time even before his needs are expressed.

    Your robot assistant is coming soonWhile today's car isn’t going to help build a fist-size nuclear fusion reactor anytime soon, it certainly could be a bit more helpful than it is right now. What if, like Jarvis, a car could predict people's needs and offer helpful, timely suggestions or subtly change its behavior to anticipate them? What if a car could save time in accomplishing a driver's most frequent and repetitive tasks? We’re exploring this in our latest CES demo car by extending our FLEXConnect.AI infotainment system with the concept of a predictive HMI (human-machine interface). What might this look like?

    • Your car looks at your linked phone and sees you’ve got “Pick up eyeglasses” on your to-do list. When you step in the car, the car offers, “I see you need to pick up your eyeglasses. Dr. Scott’s office opens tomorrow morning at 7:30 AM, so you’d have plenty of time to pick them up before your first meeting at 10:00 AM. Would you like me to remind you tomorrow?” When you say yes, the car alerts you in the morning before work and routes you to the optometrist avoiding the morning rush-hour traffic.
    • Friday is the family’s pizza and movie night, and your car notices that every Friday evening you seem to head over to Newport Pizza. On your way home from work the next Friday, it asks, “Would you like to get your standard pizza from Newport’s for a 6:30 PM pick-up as usual? I can also dial them for you if you’d like a different pizza tonight.”
    • Your car sees that you almost always prefer to roll down the windows instead of using climate control. It also notices that you usually listen to news radio during the morning commute but your iPhone music at other times. Accordingly, it reconfigures a single programmable display dashboard button to show “Heat” when it’s below 45 degrees outside, “Cool” when above 85 degrees, “WJR” in the morning, and “iPhone” for afternoon or evening.
    • Your car tells you “It’s Anna’s birthday tomorrow. Would you like to get her a gift? There a few stores within ten minutes of here that have items from her wish list.”
    • You’re on a long road trip, and the car notices that your head is nodding a bit. “You seem a bit sleepy. Would you like a coffee? There is a Starbucks, a Tim Horton’s, and a McDonald's all within five minutes of here.”

    Cars currently have an amazing amount of information available to them, from linked devices and cloud accounts, to driving history. With the power of big data and machine learning, they can correlate all this data, learning over time what discrete events occur in people's lives and building relationships between events to understand what things drivers might want. Like an online product recommendation, a predictive HMI can also compare patterns to those of others to help understand what to expect next.

    For predictions to be unobtrusive and improve over time, an HMI would need to get feedback about its suggestions. This allows the HMI to explain its reasoning of why it is offering its suggestions in the first place, and to correct for future circumstances. Here’s a dialog illustrating this point.

    Car: “Do you want to call Mike?”
    Driver: “Why?”
    Car: “Because he’s left two voice mails.”
    Driver: “No thanks!”
    Car: “No problem. Should I delete those voice mails?”
    Driver: “Sure.”
    Car: “Done.”

    Even with all the data in the world, a super-predictive HMI won’t always get everything right – just like a human assistant. But making it a smart, adaptable, and natural user experience – whether through a touch screen or a voice-based virtual assistant – allows it to become an indispensable part of tomorrow’s mobility solutions.

    As mentioned, we’ll be showing innovative ways to exploit this predictive HMI technology in our CES 2018 demo car. Visit our CES web page for more information on this and other innovations that we'll be showing in our booth. And don’t forget to make an appointment - times are filling up.

    Voratima Orawannukul     Voratima Orawannukul
    User-Experience Lead Engineer, Advanced Development

    Go comment!
  • The Crisis of Cool – and Hybrid Haptics

    | Nov 29, 2017

    A modern user interface communicates through subtle shadings, gestures, interactivity, and animation. While a rich, visual design language is in demand for phones and desktops, it’s not yet part of the automotive experience. Driver distraction concerns over the last several years have lead to today’s simplified textures, high-contrast fonts, and static displays that fly in the face of the current UX paradigm, resulting in a stodgy, dated interface.


    What can a car UX designer do to make cars cool yet still keep them safe? One approach is to use what we at Mitsubishi Electric call hybrid haptics: touch screens with tactile components.


    A hybrid haptic begins with a physical knob, button, dial, or other tangible control. Whether the control is built into a screen bezel, attached directly onto a screen, or nomadically placed on various surfaces, a physical mechanism for a digital control takes advantage of our tactile nature. (Touch is the first sense we develop as infants and remains critical to our health and happiness throughout our lives.) By exposing select features through a fixed button or knob, a hybrid haptic lets users find frequently used functions instead of searching for them in levels of menus.


    However, a hybrid haptic is much more than just a traditional push-button or dial – it’s a physical control enriched with a touch screen (which can be either beside, underneath, or on top of the control) that displays its function, action, and state, and also allows for the control to be changed as circumstances dictate. Let’s look at a few examples.


    Microsoft Surface Dial

    Probably one of the more intriguing applications of hybrid haptics is the Microsoft Surface Dial. While intended for consumer devices – rather than automotive – this handy little gadget shows what’s possible with the creative use of a number of coordinating technologies. The freely placed Surface Dial connects to a tablet through Bluetooth, senses where it’s placed, and interacts with applications to dynamically control content. For example, while working in a drawing application, a digital artist can rotate the dial and select a color rather than navigate through floating tool bars to make a color selection. While the technology is definitely cool, you’re probably not going to see arbitrarily-placed dials around the interior of a car anytime soon.


    Mitsubishi Electric hybrid hapticsOur next example is strictly automotive – it’s a Mitsubishi Electric proof-of-concept in a Jaguar F-Pace demo car. To illustrate how hybrid haptics can make the car UX both more tactile and exciting as well as less distracting to the driver, we’ve mounted two large dials directly on the car’s touch screen. These dials give a user control of common rotary functions – such as volume control, temperature adjustment, radio preset selection, and map zoom,– through simple voice commands. What’s more, the dials can be reconfigured on demand as the user’s needs change. Because the dials are mounted on the screen, the user still gets the physical sensation of turning them to directly engage with content - and because it's a touch screen, that content can dynamically change as needed.

    Art Lebedev Optimus Aux keyboard

    Our last example is an OLED switch; this particular example is from the Art Lebedev Optimus Aux keyboard. These cool little devices are push buttons that have miniaturized screens on their faces. A handful of these rich graphical buttons could really liven up a center stack display while adapting to the user’s whim or the current car context by showing new functionality when it’s needed. Unfortunately we don’t (yet) have a real-life example of these in use in automotive. 


    With hybrid haptics, you can really increase the cool factor, decrease the driver’s mental workload, and create an overall superior user experience. What’s not to like? To see these innovations in person, make an appointment to see us at CES 2018  in January.

    Jacek Spiewla     Jacek Spiewla
    User-Experience Manager, Advanced Development

    Go comment!
  • Letting urban drones do the hunting

    | Nov 17, 2017

    Nothing ruins a good trip – whether it’s an amusement park visit for the kids, getting the jump on Christmas shopping, or attending a Roger Waters concert – more than prolonged hunting for parking. Tom Vanderbilt, the author of Traffic, thinks that parking lot circling regresses us to a primitive state, saying “The way humans hunt for parking and the way animals hunt for food are not as different as you might think.”

    At Mitsubishi Electric, we’ve made it our mission to deliver a premium user experience to everyone in the car. It’s safe to say that the frazzled nerves and meaningless anxiety caused by parking lot stalking amounts to the exact opposite experience. So we started thinking, what if we could somehow use the car’s infotainment system to help drivers calmly find and reserve the nearest open parking spot? We think we’ve got a solution: Drones tied to the in-car infotainment system. Urban drones, to be precise, that act as personal parking-lot scouts and attendants. Here’s how it works.

    Drone hold Parking SpotAs a driver approaches the entrance of a parking lot, the car’s drone (which is docked on the roof of the car) takes to the air. It rises above the crowd of cars and, seeing that there are no open spots in the current lot, flies to the next. With its camera, the drone zooms in to see a car pulling out of spot #26. It sends video of the newly available space through a wireless link to the FLEXConnect.AI system in the car. At that point, the driver assays the proposed spot and clicks OK. The drone, having received the green light, flies down to the spot and hovers there to reserve it. Meanwhile, the nav system reroutes the car with a map and turn-by-turn directions to the safely held acquisition.

    Imagine how much calmer people will feel with cars and drones doing all the heavy lifting?
    Make an appointment to visit our booth at CES 2018 in January for a demo that will actually make your customers look forward to their next mega-parking lot experience.

    Gareth Williams     Gareth Williams
    Executive Director of Advanced Development

    1 Comment