Automotive windscreen heads-up displays

The IoT, with its data-gathering, analysis and communications capabilities, has enormous potential to entertain us, inform us and keep us safe throughout our car journeys. However, how do we access and interact with this information safely and legally while we’re preoccupied by driving? Mobile phones and satnav systems are obvious conduits for such information, but users risk both accidents and prosecution if they use them while driving. Even changing channels on a radio can incur penalties if it leads to careless driving or an accident.

One solution now being introduced by car manufacturers and aftermarket suppliers is the head-up (or heads-up) display, or HUD. An HUD is any transparent display that presents data without requiring users to look away from their usual viewpoints. HUDs were originally developed to allow pilots to view information with the head positioned ‘up’ and looking forward, instead of looking down at lower instruments. Another advantage is that the pilot’s eyes do not need to refocus to view outside after looking at the closer instruments.

The first military HUDs were extensions of pippers or PIP (Predicted Impact Point) markers. These displayed a probable point of impact or location for a bomb, missile or bullet. As technology advanced, displays expanded to include ballistic variables such as aircraft velocity, target velocity, target elevation, distance to target, drag and gravity forces on the projectile and others.

In the 1970s, HUD deployment extended to commercial as well as military aircraft, and in 1988, the Oldsmobile Cutlass Supreme became the first production car with a heads-up display. Since then, other car manufacturers have also adopted the technology on their sports or luxury vehicles. The first European automaker to offer an HUD interface was BMW . Aftermarket HUD systems also exist; these project the display onto a glass combiner mounted above or below the windscreen.

BMW heads-up display

Fig.1: BMW heads-up display – Image via Wikimedia Commons

Heads-up display operating principle

A Heads-up Display system usually comprises a combiner, a projector unit and a video generation computer. The combiner is the surface onto which the data is projected and is located directly in front of the pilot or driver. A combiner can be implemented by special treatment of the windscreen glass. It can take either a concave or flat shape and has a special coating (most commonly, phosphor) which reflects the monochromatic light from the projector unit, while allowing all the other wavelengths of light to go through. This creates a fluorescent or phosphorescent image of high visibility that overlaps the driver’s real view.

Combiners also manage the function of setting the distance of the HUD’s virtual image. In test situations, a projected HUD which appears near the nose of the vehicle is said to result in the most rapid response times and best situational awareness on the part of the driver, as well as facilitating better driving quality.

The projector unit uses one of three types of light emitting sources to project the image, namely a Cathode Ray Tube, light emitting diode, or liquid crystal display. In the early days of HUDs this was accomplished through refraction, but modern HUDs use reflection for improved readability.

Heads-up display electronics

The projection system’s quality and functionality, while clearly essential to developing HUD technology, depends on increasingly integrated and powerful electronics to process and deliver the data it needs.

High-profile examples of these include graphics and CPU chips such as Texas Instruments’ “Jacinto” family of infotainment processors which enhance digital car interior integration, and Fujitsu’s MB86R11 “Emerald-L” 2D/3D graphics System-on-a-Chip (SoC).

The Jacinto 6 Ex, for example, offers two embedded vision engines (EVEs) for simultaneous informational Advanced Driver Assistance Systems (ADAS) and infotainment functionalities without compromising the performance of either system. Informational ADAS describes capabilities such as object and pedestrian detection, augmented reality navigation and driver identification leveraging cameras both inside and outside the car to enhance the driving experience without actively controlling the vehicle. These capabilities can be used in the centre stack, programmable cluster and head's-up display systems.

Fujitsu’s MB86R11 “Emerald-L” is a powerful graphics System-on-a-Chip (SoC) with an integrated GDC and GPU. Designed for high-end embedded graphical applications in the automotive market, the MB86R11 “Emerald-L” manages cluster, centre information displays, navigation and in-car multimedia graphics applications.

The Emerald-L SoC comprises a 400MHz ARM Cortex A9™ processor and a powerful, custom-built graphics core capable of cutting-edge 3D and 2D graphics. The device supports Fujitsu’s 360° WrapAround Video Imaging Technology. A Visibility Enhancement feature performs adjacent pixel comparison to reproduce images with natural colours and great detail. A flexible Signature Unit offers automotive system developers a powerful way to verify data integrity and enhance safety.

However, for these big chips and their peripherals to function reliably in a vehicle’s electrically challenging environment, the power distribution system must be carefully managed; the right power must be delivered where it’s needed, while noise and spikes are efficiently filtered out. A useful document on this subject, called the Automotive Product Guide, has been prepared by Maxim and is available on Farnell’s website . (hyperlink to Guide?)

The Guide provides specifications and design information for devices used in HUD systems as well as infotainment, navigation, driver assistance, lighting and other in-car applications. Products include automotive-qualified step-down regulators, buck converters, PMICs for car batteries, USB protectors and wide operating range regulators. Other supporting products such as Class D amplifiers, LVDS and GMSL serial links, video decoders, GPS, GLONASS, Compass, and Galileo Front-Ends, thermocouple open/short monitors and other functions are also included.

Ongoing enhancements to the HUD components

Continued development in several technologies is allowing manufacturers to introduce increasingly sophisticated upgrades to these basic HUD concepts.

The most capable HUDs – such as Jaguar’s Urban Windscreen, described below – will be based on the entire windscreen becoming smart glass; this can display both text and images, in colour, and possibly in 3-D. These need large, yet high-resolution images.

Such images, with a larger field of view, higher contrast and an extended range of reproducible colours could be generated by blue and green semiconductor laser diodes developed by Japanese semiconductor supplier Nichia Corp. These diodes, according to Nichia, are designed specifically for automotive HUDs. The company says that they also offer improved brightness, colour reproducibility, contrast ratio, viewing angle and power efficiency compared with LEDs.

The systems will also need cameras that can monitor the driver’s head position so that the projector can align the images with the real objectives it is augmenting. Additionally, it must be driven by an extremely fast processor to deliver images with latency as close to zero as possible. This would make it easier to paint arrows and lines on the road ahead, making navigation more intuitive. Virtual highway signs could also be created. As a safety feature, the system could also paint virtual brake lights on a car in front if it was decelerating sharply without showing its own brake lights – for example, if it was using regenerative braking.

Improvements to how drivers interact with the HUD are also possible. Gesture-based control of functions like sun blinds, rear screen wipers and air conditioning could be achieved through sensors. Drivers could wave at these rather than having to search for buttons on the dash.

Further development would always be possible; the HUD can be regarded as an intelligent projector waiting to run new applications and upgrades as they become available.

The role of HUDs

To a greater or lesser extent, HUDs have now become part of the automobile market place; solutions are available both from car manufacturers, and aftermarket suppliers.

But what role do they fulfill, and what is the extent of their functionality?

At the most general level, HUDs contribute to driver wellbeing. To be more specific, we can break this down into several components:

  • Improving safety by assisting drivers in adverse driving conditions
  • Improving safety by increasing drivers’ situation awareness related both to their vehicle performance, and what’s happening on the road they’re using
  • Reducing stress by efficiently-presented satnav information
  • Reducing stress by advising on traffic conditions, local parking possibilities and nearby availability of fuel
  • Enriching the travelling experience with information on nearby restaurants, other entertainment opportunities and special offers
  • Providing entertainment in the form of radio channels, podcasts or music
  • Allowing hands-free communications – staying connected

This ever-increasing functionality is being/will be matched by better methods of interaction, all aimed at minimizing distraction from driving. These include voice commands via Google Voice and other channels, mobile phone integration, and gesture control.

Car HUDs were originally simple, limited-function devices designed to increase safety and situation awareness. Driving in low visibility, for example, is a major cause of accidents in both first-world and developing countries. Low visibility is caused by rain, fog, heavy snow or darkness, and is exacerbated by poor street lighting. According to the World Health Organization (WHO), 350 people worldwide die every day due to low visibility conditions while driving.

The WHO also refers to further studies from around the world that report on traffic accidents related to poor visibility:

  • In the state of Victoria, Australia, poor visibility was a factor in 65% of crashes between cars and motorized two-wheelers, and the sole cause in 21% of them.
  • Nearly 5% of severe truck crashes in Germany can be traced back to poor nighttime visibility of the truck or its trailer.
  • Motorized two-wheelers, because of their size and shape, are harder to see than other motor vehicles and are poorly visible, even during the daytime. For example, most motorcycle crashes in Malaysia occur during daylight hours.
  • European research found that one third of pedestrian casualties had difficulty seeing the vehicle that had struck them, while two fifths of drivers had difficulty seeing the pedestrian.
  • A large proportion of pedestrian and cyclist collisions in low-income countries occur around dusk, dawn or at night, possibly because of poor visibility. However, research in this area is limited.

HUD displays can mitigate these problems while allowing drivers to focus on the road, without being distracted by audio commands or complicated maps and symbols.

HUD displays can mitigate traffic accidents

Fig.2: HUD displays can mitigate traffic accidents related to poor visibility – Image via Wikimedia Commons

However, HUDs have been continuously evolving, and will continue to do so, for four reasons:

  • Driving conditions are becoming more challenging, with greater traffic density, more competition for parking spaces, more complex road signage, and more time pressure.
  • Users’ expectations, fuelled by their experience with mobile phones, tablets and other IoT devices, are raised.
  • Improvements in display, communications, computing, software and IoT technologies are providing opportunities to develop more advanced HUD platforms to meet these evolving requirements and expectations.
  • The HUD is designed as an integrated component of the car’s electronic systems.

Below, we give examples of how this evolving demand is being met now – and will be met in the future – by solutions being offered or proposed by various car manufacturers and aftermarket HUD suppliers.

A simple solution - HUDWAY

HUDWAY offers a solution which is based simply on a mobile phone app, available for Android or Apple. No special hardware is needed; as Fig.3 shows, it works by placing the phone on the car dashboard.

HUDWAY displays current vehicle speed, distance to the next sharp curve, and where it is best to slow down. All dangerous turns are displayed in red, and prior marks on the road help to visually measure the distance. Distance between each mark is equal to 50 metres (or 200 feet). This heads-up information is supplemented with voice assistance. Although an Internet connection is required when setting up the route, it is not necessary during navigation, as the app works from the information pre-loaded into the phone.

More futuristic HUDs #1: Jaguar Landrover ‘Urban Windscreen’

The potential for HUDs is growing continuously, driven by advances in computing, image capture and display, communications, and the increasing availability of cloud-based information pertaining to navigation, safety, and other useful or entertaining information.

One vision of the HUD future is offered by Jaguar Landrover (JLR), with their ‘Urban Windscreen’ technology concept, announced in December 2014. It is so-called because, instead of confining itself to the relatively small area typical of current HUDs, it uses video projection to turn a normally static windscreen into a virtual moving image display. For example, it can project the image of a ghost car onto the screen; this appears to be driving in front of your vehicle, so that you can follow it as it leads you to your destination. This is more intuitive and easier to use than traditional satnav audio and video instructions that can leave you unsure of whether you should actually be turning at this exit or the next one.

The screen will also integrate displays of useful information, such as prices at the nearest petrol station, or availability of parking spaces in a convenient car park. While doing so, the technology will make effective use of the windscreen’s real estate. When cameras on the front of the car spot an obstacle such as a pedestrian or cyclist, the HUD draws a red square around it. If you drive past a point of interest, a floating info box will appear to provide more data – the restaurant’s rating, or other items as above.

Dr Wolfgang Epple, Director of Research and Technology, said of the tech "Driving on city streets can be a stressful experience, but imagine being able to drive across town without having to look at road signs, or be distracted trying to locate a parking space.

Another aspect of this JLR technology is the ability to provide safe 360° vision and eliminate blind spots. Screens can be embedded in the car’s A, B and C pillars. (A pillars hold either side of the windscreen in place; B pillars start where the driver and passenger-side windows end as you look backward along the length of the car. C pillars hold the sides of the car's rear window in place.)

The pillar screens are fed from cameras in each blind spot location. When the driver signals a turn, then moves his head to make the manoeuvre, the relevant screen will immediately display a moving image of what’s behind it. These screens, together with the windscreen HUD, make up 360° vision, and should make urban and motorway driving much easier and safer.

Dr Epple explains that the tech is designed to improve visibility and give drivers the right information at the right time. He comments: “If we can keep the driver’s eyes on the road ahead and present information in a non-distracting way, we can help them make better decisions in the most demanding and congested driving environments.”

It’s not yet possible to buy cars with this technology, but some commentators expect to see it in Jaguars and Land Rovers by 2020.

More futuristic HUDs #2: Carrobot HUD auto accessory

By contrast to the Urban Windscreen, the Carrobot auto accessory is an aftermarket product – but a very powerful one. Their second-generation device can be considered as a personal assistant including HUD functionality rather than simply as a HUD; with AI capability, it integrates many functions and technologies. It acts as a central platform to integrate all the driver’s smart devices. Its features include:

  • High-resolution display
  • Smart voice interaction
  • Project mobile phone apps to windscreen
  • Fatigue and distraction detection and warnings
  • Smart navigation
  • Mobile app
  • Wireless connectivity

An HUD startup example

Farnell, as a partner in an organisation called Startupbootcamp IoT, is providing support to an HUD startup company called HUDlog.

Startupbootcamp IoT’s aim is to make the journey of building a connected hardware startup clearer, shorter and more successful for entrepreneurs. This is done through a three-month acceleration programme run once a year in London. The program gives up to 10 startups access to a global network of business mentors, hardware professionals, corporate partners, potential customers and investors.

HUDlog’s approach is to offer a solution designed for commercial fleets, based on their Atlas One aftermarket HUD. They pitch this to fleet operators by promoting Atlas One’s commercial benefits; they claim collisions reduced by 30%, fuel savings of up to £670 per year, per van, and insurance savings of up to 15%.

Technically, Atlas One provides a harsh driving indicator, a speed limit indicator to promote fuel savings and prevent speeding fines, and turn-by-turn navigation. This only displays essential directions, with no extra visual clutter. The HUD connects directly to the mobile network, eliminating the need for mobile phones and allowing fleet managers to retain control.

HUDlog’s objective is to provide a visual driving assistant that optimises presentation of essential information without distracting the driver.

Further development: from HUDs to augmented reality and driverless vehicles

With Ford announcing mass-production of driverless cars for 2021 , and driverless truck convoys already a reality, issues such as automated safety, navigation and infotainment – including all the functions described above - become ever more critical. Car and aftermarket manufacturers gathering in forums like CES 2017 in Las Vegas are talking in terms of Augmented Reality to reflect increasingly sophisticated and integrated systems that include the HUD as a key component.

Augmented reality systems and HUDs will provide stepping stones on the route to driverless vehicles. For example, driving in today’s environment can be improved by better and more precise mapping functions – but this high-quality mapping will be essential to the success of driverless vehicles.

Similarly, driverless cars will be dependent on large arrays of sensors that ensure safety by collecting data on other nearby vehicles, cyclists, pedestrians, signage and other items; and this data is already being used in AR systems and HUDs. It’s also possible that it could be used for deep learning to provide better insight into the current situation, and how to react. Additionally, cloud-based services could gather big data from large fleets of cars to improve understanding of the big picture for issues such as traffic flow.

An opportunity for developers

HUDs represent an expanding market with opportunities for car system developers. However, developing an OEM solution requires talent with specialised skillsets and domain experience. In taking on an HUD development project, an OEM and/or automotive supplier may encounter the following pain points:

  • Investments in R&D and technology
  • Increased time-to-market and losing out on competitive advantage

Embitel Technologies , a company whose expertise includes embedded design services, cloud, mobility and IoT solutions for automotive, smart home and smart factories, has developed a reference design for a car HUD system that they claim can reduce time for developing advanced features and customisation from an average 2½ years to six months, with an associated reduction in development costs.


This article has shown how HUD displays have evolved from standalone devices of limited functionality to sophisticated components of vehicles’ overall electronic communications and management systems. The trend now is to use the entire windscreen as a projection area, to open the display options as widely as possible.

HUDlog’s contribution to the debate is interesting, as they quantify the commercial benefits of using HUDs.

While fulfilling an increasingly valuable role currently, HUDs are ultimately set to become redundant, or least change their role towards infotainment rather than safety and navigation functions, as vehicles become self-driven.


Automotive windscreen heads-up displays. Date published: 15th February 2018 by Farnell