Making Sensor Technology Less Blurry | Bench Talk
Croatia - Flag Croatia

All prices include duty and customs fees on select shipping methods.

Please confirm your currency selection:

Free shipping on most orders over 50 € (EUR)
All payment options available

US Dollars
Free shipping on most orders over $60 (USD)
All payment options available

Bench Talk for Design Engineers

Bench Talk


Bench Talk for Design Engineers | The Official Blog of Mouser Electronics

Making Sensor Technology Less Blurry Paul Golata

(Source: gece33/GettyImages)

Many people wear glasses to improve their vision. Without their glasses on when driving, they are unable to see things clearly, such as stop signs. While driving we all want to avoid going through an intersection and getting hit by another driver. One should always wear their glasses when they drive so they can best recognize the hazards on the road ahead (Figure 1).

Figure 1: A pair of glasses on an eyechart. (Source: flaviuz -

The ability to detect and identify an object is at the heart of nearly every vision application. However, there are almost as many ways to accomplish this as there are reasons for doing it. This blog looks at several standard sensors used to detect objects.


In vision applications, cameras may be one part of a large electronic vision system. They are used in an analogous way to that of human eyesight (Figure 2).

Figure 2: Indoor security camera and motion detection system. (Source: Vittaya_25 -

Several pros of using cameras in vision systems include:

  • May be positioned to “see” the whole entire context, or at least what is deemed sufficiently appropriate
  • Potential to “outsee” humans in various contexts
    • Through the application of invisible light sources, that is light sources outside of the roughly 380nm750nm spectrum that the human eye responds to
  • May employ learning via:
    • Artificial Intelligence (AI)
    • Machine Learning (ML)
    • Neural Networks (NN)
  • Often less costly
  • Handles weather contexts somewhat analogically equivalent to human eyesight


As electronic engineers, many of us are aware that there are other sensing technologies that allow us to “see” what is not normally visible to the human eye. One example of this is radar (Figure 3). Radar is an acronym for radio detection and ranging. Radar employs radio waves (3MHz–110GHz) to help ascertain the distance (ranging), angle, or velocity of objects. Many of us are familiar with radar examples from how aviation is tracked as they fly across the skies. It provides a way to “see” where the airplane is. When looking at applications in our current purview (vehicles and robotics) often mmWave (30–300GHz) is employed.

Figure 3: Radar. (Source: your123-

Several pros of using radars in vision systems include:

  • Small package size and antenna
  • Large bandwidth
  • High doppler frequency
  • High levels of integration
  • Reliable
  • Affordable

mmWave Sensors

mmWave Sensors, from Texas Instruments, are radar solutions offered in both Industrial (IWR) and Automotive (AWR) options. Sensing is simplified using the mmWave SDK to leverage and evaluate sensing projects in less than thirty minutes. Spatial and velocity resolution can detect up to three times higher than traditional solutions. CMOS single-chip sensors shrink design size by integrating an RF front-end with a DSP and MCU.


The AWR Series of Automotive mmWave Sensors enhance driving experiences, making them safer and easier by analyzing as well as reacting to the nearby environment.


The IWR Series of Industrial mmWave Sensors provide unprecedented accuracy and robustness by detecting the range, velocity, and angle of objects.


Instead of using radio waves, the concept can be applied to light waves. Light detection and ranging (LiDAR) employ electromagnetic light pulses to determine the distance (ranging), angle, or velocity of objects.

The pros to employing LiDAR may include:

  • Accuracy
  • Precision
  • 3D imaging
  • Immunity to external lighting conditions
  • Relatively low computational power required

Time of Flight (ToF)

Time-of-flight (ToF) is a measurement of how long an object, particle, or wave takes to travel a distance through a medium. Analyzing this data can reveal concepts such as velocity or path length, as well as the properties of a particle or medium. ToF applications include proximity sensing and gesture recognition in robotics and Human Machine Interface (HMI).


We have covered different types of object detection sensors and examined the different advantages of each. As we learned, not all cameras are created equal. Much of it comes down to the application. We also learned that developers are making key advances with the development of new sensors that closely mimic the human eye’s ability to perceive changes in its visual field. In some cases, we’re using sensor technologies that go beyond what our eyes can see. Perhaps, one day our need for prescription glasses will become obsolete.

« Back

Paul Golata joined Mouser Electronics in 2011. As a Senior Technology Specialist, Paul contributes to Mouser’s success through driving strategic leadership, tactical execution, and the overall product-line and marketing directions for advanced technology related products. He provides design engineers with the latest information and trends in electrical engineering by delivering unique and valuable technical content that facilitates and enhances Mouser Electronics as the preferred distributor of choice.

Before joining Mouser Electronics, Paul served in various manufacturing, marketing, and sales related roles for Hughes Aircraft Company, Melles Griot, Piper Jaffray, Balzers Optics, JDSU, and Arrow Electronics. He holds a BSEET from the DeVry Institute of Technology (Chicago, IL); an MBA from Pepperdine University (Malibu, CA); an MDiv w/BL from Southwestern Baptist Theological Seminary (Fort Worth, TX); and a PhD from Southwestern Baptist Theological Seminary (Fort Worth, TX).

All Authors

Show More Show More
View Blogs by Date