RISC is short for reduced instruction set computer principles. Developed in 2010, RISC-V (pronounced risk-five) is a free and open instruction set architecture (ISA) that enables processor innovation through open standard collaboration. This makes RISC-V open-sourced, usable in an academic setting, and available in any hardware or software design without royalties.
Who doesn’t like royalty-free open source?
For developers, this means RISC-V is ideal for embedded applications, from IoT applications to computer devices to automotive applications and computer controllers. With RISC-V, developers are creating custom processors designed to handle the requirements of newer workloads for artificial intelligence, machine learning, Internet of Things, virtual reality, and augmented reality applications.
The development possibilities are endless. This week's New Tech Tuesdays examines three products in the RISC-V personal computer ecosystem that developers should embrace if they don’t already.
Seeed Studio Sipeed Longan Nano Development Board is ideal for anyone who wants to tinker with RISC-V processors. The board is based on the GD32VF103CBT6 MCU with a RISC-V 32-bit core by GigaDevice. The board is tiny (46.1mm x 20mm), but it has lots of functionality. It includes a 24mm RGB LCD screen, which means you're not going to want to watch a summer movie blockbuster on it. The board has a double-row pin layout design with a needle spacing of 700mils, inserted directly into a breadboard. It also has an on-board 8M passive crystal oscillator, 32.768kHz RTC low-speed crystal oscillator, mini TF slot, and uses Type-C USB interface.
Microchip Technology PolarFire® SoC FPGAs are the first system on chip (SoC) field-programmable gate arrays (FPGAs) with a RISC-V CPU cluster. That's a lot of acronyms. But automotive, defense, aerospace, and industrial designers who need automotive—and defense-grade programmable logic solutions will find the devices ideal for connected systems and smart applications, including IoT devices. More good news for developers: PolarFire FPGAs offer on-chip security features that enable secure communication, an encrypted bitstream, and a cryptographically secured supply chain, ensuring tamper-proof solutions.
SiFive HiFive Unmatched Linux Development Platform is a development tool built on a standard desktop Mini-ITX form factor (170mm x 170mm) for embedded platforms. SiFive calls its product the world's fastest native RISC-V development platform. The HiFive Unmatched features the Freedom U740 Linux-capable, multi-core, 64-bit dual-issue RISC‑V processor. The motherboard comes with a complete development environment where developers can create RISC-V-based applications from bare-metal to Linux-based systems.
The simple fixed-base ISA and modular fixed standard extensions have made it simple for researchers, teachers, and students to utilize RISC-V principles to learn and push the boundaries of design. This is always a good sign.
Traditionally, we've been annoyed by backseat drivers—those people who dispense unwanted driving advice. They can be intrusive, misinformed, and can sometimes actually create the hazard they're trying to prevent. They’re no fun.
Thanks to the development of driver monitoring systems, driving advice has evolved full circle. Our backseat drivers have become assets because they're better informed and are smart driving partners, like having an additional set of eyes, ears, and brains for those doing the actual navigating. That sounds a bit more fun.
Driver monitoring systems are among the interrelated parts of advanced driver assistance systems (ADAS). They can be cameras, sensors, and processors fitted into modules in our vehicles that process data and algorithms to enhance a driver's awareness. Fully autonomous vehicles, whose technology has not been perfected, have also entered the development discussion. On top of that, design engineers face increased integration possibilities with ultra-wideband, next-generation wireless connectivity (Wi-Fi 6), and 5G access edge technologies.
But before these projects can be practically applied, they require development and secure processing connections.
In this week's New Tech Tuesdays, we'll look at development tools from Basler and Arduino and processors from NXP Semiconductors that are essential for automotive safety systems.
Computer vision applications must be tested, of course. The Basler Embedded Vision Kit with NVIDIA Jetson Nano features a plug-and-play package for rapid prototyping of computer vision applications. The kit includes a dart camera module with an S-mount lens, an NVIDIA Jetson Nano developer board, a special adapter board, and cabling to connect these components. The kit has an integrated camera driver and sample reference applications that leverage the NVIDIA Jetson platform's capabilities, giving developers a ready-to-use development package for edge artificial intelligence use cases. The NVIDIA Jetson lineup also features support for cloud-native technologies. This support helps manufacturers and developers implement improvements and use the latest features with Jetson-based AI edge devices. Two Add-on Camera Kits are also available to provide the appropriate vision extension for developers already working with a Jetson Nano processor board.
Arduino Portenta Vision Shield is a hardware add-on for the Arduino Portenta H7 development board. The add-on brings vision and sound to edge computing projects. The device can simultaneously run high-level code along with real-time tasks, making it ideal for always-on machine vision and audio applications. The shield is available with an Ethernet port or wireless Long Range (LoRA) on-board connectivity. Both share an on-board Himax low-power camera module, two microphones, and a microSD card slot for local data storage. It also has voice recognition and audio-event detection with ultra-compact and omnidirectional audio sensors.
Let's not forget the smallest components but the essential brains for these systems. NXP Semiconductors i.MX 8M Applications Processors feature up to four 1.5GHz Arm® Cortex®-A53 and Cortex-M4 cores for connected streaming audio/video devices, scanning-imaging devices, and various devices. The quad processors also address new requirements for streaming media and 3D graphics. The i.MX 8M support full 4K High Dynamic Range (HDR) and pro audio fidelity with up to 20 audio channels and DSD512 audio. They also include flexible memory options and high-speed connectivity in a small form factor (0.65mm pitch). NXP details its smart mobility and technologies in the Smart Mobility and the Technologies Paving the Way eBook.
Developers are finding driver monitoring technologies to help drivers be aware of events outside the range of their perception, such as pedestrians or vehicles coming from the opposite direction. The applications are broadening daily when you consider the continued development of fully autonomous vehicles. But before these technologies are put into application, they require development tools and processing systems to make sure they're on task.
Vision Artificial Intelligence (Vision AI) is a field of computer science that trains computers to replicate the human vision system.
Designers develop devices, such as face detectors, QR code scanners, etc., to identify and process objects in images and videos in the way that humans do.
It's not exactly a new technology. Computer vision can trace its roots back to the 1950s, when neural networks (NN) were designed for pattern recognition. In the 1970s, optical character recognition (OCR) was able to interpret handwritten text for the vision impaired. Since 2010, deep learning has helped computers train themselves and self-improve over time.
Today, computer vision has expanded with designers using the technology for automotive, healthcare, retail, smartphones, and more. Security and surveillance are also a big part of the technology, with AI automating the data gathering and security assessments.
In this week's New Tech Tuesday, we'll look at microprocessors from Renesas Electronics and kits from Basler and Xilinx for Vision AI development.
Renesas Electronics RZ/V2L High Precision Entry-Level AI MPUs are designed for basic AI-enabled applications, including home appliances, smart doorbells, surveillance cameras, point-of-sale (POS) terminals, and robot vacuum cleaners. The MPUs incorporate Renesas' AI accelerator, DRP-AI (Dynamically Reconfigurable Processor), to make embedded AI design easier and more power-efficient. The good news for designers: The RZ/V2L is package- and pin-compatible with the existing RZ/G2L general-purpose MPUs, allowing RZ/G2L users to easily upgrade to the RZ/V2L for additional AI functions without needing to modify the system configuration. Renesas also offers a complimentary DRP-AI Translator, a tool that automatically converts AI models into an executable format. Developers can leverage the DRP-AI while using familiar tools that allow them to immediately start using the RZ/V2L to evaluate AI design based on learning data.
The Basler AI Vision Solution Kit has full integration with cloud connectivity for rapid prototyping of AI-based Internet of Things applications. The kit can test the applications on an optimized vision system and access cloud services. Connecting to the cloud makes it easier to load or train neural networks. The kit also significantly reduces the complexity and need for expert knowledge of embedded hardware and software technology. The kit comes with a dart BCON for the MIPI camera module for the NVIDIA® Jetson Nano™ System-on-Module (SoM). This solution kit with cloud connectivity is a development kit for integrating a dart camera with a proprietary BCON for the MIPI interface. The AI vision kit contains a dart Mobile Industry Processor Interface (MIPI) camera module, an NVIDIA Jetson Nano development board, lens, and cable. The kit also includes all necessary drivers and software for cloud support that offers a complete plug-and-play design-in package.
Designers will find the Xilinx Kria™ KV260 Vision AI Starter Kit an ideal out-of-the-box platform for vision application development without requiring complex hardware design knowledge. Developers of all levels can get applications up and running in under an hour with no field-programmable gate array (FPGA) experience needed. At that point, customization and differentiation can be added via preferred design environments, at any level of abstraction, from application software to AI model to FPGA design.
The carrier card allows various interfacing options and includes a power solution and network connectors for the camera, display, and microSD card. Target applications for the kit include smart city and machine vision, security cameras, retail analytics, and other industrial applications.
Like any artificial intelligence development, the technology provides multiple possibilities for consumers and businesses. Already much of AI Vision is being deployed in autonomous vehicles, cashier-less checkouts, medical diagnosis, and image labeling. The technology is still in development and learning modes, needing data to make humanlike determinations.
Advantech is applying its industrial-edge artificial intelligence system design experience to meet the needs of AI developers. Its MIC-710AIL-DVA1 NVIDIA® Jetson Nano™ Development Kit makes the power of modern AI available for embedded developers, learners, and makers.
MIC-710AIL-DVA1 leverages NVIDIA Jetson Nano, a powerful, small computer that lets you run neural networks simultaneously for image classification, object detection and segmentation, and speech recognition, as well as support deep-learning trained models.
In this week's New Tech Tuesdays, we look at the features of the MIC-710AIL-DVA1 NVIDIA Jetson Nano Development Kit.
Advantech’s NVIDIA Jetson developer kit—MIC-710AIL-DVA1—is a powerful tool with flexible I/O, varied peripherals, and accommodation for board support packages (BSP). MIC-710AIL-DVA1’s multiple I/O and various peripherals provide excellent flexibility, and after software development, enable users to convert the kit into an edge AI system using Advantech industrial design services.
The MIC-710AIL-DVA1 I/Os include:
The MiniPCIe interface permits customization of the I/O modules, including 4G/LTE/WiFi/5G communication modules, CANBus controllers, and power over Ethernet modules for IP cameras. Furthermore, the exclusive "boot from external devices" feature enables adaptable software development alongside the hardware. Using secondary BSP development, users can boot from external devices—such as eMMC, Micro SD, and/or NVMe—without encountering issues related to the Jetson Nano's limited onboard storage.
The MIC-710AIL-DVA1 NVIDIA Jetson Nano Development Kit is a flexible and easy-to-use solution that enables embedded developers, learners, and makers to achieve cutting-edge AI development. The MIC-710AIL-DVA1 NVIDIA Jetson Nano Development Kit features:
With Advantech industrial design services, users can convert the MIC-710AIL-DVA1 development kit into an edge system after the software development is complete.
Modern automotive technology was speeding along, making significant advancements with its tech systems. Then, the sudden interest in electric vehicles was—ironically—like someone suddenly stepped on the gas and hit another gear.
First, Tesla broke through with the development and delivery of its EV lineup, making strides that hadn’t been seen before. Other EV manufacturers emerged and traditional automakers soon announced they were making their leaps into electrification. Governments provided tax incentives and vowed to upgrade the charging infrastructure. The interest was expected, but certain socio-political developments triggered a greater surge.
As a result, the now-souped-up smart mobility industry created increased demands for microchip processors, sensors, components, power systems, infotainment options, regenerative braking systems, and, of course, advanced motor technology.
How is everyone, particularly design engineers, keeping up to speed? How do they sort through the deluge of information and analysis?
Mouser Electronics developed and worked with suppliers to bring information on the latest trends and technologies for modern automotive design in supplier e-books. They’re available for view on desktop and are mobile-optimized.
In this week’s New Tech Tuesdays, we’ll review e-books provided by Vishay, Microchip Technology, ROHM Semiconductor, and NXP for design engineers to get up to speed on automotive design.
Vishay: An Automotive Grade Above: Learn more about Vishay’s Automotive Grade standard for its electronic components in a 28-page e-book. The e-book details Vishay’s Automotive Grade diodes and rectifiers, MOSFETs, optoelectronics, resistors, inductors, and capacitors. The content is presented in three categories: 48V systems, EV battery-charging management, and dashboard sensors.
Microchip: Enabling the Future of Mobility: Microchip offers a close look at its automotive-grade solutions. In a long-form article with images, charts, and video, Microchip reviews the evolution of sensors, their applications, and how the data has led to software-controlled features that increase safety, comfort, and connectedness while driving.
ROHM Semiconductor: Driving the Future of Automotive Solutions: ROHM focuses on the power aspects of EVs in this e-book. In 37 pages, the e-book looks at power components and other complementary devices that ROHM provides for automotive power applications. Those include shunt resistors, gate drivers, low-dropout (LDO) regulators, DC-DC converters, and LED drivers.
NXP: Smart Mobility and the Technologies Paving the Way: Today’s modern dashboards look cool with their dynamic screens. In its 28-page e-book, NXP Semiconductors details how its connected solutions and products are paving the way for smart mobility. NXP provides a close look at its advanced driver-assistance systems (ADAS), radio detection and ranging (Radar), vehicle networks, and electrification.
Ever-evolving modern automotive technology means design engineers have to keep up with the industry’s innovations and increased demands. Automotive technology e-books offer design engineers a convenient way to keep up with the latest trends, products, and innovations.
The power of eyewear has come a long way since its inception. The first eyeglasses were invented in Italy in the late 13th century, revolutionizing the way people with vision impairments interacted with the world. These early glasses were simple convex lenses mounted on frames primarily used to correct farsightedness. Over the centuries, eyeglasses evolved, with improvements in lens technology and frame design enhancing both vision correction and comfort.
Now, what we can expect from a pair of lenses goes far beyond vision correction. The concept of smart glasses marked a significant leap in eyewear technology. Leading the way was Google Glass, or simply Glass (Figure 1), which was introduced in 2013. Glass was one of the first to merge traditional eyeglasses with modern technology. When released, Glass resembled something more like what “The Borg” would wear, for those Star Trek aficionados, displaying information for the user on a head-up display (HUD) much like what you find in many of today’s vehicles.
Figure 1: Google Glass can be controlled using the touchpad built into the side of the device. (Source: https://commons.wikimedia.org/wiki/File:A_Google_Glass_wearer.jpg)
Glass's journey unfortunately didn't align with consumer readiness and market expectations, leading to its decline. In short, consumers were not ready for Glass. However, the evolving integration of advanced technologies is now fueling a renewed interest in the smart glasses sector.
Fast forward to today, and despite the setbacks faced by Google Glass, smart glasses have evolved into more practical and stylish wearables. Companies like Ray-Ban and Oakley have entered the market, focusing on aesthetics and functionality. This interest can be attributed to advancements and fusions in technology that have allowed for more stylish and less obtrusive designs, potentially overcoming one of the significant hurdles faced by Google Glass. Furthermore, there's a growing interest in wearable technology as it becomes more integrated into daily life.
Additionally, advancements in augmented reality (AR) and artificial intelligence (AI) could transform how we interact with our environment, offering real-time information overlays and immersive experiences. The vast potential for medical, educational, and business applications indicates that smart glasses may eventually become prevalent in our daily lives.
Today's smart glasses are not only fashionable but also significantly more functional than their predecessors. Smart glasses are being designed for portability and daily use to enhance and interact with the real world. With smaller displays integrated into the lenses, they can overlay digital information without obstructing the user’s vision when displaying notifications, navigation, or camera functions. Also, smart glasses are generally more lightweight and designed to be worn like regular glasses, making them more suitable for continuous wear and everyday activities.
Meanwhile, AR/VR smart glasses continue to be bulkier, as their application is not intended for use while moving around or performing other tasks. These smart glasses are primarily designed for immersive gaming experiences, offering a fully virtual environment that replaces the user's real-world surroundings with a wider field of view. AR/VR smart glasses isolate the user from their physical environment, while smart glasses are designed to interact with and augment the real world.
Unfortunately, there are privacy concerns surrounding smart glasses, which in part affected the success of Google Glass, and these issues have not necessarily been resolved. Smart glasses present unique privacy concerns compared to other technologies, such as smartphones. They can record audio and video more discreetly, that is, without the visible actions required by smartphones, such as holding up the device. This discretion makes it difficult for others to detect when they are being recorded. Additionally, smart glasses can continuously capture data while worn. Although some smart glasses have security features like file encryption, these do not fully address the issue of covert recording in public or private spaces. Furthermore, while the public is generally aware of smartphones' recording capabilities, smart glasses are newer and less understood, leading to heightened privacy concerns.
This week, we highlight two innovative components from FRAMOS and PUI Audio, renowned for their dedication to quality and forward-thinking design. These components represent the pinnacle of modern technology, meticulously engineered for the emerging field of next-generation wearable devices, including advanced smart glasses.
The FRAMOS Sensor Module (FSM) featuring the Sony IMX296 sensor is a compact, high-performance module measuring just 26.5mm x 26.5mm. It is equipped with a Global Shutter sensor, offering a 1.6MP native resolution and a 1/2.9 optical format, with pixel precision at 3.45 x 3.45μm. The module supports a MIPI CSI-2 interface with up to 1-data lane capacity. Designed for seamless integration into various processing platforms, these modules demonstrate remarkable modularity, utilizing standardized connectors and mechanical parts. They encompass a resolution spectrum from 0.4MP to 24MP, with options for both rolling and global shutters, addressing a broad range of imaging needs. Ideal for sensor evaluation in early-stage design, the FSM facilitates comparative analysis and is easily integrated into third-party processor boards, enhancing its utility in diverse technological applications.
Design engineers focusing on smart glasses development can benefit significantly from the FSM-IMX296 Sensor Module with these additional key advantages:
The PUI Audio Piezo Haptic Benders, comprising three distinct models, offer significant advantages for innovative wearable designs, including smart glasses. The AB1270A-LW100 model is notable for its high-temperature resistance, enduring extreme conditions from -40°C to +85°C, making it suitable for wearables exposed to harsh outdoor environments or used in automotive settings. Meanwhile, the HD-PAB2001-LW100 and HD-PAB2701-1 models stand out with their low-profile design combined with high voltage and displacement capabilities, catering to demanding applications like transmission systems and medical devices such as blood pressure or insulin pumps. These versatile haptic benders, compliant with RoHS/REACH standards, are ideal for integration into wearable applications, offering robust performance in various conditions.
The evolution from traditional eyeglasses to smart glasses showcases remarkable technological and design progress. Google Glass, despite its initial setbacks, catalyzed renewed interest in this domain. Modern smart glasses, leveraging augmented reality and artificial intelligence, blend style with functionality, marking a significant leap in wearable technology. However, privacy issues, notably around discreet recording capabilities, persist as a major challenge. Addressing these concerns is essential for the broader acceptance and integration of smart glasses into daily life.
In this evolving landscape, suppliers like FRAMOS and PUI Audio are playing a pivotal role in the development of next-generation wearables.
FRAMOS. “Sensor Modules Help Accelerate Embedded Vision Development.” February 28, 2019. https://www.framos.com/en/articles/sensor-modules-help-accelerate-embedded-vision-development.
Mukhiddinov, Mukhriddin, and Jinsoo Cho. “Smart Glass System Using Deep Learning for the Blind and Visually Impaired.” Electronics 10 (22): 2756. https://doi.org/10.3390/electronics10222756.
The global demand for sustainable and efficient farming methods is driving the adoption of drone technology in smart farming and farm animal resources. Agricultural drones offer tremendous benefits in seed plantation, crop monitoring, and precision agriculture. To further enhance the development of agricultural drone designs, Würth Elektronik released its new IMU 6-Axis Sensor Evaluation Board for the WSEN-ISDS 6-Axis Sensor. Let's explore how this evaluation board, combined with drone technology, is transforming smart farming and plantation practices.
The Würth Elektronik Evaluation Board for the WSEN-ISDS Inertial Measurement Unit (IMU) 6-Axis Sensor is designed to provide developers with an opportunity to verify sensor performance and develop prototypes using an extension board, such as a Sensor Shield for Arduino. This evaluation board offers convenient integration with the sensor shield through the mounted I2C and SPI interface pins. Additionally, it can be mounted on a breadboard using the through-hole pin header connections. With its 16-bit digital ultra-low-power and high-performance MEMS sensor, which includes a 3-axis linear accelerometer and a 3-axis gyroscope, the evaluation board is ideal for various applications such as drones, industrial IoT, connected devices, robotics, and automation.
The Würth Elektronik Evaluation Board offers the following features and benefits for agricultural drone design.
Sensor Characteristics: The 6-axis IMU sensor on the evaluation board incorporates a 3-axis gyroscope and a 3-axis accelerometer with a fully calibrated 16-bit output. The adjustable full scales for acceleration and gyroscope allow customization based on specific drone requirements, ensuring optimal performance in various agricultural applications.
Communication Interface: The board supports both I2C and SPI interfaces for seamless communication with the sensor, offering flexibility in integrating it with various microcontrollers and development platforms.
Fast Prototyping: The evaluation board expedites the development process, allowing engineers and designers to quickly iterate and refine their agricultural drone designs. It provides a convenient platform for testing and validating the WSEN-ISDS IMU 6-Axis sensor.
Sensor Performance Verification: The evaluation board enables precise verification of the sensor's performance, ensuring accurate data collection for critical design aspects and ensuring its suitability for their specific application requirements such as crop monitoring, aerial imaging, and navigation. This leads to enhanced decision-making and optimized farming practices.
Easy Integration: The evaluation board is designed for seamless integration with other hardware components. It can be directly plugged into a sensor shield for Arduino or a sensor FeatherWing, simplifying the connection process and facilitating rapid development.
Target applications for the Würth Elektronik Evaluation Board include:
The global challenges of food supply and environmental sustainability are being addressed by the transformative power of drone technology in agriculture. Drones equipped with advanced sensors and capabilities are revolutionizing smart farming practices in the following ways:
Rwanda, located in East Africa, has a rich and complex history dating back thousands of years. Marred by conflict and tragedy, the country has shown resilience and determination in overcoming its challenges. Today, Rwanda is striving to build a prosperous and inclusive society for its citizens and has made remarkable progress in various sectors, including farming, healthcare, education, infrastructure, and technology. The country is committed to prioritizing economic development and poverty reduction.
According to a December 2020 article1, authors Sylvere Nshimiyimana and Jean D'Amour Rukundo from Bogor Agricultural University (IPB University) explain that the Rwandan government has outlined Vision 2050, a long-term development plan aimed at transforming the country into a knowledge-based, middle-income economy. The review states that Rwanda is embracing modern agricultural technologies to transition from subsistence farming to commercial agriculture.
The review explains that the incorporation of various strategies, policies, and modern agricultural technologies has greatly transformed agricultural activities in Rwanda. Among these advancements, the use of drones for precision agriculture stands out as a significant contributor. Alongside initiatives such as the Information and Communication Technology for Agriculture (ICT4Ag) Strategy and the implementation of the Crop Intensification Program (CIP), drones have revolutionized farming practices in Rwanda. By harnessing the power of drone technology, farmers can now achieve unprecedented precision and accuracy in their agricultural operations. Equipped with advanced imaging sensors, GPS technology, and data analytics capabilities, drones gather detailed information on crop health, soil conditions, and pest infestations. This invaluable data empowers farmers to make informed decisions regarding irrigation, fertilization, and pest control, leading to optimized resource allocation and improved yields.
Moreover, drones can enable efficient monitoring and management of large-scale agricultural fields that would otherwise be challenging to survey manually. The aerial perspective provided by drones facilitates the identification of potential issues, such as nutrient deficiencies or water stress, allowing farmers to take timely action and mitigate crop losses.
An article by The New Times2 discusses the innovative use of drones to enhance pig breeding in Rwanda. Drone technology is being employed to improve pig productivity and address challenges faced by farmers in the country.
The pilot project is revolutionizing pig breeding for smallholders [stakeholders]. Instead of traveling long distances to hire a male stud or collect pig semen, farmers now receive swine semen via drone delivery. Zipline, in collaboration with the Rwandan Agriculture and Animal Resources Development Board (RAB), has successfully operated this initiative since the beginning. The project addresses the challenges faced by farmers and offers a convenient and efficient solution for pig breeding in the country. The initiative aims to increase pig production and enhance the livelihoods of farmers, ultimately contributing to Rwanda's agricultural development. Under the Rwanda Livestock Master Plan, launched in December 2017, the pig industry is expected to be a major contributor to Rwanda’s meat production.
The adoption of drone technology in smart farming and plantation practices is driven by the global demand for sustainable and efficient farming methods. In the case of Rwanda, a country that has embraced drone technology in its efforts to transition from subsistence farming to commercial agriculture, drones have been instrumental in achieving precision agriculture and improving agricultural activities in the country. By harnessing the power of drones, farmers in Rwanda can make informed decisions regarding irrigation, fertilization, and pest control, leading to optimized resource allocation and improved yields. Drones are also being used to enhance pig breeding in Rwanda, offering a convenient and efficient solution for smallholders and contributing to the country's agricultural development.
The Würth Elektronik IMU Sensor Evaluation Board, combined with drone technology, is revolutionizing smart farming, plantations, and requirements such as crop monitoring. Its fast-prototyping capabilities and the verification of sensor performance are invaluable for agricultural drone design. By leveraging the advanced features of the evaluation board, design engineers can develop smarter, faster, and more efficient agricultural drones, contributing to global food security and environmentally friendly farming practices. Experience the future of smart farming with Würth Elektronik's IMU Sensor Evaluation Board.
Privacy Centre |
Terms and Conditions
Copyright ©2024 Mouser Electronics, Inc.
Mouser® and Mouser Electronics® are trademarks of Mouser Electronics, Inc. in the U.S. and/or other countries.
All other trademarks are the property of their respective owners.
Corporate headquarters and logistics centre in Mansfield, Texas USA.