Humanoid Robots to Have a 14-Fold Market Expansion in 5 Years

Humanoid robots, widely perceived as embodied AI, are projected to grow rapidly over the next 10 years. This surge is fueled by major players like Tesla and BYD, who plan to expand humanoid deployment in their factories more than tenfold between 2025 and 2026, aiming for over 25% cost reduction per humanoid robot. With 2025 seen as the industry’s take-off year, IDTechEx anticipates the market for humanoid robot sensory components, including LiDAR, encoders, torque sensors, 6-axis sensors, IMUs, MEMS sensors, and cameras, to reach approximately US$10 billion within ten years. This growth presents major opportunities for component suppliers. IDTechEx’s recent research report, “Humanoid Robots 2025-2035: Technologies, Markets and Opportunities“, details market opportunities, pain points, manufacturing/technical, commercial, and regulatory challenges of these components. 

Sensors are essential to the functionality of humanoid robots, serving a wide range of purposes. They enable navigation and object detection (e.g., LiDAR and cameras), force control (e.g., torque and tactile sensors), and position and stability management (e.g., IMUs). This article focuses primarily on tactile sensors and navigation sensors, particularly LiDAR and cameras.

Tactile sensors

Tactile sensors are critical components in humanoid robots, especially in the hands, where they enable tasks such as picking and placing objects. These sensors are complex systems that integrate inputs such as force, slip, pressure, and torque to generate data that guides the robot’s hand movements based on an object’s shape, stiffness, and softness. As one of the most valuable subsystems in humanoid robots, tactile sensors significantly enhance grip control and object manipulation. They allow for real-time adjustment of grip forces to prevent slippage and enable object handling without relying on visual input.

Looking ahead, tactile sensors will also play a role in object identification by detecting surface properties and determining material characteristics. Several technologies underpin tactile sensing, including capacitive, optical, and magnetic sensors. Among these, optical tactile sensors typically offer the highest precision, but such accuracy is often unnecessary for current humanoid applications. Capacitive and magnetic sensors generally provide sufficient resolution for most tasks. IDTechEx sees flexible capacitive tactile sensors as a promising direction due to their adaptability, though their sensitivity to humidity and temperature limits their use in dynamic environments. For more complex measurements, such as 6D force sensing, magnetic and optical solutions may still be required. Tactile sensors typically range in cost from US$0.4/mm² to US$1.2/mm², depending on factors such as volume, supplier, and technology type. More details of tactile sensors and their commercial applications in humanoids are in the report “Humanoid Robots 2025-2035: Technologies, Markets and Opportunities“.

LiDAR and cameras

LiDAR and cameras are two essential sensory technologies that enable humanoid robots to perform navigation, collision avoidance, and object detection. As of 2025, most humanoid robots use a combination of LiDAR and cameras, with Tesla’s Optimus being a notable exception, relying solely on cameras. The integration of both technologies is largely driven by the growing complexity of the environments these robots must navigate. Vision alone often falls short in addressing real-world applications’ unpredictable and dynamic conditions.

According to Hesai, a leading LiDAR supplier, vision-based systems may be adequate in controlled environments like production lines, where tasks and lighting conditions are stable. However, humanoid robots are increasingly expected to operate in diverse and unstructured settings, interacting with people, adapting to changing light levels, and navigating physically complex spaces. In scenarios with bright sunlight, low light, or rapidly shifting conditions, camera-only systems can struggle with reliability and raise safety concerns in human-robot interactions.

LiDAR offers crucial complementary capabilities. It enables precise path planning, real-time 3D environmental mapping, and reliable obstacle detection—even in poorly lit environments. These features are essential for humanoids performing tasks like placing objects into confined spaces, avoiding unexpected obstacles in dim conditions, or navigating areas where lighting is limited, such as after factory hours. In high-risk settings like mining or tunnel exploration, LiDAR further enhances safety and accuracy, allowing robots to scan and navigate hazardous terrain with confidence.

Looking ahead, the rapid scale-up in humanoid robot deployment is expected to drive substantial growth in sensory components, with the market projected to exceed US$10 billion by 2035. As demand for dexterous capabilities increases, tactile sensors will need to be integrated cost-effectively, enabling fine manipulation without significantly raising overall system expenses.

For LiDAR and cameras, the evolving requirement for full 360-degree spatial awareness in humanoids will influence future design trends. Unlike autonomous vehicles, which typically rely on long-range, forward-facing LiDAR for lane detection and obstacle avoidance, humanoid robots must operate in diverse environments that require comprehensive spatial coverage. As a result, next-generation LiDAR systems will likely prioritize wide field-of-view (FOV) designs with minimal blind spots, especially suited for indoor applications where precise, all-directional awareness is critical and operational speeds are lower.

Further insights into sensory demands and market opportunities can be found in IDTechEx’s report, “Humanoid Robots 2025-2035: Technologies, Markets and Opportunities”. For more information on this report, including downloadable sample pages, please visit IDTechEx.com/HumanoidRobotics, or for the full portfolio of robotics-related research available from IDTechEx, see IDTechEx.com/Research/Robotics.

Hot this week

Drivetrain Analyzer Onsite: Siemens introduces new AI‑powered on‑premises analytics for industrial drives

Siemens is introducing Drivetrain Analyzer Onsite (DTA Onsite), a...

Full agenda announced for CWIEME Berlin 2026

CWIEME Berlin, the world’s leading exhibition for coil winding,...

Kerlink Unveils New-Generation Wirnet™ iStation M2 Gateway: Powering the ‘Age of Edge’ in Remote Environments

Kerlink has introduced its new-generation Wirnet™ iStation M2 LoRaWAN...

Full agenda announced for CWIEME Berlin 2026

CWIEME Berlin, the world’s leading exhibition for coil winding,...

Critical Manufacturing Named in 2026 Gartner® Market Guide for MES

Critical Manufacturing, the Industrial Operations Platform company that unites...

UK automotive’s EV crossroads: pressure, pushback and the race to net zero

Speaking at the SMMT’s Electrified 2026 conference, Keir Mather...

European Robotics Forum Heads To Birmingham in 2027

The European robotics community will descend upon Birmingham’s International...