top of page
Blue Argus Demo
10:56
Blue Argus Demo
Learn about Blue Sky Robotics' Computer Vision Package: Blue Argus!
Features: Houston
00:33
Features: Houston
Blue Sky Robotics' low-code automation platform
Features: Analytics Dashboard
00:56
Features: Analytics Dashboard
Blue Sky Robotics' control center analytics dashboard
Meet the "Hands" of your robot!
00:30
Meet the "Hands" of your robot!
Meet the "Hands" of your robot! 🤖 End effectors are how robotic arms interact with their world. We’re breaking down the standard UFactory gripper—the versatile go-to for most of our automation tasks. 🦾✨ #UFactory #xArm #Robotics #Automation #Engineering #TechTips #shorts Learn more at https://f.mtr.cool/jenaqtawuz

What Is a 3D Sensor and How Do Robots Use One?

  • Apr 6
  • 5 min read

Updated: Apr 13

A robot arm without a sensor is working blind. It follows a fixed program, moves to a pre-taught position, and picks or places whatever it expects to find there. If something shifts by a few millimeters, or a part arrives in a different orientation, the arm either misses entirely or grabs incorrectly.


A 3D sensor changes that. It gives the robot a real-time map of its environment, not just a flat image, but a full spatial picture with depth. The arm knows where the object is, how it is oriented, and how far away it sits. It can adapt its approach accordingly, without being reprogrammed every time something changes.


This is why 3D sensors have become one of the most important enabling technologies in practical industrial robotics. This post explains what a 3D sensor is, how the main types work, which tasks they unlock, and which Blue Sky Robotics cobots are built to use them.


What a 3D Sensor Is


A 3D sensor is any device that captures spatial information about the physical world, producing data that includes depth, the distance from the sensor to surfaces in the scene, in addition to the standard X and Y coordinates that a flat camera produces.


The output is usually a point cloud: a dense collection of data points, each representing a location in three-dimensional space. Vision software processes that point cloud to identify objects, calculate their position and orientation, measure their dimensions, and pass precise coordinates to the robot controller.


The critical difference from a standard 2D camera is that a 3D sensor tells the robot where things actually are in space rather than just what they look like in an image. That spatial awareness is what allows robots to handle variability, parts in different positions, bins that are never filled the same way twice, pallets with mixed case heights, without breaking down.


The Main Types of 3D Sensors Used in Robotics


Three sensor technologies dominate industrial robotics applications. Each works differently and suits different use cases.


Structured light sensors project a known pattern of light onto the scene, typically a grid or a series of stripes, and measure how that pattern deforms across the surfaces it hits. The deformation data is processed into a dense, accurate 3D point cloud. Structured light sensors produce some of the highest-quality depth data available and handle a wide range of surfaces including reflective metal parts, dark objects, and complex geometries. Mech-Mind's Mech-Eye industrial cameras use this approach and are widely used in bin picking, palletizing, and precision inspection applications.


Stereo vision sensors use two cameras offset from each other, similar to how human eyes work, to calculate depth from the difference between the two images. These sensors are compact and relatively affordable, making them a practical choice for cobot applications. The Intel RealSense D435 and Luxonis OAK-D-Pro-PoE are two of the most common stereo cameras in cobot deployments. UFactory's open-source vision SDK supports both cameras natively across the full xArm and Lite 6 lineup.


Time-of-Flight (ToF) sensors emit pulses of infrared or laser light and measure how long the pulses take to return from the scene. This gives a depth map in real time at high frame rates. ToF sensors are well suited for fast-moving applications and environments where the robot needs to perceive large areas quickly. They maintain reliable performance in variable lighting conditions, including bright factory floors and low-light environments.


Each technology involves tradeoffs. Structured light delivers the highest accuracy and point cloud density but is slower and more expensive. Stereo vision is affordable and versatile but less accurate on featureless or highly reflective surfaces. ToF is fast and robust across lighting conditions but typically lower in resolution than structured light at comparable price points.


What 3D Sensors Enable in Practice


The applications where 3D sensors make the biggest difference are those where fixed programming breaks down because the real world does not stay still.

Bin picking is the canonical example. Parts arrive in a bin in random orientations, often stacked or touching. Without a 3D sensor, the robot cannot locate a pickable surface. With one, it maps the bin in real time, identifies a stable grasp point, plans a collision-free path, and picks reliably even as the bin empties and the remaining parts shift.


Machine tending requires the robot to locate parts of varying sizes and shapes, pick them accurately, and load them into machines at the correct position and angle. A 3D sensor handles the variation between parts without requiring a human to orient each one first.


Palletizing and depalletizing use 3D sensors to handle mixed pallet patterns, angled cases, and deformed bags that would stop a fixed-program system. The sensor maps the pallet surface in real time and the robot adjusts its picks accordingly.


Assembly and alignment rely on 3D sensors to verify part position before and during placement, correcting for small positional errors that compound into defects without real-time feedback.


Quality inspection uses 3D data to measure surface flatness, detect dents or protrusions, verify dimensions, and flag deviations from spec at line speed without removing parts from the production flow.


Which Cobots Work Best with 3D Sensors


Every arm in the Blue Sky Robotics lineup supports 3D sensor integration through open APIs, Python SDKs, and ROS compatibility. The combination that fits your application depends on the sensor type, the payload requirement, and the complexity of the task.


For entry-level vision applications with a stereo sensor, the UFactory Lite 6 ($3,500) is the most accessible starting point. It supports both the Intel RealSense and Luxonis OAK-D cameras through UFactory's vision SDK and handles straightforward pick and place and basic inspection tasks reliably.


For production cells with heavier parts or structured-light cameras, the Fairino FR5 ($6,999) and Fairino FR10 ($10,199) offer the payload and reach to run demanding vision-guided applications. Both support full ROS integration, making them compatible with the broader ecosystem of 3D sensor software and tools.


Getting Started


Use our Cobot Selector to match an arm and sensor type to your application, or the Automation Analysis Tool to model the ROI of a 3D sensor-equipped cell against your current process. When you are ready to see it in action, book a live demo. To learn more about computer vision software visit Blue Argus.


Browse our full UFactory lineup and Fairino cobots with current pricing.


FAQ


What is the difference between a 3D sensor and a regular camera?

A regular camera captures a flat 2D image. A 3D sensor adds depth information, producing a spatial map that tells the robot how far away objects are and what shape they have in three dimensions. That depth data is what enables reliable robotic manipulation in variable environments.


Which 3D sensor type is best for bin picking?

Structured light sensors produce the most accurate and dense point clouds, making them the standard choice for bin picking of complex or reflective parts. Stereo sensors work well for simpler bin picking applications at lower cost.


Can a 3D sensor work with any robot arm?Most industrial 3D sensors connect via Gigabit Ethernet or USB and output standard data formats that integrate with any robot controller through an open API. UFactory and Fairino cobots both support this integration architecture natively.

bottom of page