AI-Powered SCARA Robot

An R&D project to design and build a low-cost automation solution for local MSMEs, from initial concept to a functional AI vision system.

Explore The Project

Project Overview

Problem Statement

Micro, Small, and Medium Enterprises (MSMEs) in regions like Upper Assam face significant hurdles in adopting industrial automation due to the high capital investment for commercial robotic systems. This technology gap limits productivity and competitiveness. This project addresses the pressing need for an indigenously developed, cost-effective automation platform.

Goal & Scope

The principal goal was to develop a functional proof-of-concept for a 4-DOF SCARA robot. The project's comprehensive scope included the entire pre-production lifecycle: detailed mechanical design, structural validation via CAE, rapid prototyping of all custom parts with 3D printing, and the development of a custom AI model for the vision system.

Mechanical Design & Validation

Interactive 3D Model

Explore the full CAD assembly of the SCARA robot. You can rotate, pan, and zoom to inspect every component, from the structural links to the GT2 pulley system designed for backlash-free motion.

Loading Model...

4-DOF Kinematic Structure

The robot uses a classic 4-Degree-of-Freedom design for optimal performance in pick-and-place tasks. This includes two parallel rotary joints for planar motion, a prismatic joint for vertical movement, and a final roll axis for gripper orientation.

Component Design

Critical components like bearings and the GT2 belt system were carefully selected. Thrust bearings (e.g., 40x60x13mm) were chosen to handle axial loads from the cantilevered arm, while the GT2 system provides high gear reductions (up to 20:1) for ample torque and precision.

Prototyping & Bill of Materials

All custom structural parts were fabricated using FDM 3D printing with PLA material over a total of 35 hours. This approach allowed for rapid iteration and cost-effective prototyping. A comprehensive Bill of Materials was created to track all components.

Component NameSpecificationCategoryQuantity
NEMA 17 Stepper Motor1.8-degree step, 42mmElectronic4
Arduino UNO R3ATmega328PElectronic1
CNC Shield V3-Electronic1
A4988 Stepper DriverWith heatsinkElectronic4
Thrust Ball Bearing40x60x13mm (Joint 1)Mechanical2
GT2 Timing Belt200mm, 300mm, 400mmMechanicalVarious
Linear Ball Bearing10mm ID (LM10UU)Mechanical4
3D Printed PartsBase, Arm 1, Arm 2, etc.Structural35 parts

AI Vision System & Control

The robot’s intelligence is driven by a state-of-the-art YOLO (You Only Look Once) object detection model. This system transforms the robot from a simple manipulator into an autonomous agent that can perceive and interact with its environment.

  1. Data Collection & Labeling

    A custom dataset of hundreds of images was created, capturing target objects from various angles and under different lighting conditions. Each object in every image was meticulously hand-labeled with bounding boxes to prepare the data for training.

  2. YOLO Model Training

    The labeled dataset was used to train a YOLOv8 neural network. This computationally intensive process allowed the model to learn the distinct visual features of the target objects. A validation set was used to prevent overfitting and ensure the model could generalize to new, unseen images.

  3. Validation & Deployment

    The trained model was validated for accuracy and real-time performance (FPS). The final model can accurately detect objects and output their bounding box coordinates, which are then fed into the robot's control system.

  4. Robot Kinematics (IK/FK)

    Inverse Kinematics (IK) is used to translate the 2D pixel coordinates from the AI model into the specific joint angles required for the robot's end-effector to reach the target object in 3D space. This is the crucial mathematical bridge between "seeing" and "acting."

Key Learnings & Tools

Technical & Professional Growth

  • Interdisciplinary Integration: Gained deep insight into the R&D process of creating a cohesive mechatronic system by integrating mechanical, electronic, and software engineering.
  • Systematic Troubleshooting: Developed a methodical approach to debugging complex systems, isolating issues between mechanical tolerances, electronic wiring, and software logic.

Software Tools Used

Fusion 360

Fusion 360

ANSYS

ANSYS

Python

Python

Arduino

Arduino IDE

Future Work & Improvements

Modular Gripper System

Design a versatile, quick-change end-effector system with force-sensitive capabilities for handling delicate objects.

Multi-Object Classification

Enhance the AI model to classify and sort multiple different object types simultaneously, increasing the robot's adaptability.

Optimized Motion Paths

Implement advanced control algorithms (e.g., trajectory planning with splines) for faster, smoother, and more efficient motion.

Full Project Report