Step‑by‑Step Guide: Build Your Own Lerobot SO‑ARM100 Robotic Arm from Scratch

This article walks you through the entire process of assembling a low‑cost Lerobot SO‑ARM100 6‑DOF robotic arm, configuring its Feetech servos, calibrating motion, adding dual cameras for teleoperation, collecting a dataset, and training a reinforcement‑learning policy locally or on cloud GPUs, with detailed troubleshooting tips and code examples.

ShiZhen AI
ShiZhen AI
ShiZhen AI
Step‑by‑Step Guide: Build Your Own Lerobot SO‑ARM100 Robotic Arm from Scratch

Background

During the Chinese New Year break the author rebuilt the open‑source Lerobot SO‑ARM100 robot, an affordable 6‑DOF manipulator that can be 3D‑printed and controlled via reinforcement learning.

Project Overview

The SO‑ARM100 is a 6‑DOF robot arm. Lerobot provides a reinforcement‑learning framework to train it for tasks such as teleoperation.

Materials

3D‑printed parts (≈¥1500 if printed yourself)

12× Feetech STS3215 servos (≈¥1200 total)

Motor control boards (2, ≈¥48)

Power adapters (2 A for main arm, 10 A for forearm)

USB‑C cables, expansion dock, screwdriver, file

Two high‑resolution USB cameras

1. Set Up the Lerobot Environment

1.1 Install Miniconda

wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-aarch64.sh
sh Miniconda3-latest-Linux-aarch64.sh
source ~/.bashrc

1.2 Create and Activate Conda Environment

conda create -y -n lerobot python=3.10 && conda activate lerobot

1.3 Clone the Lerobot Repository

git clone https://github.com/huggingface/lerobot.git ~/lerobot

1.4 Install Feetech Servo Drivers

cd ~/lerobot && pip install -e "[feetech]"

1.5 Additional Linux Dependencies

conda install -y -c conda-forge ffmpeg
pip uninstall -y opencv-python
conda install -y -c conda-forge "opencv>=4.10.0"

2. Configure Servo Drivers

2.1 Find Motor Bus Port

sh python lerobot/scripts/find_motors_bus_port.py

The command prints a port like /dev/tty.usbmodem58760431091.

2.2 Edit Configuration

Open lerobot/common/robot_devices/robots/configs.py and modify FeetechMotorsBusConfig with the detected ports for the main and forearm.

2.3 Set Motor IDs

python lerobot/scripts/configure_motor.py \
  --port /dev/tty.usbmodem58760431091 \
  --brand feetech \
  --model sts3215 \
  --baudrate 1000000 \
  --ID 1

Repeat for IDs 2‑6 on the main arm and for the forearm using its port.

3. Assemble the Arm

3.1 Assemble the Forearm

Install servos from bottom to top; the lowest servo is ID 1, the highest is ID 6.

3.2 Assemble the Main Arm

Follow the same order but keep the gear on the sixth joint detached until later. Do not rotate any gear during assembly, as doing so caused major re‑work for the author.

3.3 Connect Cables and Secure

Route the data cable from servo 1 to the control board, interconnect the remaining cables, and clamp the assembly to the desk.

4. Calibrate the Robot

Run the calibration script to align the main and forearm joints. The robot reports a voice prompt; calibrate the forearm first, then the main arm. The calibration succeeds only if each joint’s angle error stays below 10 %.

python lerobot/scripts/control_robot.py \
  --robot.type=so100 \
  --robot.cameras='{}' \
  --control.type=calibrate \
  --control.arms='["main_follower"]'

4.1 Adjust PID Parameters to Reduce Shakiness

The default PID P‑coefficient (32) causes vibration. Lower it to 12 and set I‑coefficient to 0, D‑coefficient to 0 in

lerobot/main/lerobot/common/robot_devices/robots/manipulator.py

under the set_so100_robot_preset method.

# Set P_Coefficient to lower value to avoid shakiness (Default is 32)
self.follower_arms[name].write("P_Coefficient", 12)
# Set I_Coefficient and D_Coefficient to default values
self.follower_arms[name].write("I_Coefficient", 0)
self.follower_arms[name].write("D_Coefficient", 0)

5. Teleoperate with Cameras

After attaching two USB cameras, verify their indices with the control script.

python lerobot/scripts/control_robot.py \
  --robot.type=so100 \
  --control.type=teleoperate

Update lerobot/common/robot_devices/robots/configs.py – the So100RobotConfig – to set the correct camera_index for each camera (e.g., 0 for laptop camera, 1 for phone camera) and desired resolution (640×480, 30 fps).

cameras: dict[str, CameraConfig] = field(
    default_factory=lambda: {
        "laptop": OpenCVCameraConfig(camera_index=0, fps=30, width=640, height=480),
        "phone": OpenCVCameraConfig(camera_index=1, fps=30, width=640, height=480),
    }
)

6. Record a Dataset

Adjust the camera positions (one top‑down, one side view) and ensure a clean background. Then run the recording command to collect a few episodes of a simple grasp‑and‑place task.

python lerobot/scripts/control_robot.py \
  --robot.type=so100 \
  --control.type=record \
  --control.fps=30 \
  --control.single_task="Grasp a lego block and put it in the bin." \
  --control.repo_id=mytest/so100_test \
  --control.tags='["so100","tutorial"]' \
  --control.warmup_time_s=5 \
  --control.episode_time_s=20 \
  --control.reset_time_s=20 \
  --control.num_episodes=5 \
  --control.push_to_hub=false

7. Train Locally

If a GPU is available, start training the ACT policy for 100 000 steps (≈2 h on an RTX 4090). The trained model is saved under outputs/train/act_so100.

nohup python lerobot/scripts/train.py \
  --dataset.repo_id=mytest/so100_test \
  --policy.type=act \
  --output_dir=outputs/train/act_so100 \
  --job_name=act_so100 \
  --device=cuda \
  --wandb.enable=false \
  --dataset.local_files_only=true &

8. Cloud Training (Optional)

Rent a cloud GPU (e.g., 4090 on Autodl or 9GPU) and repeat the environment setup, dataset upload (via FTP or OSS), and training steps exactly as on the local machine.

9. Evaluate the Trained Model

After training, run the evaluation script. The robot should automatically grasp the target object and place it in the bin.

python lerobot/scripts/control_robot.py \
  --robot.type=so100 \
  --control.type=record \
  --control.fps=30 \
  --control.single_task="Grasp a lego block and put it in the bin." \
  --control.repo_id=stevenbobo/eval_act_so100_test \
  --control.tags='["tutorial"]' \
  --control.warmup_time_s=5 \
  --control.episode_time_s=30 \
  --control.reset_time_s=30 \
  --control.num_episodes=10 \
  --control.push_to_hub=true \
  --control.local_files_only=true \
  --policy.path=outputs/train/act_so100/checkpoints/last/pretrained_model

Conclusion

Building a Lerobot SO‑ARM100 from scratch involves hardware assembly, servo configuration, PID tuning, camera integration, dataset collection, and reinforcement‑learning training. With AI assistance the author resolved many pitfalls, achieving a functional teleoperated arm that can perform simple grasping tasks and is ready for more complex experiments such as cloth folding.

Robot assembly image
Robot assembly image
Overall workflow diagram
Overall workflow diagram
Gear issue warning
Gear issue warning
Loose servo gear
Loose servo gear
Power plug issue
Power plug issue
Training progress
Training progress
Camera placement
Camera placement
PythonReinforcement learningLerobotrobotic armSO-ARM100teleoperation
ShiZhen AI
Written by

ShiZhen AI

Tech blogger with over 10 years of experience at leading tech firms, AI efficiency and delivery expert focusing on AI productivity. Covers tech gadgets, AI-driven efficiency, and leisure— AI leisure community. 🛰 szzdzhp001

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.