Cloud Native 8 min read

Deploy and Explore StreamPipes: A Self‑Service Industrial IoT Toolbox

This guide introduces StreamPipes, an end‑to‑end industrial IoT toolbox for non‑technical users, outlines its key features, shows how to connect data sources, build pipelines, visualize data, and provides step‑by‑step Docker‑Compose installation, configuration, and development instructions.

Java Architecture Diary
Java Architecture Diary
Java Architecture Diary
Deploy and Explore StreamPipes: A Self‑Service Industrial IoT Toolbox

StreamPipes is a self‑service (industrial) IoT toolbox that enables non‑technical users to connect, analyze, and explore IoT data streams.

1. About Apache StreamPipes

StreamPipes is an end‑to‑end toolbox for industrial IoT with a rich graphical user interface for non‑technical users, offering the following capabilities:

Quickly connect to more than 20 industrial protocols such as OPC‑UA, PLC, MQTT, REST, Pulsar, Kafka, etc.

Create data coordination and analysis pipelines using over 100 algorithms and data receivers, forwarding data to third‑party systems.

Explore historical data with a data browser that provides many widgets for time‑series visualization.

Real‑time dashboards display live data from sources and pipelines, e.g., shop‑floor monitoring.

StreamPipes is highly extensible and includes a Java SDK for creating new pipeline elements and adapters; Python support is in early development. Pipeline elements are independent microservices that can run anywhere—centrally on a server or near the edge. To apply your own machine‑learning models on real‑time data, simply write a custom processor and deploy it as a reusable pipeline element.

Additional production‑deployment features include:

Assign pipelines, data streams, and dashboards to assets for better resource organization.

Monitor metrics of pipelines and adapters.

Built‑in user and access‑permission management.

Import and export resources.

2. Usage

Three‑step quick configuration to connect to an OPC‑UA server and collect data.

Create a pipeline using the Trend Detection processor and a

Notification

receiver to detect continuous decline.

Use the data browser for visual data analysis.

3. Installation (Docker Compose Example)

3.1 Prerequisites

Docker >= 17.06.0

Docker‑Compose >= 1.17.0 (Compose file format: 3.4)

Google Chrome (recommended), Mozilla Firefox, Microsoft Edge

3.2 Usage

Three docker‑compose options are provided:

default : Standard installation using Kafka as the internal message broker (recommended).

nats : Standard installation using NATS as the message broker (suggested for new installations).

full : Includes experimental Flink integration.

The

nats

version will become the default in a future release; the current Kafka‑based installation does not yet support automatic migration to the NATS version.

Starting the default option is as easy as running locally:

Note : Startup may take some time because the docker-compose up command pulls all Docker images from Docker Hub.
<code>docker-compose up -d<br># then open http://localhost</code>

After all containers start successfully, open http://localhost in a browser to complete the installation. Then switch to the pipeline editor for an interactive tour or consult the documentation for more details.

Stopping the default option:

<code>docker-compose down<br># To remove associated data volumes, run:<br># docker-compose down -v</code>

Starting the nats option:

<code>docker-compose -f docker-compose.nats.yml up -d<br># go to http://localhost after all services are started</code>

Stopping the nats option:

<code>docker-compose -f docker-compose.nats.yml down<br># docker-compose -f docker-compose.nats.yml down</code>

Starting the full option is similar; just specify the

docker-compose.full.yml

file:

<code>docker-compose -f docker-compose.full.yml up -d<br># go to http://localhost after all services are started</code>

Stopping the full option:

<code>docker-compose -f docker-compose.full.yml down<br># docker-compose -f docker-compose.full.yml down</code>

3.3 Service Update

To pull the latest available Docker images, run:

<code>docker-compose pull<br># docker-compose -f docker-compose.full.yml pull</code>

3.4 Version Update

To upgrade to another StreamPipes version, edit the

.env

file and modify the

SP_VERSION

variable:

<code>SP_VERSION=&lt;VERSION&gt;</code>

4. Build and Extend

4.1 Prerequisites

Java 17 JDK

Maven (recommended version 3.8)

NodeJS + NPM (recommended v12+ / v6+)

Docker + Docker‑Compose

4.2 Build

To build the core project:

<code>mvn clean package</code>

To build the UI, switch to the ui folder and run:

<code>npm install<br>npm run build</code>
microservicesdata pipelinesDocker Composeindustrial IoTInstallation GuideStreamPipes
Java Architecture Diary
Written by

Java Architecture Diary

Committed to sharing original, high‑quality technical articles; no fluff or promotional content.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.