Stream processing

Turbocharge your business with our real-time data stream processing service for analytics, machine learning, IoT, and more.

proven & trusted

data processing technologies

We have built up ODF on a state-of-the-art tech stack that enables you to build an end-to-end streaming pipeline at scale. The architecture behind ODF consists of three modules, each of which is capable of handling hundreds of thousands of events per second.



connect any data source

The collector module ingests and stores various inputs. Those can be sent from anything connected to the Internet: websites, apps, backend services, sensors, smart devices, etc. The collector exposes a powerful REST API interface.


analyze & transform in real-time

The processor module takes in the data passed by the collector and runs them through user-defined operations. It features a built-in autoscaler, which guarantees that your mission-critical workloads are processed with minimum latencies.


send or store
the output

The emitter module is responsible for sending the processed data to a supported database (Redis or ElasticSearch). You can also store it securely in Oktawave Cloud Storage.

use case scenarios

build data-fuelled applications

smart devices

Distributed IoT devices stream logs and the collector transfers them to the cloud computing environment for further analysis.

air pollution

Air quality sensors stream the results of air quality measurement. The processor sanitizes the data and passes them down to the emitter.

stock quotes

Raw transaction data is streamed to the collector. The processor applies its calculations and the emitter outputs current stock quotes.

road traffic

Mobile applications send GPS data to the collector. The output data are real-time reports on the current road conditions.

build your own

Ready to tap into the power of data streams? Contact us for a free consultation. Our experts can help you build a PoC or a production-ready application.

Contact our experts
for more details

+48 22 10 10 555