.. |makinage-logo| image:: https://github.com/maki-nage/makinage/raw/master/asset/makinage_logo.png
Stream Processing Made Easy
.. image:: https://badge.fury.io/py/makinage.svg :target: https://badge.fury.io/py/makinage
.. image:: https://github.com/maki-nage/makinage/workflows/Python%20package/badge.svg :target: https://github.com/maki-nage/makinage/actions?query=workflow%3A%22Python+package%22 :alt: Github WorkFlows
.. image:: https://github.com/maki-nage/makinage/raw/master/asset/docs_download.svg :target: https://www.makinage.org/doc/makinage-book/latest/index.html :alt: Documentation
Maki Nage is a Python stream processing library and framework. It provides expressive and extensible APIs. Maki Nage speeds up the development of stream applications. It can be used to process stream and batch data. More than that, it allows to develop an application with batch data, and deploy it as a Kafka micro-service.
Read the doc <https://www.makinage.org/doc/makinage-book/latest/index.html>
_
to learn more.
.. image:: https://github.com/maki-nage/makinage/raw/master/asset/graph.png :width: 50%
ReactiveX <https://github.com/ReactiveX/RxPY>
_.Maki Nage is available on PyPI:
.. code:: console
pip install makinage
.. code:: Python
import rx
import rxsci as rs
def rolling_mean():
return rx.pipe(
rs.data.roll(window=3, stride=3, pipeline=rx.pipe(
rs.math.mean(reduce=True),
)),
)
You can test your code from any python data or CSV file.
.. code:: Python
data = [1, 2, 3, 4, 5, 6, 7]
rx.from_(data).pipe(
rs.state.with_memory_store(rx.pipe(
rolling_mean(),
)),
).subscribe(
on_next=print
)
.. code:: console
2.0
5.0
To deploy the code, package it as a function:
.. code:: Python
def my_app(config, data):
roll_mean = rx.from_(data).pipe(
rs.state.with_memory_store(rx.pipe(
rolling_mean(),
)),
)
return roll_mean,
Create a configuration file:
.. code:: yaml
application:
name: my_app
kafka:
endpoint: "localhost"
topics:
- name: data
- name: features
operators:
compute_features:
factory: my_app:my_app
sources:
- data
sinks:
- features
And start it!
.. code:: console
makinage --config myconfig.yaml
Maki Nage contains a model serving tool. With it, serving a machine learning model in streaming mode just requires a configuration file:
.. code:: yaml
application:
name: my_model_serving
Kafka:
endpoint: "localhost"
topics:
- name: data
encoder: makinage.encoding.json
- name: model
encoder: makinage.encoding.none
start_from: last
- name: predict
encoder: makinage.encoding.json
operators:
serve:
factory: makinage.serve:serve
sources:
- model
- data
sinks:
- predict
config:
serve: {}
And then serving the model it done the same way than any makinage application:
.. code:: console
makinage --config config.serve.yaml
Some pre and post processing steps are possible if input features or predictions must be modified before/after the inference:
.. image:: https://github.com/maki-nage/makinage/raw/master/asset/serve.png
Read the book <https://www.makinage.org/doc/makinage-book/latest/serving.html#>
_
to learn more.
Stream Processing Made Easy <https://towardsdatascience.com/stream-processing-made-easy-5f4892736623>
_Real-Time Histogram Plots on Unbounded Data <https://www.kdnuggets.com/2021/09/real-time-histogram-plots-unbounded-data.html>
_Maki Nage is publised under the MIT License.