-
# PLATFORM: OMNIBUS
EA's Contact and Customer Data Platform
## IDEAS
Get the Contacts Sorted - Everything we do revolves around our ability to communicate to the right people, the right message at …
-
We've had a bunch of calls for scikit-learn style data pipelines. It's time.
These pipelines have to be runnable in an MRTask across the training data when building a model, and also serializabl…
-
_This issue was transferred from a private repository_
Our processing pipeline allows customer to create a logging scope in a custom behavior. Within this scope all log entries can include custom p…
-
## Hypothesis Statement
- Funnel Entry Date: 14/6/22
- Epic Name: Development of complex workflows for analysing MR images
- Epic Owner: @tclose
## Description
For users of AIS/NIF who ac…
-
**Related command**
az datafactory pipeline list --factory-name XDLKMCNINFDTFACINST001 -g XDLKMCNINFRESGP026 --debug
***Errortack is as following:***
DEBUG: cli.azure.cli.core.util: Tracebac…
-
Testing on
```
pachctl 2.0.0-beta.1
pachd 2.0.0-beta.1
```
I ran a set of tests designed to exercise PFS and grpc's max transaction size on 2.0 and immediately ran …
-
Consider this python based prediction server code:
# One time setup
model = Pipeline()
model.load_model('model.zip')
# Numerous multiple calls to predict:
model.predict(data)
###############…
ganik updated
5 years ago
-
**Is your feature request related to a problem? Please describe.**
One common concern when working with _Bot Pipelines_ is _NLU_ data loss, especially when you are using the _Misunderstood_ module …
-
It would be nice to be able to export the pipeline as a json file and use them later on, example if you have to create pipelines for subsets of your training data based on some filter criteria.
## …
-
Two customers would like to have the ability to easily create pipelines using databrew and having a easy to use GUI to build data preparations that can publish a dataset back into data.all catalog.
…