compose / transporter

Sync data between persistence engines, like ETL only not stodgy
https://github.com/compose/transporter/issues/523
BSD 3-Clause "New" or "Revised" License
1.45k stars 213 forks source link

Transporter doesnt's copy records from mongodb to elasticsearch #435

Closed Simone-cogno closed 6 years ago

Simone-cogno commented 6 years ago

Directions

Issues are reserved for actionable bug reports and feature requests.

Before opening an issue, search for similar bug reports or feature requests. If no similar issue can be found, fill out either the "Bug Report" or the "Feature Request" section below. Erase the other section and everything on and above this line.

Please note, the quickest way to fix a bug is to open a Pull Request.

Bug report

Record are not transferred from mongodb to elesticsearch

Relevant pipeline.js:

var source = mongodb({
  "uri": "mongodb://localhost:27017/prod"
  // "timeout": "30s",
  // "tail": false,
  // "ssl": false,
  // "cacerts": ["/path/to/cert.pem"],
  // "wc": 1,
  // "fsync": false,
  // "bulk": false,
  // "collection_filters": "{}",
  // "read_preference": "Primary"
})

var sink = elasticsearch({
  "uri": "http://localhost:9200/esp-prod"
  // "timeout": "10s", // defaults to 30s
  // "aws_access_key": "ABCDEF", // used for signing requests to AWS Elasticsearch service
  // "aws_access_secret": "ABCDEF" // used for signing requests to AWS Elasticsearch service
  // "parent_id": "elastic_parent" // defaults to "elastic_parent" parent identifier for Elasticsearch
})

t.Source("source", source, "esp-prod").Save("sink", sink, "esp-prod")

log:

INFO[0000] boot map[sink:elasticsearch source:mongodb]   ts=1510134354597344905
INFO[0000] adaptor Listening...                          name=sink path="source/sink" type=elasticsearch
INFO[0000] starting with metadata map[]                  name=source path=source type=mongodb
INFO[0000] adaptor Starting...                           name=source path=source type=mongodb
INFO[0000] starting Read func                            db=prod
INFO[0000] collection count                              db=prod num_collections=13
INFO[0000] skipping iteration...                         collection=Device db=prod
INFO[0000] skipping iteration...                         collection=Home db=prod
INFO[0000] skipping iteration...                         collection=HomeSystem db=prod
INFO[0000] skipping iteration...                         collection=Measurement db=prod
INFO[0000] skipping iteration...                         collection=Newsletter db=prod
INFO[0000] skipping iteration...                         collection=PresencePreset db=prod
INFO[0000] skipping iteration...                         collection=ProgramEvent db=prod
INFO[0000] skipping iteration...                         collection=Room db=prod
INFO[0000] skipping iteration...                         collection=RoomPresencePreset db=prod
INFO[0000] skipping iteration...                         collection="_Role" db=prod
INFO[0000] skipping iteration...                         collection="_SCHEMA" db=prod
INFO[0000] skipping iteration...                         collection="_Session" db=prod
INFO[0000] skipping iteration...                         collection="_User" db=prod
INFO[0000] done iterating collections                    db=prod
INFO[0000] Read completed                                db=prod
INFO[0000] adaptor Start finished...                     name=source path=source type=mongodb
INFO[0000] adaptor Stopping...                           name=source path=source type=mongodb
INFO[0000] adaptor Stopped                               name=source path=source type=mongodb
INFO[0000] Connections to localhost:27017 closing (1 live sockets). 
INFO[0000] Socket 0xc42020a1c0 to localhost:27017: closing: Closed explicitly (abend=false) 
INFO[0000] adaptor Stopping...                           name=sink path="source/sink" type=elasticsearch
INFO[0000] received stop, message buffer is empty, closing... 
INFO[0000] adaptor Listen closed...                      name=sink path="source/sink" type=elasticsearch
INFO[0000] adaptor Stopped                               name=sink path="source/sink" type=elasticsearch
INFO[0000] closing BulkProcessor                         version=5 writer=elasticsearch
INFO[0000] metrics source records: 0                     path=source ts=1510134354603874918
INFO[0000] metrics source/sink records: 0                path="source/sink" ts=1510134354603875860
INFO[0000] exit map[source:mongodb sink:elasticsearch]   ts=1510134354603876136

System info:

Reproducible Steps:

  1. install mongodb + elastic search + transporter
  2. create a pipeline.js file with the content of above
  3. run cmd: transporter run pipeline.js

What did you expect to happened?

I expect the the index is filled up with records from the mongodb database

What actually happened?

No record are transferred to elastic search index

webpioneer commented 6 years ago

Hi @Simone-cogno ,

What was the solution to "skipping iteration ..." ? Thanks