Closed raffaelecarelle closed 5 years ago
Hi Raffaelle
Currently there is no configuration possibility or easy way to configure import batch size for each job. It will become possible in the next patch release or later.
If you cannot wait, I can give an advice - decorating "oro_importexport.async.pre_cli_import" or "oro_importexport.async.pre_http_import" will not help. Instead the main goal is to prevent splitting import file by BatchFileManager. You should to find a way to pass the batch size to the options of Oro\Bundle\ImportExportBundle\File\BatchFileManager
from Oro\Bundle\ImportExportBundle\Handler\ImportHandler
(or AbstractImportHandler in v3.1), which in its turn can take it, e.g. from Oro\Bundle\BatchBundle\Step\ItemStep
(because ItemStep can be configured in batch_jobs.yml).
Import batch size can be configured in Resources/config/batch_jobs.yml
for each step of a job by specifying batch_size
under parameters
section of a step, e.g.
connector:
name: oro_importexport
jobs:
category_import_from_csv:
title: "Category Import from CSV"
type: import
steps:
import:
title: import
class: Oro\Bundle\BatchBundle\Step\ItemStep
services:
reader: oro_importexport.reader.csv
processor: oro_importexport.processor.import_delegate
writer: oro_catalog.importexport.writer.category
parameters:
batch_size: 10000
How is it possibile set batch size for products import without override global parameter "oro_importexport.import.size_of_batch".
I thought to decorate service oro_importexport.async.pre_cli_import and oro_importexport.async.pre_http_import setting batchsize when the entity is Product (or leave it as it is injected from decorated service otherwise) but this property is protected without setters method.
There is another way to do that?
Thanks a lot :)