postmanlabs / openapi-to-postman

Plugin for converting OpenAPI 3.0 specs to the Postman Collection (v2) format
Apache License 2.0
931 stars 200 forks source link

`Error Error: _props.forEach is not a function in... ` and `Error faking a schema. Not faking this schema. Schema: {` #708

Closed mikemadeja closed 1 year ago

mikemadeja commented 1 year ago

I've downloaded the OpenAPI specification from Microsoft's site.

I'm running node.js v18.16.0 on Windows along with npm version 9.5.1.

https://learn.microsoft.com/en-us/azure/databricks/dev-tools/api/latest/jobs - https://learn.microsoft.com/en-us/azure/databricks/_extras/api-refs/jobs-2.1-azure.yaml

When I run the the following command npx openapi2postmanv2 -s .\jobs-2.1-azure.yaml -o collection2.json -p -O folderStrategy=Tags,includeAuthInfoInExample=false

I get the following errors

Error Error: _props.forEach is not a function in /properties/metadata/properties/cluster_spec/properties/new_cluster

Error faking a schema. Not faking this schema. Schema: { type: 'object', properties: { notebook_output: { properties: [Object], type: 'object' }, sql_output: { properties: [Object], type: 'object' }, dbt_output: { properties: [Object], type: 'object' }, logs: { type: 'string', example: 'Hello World!',

The file seems to be outputted but it's missing some of the data from the OpenAPI yaml.

Here's the full output.

PS C:\Users\MichaelMadeja\OneDrive\Desktop> npx openapi2postmanv2 -s .\jobs-2.1-azure.yaml -o collection2.json -p -O folderStrategy=Tags,includeAuthInfoInExample=false Input file: C:\Users\MichaelMadeja\OneDrive\Desktop\jobs-2.1-azure.yaml Error faking a schema. Not faking this schema. Schema: { type: 'object', properties: { name: { type: 'string', example: 'A multitask job', default: 'Untitled', description: 'An optional name for the job.' }, tags: { type: 'object', example: [Object], default: '{}', description: 'A map of tags associated with the job. These are forwarded to the cluster as cluster tags for jobs clusters, and are subject to the same limitations as cluster tags. A maximum of 25 tags can be added to the job.' }, tasks: { type: 'array', maxItems: 2, description: 'A list of task specifications to be executed by this job.', example: [Array], minItems: 2, items: [Object] }, job_clusters: { type: 'array', maxItems: 2, description: 'A list of job cluster specifications that can be shared and reused by tasks of this job. Libraries cannot be declared in a shared job cluster. You must declare dependent libraries in task settings.', example: [Array], minItems: 2, items: [Object] }, email_notifications: { type: 'object', properties: [Object] }, webhook_notifications: { type: 'object', properties: [Object] }, timeout_seconds: { type: 'integer', example: 86400, description: 'An optional timeout applied to each run of this job. The default behavior is to have no timeout.', default: '<integer>' }, schedule: { required: [Array], type: 'object', properties: [Object] }, max_concurrent_runs: { type: 'integer', example: 10, description: 'An optional maximum allowed number of concurrent runs of the job.\n' + '\n' + 'Set this value if you want to be able to execute multiple runs of the same job concurrently. This is useful for example if you trigger your job on a frequent schedule and want to allow consecutive runs to overlap with each other, or if you want to trigger multiple runs which differ by their input parameters.\n' + '\n' + 'This setting affects only new runs. For example, suppose the job’s concurrency is 4 and there are 4 concurrent active runs. Then setting the concurrency to 3 won’t kill any of the active runs. However, from then on, new runs are skipped unless there are fewer than 3 active runs.\n' + '\n' + 'This value cannot exceed 1000\\. Setting this value to 0 causes all new runs to be skipped. The default behavior is to allow only 1 concurrent run.', default: '<integer>' }, git_source: { required: [Array] }, format: { type: 'string', enum: [Array], example: 'MULTI_TASK', description: 'Used to tell what is the format of the job. This field is ignored in Create/Update/Reset calls. When using the Jobs API 2.1 this value is always set to"MULTI_TASK".' }, access_control_list: { type: 'array', description: 'List of permissions to set on the job.', maxItems: 2, items: [Object] } } } Error Error: _props.forEach is not a function in /properties/tasks/items/0/properties/new_cluster at run (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24938:21) at jsf (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24981:14) at fakeSchema (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:785:21) at resolveRequestBodyData (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1165:20) at resolveRawModeRequestBodyForPostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1347:18) at resolveRequestBodyForPostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1411:12) at resolvePostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1820:21) at C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\index.js:97:49 at arrayEach (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:530:11) at Function.forEach (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:9410:14) Error faking a schema. Not faking this schema. Schema: { type: 'object', properties: { jobs: { type: 'array', description: 'The list of jobs.', items: [Object], maxItems: 2 }, has_more: { type: 'boolean', example: false, default: '<boolean>' } } } Error Error: _props.forEach is not a function in /properties/jobs/items/0/properties/settings/properties/tasks/items/0/properties/new_cluster at run (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24938:21) at jsf (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24981:14) at fakeSchema (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:785:21) at resolveRequestBodyData (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1165:20) at resolveResponseBody (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1610:16) at C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1762:64 at C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:4967:15 at baseForOwn (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:3032:24) at Function.forOwn (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:13082:24) at resolveResponseForPostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1756:7) Error faking a schema. Not faking this schema. Schema: { type: 'object', properties: { job_id: { type: 'integer', description: 'The canonical identifier for this job.', example: 11223344, format: 'int64', default: '<long>' }, creator_user_name: { type: 'string', example: 'user.name@databricks.com', description: 'The creator user name. This field won’t be included in the response if the user has been deleted.', default: '<string>' }, run_as_user_name: { type: 'string', example: 'user.name@databricks.com', description: 'The user name that the job runs as.run_as_user_nameis based on the current job settings, and is set to the creator of the job if job access control is disabled, or theis_ownerpermission if job access control is enabled.', default: '<string>' }, settings: { properties: [Object], type: 'object' }, created_time: { type: 'integer', example: 1601370337343, description: 'The time at which this job was created in epoch milliseconds (milliseconds since 1/1/1970 UTC).', format: 'int64', default: '<long>' } } } Error Error: _props.forEach is not a function in /properties/settings/properties/tasks/items/0/properties/new_cluster at run (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24938:21) at jsf (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24981:14) at fakeSchema (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:785:21) at resolveRequestBodyData (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1165:20) at resolveResponseBody (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1610:16) at C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1762:64 at C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:4967:15 at baseForOwn (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:3032:24) at Function.forOwn (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:13082:24) at resolveResponseForPostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1756:7) Error faking a schema. Not faking this schema. Schema: { type: 'object', required: [ 'job_id' ], properties: { job_id: { type: 'integer', example: 11223344, description: 'The canonical identifier of the job to reset. This field is required.', format: 'int64', default: '<long>' }, new_settings: { properties: [Object], type: 'object' } } } Error Error: _props.forEach is not a function in /properties/new_settings/properties/tasks/items/0/properties/new_cluster at run (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24938:21) at jsf (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24981:14) at fakeSchema (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:785:21) at resolveRequestBodyData (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1165:20) at resolveRawModeRequestBodyForPostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1347:18) at resolveRequestBodyForPostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1411:12) at resolvePostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1820:21) at C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\index.js:97:49 at arrayEach (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:530:11) at Function.forEach (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:9410:14) Error faking a schema. Not faking this schema. Schema: { type: 'object', required: [ 'job_id' ], properties: { job_id: { type: 'integer', example: 11223344, description: 'The canonical identifier of the job to update. This field is required.', format: 'int64', default: '<long>' }, new_settings: { properties: [Object], type: 'object' }, fields_to_remove: { type: 'array', description: 'Remove top-level fields in the job settings. Removing nested fields is not supported. This field is optional.', example: [Array], items: [Object], maxItems: 2 } } } Error Error: _props.forEach is not a function in /properties/new_settings/properties/tasks/items/0/properties/new_cluster at run (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24938:21) at jsf (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24981:14) at fakeSchema (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:785:21) at resolveRequestBodyData (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1165:20) at resolveRawModeRequestBodyForPostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1347:18) at resolveRequestBodyForPostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1411:12) at resolvePostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1820:21) at C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\index.js:97:49 at arrayEach (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:530:11) at Function.forEach (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:9410:14) Error faking a schema. Not faking this schema. Schema: { type: 'object', properties: { tasks: { type: 'array', maxItems: 2, example: [Array], minItems: 2, items: [Object] }, run_name: { type: 'string', example: 'A multitask job run', description: 'An optional name for the run. The default value isUntitled.', default: '<string>' }, webhook_notifications: { type: 'object', properties: [Object] }, git_source: { required: [Array] }, timeout_seconds: { type: 'integer', example: 86400, description: 'An optional timeout applied to each run of this job. The default behavior is to have no timeout.', default: '<integer>' }, idempotency_token: { type: 'string', example: '8f018174-4792-40d5-bcbc-3e6a527352c8', description: 'An optional token that can be used to guarantee the idempotency of job run requests. If a run with the provided token already exists, the request does not create a new run but returns the ID of the existing run instead. If a run with the provided token is deleted, an error is returned.\n' + '\n' + 'If you specify the idempotency token, upon failure you can retry until the request succeeds. Databricks guarantees that exactly one run is launched with that idempotency token.\n' + '\n' + 'This token must have at most 64 characters.\n' + '\n' + 'For more information, see [How to ensure idempotency for jobs](https://docs.microsoft.com/azure/databricks/kb/jobs/jobs-idempotency).', default: '<string>' }, access_control_list: { type: 'array', description: 'List of permissions to set on the job.', maxItems: 2, items: [Object] } } } Error Error: _props.forEach is not a function in /properties/tasks/items/0/properties/new_cluster at run (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24938:21) at jsf (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24981:14) at fakeSchema (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:785:21) at resolveRequestBodyData (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1165:20) at resolveRawModeRequestBodyForPostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1347:18) at resolveRequestBodyForPostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1411:12) at resolvePostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1820:21) at C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\index.js:97:49 at arrayEach (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:530:11) at Function.forEach (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:9410:14) Error faking a schema. Not faking this schema. Schema: { type: 'object', properties: { runs: { type: 'array', description: 'A list of runs, from most recently started to least.', items: [Object], maxItems: 2 }, has_more: { type: 'boolean', description: 'If true, additional runs matching the provided filter are available for listing.', default: '<boolean>' } } } Error Error: _props.forEach is not a function in /properties/runs/items/0/properties/tasks/items/0/properties/new_cluster at run (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24938:21) at jsf (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24981:14) at fakeSchema (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:785:21) at resolveRequestBodyData (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1165:20) at resolveResponseBody (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1610:16) at C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1762:64 at C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:4967:15 at baseForOwn (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:3032:24) at Function.forOwn (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:13082:24) at resolveResponseForPostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1756:7) Error faking a schema. Not faking this schema. Schema: { type: 'object', properties: { job_id: { type: 'integer', example: 11223344, description: 'The canonical identifier of the job that contains this run.', format: 'int64', default: '<long>' }, run_id: { type: 'integer', example: 455644833, description: 'The canonical identifier of the run. This ID is unique across all runs of all jobs.', format: 'int64', default: '<long>' }, number_in_job: { type: 'integer', example: 455644833, deprecated: true, description: 'A unique identifier for this job run. This is set to the same value asrun_id.', format: 'int64', default: '<long>' }, creator_user_name: { type: 'string', example: 'user.name@databricks.com', description: 'The creator user name. This field won’t be included in the response if the user has already been deleted.', default: '<string>' }, original_attempt_run_id: { type: 'integer', example: 455644833, description: 'If this run is a retry of a prior run attempt, this field contains the run_id of the original attempt; otherwise, it is the same as the run_id.', format: 'int64', default: '<long>' }, state: { description: 'The result and lifecycle state of the run.', type: 'object', properties: [Object] }, schedule: { required: [Array], type: 'object', properties: [Object] }, tasks: { description: 'The list of tasks performed by the run. Each task has its ownrun_idwhich you can use to callJobsGetOutputto retrieve the run resutls.', type: 'array', maxItems: 2, example: [Array], minItems: 2, items: [Object] }, job_clusters: { type: 'array', maxItems: 2, description: 'A list of job cluster specifications that can be shared and reused by tasks of this job. Libraries cannot be declared in a shared job cluster. You must declare dependent libraries in task settings.', example: [Array], minItems: 2, items: [Object] }, cluster_spec: { type: 'object', properties: [Object] }, cluster_instance: { type: 'object', properties: [Object] }, git_source: { required: [Array] }, overriding_parameters: { type: 'object', properties: [Object] }, start_time: { type: 'integer', example: 1625060460483, description: 'The time at which this run was started in epoch milliseconds (milliseconds since 1/1/1970 UTC). This may not be the time when the job task starts executing, for example, if the job is scheduled to run on a new cluster, this is the time the cluster creation call is issued.', format: 'int64', default: '<long>' }, setup_duration: { type: 'integer', example: 0, description: 'The time in milliseconds it took to set up the cluster. For runs that run on new clusters this is the cluster creation time, for runs that run on existing clusters this time should be very short. The duration of a task run is the sum of thesetup_duration,execution_duration, and thecleanup_duration. Thesetup_durationfield is set to 0 for multitask job runs. The total duration of a multitask job run is the value of therun_durationfield.', format: 'int64', default: '<long>' }, execution_duration: { type: 'integer', example: 0, description: 'The time in milliseconds it took to execute the commands in the JAR or notebook until they completed, failed, timed out, were cancelled, or encountered an unexpected error. The duration of a task run is the sum of thesetup_duration,execution_duration, and thecleanup_duration. Theexecution_durationfield is set to 0 for multitask job runs. The total duration of a multitask job run is the value of therun_durationfield.', format: 'int64', default: '<long>' }, cleanup_duration: { type: 'integer', example: 0, description: 'The time in milliseconds it took to terminate the cluster and clean up any associated artifacts. The duration of a task run is the sum of thesetup_duration,execution_duration, and thecleanup_duration. Thecleanup_durationfield is set to 0 for multitask job runs. The total duration of a multitask job run is the value of therun_durationfield.', format: 'int64', default: '<long>' }, end_time: { type: 'integer', example: 1625060863413, description: 'The time at which this run ended in epoch milliseconds (milliseconds since 1/1/1970 UTC). This field is set to 0 if the job is still running.', format: 'int64', default: '<long>' }, run_duration: { type: 'integer', example: 3879812, description: 'The time in milliseconds it took the job run and all of its repairs to finish. This field is only set for multitask job runs and not task runs. The duration of a task run is the sum of thesetup_duration,execution_duration, and thecleanup_duration.', default: '<integer>' }, trigger: { type: 'string', enum: [Array], description: '*PERIODIC: Schedules that periodically trigger runs, such as a cron scheduler.\n' + '*ONE_TIME: One time triggers that fire a single run. This occurs you triggered a single run on demand through the UI or the API.\n' + '*RETRY: Indicates a run that is triggered as a retry of a previously failed run. This occurs when you request to re-run the job in case of failures.' }, run_name: { type: 'string', example: 'A multitask job run', default: 'Untitled', description: 'An optional name for the run. The maximum allowed length is 4096 bytes in UTF-8 encoding.' }, run_page_url: { type: 'string', description: 'The URL to the detail page of the run.', example: 'https://my-workspace.cloud.databricks.com/#job/11223344/run/123', default: '<string>' }, run_type: { type: 'string', example: 'JOB_RUN', enum: [Array], description: 'The type of the run.\n' + '*JOB_RUN\\- Normal job run. A run created with [Run now](https://docs.microsoft.com/azure/databricks/dev-tools/api/latest/jobs#operation/JobsRunNow). \n' + '*WORKFLOW_RUN\\- Workflow run. A run created with [dbutils.notebook.run](https://docs.microsoft.com/azure/databricks/dev-tools/databricks-utils#dbutils-workflow).\n' + '*SUBMIT_RUN\\- Submit run. A run created with [Run Submit](https://docs.microsoft.com/azure/databricks/dev-tools/api/latest/jobs#operation/JobsRunsSubmit).' }, attempt_number: { type: 'integer', example: 0, description: 'The sequence number of this run attempt for a triggered job run. The initial attempt of a run has an attempt_number of 0\\. If the initial run attempt fails, and the job has a retry policy (max_retries\\> 0), subsequent runs are created with anoriginal_attempt_run_idof the original attempt’s ID and an incrementingattempt_number. Runs are retried only until they succeed, and the maximumattempt_numberis the same as themax_retriesvalue for the job.', default: '<integer>' }, repair_history: { description: 'The repair history of the run.', type: 'array', maxItems: 2, items: [Object] } } } Error Error: _props.forEach is not a function in /properties/tasks/items/0/properties/new_cluster at run (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24938:21) at jsf (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24981:14) at fakeSchema (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:785:21) at resolveRequestBodyData (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1165:20) at resolveResponseBody (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1610:16) at C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1762:64 at C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:4967:15 at baseForOwn (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:3032:24) at Function.forOwn (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:13082:24) at resolveResponseForPostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1756:7) Error faking a schema. Not faking this schema. Schema: { type: 'object', properties: { notebook_output: { properties: [Object], type: 'object' }, sql_output: { properties: [Object], type: 'object' }, dbt_output: { properties: [Object], type: 'object' }, logs: { type: 'string', example: 'Hello World!', description: "The output from tasks that write to standard streams (stdout/stderr) such as [SparkJarTask](https://docs.microsoft.com/azure/databricks/dev-tools/api/latest/jobs#/components/schemas/SparkJarTask), [SparkPythonTask](https://docs.microsoft.com/azure/databricks/dev-tools/api/latest/jobs#/components/schemas/SparkPythonTask, [PythonWheelTask](https://docs.microsoft.com/azure/databricks/dev-tools/api/latest/jobs#/components/schemas/PythonWheelTask. It's not supported for the [NotebookTask](https://docs.microsoft.com/azure/databricks/dev-tools/api/latest/jobs#/components/schemas/NotebookTask, [PipelineTask](https://docs.microsoft.com/azure/databricks/dev-tools/api/latest/jobs#/components/schemas/PipelineTask, or [SparkSubmitTask](https://docs.microsoft.com/azure/databricks/dev-tools/api/latest/jobs#/components/schemas/SparkSubmitTask. Azure Databricks restricts this API to return the last 5 MB of these logs.", default: '<string>' }, logs_truncated: { type: 'boolean', example: true, description: 'Whether the logs are truncated.', default: '<boolean>' }, error: { type: 'string', example: 'ZeroDivisionError: integer division or modulo by zero', description: 'An error message indicating why a task failed or why output is not available. The message is unstructured, and its exact format is subject to change.', default: '<string>' }, error_trace: { type: 'string', example: '---------------------------------------------------------------------------\n' + 'Exception Traceback (most recent call last)\n' + ' 1 numerator = 42\n' + ' 2 denominator = 0\n' + '----> 3 return numerator / denominator\n' + '\n' + 'ZeroDivisionError: integer division or modulo by zero', description: 'If there was an error executing the run, this field contains any available stack traces.', default: '<string>' }, metadata: { properties: [Object], type: 'object' } } } Error Error: _props.forEach is not a function in /properties/metadata/properties/tasks/items/0/properties/new_cluster at run (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24938:21) at jsf (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\assets\json-schema-faker.js:24981:14) at fakeSchema (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:785:21) at resolveRequestBodyData (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1165:20) at resolveResponseBody (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1610:16) at C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1762:64 at C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:4967:15 at baseForOwn (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:3032:24) at Function.forOwn (C:\Users\MichaelMadeja\node_modules\lodash\lodash.js:13082:24) at resolveResponseForPostmanRequest (C:\Users\MichaelMadeja\node_modules\openapi-to-postmanv2\libV2\schemaUtils.js:1756:7) Writing to file: true C:\Users\MichaelMadeja\OneDrive\Desktop\collection2.json { result: true, output: [ { type: 'collection', data: [Object] } ], analytics: {} } Conversion successful, collection written to file PS C:\Users\MichaelMadeja\OneDrive\Desktop>

VShingala commented 1 year ago

@mikemadeja Thanks for reporting the issue! We've identified the issue and will be working on the fix for same. I'll add update here once we have it resolved.

VShingala commented 1 year ago

@mikemadeja It seems that this definition contains an incorrect structure for required field in one of the properties.

See line no. 1591 in the definition, here required value should be array and not string according to JSON schema specificaions. You can update it as following and conversion should work fine.

NewCluster:
      required: 
      - spark_version
      properties:
...
mikemadeja commented 1 year ago

Thank you @VShingala!