Closed zero323 closed 4 years ago
Can you run it with jedi.set_debug_function()
and send me the output?
The problem essentially is that parent_module_value is a context, but it should be a value.
Here you are. Please let me know if you need anything else.
speed: init 0.09431767463684082
speed: parsed 0.09518074989318848
dbg: Start: complete
dbg: infer_node <Name: SparkSession@1,0>@(1, 0) in MixedModuleContext(<ModuleValue: @1-1 is_stub=False>)
dbg: context.goto <Name: SparkSession@1,0> in (MixedModuleContext(<ModuleValue: @1-1 is_stub=False>)): [<MixedName: (<CompiledValueName: string_name=>).SparkSession>]
dbg: context.names_to_types: [<MixedName: (<CompiledValueName: string_name=>).SparkSession>] -> S{<MixedObject: <class 'pyspark.sql.session.SparkSession'>>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: builder@1,13>]) in S{<MixedObject: <class 'pyspark.sql.session.SparkSession'>>}
dbg: context.goto <Name: builder@1,13> in (<MixedObject: <class 'pyspark.sql.session.SparkSession'>>): [<MixedName: (<ValueName: string_name=SparkSession start_pos=(63, 6)>).builder>]
dbg: global search_module 'builtins': <CompiledObject: <module 'builtins' (built-in)>>
dbg: execute: <ClassValue: <Class: Builder@80-187>> <ValuesArguments: []>
dbg: execute result: S{<TreeInstance of <ClassValue: <Class: Builder@80-187>>(<ValuesArguments: []>)>} in <ClassValue: <Class: Builder@80-187>>
dbg: context.names_to_types: [<MixedName: (<ValueName: string_name=SparkSession start_pos=(63, 6)>).builder>] -> S{<MixedObject: <pyspark.sql.session.SparkSession.Builder object a..>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: getOrCreate@1,21>]) in S{<MixedObject: <pyspark.sql.session.SparkSession.Builder object a..>}
dbg: context.goto <Name: getOrCreate@1,21> in (<MixedObject: <pyspark.sql.session.SparkSession.Builder object a..>): [<MixedName: (<ValueName: string_name=Builder start_pos=(80, 10)>).getOrCreate>]
dbg: context.names_to_types: [<MixedName: (<ValueName: string_name=Builder start_pos=(80, 10)>).getOrCreate>] -> S{<MixedObject: <bound method SparkSession.Builder.getOrCreate of ..>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: (>, <Operator: )>]) in S{<MixedObject: <bound method SparkSession.Builder.getOrCreate of ..>}
dbg: execute: <MixedObject: <bound method SparkSession.Builder.getOrCreate of ..> <TreeArguments: None>
dbg: infer_node <Name: session@186,23>@(186, 23) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)
dbg: infer_node PythonNode(or_test, [PythonNode(comparison, [<Name: session@169,19>, <Keyword: is>, <Keyword: None>]), <Keyword: or>, PythonNode(comparison, [PythonNode(atom_expr, [<Name: session@169,38>, PythonNode(trailer, [<Operator: .>, <Name: _sc@169,46>]), PythonNode(trailer, [<Operator: .>, <Name: _jsc@169,50>])]), <Keyword: is>, <Keyword: None>])])@(169, 19) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)
dbg: infer_node PythonNode(comparison, [<Name: session@169,19>, <Keyword: is>, <Keyword: None>])@(169, 19) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)
dbg: infer_node <Name: session@169,19>@(169, 19) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)
dbg: context.goto <Name: session@169,19> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)): [<TreeNameDefinition: string_name=session start_pos=(168, 16)>]
dbg: infer_expr_stmt <ExprStmt: session = SparkSession._instantiatedSession@168,16> (<Name: session@168,16>)
dbg: infer_node PythonNode(atom_expr, [<Name: SparkSession@168,26>, PythonNode(trailer, [<Operator: .>, <Name: _instantiatedSession@168,39>])])@(168, 26) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)
dbg: infer_node <Name: SparkSession@168,26>@(168, 26) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)
dbg: context.goto <Name: SparkSession@168,26> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)): [<MixedName: (<CompiledValueName: string_name=session>).SparkSession>]
dbg: context.names_to_types: [<MixedName: (<CompiledValueName: string_name=session>).SparkSession>] -> S{<MixedObject: <class 'pyspark.sql.session.SparkSession'>>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: _instantiatedSession@168,39>]) in S{<MixedObject: <class 'pyspark.sql.session.SparkSession'>>}
dbg: context.goto <Name: _instantiatedSession@168,39> in (<MixedObject: <class 'pyspark.sql.session.SparkSession'>>): [<MixedName: (<ValueName: string_name=SparkSession start_pos=(63, 6)>)._instantiatedSession>]
dbg: Start: convert values
dbg: End: convert values
dbg: context.goto 'NoneType' in (<StubModuleValue: builtins@4-1651 is_stub=True>): []
dbg: context.names_to_types: [] -> S{}
dbg: context.names_to_types: [<MixedName: (<ValueName: string_name=SparkSession start_pos=(63, 6)>)._instantiatedSession>] -> S{<CompiledObject: None>}
dbg: infer_expr_stmt result S{<CompiledObject: None>}
dbg: context.names_to_types: [<TreeNameDefinition: string_name=session start_pos=(168, 16)>] -> S{<CompiledObject: None>}
dbg: infer_node <Keyword: None>@(169, 30) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)
dbg: infer_or_test types S{<CompiledObject: True>}
dbg: infer_or_test types S{<CompiledObject: True>}
dbg: context.goto <Name: session@186,23> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)): [<TreeNameDefinition: string_name=session start_pos=(181, 20)>]
dbg: infer_expr_stmt <ExprStmt: session = SparkSession(sc)@181,20> (<Name: session@181,20>)
dbg: infer_node PythonNode(atom_expr, [<Name: SparkSession@181,30>, PythonNode(trailer, [<Operator: (>, <Name: sc@181,43>, <Operator: )>])])@(181, 30) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)
dbg: infer_node <Name: SparkSession@181,30>@(181, 30) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)
dbg: context.goto <Name: SparkSession@181,30> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)): [<MixedName: (<CompiledValueName: string_name=session>).SparkSession>]
dbg: context.names_to_types: [<MixedName: (<CompiledValueName: string_name=session>).SparkSession>] -> S{<MixedObject: <class 'pyspark.sql.session.SparkSession'>>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: (>, <Name: sc@181,43>, <Operator: )>]) in S{<MixedObject: <class 'pyspark.sql.session.SparkSession'>>}
dbg: execute: <MixedObject: <class 'pyspark.sql.session.SparkSession'>> <TreeArguments: <Name: sc@181,43>>
dbg: execute result: S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>} in <MixedObject: <class 'pyspark.sql.session.SparkSession'>>
dbg: infer_expr_stmt result S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: context.names_to_types: [<TreeNameDefinition: string_name=session start_pos=(181, 20)>] -> S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: Return reachable: <ReturnStmt: return session@186,16>
dbg: execute result: S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>} in <MixedObject: <bound method SparkSession.Builder.getOrCreate of ..>
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: createDataFrame@1,35>]) in S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: infer_node <Name: object@63,19>@(63, 19) in MixedModuleContext(<MixedObject: <module 'pyspark.sql.session' from '/home/zero323/..>)
dbg: context.goto <Name: object@63,19> in (MixedModuleContext(<MixedObject: <module 'pyspark.sql.session' from '/home/zero323/..>)): [<StubName: string_name=object start_pos=(40, 6)>]
dbg: context.names_to_types: [<StubName: string_name=object start_pos=(40, 6)>] -> S{<ClassValue: <Class: object@40-70>>}
dbg: context.goto '__init__' in (<ClassValue: <Class: SparkSession@63-843>>): [<ClassName: string_name=__init__ start_pos=(194, 8)>]
dbg: decorator: <Decorator: @ignore_unicode_prefix@193,4> S{<MethodValue: <Function: __init__@194-234>>}
dbg: infer_node <Name: ignore_unicode_prefix@193,5>@(193, 5) in ClassContext(<ClassValue: <Class: SparkSession@63-843>>)
dbg: context.goto <Name: ignore_unicode_prefix@193,5> in (ClassContext(<ClassValue: <Class: SparkSession@63-843>>)): [<MixedName: (<CompiledValueName: string_name=session>).ignore_unicode_prefix>]
dbg: context.names_to_types: [<MixedName: (<CompiledValueName: string_name=session>).ignore_unicode_prefix>] -> S{<MixedObject: <function ignore_unicode_prefix at 0x7f615f55eb00>>}
dbg: execute: <MixedObject: <function ignore_unicode_prefix at 0x7f615f55eb00>> <ValuesArguments: [S{<MethodValue: <Function: __init__@194-234>>}]>
dbg: infer_node <Name: f@162,11>@(162, 11) in FunctionExecutionContext(<FunctionValue: <Function: ignore_unicode_prefix@152-163>>)
dbg: context.goto <Name: f@162,11> in (FunctionExecutionContext(<FunctionValue: <Function: ignore_unicode_prefix@152-163>>)): [<ParamName: string_name=f start_pos=(152, 26)>]
dbg: Found param types for docstring: S{}
dbg: context.names_to_types: [<ParamName: string_name=f start_pos=(152, 26)>] -> S{<MethodValue: <Function: __init__@194-234>>}
dbg: Return reachable: <ReturnStmt: return f@162,4>
dbg: execute result: S{<MethodValue: <Function: __init__@194-234>>} in <MixedObject: <function ignore_unicode_prefix at 0x7f615f55eb00>>
dbg: decorator end S{<MethodValue: <Function: __init__@194-234>>}
dbg: context.names_to_types: [<ClassName: string_name=__init__ start_pos=(194, 8)>] -> S{<MethodValue: <Function: __init__@194-234>>}
dbg: infer_node <Name: sc@181,43>@(181, 43) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)
dbg: context.goto <Name: sc@181,43> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)): [<TreeNameDefinition: string_name=sc start_pos=(173, 20)>]
dbg: infer_expr_stmt <ExprStmt: sc = SparkContext.getOrCreate(sparkConf)@173,20> (<Name: sc@173,20>)
dbg: infer_node PythonNode(atom_expr, [<Name: SparkContext@173,25>, PythonNode(trailer, [<Operator: .>, <Name: getOrCreate@173,38>]), PythonNode(trailer, [<Operator: (>, <Name: sparkConf@173,50>, <Operator: )>])])@(173, 25) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)
dbg: infer_node <Name: SparkContext@173,25>@(173, 25) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)
dbg: context.goto <Name: SparkContext@173,25> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: getOrCreate@143-187>>>)): [<TreeNameDefinition: string_name=SparkContext start_pos=(166, 44)>]
speed: import (<Name: pyspark@166,21>, <Name: context@166,29>) MixedModuleContext(<MixedObject: <module 'pyspark.sql.session' from '/home/zero323/..>) 0.1471543312072754
dbg: global search_module 'pyspark': <ModuleValue: pyspark@18-123 is_stub=False>
dbg: search_module 'pyspark.context' in paths ['/path/to/jedi/venv/lib/python3.7/site-packages/pyspark']: <ModuleValue: context@18-1111 is_stub=False>
dbg: context.goto <Name: SparkContext@166,44> in (<ModuleValue: context@18-1111 is_stub=False>): [<TreeNameDefinition: string_name=SparkContext start_pos=(60, 6)>]
dbg: context.names_to_types: [<TreeNameDefinition: string_name=SparkContext start_pos=(60, 6)>] -> S{<ClassValue: <Class: SparkContext@60-1093>>}
dbg: after import: S{<ClassValue: <Class: SparkContext@60-1093>>}
dbg: context.names_to_types: [<TreeNameDefinition: string_name=SparkContext start_pos=(166, 44)>] -> S{<ClassValue: <Class: SparkContext@60-1093>>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: getOrCreate@173,38>]) in S{<ClassValue: <Class: SparkContext@60-1093>>}
dbg: infer_node <Name: object@60,19>@(60, 19) in ModuleContext(<ModuleValue: context@18-1111 is_stub=False>)
dbg: context.goto <Name: object@60,19> in (ModuleContext(<ModuleValue: context@18-1111 is_stub=False>)): [<StubName: string_name=object start_pos=(40, 6)>]
dbg: context.names_to_types: [<StubName: string_name=object start_pos=(40, 6)>] -> S{<ClassValue: <Class: object@40-70>>}
dbg: context.goto <Name: getOrCreate@173,38> in (<ClassValue: <Class: SparkContext@60-1093>>): [<ClassName: string_name=getOrCreate start_pos=(359, 8)>]
dbg: decorator: <Decorator: @classmethod@358,4> S{<MethodValue: <Function: getOrCreate@359-369>>}
dbg: infer_node <Name: classmethod@358,5>@(358, 5) in ClassContext(<ClassValue: <Class: SparkContext@60-1093>>)
dbg: context.goto <Name: classmethod@358,5> in (ClassContext(<ClassValue: <Class: SparkContext@60-1093>>)): [<StubName: string_name=classmethod start_pos=(80, 6)>]
dbg: context.names_to_types: [<StubName: string_name=classmethod start_pos=(80, 6)>] -> S{<ClassValue: <Class: classmethod@80-88>>}
dbg: builtin start <ClassValue: <Class: classmethod@80-88>>
dbg: builtin end: S{ClassMethodObject(<TreeInstance of <ClassValue: <Class: classmethod@80-88>>(<ValuesArguments: [S{<MethodValue: <Function: getOrCreate@359-369>>}]>)>)}
dbg: decorator end S{ClassMethodObject(<TreeInstance of <ClassValue: <Class: classmethod@80-88>>(<ValuesArguments: [S{<MethodValue: <Function: getOrCreate@359-369>>}]>)>)}
dbg: infer_node <Name: object@80,18>@(80, 18) in StubModuleContext(<StubModuleValue: builtins@4-1651 is_stub=True>)
dbg: context.goto <Name: object@80,18> in (StubModuleContext(<StubModuleValue: builtins@4-1651 is_stub=True>)): [<TreeNameDefinition: string_name=object start_pos=(40, 6)>]
dbg: context.names_to_types: [<TreeNameDefinition: string_name=object start_pos=(40, 6)>] -> S{<ClassValue: <Class: object@40-70>>}
dbg: context.goto '__init__' in (<ClassValue: <Class: classmethod@80-88>>): [<ClassName: string_name=__init__ start_pos=(85, 8)>]
dbg: context.names_to_types: [<ClassName: string_name=__init__ start_pos=(85, 8)>] -> S{<MethodValue: <Function: __init__@85-86>>}
dbg: global search_module 'types': <ModuleValue: types@1-296 is_stub=False>
dbg: context.goto 'FunctionType' in (<StubModuleValue: types@6-289 is_stub=True>): [<StubName: string_name=FunctionType start_pos=(25, 6)>]
dbg: context.names_to_types: [<StubName: string_name=FunctionType start_pos=(25, 6)>] -> S{<ClassValue: <Class: FunctionType@25-38>>}
dbg: infer_node PythonNode(atom_expr, [<Name: Callable@85,26>, PythonNode(trailer, [<Operator: [>, PythonNode(subscriptlist, [<Operator: ...>, <Operator: ,>, <Name: Any@85,40>]), <Operator: ]>])])@(85, 26) in ClassContext(<ClassValue: <Class: classmethod@80-88>>)
dbg: infer_node <Name: Callable@85,26>@(85, 26) in ClassContext(<ClassValue: <Class: classmethod@80-88>>)
dbg: context.goto <Name: Callable@85,26> in (ClassContext(<ClassValue: <Class: classmethod@80-88>>)): [<TreeNameDefinition: string_name=Callable start_pos=(6, 80)>]
speed: import (<Name: typing@4,5>,) StubModuleContext(<StubModuleValue: builtins@4-1651 is_stub=True>) 0.1875472068786621
dbg: global search_module 'typing': <ModuleValue: typing@1-1650 is_stub=False>
dbg: context.goto <Name: Callable@6,80> in (<TypingModuleWrapper: typing@3-646 is_stub=True>): [TypingModuleName(<StubName: string_name=Callable start_pos=(22, 0)>)]
dbg: context.names_to_types: [TypingModuleName(<StubName: string_name=Callable start_pos=(22, 0)>)] -> S{ProxyTypingClassValue(Callable)}
dbg: after import: S{ProxyTypingClassValue(Callable)}
dbg: context.names_to_types: [<TreeNameDefinition: string_name=Callable start_pos=(6, 80)>] -> S{ProxyTypingClassValue(Callable)}
dbg: infer_node <Operator: ...>@(85, 35) in ClassContext(<ClassValue: <Class: classmethod@80-88>>)
dbg: infer_node <Name: ellipsis@1462,10>@(1462, 10) in StubModuleContext(<StubModuleValue: builtins@4-1651 is_stub=True>)
dbg: context.goto <Name: ellipsis@1462,10> in (StubModuleContext(<StubModuleValue: builtins@4-1651 is_stub=True>)): [<TreeNameDefinition: string_name=ellipsis start_pos=(1461, 6)>]
dbg: context.names_to_types: [<TreeNameDefinition: string_name=ellipsis start_pos=(1461, 6)>] -> S{<ClassValue: <Class: ellipsis@1461-1462>>}
dbg: execute: <ClassValue: <Class: ellipsis@1461-1462>> <ValuesArguments: []>
dbg: execute result: S{<TreeInstance of <ClassValue: <Class: ellipsis@1461-1462>>(<ValuesArguments: []>)>} in <ClassValue: <Class: ellipsis@1461-1462>>
dbg: infer_node <Name: Any@85,40>@(85, 40) in ClassContext(<ClassValue: <Class: classmethod@80-88>>)
dbg: context.goto <Name: Any@85,40> in (ClassContext(<ClassValue: <Class: classmethod@80-88>>)): [<TreeNameDefinition: string_name=Any start_pos=(6, 69)>]
speed: import (<Name: typing@4,5>,) StubModuleContext(<StubModuleValue: builtins@4-1651 is_stub=True>) 0.22384262084960938
dbg: context.goto <Name: Any@6,69> in (<TypingModuleWrapper: typing@3-646 is_stub=True>): [TypingModuleName(<StubName: string_name=Any start_pos=(12, 0)>)]
dbg: context.names_to_types: [TypingModuleName(<StubName: string_name=Any start_pos=(12, 0)>)] -> S{Any(Any)}
dbg: after import: S{Any(Any)}
dbg: context.names_to_types: [<TreeNameDefinition: string_name=Any start_pos=(6, 69)>] -> S{Any(Any)}
dbg: Start: Resolve lazy value wrapper
dbg: End: Resolve lazy value wrapper
dbg: py__getitem__ result: S{TypingClassValueWithIndex(Callable<LazyG>[S{<TreeInstance of <ClassValue: <Class: ellipsis@1461-1462>>(<ValuesArguments: []>)>}, S{Any(Any)}])}
dbg: Start: Resolve lazy value wrapper
dbg: End: Resolve lazy value wrapper
dbg: context.goto 'object' in (<StubModuleValue: builtins@4-1651 is_stub=True>): [<StubName: string_name=object start_pos=(40, 6)>]
dbg: context.names_to_types: [<StubName: string_name=object start_pos=(40, 6)>] -> S{<ClassValue: <Class: object@40-70>>}
dbg: param compare False: S{<ClassValue: <Class: FunctionType@25-38>>} <=> S{TypingClassValueWithIndex(Callable<LazyG>[S{<TreeInstance of <ClassValue: <Class: ellipsis@1461-1462>>(<ValuesArguments: []>)>}, S{Any(Any)}])}
dbg: Overloading no match: '__init__(self, f: Callable[..., Any]) -> None'@85 (<InstanceArguments: <ValuesArguments: [S{<MethodValue: <Function: getOrCreate@359-369>>}]>>)
dbg: context.goto '__get__' in (<TreeInstance of <ClassValue: <Class: classmethod@80-88>>(<ValuesArguments: [S{<MethodValue: <Function: getOrCreate@359-369>>}]>)>): [LazyInstanceClassName(<ClassName: string_name=__get__ start_pos=(87, 8)>)]
dbg: context.names_to_types: [LazyInstanceClassName(<ClassName: string_name=__get__ start_pos=(87, 8)>)] -> S{<BoundMethod: <MethodValue: <Function: __get__@87-88>>>}
dbg: context.names_to_types: [<ClassName: string_name=getOrCreate start_pos=(359, 8)>] -> S{ClassMethodGet(<BoundMethod: <MethodValue: <Function: __get__@87-88>>>)}
dbg: infer_trailer: PythonNode(trailer, [<Operator: (>, <Name: sparkConf@173,50>, <Operator: )>]) in S{ClassMethodGet(<BoundMethod: <MethodValue: <Function: __get__@87-88>>>)}
dbg: execute: ClassMethodGet(<BoundMethod: <MethodValue: <Function: __get__@87-88>>>) <TreeArguments: <Name: sparkConf@173,50>>
dbg: execute: <MethodValue: <Function: getOrCreate@359-369>> <ClassMethodArguments: <TreeArguments: <Name: sparkConf@173,50>>>
dbg: infer_node PythonNode(atom_expr, [<Name: SparkContext@368,19>, PythonNode(trailer, [<Operator: .>, <Name: _active_spark_context@368,32>])])@(368, 19) in FunctionExecutionContext(<MethodValue: <Function: getOrCreate@359-369>>)
dbg: infer_node <Name: SparkContext@368,19>@(368, 19) in FunctionExecutionContext(<MethodValue: <Function: getOrCreate@359-369>>)
dbg: context.goto <Name: SparkContext@368,19> in (FunctionExecutionContext(<MethodValue: <Function: getOrCreate@359-369>>)): [<TreeNameDefinition: string_name=SparkContext start_pos=(60, 6)>]
dbg: context.names_to_types: [<TreeNameDefinition: string_name=SparkContext start_pos=(60, 6)>] -> S{<ClassValue: <Class: SparkContext@60-1093>>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: _active_spark_context@368,32>]) in S{<ClassValue: <Class: SparkContext@60-1093>>}
dbg: context.goto <Name: _active_spark_context@368,32> in (<ClassValue: <Class: SparkContext@60-1093>>): [<ClassName: string_name=_active_spark_context start_pos=(75, 4)>]
dbg: infer_expr_stmt <ExprStmt: _active_spark_context = None@75,4> (<Name: _active_spark_context@75,4>)
dbg: infer_node <Keyword: None>@(75, 28) in ClassContext(<ClassValue: <Class: SparkContext@60-1093>>)
dbg: infer_expr_stmt result S{<CompiledObject: None>}
warning: No __get__ defined on <CompiledObject: None>
dbg: context.names_to_types: [<ClassName: string_name=_active_spark_context start_pos=(75, 4)>] -> S{<CompiledObject: None>}
dbg: Return reachable: <ReturnStmt: return SparkContext._active_spark_context@368,12>
dbg: execute result: S{<CompiledObject: None>} in <MethodValue: <Function: getOrCreate@359-369>>
dbg: execute result: S{<CompiledObject: None>} in ClassMethodGet(<BoundMethod: <MethodValue: <Function: __get__@87-88>>>)
dbg: infer_expr_stmt result S{<CompiledObject: None>}
dbg: context.names_to_types: [<TreeNameDefinition: string_name=sc start_pos=(173, 20)>] -> S{<CompiledObject: None>}
dbg: Overloading match: '__init__(self, sparkContext, jsparkSession=None)'@194 (<InstanceArguments: <TreeArguments: <Name: sc@181,43>>>)
dbg: context.goto <Name: createDataFrame@1,35> in (<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>): [LazyInstanceClassName(<ClassName: string_name=createDataFrame start_pos=(587, 8)>)]
dbg: decorator: <Decorator: @ignore_unicode_prefix@586,4> S{<MethodValue: <Function: createDataFrame@587-754>>}
dbg: infer_node <Name: ignore_unicode_prefix@586,5>@(586, 5) in ClassContext(<ClassValue: <Class: SparkSession@63-843>>)
dbg: context.goto <Name: ignore_unicode_prefix@586,5> in (ClassContext(<ClassValue: <Class: SparkSession@63-843>>)): [<MixedName: (<CompiledValueName: string_name=session>).ignore_unicode_prefix>]
dbg: context.names_to_types: [<MixedName: (<CompiledValueName: string_name=session>).ignore_unicode_prefix>] -> S{<MixedObject: <function ignore_unicode_prefix at 0x7f615f55eb00>>}
dbg: execute: <MixedObject: <function ignore_unicode_prefix at 0x7f615f55eb00>> <ValuesArguments: [S{<MethodValue: <Function: createDataFrame@587-754>>}]>
dbg: infer_node <Name: f@162,11>@(162, 11) in FunctionExecutionContext(<FunctionValue: <Function: ignore_unicode_prefix@152-163>>)
dbg: context.goto <Name: f@162,11> in (FunctionExecutionContext(<FunctionValue: <Function: ignore_unicode_prefix@152-163>>)): [<ParamName: string_name=f start_pos=(152, 26)>]
dbg: context.names_to_types: [<ParamName: string_name=f start_pos=(152, 26)>] -> S{<MethodValue: <Function: createDataFrame@587-754>>}
dbg: Return reachable: <ReturnStmt: return f@162,4>
dbg: execute result: S{<MethodValue: <Function: createDataFrame@587-754>>} in <MixedObject: <function ignore_unicode_prefix at 0x7f615f55eb00>>
dbg: decorator end S{<MethodValue: <Function: createDataFrame@587-754>>}
dbg: decorator: <Decorator: @since(2.0)@585,4> S{<MethodValue: <Function: createDataFrame@587-754>>}
dbg: infer_node <Name: since@585,5>@(585, 5) in ClassContext(<ClassValue: <Class: SparkSession@63-843>>)
dbg: context.goto <Name: since@585,5> in (ClassContext(<ClassValue: <Class: SparkSession@63-843>>)): [<MixedName: (<CompiledValueName: string_name=session>).since>]
dbg: context.names_to_types: [<MixedName: (<CompiledValueName: string_name=session>).since>] -> S{<MixedObject: <function since at 0x7f615ff923b0>>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: (>, <Number: 2.0>, <Operator: )>]) in S{<MixedObject: <function since at 0x7f615ff923b0>>}
dbg: execute: <MixedObject: <function since at 0x7f615ff923b0>> <TreeArguments: <Number: 2.0>>
dbg: infer_node <Name: deco@77,11>@(77, 11) in FunctionExecutionContext(<FunctionValue: <Function: since@65-78>>)
dbg: context.goto <Name: deco@77,11> in (FunctionExecutionContext(<FunctionValue: <Function: since@65-78>>)): [<TreeNameDefinition: string_name=deco start_pos=(72, 8)>]
dbg: context.names_to_types: [<TreeNameDefinition: string_name=deco start_pos=(72, 8)>] -> S{<FunctionValue: <Function: deco@72-77>>}
dbg: Return reachable: <ReturnStmt: return deco@77,4>
dbg: execute result: S{<FunctionValue: <Function: deco@72-77>>} in <MixedObject: <function since at 0x7f615ff923b0>>
dbg: execute: <FunctionValue: <Function: deco@72-77>> <ValuesArguments: [S{<MethodValue: <Function: createDataFrame@587-754>>}]>
dbg: infer_node <Name: f@76,15>@(76, 15) in FunctionExecutionContext(<FunctionValue: <Function: deco@72-77>>)
dbg: context.goto <Name: f@76,15> in (FunctionExecutionContext(<FunctionValue: <Function: deco@72-77>>)): [<ParamName: string_name=f start_pos=(72, 13)>]
dbg: Found param types for docstring: S{}
dbg: context.names_to_types: [<ParamName: string_name=f start_pos=(72, 13)>] -> S{<MethodValue: <Function: createDataFrame@587-754>>}
dbg: Return reachable: <ReturnStmt: return f@76,8>
dbg: execute result: S{<MethodValue: <Function: createDataFrame@587-754>>} in <FunctionValue: <Function: deco@72-77>>
dbg: decorator end S{<MethodValue: <Function: createDataFrame@587-754>>}
dbg: context.names_to_types: [LazyInstanceClassName(<ClassName: string_name=createDataFrame start_pos=(587, 8)>)] -> S{<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: (>, PythonNode(atom, [<Operator: [>, <Operator: ]>]), <Operator: )>]) in S{<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>}
dbg: execute: <BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>> <TreeArguments: PythonNode(atom, [<Operator: [>, <Operator: ]>])>
dbg: infer_node PythonNode(atom_expr, [<Name: self@702,27>, PythonNode(trailer, [<Operator: .>, <Name: _create_from_pandas_with_arrow@702,32>]), PythonNode(trailer, [<Operator: (>, PythonNode(arglist, [<Name: data@702,63>, <Operator: ,>, <Name: schema@702,69>, <Operator: ,>, <Name: timezone@702,77>]), <Operator: )>])])@(702, 27) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)
dbg: infer_node <Name: self@702,27>@(702, 27) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)
dbg: context.goto <Name: self@702,27> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)): [<ParamName: string_name=self start_pos=(587, 24)>]
dbg: Found param types for docstring: S{}
dbg: context.names_to_types: [<ParamName: string_name=self start_pos=(587, 24)>] -> S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: _create_from_pandas_with_arrow@702,32>]) in S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: context.goto <Name: _create_from_pandas_with_arrow@702,32> in (<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>): [LazyInstanceClassName(<ClassName: string_name=_create_from_pandas_with_arrow start_pos=(498, 8)>)]
dbg: context.names_to_types: [LazyInstanceClassName(<ClassName: string_name=_create_from_pandas_with_arrow start_pos=(498, 8)>)] -> S{<BoundMethod: <MethodValue: <Function: _create_from_pandas_with_arrow@498-557>>>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: (>, PythonNode(arglist, [<Name: data@702,63>, <Operator: ,>, <Name: schema@702,69>, <Operator: ,>, <Name: timezone@702,77>]), <Operator: )>]) in S{<BoundMethod: <MethodValue: <Function: _create_from_pandas_with_arrow@498-557>>>}
dbg: execute: <BoundMethod: <MethodValue: <Function: _create_from_pandas_with_arrow@498-557>>> <TreeArguments: PythonNode(arglist, [<Name: data@702,63>, <Operator: ,>, <Name: schema@702,69>, <Operator: ,>, <Name: timezone@702,77>])>
dbg: infer_node <Name: df@556,15>@(556, 15) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: _create_from_pandas_with_arrow@498-557>>>)
dbg: context.goto <Name: df@556,15> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: _create_from_pandas_with_arrow@498-557>>>)): [<TreeNameDefinition: string_name=df start_pos=(554, 8)>]
dbg: infer_expr_stmt <ExprStmt: df = DataFrame(jdf, self._wrapped)@554,8> (<Name: df@554,8>)
dbg: infer_node PythonNode(atom_expr, [<Name: DataFrame@554,13>, PythonNode(trailer, [<Operator: (>, PythonNode(arglist, [<Name: jdf@554,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@554,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@554,33>])])]), <Operator: )>])])@(554, 13) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: _create_from_pandas_with_arrow@498-557>>>)
dbg: infer_node <Name: DataFrame@554,13>@(554, 13) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: _create_from_pandas_with_arrow@498-557>>>)
dbg: context.goto <Name: DataFrame@554,13> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: _create_from_pandas_with_arrow@498-557>>>)): [<MixedName: (<CompiledValueName: string_name=session>).DataFrame>]
dbg: context.names_to_types: [<MixedName: (<CompiledValueName: string_name=session>).DataFrame>] -> S{<MixedObject: <class 'pyspark.sql.dataframe.DataFrame'>>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: (>, PythonNode(arglist, [<Name: jdf@554,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@554,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@554,33>])])]), <Operator: )>]) in S{<MixedObject: <class 'pyspark.sql.dataframe.DataFrame'>>}
dbg: execute: <MixedObject: <class 'pyspark.sql.dataframe.DataFrame'>> <TreeArguments: PythonNode(arglist, [<Name: jdf@554,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@554,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@554,33>])])])>
dbg: execute result: S{<TreeInstance of <ClassValue: <Class: DataFrame@49-2204>>(<TreeArguments: PythonNode(arglist, [<Name: jdf@554,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@554,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@554,33>])])])>)>} in <MixedObject: <class 'pyspark.sql.dataframe.DataFrame'>>
dbg: infer_expr_stmt result S{<TreeInstance of <ClassValue: <Class: DataFrame@49-2204>>(<TreeArguments: PythonNode(arglist, [<Name: jdf@554,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@554,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@554,33>])])])>)>}
dbg: context.names_to_types: [<TreeNameDefinition: string_name=df start_pos=(554, 8)>] -> S{<TreeInstance of <ClassValue: <Class: DataFrame@49-2204>>(<TreeArguments: PythonNode(arglist, [<Name: jdf@554,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@554,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@554,33>])])])>)>}
dbg: Return reachable: <ReturnStmt: return df@556,8>
dbg: execute result: S{<TreeInstance of <ClassValue: <Class: DataFrame@49-2204>>(<TreeArguments: PythonNode(arglist, [<Name: jdf@554,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@554,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@554,33>])])])>)>} in <BoundMethod: <MethodValue: <Function: _create_from_pandas_with_arrow@498-557>>>
dbg: infer_node <Name: df@753,15>@(753, 15) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)
dbg: context.goto <Name: df@753,15> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)): [<TreeNameDefinition: string_name=df start_pos=(751, 8)>]
dbg: infer_expr_stmt <ExprStmt: df = DataFrame(jdf, self._wrapped)@751,8> (<Name: df@751,8>)
dbg: infer_node PythonNode(atom_expr, [<Name: DataFrame@751,13>, PythonNode(trailer, [<Operator: (>, PythonNode(arglist, [<Name: jdf@751,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@751,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@751,33>])])]), <Operator: )>])])@(751, 13) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)
dbg: infer_node <Name: DataFrame@751,13>@(751, 13) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)
dbg: context.goto <Name: DataFrame@751,13> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)): [<MixedName: (<CompiledValueName: string_name=session>).DataFrame>]
dbg: context.names_to_types: [<MixedName: (<CompiledValueName: string_name=session>).DataFrame>] -> S{<MixedObject: <class 'pyspark.sql.dataframe.DataFrame'>>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: (>, PythonNode(arglist, [<Name: jdf@751,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@751,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@751,33>])])]), <Operator: )>]) in S{<MixedObject: <class 'pyspark.sql.dataframe.DataFrame'>>}
dbg: execute: <MixedObject: <class 'pyspark.sql.dataframe.DataFrame'>> <TreeArguments: PythonNode(arglist, [<Name: jdf@751,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@751,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@751,33>])])])>
dbg: execute result: S{<TreeInstance of <ClassValue: <Class: DataFrame@49-2204>>(<TreeArguments: PythonNode(arglist, [<Name: jdf@751,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@751,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@751,33>])])])>)>} in <MixedObject: <class 'pyspark.sql.dataframe.DataFrame'>>
dbg: infer_expr_stmt result S{<TreeInstance of <ClassValue: <Class: DataFrame@49-2204>>(<TreeArguments: PythonNode(arglist, [<Name: jdf@751,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@751,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@751,33>])])])>)>}
dbg: context.names_to_types: [<TreeNameDefinition: string_name=df start_pos=(751, 8)>] -> S{<TreeInstance of <ClassValue: <Class: DataFrame@49-2204>>(<TreeArguments: PythonNode(arglist, [<Name: jdf@751,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@751,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@751,33>])])])>)>}
dbg: Return reachable: <ReturnStmt: return df@753,8>
dbg: execute result: S{<TreeInstance of <ClassValue: <Class: DataFrame@49-2204>>(<TreeArguments: PythonNode(arglist, [<Name: jdf@751,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@751,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@751,33>])])])>)>, <TreeInstance of <ClassValue: <Class: DataFrame@49-2204>>(<TreeArguments: PythonNode(arglist, [<Name: jdf@554,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@554,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@554,33>])])])>)>} in <BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>
dbg: trailer completion values: S{<TreeInstance of <ClassValue: <Class: DataFrame@49-2204>>(<TreeArguments: PythonNode(arglist, [<Name: jdf@751,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@751,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@751,33>])])])>)>, <TreeInstance of <ClassValue: <Class: DataFrame@49-2204>>(<TreeArguments: PythonNode(arglist, [<Name: jdf@554,23>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@554,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@554,33>])])])>)>}
dbg: infer_node <Name: object@49,16>@(49, 16) in MixedModuleContext(<MixedObject: <module 'pyspark.sql.dataframe' from '/home/zero32..>)
dbg: context.goto <Name: object@49,16> in (MixedModuleContext(<MixedObject: <module 'pyspark.sql.dataframe' from '/home/zero32..>)): [<StubName: string_name=object start_pos=(40, 6)>]
dbg: context.names_to_types: [<StubName: string_name=object start_pos=(40, 6)>] -> S{<ClassValue: <Class: object@40-70>>}
dbg: context.goto '__init__' in (<ClassValue: <Class: DataFrame@49-2204>>): [<ClassName: string_name=__init__ start_pos=(76, 8)>]
dbg: context.names_to_types: [<ClassName: string_name=__init__ start_pos=(76, 8)>] -> S{<MethodValue: <Function: __init__@76-86>>}
dbg: infer_node <Name: jdf@751,23>@(751, 23) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)
dbg: context.goto <Name: jdf@751,23> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)): [<TreeNameDefinition: string_name=jdf start_pos=(750, 8)>]
dbg: infer_expr_stmt <ExprStmt: jdf = self._jsparkSession.applySchemaToPythonRDD(jrdd.rdd(), schema.json())@750,8> (<Name: jdf@750,8>)
dbg: infer_node PythonNode(atom_expr, [<Name: self@750,14>, PythonNode(trailer, [<Operator: .>, <Name: _jsparkSession@750,19>]), PythonNode(trailer, [<Operator: .>, <Name: applySchemaToPythonRDD@750,34>]), PythonNode(trailer, [<Operator: (>, PythonNode(arglist, [PythonNode(atom_expr, [<Name: jrdd@750,57>, PythonNode(trailer, [<Operator: .>, <Name: rdd@750,62>]), PythonNode(trailer, [<Operator: (>, <Operator: )>])]), <Operator: ,>, PythonNode(atom_expr, [<Name: schema@750,69>, PythonNode(trailer, [<Operator: .>, <Name: json@750,76>]), PythonNode(trailer, [<Operator: (>, <Operator: )>])])]), <Operator: )>])])@(750, 14) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)
dbg: infer_node <Name: self@750,14>@(750, 14) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)
dbg: context.goto <Name: self@750,14> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)): [<ParamName: string_name=self start_pos=(587, 24)>]
dbg: context.names_to_types: [<ParamName: string_name=self start_pos=(587, 24)>] -> S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: _jsparkSession@750,19>]) in S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: context.goto <Name: self@222,8> in (AnonymousMethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)): [<InstanceExecutedParamName: string_name=self start_pos=(194, 17)>]
dbg: context.goto <Name: _jsparkSession@750,19> in (<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>): [<SelfName: string_name=_jsparkSession start_pos=(222, 13)>]
dbg: infer_expr_stmt <ExprStmt: self._jsparkSession = jsparkSession@222,8> (<Name: _jsparkSession@222,13>)
dbg: infer_node <Name: jsparkSession@222,30>@(222, 30) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)
dbg: infer_node PythonNode(and_test, [PythonNode(atom_expr, [<Name: self@216,15>, PythonNode(trailer, [<Operator: .>, <Name: _jvm@216,20>]), PythonNode(trailer, [<Operator: .>, <Name: SparkSession@216,25>]), PythonNode(trailer, [<Operator: .>, <Name: getDefaultSession@216,38>]), PythonNode(trailer, [<Operator: (>, <Operator: )>]), PythonNode(trailer, [<Operator: .>, <Name: isDefined@216,58>]), PythonNode(trailer, [<Operator: (>, <Operator: )>])]), <Keyword: and>, PythonNode(not_test, [<Keyword: not>, PythonNode(atom_expr, [<Name: self@217,28>, PythonNode(trailer, [<Operator: .>, <Name: _jvm@217,33>]), PythonNode(trailer, [<Operator: .>, <Name: SparkSession@217,38>]), PythonNode(trailer, [<Operator: .>, <Name: getDefaultSession@217,51>]), PythonNode(trailer, [<Operator: (>, <Operator: )>]), PythonNode(trailer, [<Operator: .>, <Name: get@217,71>]), PythonNode(trailer, [<Operator: (>, <Operator: )>]), PythonNode(trailer, [<Operator: .>, <Name: sparkContext@218,25>]), PythonNode(trailer, [<Operator: (>, <Operator: )>]), PythonNode(trailer, [<Operator: .>, <Name: isStopped@218,40>]), PythonNode(trailer, [<Operator: (>, <Operator: )>])])])])@(216, 15) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)
dbg: infer_node PythonNode(atom_expr, [<Name: self@216,15>, PythonNode(trailer, [<Operator: .>, <Name: _jvm@216,20>]), PythonNode(trailer, [<Operator: .>, <Name: SparkSession@216,25>]), PythonNode(trailer, [<Operator: .>, <Name: getDefaultSession@216,38>]), PythonNode(trailer, [<Operator: (>, <Operator: )>]), PythonNode(trailer, [<Operator: .>, <Name: isDefined@216,58>]), PythonNode(trailer, [<Operator: (>, <Operator: )>])])@(216, 15) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)
dbg: infer_node <Name: self@216,15>@(216, 15) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)
dbg: context.goto <Name: self@216,15> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)): [<ParamName: string_name=self start_pos=(194, 17)>]
dbg: Found param types for docstring: S{}
dbg: context.names_to_types: [<ParamName: string_name=self start_pos=(194, 17)>] -> S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: _jvm@216,20>]) in S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: context.goto <Name: self@214,8> in (AnonymousMethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)): [<InstanceExecutedParamName: string_name=self start_pos=(194, 17)>]
dbg: context.goto <Name: _jvm@216,20> in (<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>): [<SelfName: string_name=_jvm start_pos=(214, 13)>]
dbg: infer_expr_stmt <ExprStmt: self._jvm = self._sc._jvm@214,8> (<Name: _jvm@214,13>)
dbg: infer_node PythonNode(atom_expr, [<Name: self@214,20>, PythonNode(trailer, [<Operator: .>, <Name: _sc@214,25>]), PythonNode(trailer, [<Operator: .>, <Name: _jvm@214,29>])])@(214, 20) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)
dbg: infer_node <Name: self@214,20>@(214, 20) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)
dbg: context.goto <Name: self@214,20> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)): [<ParamName: string_name=self start_pos=(194, 17)>]
dbg: Found param types for docstring: S{}
dbg: context.names_to_types: [<ParamName: string_name=self start_pos=(194, 17)>] -> S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: _sc@214,25>]) in S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: context.goto <Name: self@212,8> in (AnonymousMethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)): [<InstanceExecutedParamName: string_name=self start_pos=(194, 17)>]
dbg: context.goto <Name: _sc@214,25> in (<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>): [<SelfName: string_name=_sc start_pos=(212, 13)>]
dbg: infer_expr_stmt <ExprStmt: self._sc = sparkContext@212,8> (<Name: _sc@212,13>)
dbg: infer_node <Name: sparkContext@212,19>@(212, 19) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)
dbg: context.goto <Name: sparkContext@212,19> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)): [<ParamName: string_name=sparkContext start_pos=(194, 23)>]
dbg: Found param types for docstring: S{}
dbg: context.names_to_types: [<ParamName: string_name=sparkContext start_pos=(194, 23)>] -> S{<CompiledObject: None>}
dbg: infer_expr_stmt result S{<CompiledObject: None>}
dbg: context.names_to_types: [<SelfName: string_name=_sc start_pos=(212, 13)>] -> S{<CompiledObject: None>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: _jvm@214,29>]) in S{<CompiledObject: None>}
dbg: context.goto <Name: _jvm@214,29> in (<CompiledObject: None>): []
warning: /path/to/jedi/venv/lib/python3.7/site-packages/pyspark/sql/session.py:214:29: E1 AttributeError: <CompiledObject: None> has no attribute <Name: _jvm@214,29>.
dbg: context.names_to_types: [] -> S{}
dbg: infer_expr_stmt result S{}
dbg: execute: <BoundMethod: <MethodValue: <Function: __getattribute__@61-62>>> <ValuesArguments: [S{<CompiledValue: <CompiledObject: '_jvm'>>}]>
dbg: infer_node <Name: Any@61,45>@(61, 45) in ClassContext(<ClassValue: <Class: object@40-70>>)
dbg: context.goto <Name: Any@61,45> in (ClassContext(<ClassValue: <Class: object@40-70>>)): [<TreeNameDefinition: string_name=Any start_pos=(6, 69)>]
dbg: context.names_to_types: [<TreeNameDefinition: string_name=Any start_pos=(6, 69)>] -> S{Any(Any)}
warning: Used Any - returned no results
dbg: execute result: S{} in <BoundMethod: <MethodValue: <Function: __getattribute__@61-62>>>
dbg: context.names_to_types: [<SelfName: string_name=_jvm start_pos=(214, 13)>] -> S{}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: SparkSession@216,25>]) in S{}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: getDefaultSession@216,38>]) in S{}
dbg: infer_trailer: PythonNode(trailer, [<Operator: (>, <Operator: )>]) in S{}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: isDefined@216,58>]) in S{}
dbg: infer_trailer: PythonNode(trailer, [<Operator: (>, <Operator: )>]) in S{}
dbg: infer_or_test types S{}
dbg: context.goto <Name: jsparkSession@222,30> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)): [<TreeNameDefinition: string_name=jsparkSession start_pos=(221, 16)>, <TreeNameDefinition: string_name=jsparkSession start_pos=(219, 16)>, <ParamName: string_name=jsparkSession start_pos=(194, 37)>]
dbg: infer_expr_stmt <ExprStmt: jsparkSession = self._jvm.SparkSession(self._jsc.sc())@221,16> (<Name: jsparkSession@221,16>)
dbg: infer_node PythonNode(atom_expr, [<Name: self@221,32>, PythonNode(trailer, [<Operator: .>, <Name: _jvm@221,37>]), PythonNode(trailer, [<Operator: .>, <Name: SparkSession@221,42>]), PythonNode(trailer, [<Operator: (>, PythonNode(atom_expr, [<Name: self@221,55>, PythonNode(trailer, [<Operator: .>, <Name: _jsc@221,60>]), PythonNode(trailer, [<Operator: .>, <Name: sc@221,65>]), PythonNode(trailer, [<Operator: (>, <Operator: )>])]), <Operator: )>])])@(221, 32) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)
dbg: infer_node <Name: self@221,32>@(221, 32) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)
dbg: context.goto <Name: self@221,32> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)): [<ParamName: string_name=self start_pos=(194, 17)>]
dbg: context.names_to_types: [<ParamName: string_name=self start_pos=(194, 17)>] -> S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: _jvm@221,37>]) in S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: context.goto <Name: self@214,8> in (AnonymousMethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)): [<InstanceExecutedParamName: string_name=self start_pos=(194, 17)>]
dbg: context.goto <Name: _jvm@221,37> in (<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>): [<SelfName: string_name=_jvm start_pos=(214, 13)>]
dbg: infer_expr_stmt <ExprStmt: self._jvm = self._sc._jvm@214,8> (<Name: _jvm@214,13>)
dbg: infer_expr_stmt result S{}
dbg: execute: <BoundMethod: <MethodValue: <Function: __getattribute__@61-62>>> <ValuesArguments: [S{<CompiledValue: <CompiledObject: '_jvm'>>}]>
warning: Used Any - returned no results
dbg: execute result: S{} in <BoundMethod: <MethodValue: <Function: __getattribute__@61-62>>>
dbg: context.names_to_types: [<SelfName: string_name=_jvm start_pos=(214, 13)>] -> S{}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: SparkSession@221,42>]) in S{}
dbg: infer_trailer: PythonNode(trailer, [<Operator: (>, PythonNode(atom_expr, [<Name: self@221,55>, PythonNode(trailer, [<Operator: .>, <Name: _jsc@221,60>]), PythonNode(trailer, [<Operator: .>, <Name: sc@221,65>]), PythonNode(trailer, [<Operator: (>, <Operator: )>])]), <Operator: )>]) in S{}
dbg: infer_expr_stmt result S{}
dbg: infer_expr_stmt <ExprStmt: jsparkSession = self._jvm.SparkSession.getDefaultSession().get()@219,16> (<Name: jsparkSession@219,16>)
dbg: infer_node PythonNode(atom_expr, [<Name: self@219,32>, PythonNode(trailer, [<Operator: .>, <Name: _jvm@219,37>]), PythonNode(trailer, [<Operator: .>, <Name: SparkSession@219,42>]), PythonNode(trailer, [<Operator: .>, <Name: getDefaultSession@219,55>]), PythonNode(trailer, [<Operator: (>, <Operator: )>]), PythonNode(trailer, [<Operator: .>, <Name: get@219,75>]), PythonNode(trailer, [<Operator: (>, <Operator: )>])])@(219, 32) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)
dbg: infer_node <Name: self@219,32>@(219, 32) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)
dbg: context.goto <Name: self@219,32> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)): [<ParamName: string_name=self start_pos=(194, 17)>]
dbg: context.names_to_types: [<ParamName: string_name=self start_pos=(194, 17)>] -> S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: _jvm@219,37>]) in S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: context.goto <Name: self@214,8> in (AnonymousMethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)): [<InstanceExecutedParamName: string_name=self start_pos=(194, 17)>]
dbg: context.goto <Name: _jvm@219,37> in (<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>): [<SelfName: string_name=_jvm start_pos=(214, 13)>]
dbg: infer_expr_stmt <ExprStmt: self._jvm = self._sc._jvm@214,8> (<Name: _jvm@214,13>)
dbg: infer_expr_stmt result S{}
dbg: execute: <BoundMethod: <MethodValue: <Function: __getattribute__@61-62>>> <ValuesArguments: [S{<CompiledValue: <CompiledObject: '_jvm'>>}]>
warning: Used Any - returned no results
dbg: execute result: S{} in <BoundMethod: <MethodValue: <Function: __getattribute__@61-62>>>
dbg: context.names_to_types: [<SelfName: string_name=_jvm start_pos=(214, 13)>] -> S{}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: SparkSession@219,42>]) in S{}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: getDefaultSession@219,55>]) in S{}
dbg: infer_trailer: PythonNode(trailer, [<Operator: (>, <Operator: )>]) in S{}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: get@219,75>]) in S{}
dbg: infer_trailer: PythonNode(trailer, [<Operator: (>, <Operator: )>]) in S{}
dbg: infer_expr_stmt result S{}
dbg: Found param types for docstring: S{}
dbg: infer_node <Keyword: None>@(194, 51) in ClassContext(<ClassValue: <Class: SparkSession@63-843>>)
dbg: context.names_to_types: [<TreeNameDefinition: string_name=jsparkSession start_pos=(221, 16)>, <TreeNameDefinition: string_name=jsparkSession start_pos=(219, 16)>, <ParamName: string_name=jsparkSession start_pos=(194, 37)>] -> S{<CompiledObject: None>}
dbg: infer_expr_stmt result S{<CompiledObject: None>}
dbg: context.names_to_types: [<SelfName: string_name=_jsparkSession start_pos=(222, 13)>] -> S{<CompiledObject: None>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: applySchemaToPythonRDD@750,34>]) in S{<CompiledObject: None>}
dbg: context.goto <Name: applySchemaToPythonRDD@750,34> in (<CompiledObject: None>): []
warning: /path/to/jedi/venv/lib/python3.7/site-packages/pyspark/sql/session.py:750:34: E1 AttributeError: <CompiledObject: None> has no attribute <Name: applySchemaToPythonRDD@750,34>.
dbg: context.names_to_types: [] -> S{}
dbg: infer_trailer: PythonNode(trailer, [<Operator: (>, PythonNode(arglist, [PythonNode(atom_expr, [<Name: jrdd@750,57>, PythonNode(trailer, [<Operator: .>, <Name: rdd@750,62>]), PythonNode(trailer, [<Operator: (>, <Operator: )>])]), <Operator: ,>, PythonNode(atom_expr, [<Name: schema@750,69>, PythonNode(trailer, [<Operator: .>, <Name: json@750,76>]), PythonNode(trailer, [<Operator: (>, <Operator: )>])])]), <Operator: )>]) in S{}
dbg: infer_expr_stmt result S{}
dbg: context.names_to_types: [<TreeNameDefinition: string_name=jdf start_pos=(750, 8)>] -> S{}
dbg: infer_node PythonNode(atom_expr, [<Name: self@751,28>, PythonNode(trailer, [<Operator: .>, <Name: _wrapped@751,33>])])@(751, 28) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)
dbg: infer_node <Name: self@751,28>@(751, 28) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)
dbg: context.goto <Name: self@751,28> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: createDataFrame@587-754>>>)): [<ParamName: string_name=self start_pos=(587, 24)>]
dbg: context.names_to_types: [<ParamName: string_name=self start_pos=(587, 24)>] -> S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: infer_trailer: PythonNode(trailer, [<Operator: .>, <Name: _wrapped@751,33>]) in S{<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>}
dbg: context.goto <Name: self@224,8> in (AnonymousMethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)): [<InstanceExecutedParamName: string_name=self start_pos=(194, 17)>]
dbg: context.goto <Name: _wrapped@751,33> in (<TreeInstance of <ClassValue: <Class: SparkSession@63-843>>(<TreeArguments: <Name: sc@181,43>>)>): [<SelfName: string_name=_wrapped start_pos=(224, 13)>]
dbg: infer_expr_stmt <ExprStmt: self._wrapped = SQLContext(self._sc, self, self._jwrapped)@224,8> (<Name: _wrapped@224,13>)
dbg: infer_node PythonNode(atom_expr, [<Name: SQLContext@224,24>, PythonNode(trailer, [<Operator: (>, PythonNode(arglist, [PythonNode(atom_expr, [<Name: self@224,35>, PythonNode(trailer, [<Operator: .>, <Name: _sc@224,40>])]), <Operator: ,>, <Name: self@224,45>, <Operator: ,>, PythonNode(atom_expr, [<Name: self@224,51>, PythonNode(trailer, [<Operator: .>, <Name: _jwrapped@224,56>])])]), <Operator: )>])])@(224, 24) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)
dbg: infer_node <Name: SQLContext@224,24>@(224, 24) in MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)
dbg: context.goto <Name: SQLContext@224,24> in (MethodExecutionContext(<BoundMethod: <MethodValue: <Function: __init__@194-234>>>)): [<TreeNameDefinition: string_name=SQLContext start_pos=(211, 40)>]
speed: import (<Name: pyspark@211,13>, <Name: sql@211,21>, <Name: context@211,25>) MixedModuleContext(<MixedObject: <module 'pyspark.sql.session' from '/home/zero323/..>) 0.3581387996673584
dbg: End: complete
Environment:
Code:
Traceback