scrapinghub / portia

Visual scraping for Scrapy
BSD 3-Clause "New" or "Revised" License
9.3k stars 1.4k forks source link

[_GenericHTTPChannelProtocol,0,127.0.0.1] Unhandled Error #809

Open eromoe opened 7 years ago

eromoe commented 7 years ago

Hi, I was trying portia(docker version), and got this error after set samples.

 ^[2017-07-07 06:47:32.637842 [_GenericHTTPChannelProtocol,0,127.0.0.1] Unhandled Error                           
        Traceback (most recent call last):                                                                       
          File "/usr/local/lib/python2.7/dist-packages/autobahn/twisted/websocket.py", line 160, in _onMessageEnd
            self.onMessageEnd()                                                                                  
          File "/usr/local/lib/python2.7/dist-packages/autobahn/websocket/protocol.py", line 635, in onMessageEnd
            self._onMessage(payload, self.message_is_binary)                                                     
          File "/usr/local/lib/python2.7/dist-packages/autobahn/twisted/websocket.py", line 163, in _onMessage   
            self.onMessage(payload, isBinary)                                                                    
          File "/app/slyd/slyd/splash/ferry.py", line 317, in onMessage                                          
            wrap_callback, None, self._on_message, storage=storage, data=data)                                   
        --- <exception caught here> ---                                                                          
          File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 150, in maybeDeferred    
            result = f(*args, **kw)                                                                              
          File "/app/slyd/slyd/splash/ferry.py", line 75, in wrap_callback                                       
            result = callback(**parsed)                                                                          
          File "/app/slyd/slyd/splash/ferry.py", line 333, in _on_message                                        
            result = getattr(commands, command, lambda: None)()                                                  
          File "/app/slyd/slyd/splash/commands.py", line 75, in extract_items                                    
            c = ItemChecker(self, project, spider, sample)                                                       
          File "/app/slyd/slyd/splash/commands.py", line 293, in __init__                                        
            project=project)                                                                                     
          File "/app/slyd/slyd/splash/ferry.py", line 513, in open_spider                                        
            extractors, self.settings)                                                                           
          File "/app/slybot/slybot/spider.py", line 58, in __init__                                              
            settings, spec, item_schemas, all_extractors)                                                        
          File "/app/slybot/slybot/spider.py", line 226, in _configure_plugins                                   
            self.logger)                                                                                         
          File "/app/slybot/slybot/plugins/scrapely_annotations/annotations.py", line 89, in setup_bot           
            self.extractors.append(SlybotIBLExtractor(list(group)))                                              
          File "/app/slybot/slybot/plugins/scrapely_annotations/extraction/extractors.py", line 61, in __init__  
            for p, v in zip(parsed_templates, template_versions)                                                 
          File "/app/slybot/slybot/plugins/scrapely_annotations/extraction/extractors.py", line 70, in build_extr
action_tree                                                                                                      
            basic_extractors = ContainerExtractor.apply(template, basic_extractors)                              
          File "/app/slybot/slybot/plugins/scrapely_annotations/extraction/container_extractors.py", line 65, in 
apply                                                                                                            
            extraction_tree = cls._build_extraction_tree(containers)                                             
          File "/app/slybot/slybot/plugins/scrapely_annotations/extraction/container_extractors.py", line 144, in
 _build_extraction_tree                                                                                          
            parent = containers[parent_id]                                                                       
        exceptions.KeyError: u'7c2b-4a7e-b94f#parent'                                                            
umarmurtaza commented 6 years ago

did you find a solution to it?