S-C-O-U-T / Pyadomd

A pythonic approach to query SSAS data models.
https://pyadomd.readthedocs.io/en/latest/index.html
Apache License 2.0
25 stars 6 forks source link

Unhandled exception when using Pyadomd as generator #17

Open ghost opened 2 years ago

ghost commented 2 years ago

I was trying to improve runtime of application and yield chunks from pyadomd, transform data and write it immediately. Unfortunately faced unhandled exception:

File "C:\PythonEnv\CustomerProfiles2\lib\site-packages\pyadomd\pyadomd.py", line 94, in fetchmany
    l.append(next(self.fetchone()))
  File "C:\PythonEnv\CustomerProfiles2\lib\site-packages\pyadomd\pyadomd.py", line 81, in fetchone
    while(self._reader.Read()):
Microsoft.AnalysisServices.AdomdClient.AdomdUnknownResponseException: The server sent an unrecognizable response.
   at Microsoft.AnalysisServices.AdomdClient.XmlaClient.ReadEndElementS(XmlReader reader, String name, String ns)
   at Microsoft.AnalysisServices.AdomdClient.XmlaDataReader.InternalRead()

Logic to reproduce:

def get_data():
  chunk_size = 10000
  path.append('\\Program Files\\Microsoft.NET\\ADOMD.NET\\150')
  from pyadomd import Pyadomd

  with Pyadomd(source) as conn:
      with conn.cursor().execute(self.resource) as cur:
          while True:
              rows = cur.fetchmany(chunk_size)
              row_count = len(rows)

              if not rows:
                  break

              yield rows

def write_data(data):
  some_write_method()

def run(self):
  for chunk in get_data():
      write_customers(customers)

All rows are written successfully (unless you catch locking conflict when Tabular model is refreshed), but at the end last iteration throws an exception.

Having spent some time on it, I got that issue comes from line 93-94 lines of pydomd.py when retrieving next item which is not existing. For example if you have 6 rows to extract and you put chunk_size 1 or 2 or 3 then no exception occurs, but if you put 4 or 5, then you got exception, since first chunk passes ok and then second chunk is not complete.