ChristianTremblay / pyhaystack

Pyhaystack is a module that allow python programs to connect to a haystack server project-haystack.org. Connection can be established with Niagara Platform running the nhaystack, Skyspark and Widesky. For this to work with Anaconda IPython Notebook in Windows, be sure to use "python setup.py install" using the Anaconda Command Prompt in Windows. If not, module will be installed for System path python but won't work in the environment of Anaconda IPython Notebook. You will need hszinc 1.3+ for this to work.
Apache License 2.0
74 stars 32 forks source link

Breaking change from SkySpark 3.0.27 to 3.0.28 #108

Open jdechalendar opened 3 years ago

jdechalendar commented 3 years ago

The following snippet

session = pyhaystack.connect(
  implementation="skyspark",
  uri=SKYSPARK_URL,
  username=SKYSPARK_USERID,
  password=SKYSPARK_PWD,
  project=PROJECT,
  grid_format=hszinc.MODE_JSON,
)

print(session.find_entity("site"))
print(session.find_entity("site").result)

produces the following output

  <FindEntityOperation failed>
Traceback (most recent call last):
  File "test.py", line 31, in <module>
    print(session.find_entity("site").result)
  File "/home/jdechale/.venv/py38/lib64/python3.8/site-packages/pyhaystack/util/state.py", line 100, in result
    self._result.reraise()
  File "/home/jdechale/.venv/py38/lib64/python3.8/site-packages/pyhaystack/util/asyncexc.py", line 29, in reraise
    reraise(*self._exc_info)
  File "/home/jdechale/.venv/py38/lib64/python3.8/site-packages/six.py", line 703, in reraise
    raise value
  File "/home/jdechale/.venv/py38/lib64/python3.8/site-packages/pyhaystack/client/ops/entity.py", line 66, in _on_read
    entity_id = entity_ref.name
AttributeError: 'dict' object has no attribute 'name'

with a skyspark server running 3.0.28 but was working with 3.0.27.

If I remove the grid_format argument (which I am guessing defaults to zinc for the skyspark implementation), then the issue disappears. For our application, parsing json was much faster than zinc, so we would prefer to keep that.

I'd be happy to do a bit more digging to try to resolve this, but it would be great to get a few pointers. It looks like entity_ref here is not expected to be a dict.

Does someone have thoughts on what is happening here?

sjlongland commented 3 years ago

id should only ever be a Ref, which last time I checked, cannot be represented by a JSON object. It'd be worth turning on full logging and inspecting what the server actually spat out.

jdechalendar commented 3 years ago

Thanks for replying back soon. I was actually just looking into more detailed logs.

It looks like before the update, the server returned a Ref for id. After the update, the server returns a dictionary with a field _kind that has value 'ref': {'_kind': 'ref', 'val': 'IDENTIFIER', 'dis': 'HUMAN READABLE NAME'}

sjlongland commented 3 years ago

Right, so their application/json is now the previously discussed application/vnd.haystack.v1+json which is described here.

Clearly they don't use Semantic Versioning, because a jump from 3.0.27 to 3.0.28 would not imply such a drastic change.

jdechalendar commented 3 years ago

I wonder whether this is intended behavior on the Skyspark side actually. From the blog post you referenced, I understand that you should only be getting the "Hayson" format if you specify application/vnd.haystack.v1+json in the headers. I am explicitly asking for application/json, in which case I should be getting Zinc JSON encoded data according the blog post.

If I try asking for application/vnd.haystack.v1+json in the headers I get Unsupported Accept type: application/vnd.haystack.v1+json

ChristianTremblay commented 3 years ago

C'est pas gentil....

jdechalendar commented 3 years ago

Is there a parser available for hayson, or would this need to be written from scratch?

ChristianTremblay commented 3 years ago

They have a little tendency to create new things instead of reusing existing stuff....
I can only hope that this will be close enough to JSON so we don't have to rebuild a new parser... and that specs won't change too often.

chatéchaudé

sjlongland commented 3 years ago

Standards (Credit: XKCD)

A parser (and dumper) for Hayson does not yet exist, see https://github.com/widesky/hszinc/issues/35.

sjlongland commented 3 years ago

Personally, we need to do two things:

  1. Support "Hayson"… that's mostly a hszinc bug, but lightly touches pyhaystack too.
  2. Complain to SkySpark about using application/json to represent Hayson. There should be a setting on the server somewhere to control this… and going forward, SkySpark should use the agreed application/vnd.haystack.v1+json to represent Hayson; leaving application/json for the legacy Haystack JSON encoding. Doing otherwise is an egregious bug IMO.
jdechalendar commented 2 years ago

Coming back to this a year later. I had missed the description of this change in the Skyspark 3.0.28 build notes (see below). I was able to get the previous format by making a manual request using application/vnd.haystack+json;version=3, but I am having trouble changing the corresponding header for the http client in your SkysparkScramHaystackSession because you are disabling the use of requests.Session in favor of just using the requests module directly here.

Do you have an idea of a workaround for passing headers?

Hayson

Haystack 4 includes a new JSON encoding named Hayson which was developed by WG 792. Based on community feedback this build makes the Haystack 4 encoding the default for SkySpark.

The ioReadJson(), ioReadJsonGrid(), and ioWriteJson() funcs will now all default to Haystack 4 JSON. They each accept an options Dict that allows you specify v3 JSON should be used instead:

readAll(site).ioWriteJson(`io/sites.json`, {v3})

You can force the io JSON funcs to default to v3 by setting the jsonVersion tag to "3" on the io ext rec in your project.

If you are using the Haystack REST API to invoke Ops, then the application/json MIME type will result in Haystack 4 JSON. You can request Haystac 3 JSON by using the MIME type application/vnd.haystack+json;version=3 MIME. You can change the host default by applying the jsonVersion: 3 tag to your api SysMod config.

sjlongland commented 2 years ago

Not easily… See pyhaystack was designed to the previous contract, which SkySpark decided to break. That is on them. I'd work on it, but I have limited time to do this.

jdechalendar commented 2 years ago

Not easily… See pyhaystack was designed to the previous contract, which SkySpark decided to break. That is on them. I'd work on it, but I have limited time to do this.

Are you sure they broke the contract here? From the Haystack docs, Hayson is now the standard that should be returned by application/json. Json v3 is still supported for backwards compatibility, but is no longer the default.

Passing a different Accept header seems like a pretty easy way to fix this issue. Do you remember why requests.Session was disabled for pyhaystack's Skyspark client? Would re-enabling it open a whole other can of worms?

If you do not have the bandwidth to support this, I understand. But this will mean moving away from pyhaystack for me, so I did want to check.

Thanks for your time

sjlongland commented 2 years ago

Are you sure they broke the contract here? From the Haystack docs, Hayson is now the standard that should be returned by application/json. Json v3 is still supported for backwards compatibility, but is no longer the default.

We (I) wrote to the standard that existed at the time. That was, application/json, generated the format that they now call "JSON v3". They chose to change this, thereby no longer respecting what was the agreed upon standard → breech of contract.

If they wanted to use a different content-type for the new Hayson standard, that'd be fine, it'd be backward-compatible with existing clients, and we could update those clients to also support the new standard… no problems. This is not what Skyspark did.

Now, I'm not saying pyhaystack can't be fixed, and I'm sure a pull request would be reviewed and merged, I am saying that I cannot personally work on it. I'd have to convince management at my workplace (who maintain and sell a competing product) to permit me to spend some time to fix this issue. Unilateral decisions on this are not possible as I am under a non-compete clause. Any work otherwise by myself would have to wait until such clauses lapsed which would take years assuming I quit my job now.

The pyhaystack HTTP interface is intended to be abstracted, so in addition to using requests, other HTTP clients can be used as well (e.g. aiohttp, tornado) if suitable code is written to support them.

It should be possible to allow switching of the Accept header for Skyspark, but I also worry what such a change will do to backward compatibility, as not everyone will be running the latest version of Skyspark.

jdechalendar commented 2 years ago

Understood. Does your comment about management at your workplace also mean that general maintenance of the pyhaystack library will be discontinued?

Thanks for all the work you have done on this so far.

ChristianTremblay commented 2 years ago

It is not planned to be deprecated. It is just that we built pyhaystack .... not pyskyspark. @sjlongland uses it on widesky, a competing product and I use it with Niagara4 nhaystack.

Most of the work required has to be done in hszinc, the parser used by pyhaystack. If I change the header in pyhaystack alone, hszinc will complain about not knowing what grid format to be used. Both libraries need to be updated in a way that won't break other implementation (Widesky and Niagara)

I agree with Stuart that application/json should have been kept for standard json. This change creates a situation where you ask for something common and standard and you get something else. Clearly they made a move, forgetting other people using haystack outside of Skyspark. Haystack was supposed to be an open standard... politically, it's highly tied to one major player.

Now as Stuart told, pyhaystack is not closed source. We would love getting PR from people using Skyspark. I don't see why we should end up with multiple different tools to deal with haystack on different platforms. Which is the former reason why Stuart joined me in my effort of building this tool at the beginning (thanks to him, as I wouldn't have gone very far without him). The other question is Haystack 4... but let's keep that for another time.