Open okling opened 4 years ago
Hi Angel, Could you please try the new version of obspyDMT (2.2.7
). Indeed, there were some inconsistencies between obspyDMT and the latest version of obspy. Thank you for reporting this.
Here is the output:
=================================
Waveforms will be retrieved from:
BGR,ETH,EMSC,GEONET,GFZ,ICGC,INGV,IPGP,IRIS,ISC,KNMI,KOERI,LMU,NCEDC,NIEP,NOA,ODC,ORFEUS,RESIF,RASPISHAKE,SCEDC,TEXNET,USGS,USP
=================================
Event(s) are based on: NEIC_USGS
#Divisions: 0
Assembling 1 xml files...
1,
cleaning up the temporary folder.
0.394233689 sec to retrieve the event info form NEIC.
#N LAT LON DEP DATETIME MAG AUTH EV_ID FLY
--------------------------------------------------------------------------------
1 51.423 -178.026 34 2018-08-15T21:56:56 6.5 None 20180815_215656.a NAN
--------------------------------------------------------------------------------
Number of events/intervals: 1
Time for retrieving and saving the event info: 0:00:00
#Events after filtering: 1
=============
metadata mode
=============
initializing files and directories...DONE
check the availability: BGR
Hi Angel, Could you please try the new version of obspyDMT (
2.2.7
). Indeed, there were some inconsistencies between obspyDMT and the latest version of obspy. Thank you for reporting this.
Hi Kasra, thanks for the quick response! It is fixed in the master branch, but it is not fixed in the routing_client branch, which has the option --eida_token_path
. In the master branch, it gives this message: obspyDMT: error: no such option: --eida_token_path
. Could you implement this option to the master branch or also resolve the compatibility in the routing_client branch?
Thanks again!
Hi Kasra, I have found another problem with writing QuakeML catalog even in the master branch. So instead of a ML catalog catalog.ml
in the EVENTS-INFO folder, a file called catalog.ml.pkl
is written. When I'm trying to open (pickle) it, it gives me the following message:
>>> import obspy
>>> import pickle
>>> ev_load = open("catalog.ml.pkl","r")
>>> ev_pkl = pickle.load(ev_load)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/aling/anaconda3/envs/obspy/lib/python3.7/codecs.py", line 322, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0x80 in position 0: invalid start byte
If reading in binary,
>>> ev_load = open("catalog.ml.pkl","rb")
>>> ev_pkl = pickle.load(ev_load)
/home/aling/anaconda3/envs/obspy/lib/python3.7/site-packages/obspy/core/util/attribdict.py:97: UserWarning: Setting attribute "preferred_mag" which is not a default attribute ("event_descriptions", "comments", "picks", "amplitudes", "focal_mechanisms", "origins", "magnitudes", "station_magnitudes", "resource_id", "event_type", "event_type_certainty", "creation_info", "preferred_origin_id", "preferred_magnitude_id", "preferred_focal_mechanism_id").
warnings.warn(msg)
/home/aling/anaconda3/envs/obspy/lib/python3.7/site-packages/obspy/core/util/attribdict.py:97: UserWarning: Setting attribute "preferred_mag_type" which is not a default attribute ("event_descriptions", "comments", "picks", "amplitudes", "focal_mechanisms", "origins", "magnitudes", "station_magnitudes", "resource_id", "event_type", "event_type_certainty", "creation_info", "preferred_origin_id", "preferred_magnitude_id", "preferred_focal_mechanism_id").
warnings.warn(msg)
/home/aling/anaconda3/envs/obspy/lib/python3.7/site-packages/obspy/core/util/attribdict.py:97: UserWarning: Setting attribute "preferred_author" which is not a default attribute ("event_descriptions", "comments", "picks", "amplitudes", "focal_mechanisms", "origins", "magnitudes", "station_magnitudes", "resource_id", "event_type", "event_type_certainty", "creation_info", "preferred_origin_id", "preferred_magnitude_id", "preferred_focal_mechanism_id").
warnings.warn(msg)
Thanks for your help!
Hi Kasra,
I had an issue in writing event catalog (Quake ML) after updating Obspy.
I was trying to download waveforms using the following command:
But I received this error message and no ML file was written in the EVENT-INFO folder:
Here are my current packages:
The data set cannot be updated (e.g. adding more stations/channels). Any ideas on this? Thanks for your help!
Cheers, Angel