colav / impactu

Colav Impactu Issues and Documentation
BSD 3-Clause "New" or "Revised" License
0 stars 0 forks source link

procesar los datos para lo de SDS #112

Open omazapa opened 3 weeks ago

omazapa commented 3 weeks ago
omazapa commented 3 days ago

https://github.com/colav/sds-backend/blob/main/sds/plugins/InstitutionsApp.py#L618

En aff no tiene ese estructura

https://github.com/colav/sds-backend/blob/main/sds/plugins/AuthorsApp.py#L530

https://github.com/colav/sds-backend/blob/main/sds/plugins/SubjectsApp.py#L894

No tenemos esos datos, tomar de los viejos y cruzar con COD_RH ya que eso debe venir de cvlac.

Busqueada por tema, ej: cirugia.

https://github.com/colav/sds-backend/blob/main/sds/plugins/DocumentsApp.py#L77

http://172.22.0.4/app/groups?id=665e0b60aa7c2077adf683d3

sds_backend | File "/usr/local/lib/python3.8/dist-packages/sds/plugins/GroupsApp.py", line 693, in app_groups sds_backend | coauthors=self.get_coauthors(idx,start_year,end_year) sds_backend | File "/usr/local/lib/python3.8/dist-packages/sds/plugins/GroupsApp.py", line 313, in get_coauthors sds_backend | for reg in self.colav_db["works"].aggregate(pipeline,allowDiskUse=True): sds_backend | File "/usr/local/lib/python3.8/dist-packages/pymongo/collection.py", line 2392, in aggregate sds_backend | return self._aggregate(pipeline, sds_backend | File "/usr/local/lib/python3.8/dist-packages/pymongo/collection.py", line 2293, in _aggregate sds_backend | result = sock_info.command( sds_backend | File "/usr/local/lib/python3.8/dist-packages/pymongo/pool.py", line 570, in command sds_backend | return command(self.sock, dbname, spec, slave_ok, sds_backend | File "/usr/local/lib/python3.8/dist-packages/pymongo/network.py", line 148, in command sds_backend | helpers._check_command_response( sds_backend | File "/usr/local/lib/python3.8/dist-packages/pymongo/helpers.py", line 155, in _check_command_response sds_backend | raise OperationFailure(msg % errmsg, code, response) sds_backend | pymongo.errors.OperationFailure: Total size of documents in affiliations matching pipeline { $match: { $and: [ { _id: { $eq: ObjectId('665e0a9caa7c2077adf62c7f') } }, {} ] } } exceeds maximum document size

No es un problems de datos el backend saca un error por exceder el tamaño del document en el aggregate