Open-MBEE / exec-cameo-mdk

Cameo plugin for MMS sync and DocGen
https://www.openmbee.org/
Apache License 2.0
52 stars 29 forks source link

ViewPoint content not rendering in VE 4.0.6 #285

Open SteveHespelt opened 2 months ago

SteveHespelt commented 2 months ago

Describe the bug Not sure if this clear or concise but here goes. I'll pose the question first, then observations that led to the question.

Using MDK 5.1.3, will using TeamworkCloud projects that have previously been the source for MMS 3.4.2 /MDK4 importing, have issues such as viewpoint exposed content from used models not being pushed to MMS-4, therefore not rendering in VE 4? I'm posting this issue here because the below error messages appear while using MDK to push the models into a new MMS-4.0.20 setup. VE is where the issue manifests.

The msosa.log contains:

  [INFO] MMS Request [POST] https://art002-test.serc.stevens.edu:8444/projects/PROJECT-8dbde512-5131-4949-a179-86f22f26cd6b/refs/master/elements?overwrite=true
gov.nasa.jpl.mbee.mdk.http.ServerException: 500 Server Error
    at gov.nasa.jpl.mbee.mdk.mms.MMSUtils.sendMMSRequest(MMSUtils.java:305)
    at gov.nasa.jpl.mbee.mdk.mms.MMSUtils.sendMMSRequest(MMSUtils.java:318)
    at gov.nasa.jpl.mbee.mdk.mms.actions.CommitClientElementAction.lambda$request$0(CommitClientElementAction.java:146)
    at gov.nasa.jpl.mbee.mdk.util.TaskRunner.lambda$runWithProgressStatus$1(TaskRunner.java:61)

FYI: The MSoSA GUILog windows doesn't not show any messages regarding the above 500 exception.

The MMS-4.0.20 log contains the following:

2024-08-07T20:14:28,500Z [https-jsse-nio-8080-exec-9] ERROR o.o.m.c.services.CameoNodeService - Error in commitChanges:
org.openmbee.mms.core.exceptions.InternalErrorException: failure in bulk execution:
[1881]: index [project-8dbde512-5131-4949-a179-86f22f26cd6b_node], type [_doc], id [494a5daa-dcfd-44cf-9964-2b9ccb86086a], message [ElasticsearchException[Elasticsearch exception [type=mapper_parsing_exception, reason=failed to parse field [packagedElementIds] of type [keyword] in document with id '494a5daa-dcfd-44cf-9964-2b9ccb86086a'. Preview of field's value: '{ownedCommentIds=[], mdExtensionsIds=[], owningLowerId=null, appliedStereotypeInstanceId=_18_5_3_8db028d_1526068215613_864996_10382_asi, templateParameterId=null, type=LiteralString, ownerId=_18_5_3_8db028d_1526068093335_779711_9159, owningPropertyId=null, clientDependencyIds=[], owningTemplateParameterId=null, owningUpperId=null, syncElementId=null, owningSlotId=null, owningPackageId=_18_5_3_8db028d_1526068093335_779711_9159, id=_18_5_3_8db028d_1526068215613_864996_10382, supplierDependencyIds=[], value=2020-12-01T13:07:31-05:00, _appliedStereotypeIds=[], nameExpression=null, visibility=null, documentation=, owningParameterId=null, owningInstanceSpecId=null, name=, typeId=null, owningConstraintId=null}']]; nested: ElasticsearchException[Elasticsearch exception [type=illegal_state_exception, reason=Can't get text on a START_OBJECT at 1:1037]];]
[4706]: index [project-8dbde512-5131-4949-a179-86f22f26cd6b_node], type [_doc], id [387ab644-7f76-41fc-9941-b117f6cab8e5], message [ElasticsearchException[Elasticsearch exception [type=mapper_parsing_exception, reason=failed to parse field [packagedElementIds] of type [keyword] in document with id '387ab644-7f76-41fc-9941-b117f6cab8e5'. Preview of field's value: '{ownedCommentIds=[], mdExtensionsIds=[], owningLowerId=null, appliedStereotypeInstanceId=_18_5_3_8db028d_1526068215613_115840_10380_asi, templateParameterId=null, type=LiteralString, ownerId=_18_5_3_8db028d_1526068093334_976932_9158, owningPropertyId=null, clientDependencyIds=[], owningTemplateParameterId=null, owningUpperId=null, syncElementId=null, owningSlotId=null, owningPackageId=_18_5_3_8db028d_1526068093334_976932_9158, id=_18_5_3_8db028d_1526068215613_115840_10380, supplierDependencyIds=[], value=2017-11-08T13:06:13-05:00, _appliedStereotypeIds=[], nameExpression=null, visibility=null, documentation=, owningParameterId=null, owningInstanceSpecId=null, name=, typeId=null, owningConstraintId=null}']]; nested: ElasticsearchException[Elasticsearch exception [type=illegal_state_exception, reason=Can't get text on a START_OBJECT at 1:1039]];]
        at org.openmbee.mms.elastic.utils.BulkProcessor.bulkBatchRequests(BulkProcessor.java:53)
        at org.openmbee.mms.elastic.utils.BulkProcessor.clear(BulkProcessor.java:38)

And the Elasticsearch container's log contains:

{"type": "server", "timestamp": "2024-08-07T20:14:27,309Z", "level": "DEBUG", "component": "o.e.a.b.TransportShardBulkAction", "cluster.name": "docker-cluster", "node.name": "
9a3fe941f53c", "message": "[project-8dbde512-5131-4949-a179-86f22f26cd6b_node][0] failed to execute bulk item (index) index {[project-8dbde512-5131-4949-a179-86f22f26cd6b_node
][_doc][494a5daa-dcfd-44cf-9964-2b9ccb86086a], source[_na_]}", "cluster.uuid": "MGm1dleFSQyxsWOejEIlbw", "node.id": "cztNWQy0SHGW7DukkdgPzg" ,
"stacktrace": ["org.elasticsearch.index.mapper.MapperParsingException: failed to parse field [packagedElementIds] of type [keyword] in document with id '494a5daa-dcfd-44cf-996
4-2b9ccb86086a'. Preview of field's value: '{ownedCommentIds=[], mdExtensionsIds=[], owningLowerId=null, appliedStereotypeInstanceId=_18_5_3_8db028d_1526068215613_864996_10382
_asi, templateParameterId=null, type=LiteralString, ownerId=_18_5_3_8db028d_1526068093335_779711_9159, owningPropertyId=null, clientDependencyIds=[], owningTemplateParameterId=null, owningUpperId=null, syncElementId=null, owningSlotId=null, owningPackageId=_18_5_3_8db028d_1526068093335_779711_9159, id=_18_5_3_8db028d_1526068215613_864996_10382, supplierDependencyIds=[], value=2020-12-01T13:07:31-05:00, _appliedStereotypeIds=[], nameExpression=null, visibility=null, documentation=, owningParameterId=null, owningInstanceSpecId=null, name=, typeId=null, owningConstraintId=null}'",

I'm guessing that the root cause of the issue is related to the Elasticsearch exception: MapperParsingException: failed to parse field [packagedElementIds] in the above log output. And that perhaps the JSON content isn't exactly what the parser is able to handle, due to earlier usage with MMS-3.4.2 ?

I know about the MMS-3 to MMS-4 migration draft but I'm not migrating the MMS-3 data in this scenerio, just using the same TWC project. But as it's a project used as the source for MMS-3.4.2, perhaps the only way to use it is to use the migration sequence? Thanks, -steve

To Reproduce Steps to reproduce the behavior:

  1. Use a TWC 19sp3 project that has already been pushed to a MMS-3.4.2 env.
  2. Follow the steps in [https://docs.openmbee.org/projects/mdk/en/latest/initialization.html#enable-mbee-integration]
  3. Login to VE 4.0.6, open the ProjectA-Document project.
  4. Expand the view structure to drill-down into where ProjectA model content should be via the viewpoint definitions.

Expected behavior The ProjectA-Document model that uses the ProjectA model results in the content from ProjectA being rendered by VE

Screenshots Observations above from msosa, MMS, Elasticsearch logs.

Environment:

Additional context Add any other context about the problem here.

dlamoris commented 2 months ago

This has to do with the json serialization process of mdk to mms - elements that are ValueSpecifications are inlined in the place where they are referenced (usually defaultValue of properties or value of slots) - this is accounted for in the mms elastic mapping, but sometimes we've seen an orphan valuespec being housed in a package with nothing pointing to it (and also it's invisible in the cameo client by default). From the logs you can find the id of the offending element where it says "Preview of field's value", ex. _18_5_3_8db028d_1526068215613_864996_10382, remove those elements and the model push should work.

there is an option to Log JSON in the Options -> Environment -> MDK, if it's set to true it'll log every request and reply to mms server and the error would show up in the response

dlamoris commented 2 months ago

this is a simple jython macro that'll remove any offending value specs in the model (assuming these are orphans and actual model values are not in packages) that you can run to 'clean' the model first


from com.nomagic.magicdraw.core import Application
from com.nomagic.magicdraw.openapi.uml import SessionManager
from com.nomagic.uml2.ext.magicdraw.classes.mdkernel import ValueSpecification
from com.nomagic.uml2.ext.magicdraw.classes.mdkernel import Package

deleteCount = 0
bads = []
def replaceElementsRecursively(element):
    global deleteCount
    global bads
    for ownedElement in element.getOwnedElement():
        if isinstance(ownedElement, Package):
            replaceElementsRecursively(ownedElement)
        if isinstance(ownedElement, ValueSpecification):
            Application.getInstance().getGUILog().log(ownedElement.getID())
            bads.append(ownedElement)
            deleteCount += 1

project = Application.getInstance().getProject()
if (SessionManager.getInstance().isSessionCreated(project)):
    SessionManager.getInstance().cancelSession(project)
SessionManager.getInstance().createSession(project, 'Fixing bad instance ')
replaceElementsRecursively(project.getModel())
for e in bads:
    e.dispose()
SessionManager.getInstance().closeSession(project)
Application.getInstance().getGUILog().log('[INFO] deleted ' + str(deleteCount) + ' element(s).')