MaibornWolff / codecharta

CodeCharta visualizes multiple code metrics using 3D tree maps.
https://maibornwolff.github.io/codecharta/
BSD 3-Clause "New" or "Revised" License
190 stars 30 forks source link

Wrong calculation for initial file size #3111

Open shaman-apprentice opened 1 year ago

shaman-apprentice commented 1 year ago

Bug

Expected Behavior

Given a gzipped file in url When loading the page Then the calculated file size is correct.

Given a cc.json file in url When loading the page Then the calculated file size is correct.

Actual Behavior

The calculated file size is always 13 for gzipped files and 15 for not gzipped files.

Steps to Reproduce the Problem

  1. copy app/codeCharta/assets to dist/webpack which is the static file server in dev mode
  2. run in dev mode (npm run dev)
  3. add a debugger to the calculation
  4. open in browser e.g. http://localhost:3000/?file=assets%2Fsample3.cc.json, http://localhost:3000/?file=assets%2Fsample2.cc.json&file=assets%2Fsample3.cc.json or http://localhost:3000?file=assets%2Foutput.cc.json.gz

Technical analysis

Calculated file size for gzipped files comes from response.body.toString().length . response.body is a blob in that case. Therefore response.body.toString() is always the string "[object Blob]" and its length is 13

Calculated file size for not gzipped files comes from response.body.toString().length . response.body is an object in that case. Therefore response.body.toString() is always the string "[object Object]" and its length is 15

Note that previous urlExtractor.spec.ts mocked the http get request falsely as below. Therefore some tests had a different file size. But in reality initially it receives always a blob or an object. Also note that this complete parsing gets only called for initial map loading. Manual uploading a map doesn't utilize the urlExtractor.

$http.get = jest.fn().mockImplementation(async () => {
            return { data: '{"checksum": "fake-md5", "data": {"apiVersion": 1.3, "nodes": []}}', status: 200 }
        })

Open question

Do we need the calculation, when it currently is a "random small hardcoded number" and no one has noticed?

What is actually the correct calculation? Should it be the length of (unzipped) text content of file? It seems to be used for map size resolution somehow? (see getMapResolutionScaleFactor within codeMapHelper.ts.

Specifications

ce-bo commented 1 year ago

We need the correct file size to handle the visualization of big maps and to avoid crashes of CC due to performance issues. Hence, it should be fixed.

In more detail: We try to scale down the resolution of the rendered city map according to the map size to prevent rendered maps with a resolution, for instance, of 50.000 x 50.000 pixels.

BridgeAR commented 1 year ago

@ce-bo we probably do not need the file size for that as it is actually not representing the actual size of the map (long metric names and lots of metrics with longer values would not be an issue while vewer metrics with short names might be an issue if the map is big and the file size is identical. The implementation of @RomanenkoVladimir takes an absolute value for the calculated size and makes sure the map does not become any bigger. That should suffice in this case. Is the file size maybe also used for hash value? We could in fact add that to reduce computation time (no need to calculate a hash in case a file size is not identical).

ce-bo commented 1 year ago

I was not aware, that this will change in the implementation of @RomanenkoVladimir, nice! So, we do not need it anymore. Your suggestion to check the file size before calculating the map hash is fine though. @RomanenkoVladimir did you already remove the file size calculation in #3077?

RomanenkoVladimir commented 1 year ago

I am not sure we can remove that, my issue only addresses TreeMap, but what about StreetView maps?

ce-bo commented 1 year ago

AFAIK, we do not use the scaling for StreetMaps.

RomanenkoVladimir commented 1 year ago

but we call the mapResoultionScaling in StreetView algorithm and flatly removing it leads to crashes in firefox

RomanenkoVladimir commented 1 year ago

I think we need to investigate this once #3077 is merged

BridgeAR commented 1 year ago

Seems like this is unblocked now? :tada:

ce-bo commented 1 year ago

Seems like this is unblocked now? 🎉

Indeed. I would like to check, if we can just remove it or fix it to use it as a pre-condition for calculating the md5 hashes (map checksum). I increased the priority accordingly.