Open maxjf1 opened 10 months ago
Since i can't wait for a fix, my workaround was to divide the JSON into smaller chunks, and calling multiple velocity.render
to render it. Basically i render an fragment of the JSON on an script tag and store on an global variable. In the next fragment, i join them both until there is nomore JSON to render.
On the gtmScript file, i simple Parse the stored JSON
Example:
// gtmhooks.js file
function htmlHead(pdict) {
if (gtmHelpers.isEnabled) {
// ...
// Break JSON in slices of 10000 characters
for (var i = 0; i < Math.ceil(ga4datalayerJSON.length / 10000); i++) {
velocity.render(
"$velocity.remoteInclude('GTM-HtmlHeadEvent', 'ga4datalayer', $ga4datalayer)",
{
velocity: velocity,
ga4datalayer: ga4datalayerJSON.substr(i * 10000, 10000)
}
);
}
The ISML file rendered by the above velocity.render
:
<iscomment> Render a fragment of the JSON event </iscomment>
<script>
window.ga4DataLayerEvent = window.ga4DataLayerEvent || '';
window.ga4DataLayerEvent+=`<isprint value="${pdict.ga4datalayer}" encoding="off"/>`
</script>
then, in the gtmScript.isml file:
<script>
//...
// Join all fragments of the JSON event and decode it
var ga4DataLayerEvent = window.ga4DataLayerEvent ? JSON.parse(window.ga4DataLayerEvent): false;
</script>
This will result in the following html rendered:
Not sure if this is the best solution, but i can make an PR for it. I was thinking if there is an solution where i don't need to pass all the JSON to velocity as argument, instead, just pass the needed data to generate it. Another possible solution was to compress the JSON and decompress it when rendering. In my case, it goes from 17000 characters to 2000 after compressing, but this might be CPU intensive since it need to be done to all webpages
When passing arguments to velocity render, the
datalayer
andga4datalayer
may become too big and breaks velocity rendering.This can happen in product listing pages, when you increase the ammount of products shown (in my case, above 45 products breaks it).
I couldn't find it, but it seems that there is a limit on the size of the arguments passed to the wainclude / velocity.render. I calculate the limit is something between 15.000/16.000 characters on the URL argument. I'm contacting salesforce support to check it.
This results in the following error on logcenter: (i replaced the JSON data chunks with
[...tooMuchData]
)Steps to reproduce:
On your SFCC project with GTM activated, increase the ammount of products shown on an search page or category page (try 100 products). This will generate the error above and breaks GTM rendering.
Function causing the issue https://github.com/redvanworkshop/sfcc-plugin-gtm/blob/b58bfc48998a079dfe7be33e839b8fa1650b19bd/cartridges/plugin_gtm/cartridge/scripts/hooks/gtm/gtmhooks.js#L11-L27