Unsafe links can potentially be generated by LLMs in the Mattermost AI plugin. For example a prompt injection could occour that tells the AI to create a malicious link that contains private data in the query paramenter: http://badserver.com/all/my?private=data
This PR adds a post prop and rendering options to mitigate the risk of links from prompt injections.
Summary
Unsafe links can potentially be generated by LLMs in the Mattermost AI plugin. For example a prompt injection could occour that tells the AI to create a malicious link that contains private data in the query paramenter:
http://badserver.com/all/my?private=data
This PR adds a post prop and rendering options to mitigate the risk of links from prompt injections.
Related PRs
Server: https://github.com/mattermost/mattermost/pull/26098 Webapp: https://github.com/mattermost/mattermost/pull/26129 Mobile: https://github.com/mattermost/mattermost-mobile/pull/7815