Open YANG-DB opened 1 year ago
I am not sure whether spark sql support prepare statement or not. Need to double confrim. It may have some concern to support prepare statement from security perspective. https://www.securityjourney.com/post/how-to-prevent-sql-injection-vulnerabilities-how-prepared-statements-work#:~:text=A%20prepared%20statement%20is%20a,safely%2C%20preventing%20SQL%20Injection%20vulnerabilities.
Not sure, is it something client can achive?
val column = "time"
val day = "2023-05-12"
s"(select * from table where $column > '$day')"
I was initially pleased to switch to SQL for simpler queries.
However, since prepared statements aren’t supported, I can’t be sure whether the risk of SQL injection is fully mitigated.
Looks like I’ll have to switch back to Query DSL after all… 😔
Is your feature request related to a problem? Create a 'Prepared Statement Like' API that will allow users to submit a DDL / DML query to a search template (similar to the existing DSL search template API )
What solution would you like? This solution will allow creation of both DML / DDL SQL queries that would be parameterized templates with the ability to be called by users (similar to RDBMS's prepared statements support)
The next API will be used (similar to the existing DSL search query API)
This command defines a search template to generate a view by using a parameterized query. The {{client_ip}} in the query is replaced by the value 10.40.40.10.:
After the search template works the way you want it to, you can save the source of that template as a script, making it reusable for different input parameters.
When saving the search template as a script, you need to specify the lang parameter as mustache:
Now you can reuse the template by referring to its id parameter. You can reuse this source template for different input values.
What alternatives have you considered? N/A
Do you have any additional context?