Closed vasumur closed 1 year ago
Here is what I am doing now. Is there a utility function to convert the decoded string into a blob (binary?) before inserting?
Appreciate the help.
query = "INSERT INTO table_x
(create_date, create_user, update_date , update_user, work_effort_id, attachment_seq_number, file_name, file_desc, file_path, attachment_lob)
VALUES (:1,:2,:3,:4,:5,:6,:7,:8,:9,:10)"
dt = Module.Util.Date.naivedatetime_local_now() |> NaiveDateTime.truncate(:second)
ret =
SsiV2.InsiteRepo.query(query, [
dt,
"user",
dt,
"user",
work_effort_id,
idx,
attachment[:file_name],
attachment[:file_desc],
attachment[:file_path],
attachment_lob |> Base.decode64!()
])
You can insert large binary, even 100MB.
Describe parameter as binary 80#issuecomment-622949869
lob MUST be last in parameter list 80#issuecomment-624300201
{:ok, bin} = File.read("Telegram.exe")
attachment_lob = %Ecto.Query.Tagged{value: bin, type: :binary}
ret =
SsiV2.InsiteRepo.query(query, [
dt,
"user",
dt,
"user",
work_effort_id,
idx,
attachment[:file_name],
attachment[:file_desc],
attachment[:file_path],
attachment_lob
])
Thank you @vstavskyi . Appreciate the quick turnaround. I was missing Tagging the binary. In my case the data comes base64 encoded and I am pushing it to the table into a blob field.
Hi @vstavskyi,
I see details on #80, but not sure if the later version of the package provided the way to insert/update blobs.
I was able to insert new blobs which I get as base64 encoded data from the client using oracle functions, but there is a limit of 32k bytes. Wondering if there is a solution available in this package which works for a bigger blobs.
The 32k insert was working as i was executing the clob using the SQL function to convert the encoded string into raw
utl_encode.base64_decode(utl_raw.cast_to_raw(_encodeddata)
Thanks