kukhariev / ngx-uploadx

Angular Resumable Upload Module
https://github.com/kukhariev/ngx-uploadx
MIT License
43 stars 23 forks source link

chuked file on web api #454

Closed akshaybogar1984 closed 1 month ago

akshaybogar1984 commented 3 months ago

HI, I am getting this issue as I unable to post the document on web api. please help me I can see the payload but where I can get the chuked file ?

payload: {"name":"10_MB.MP4","mimeType":"video/mp4","size":10485760,"lastModified":1720538165694}

endpoint: @https://localhost:7177/api/WeatherForecast/Upload1?uploadType=uploadx`

web api issue

kukhariev commented 3 months ago

The backend must respond to the POST request by specifying the chunks upload url in the Location header. Then ngx-uploadx will send file chunks to this url.

https://github.com/kukhariev/node-uploadx/blob/master/proto.md#requests-overview https://developers.google.com/drive/api/guides/manage-uploads?#resumable

akshaybogar1984 commented 3 months ago

My requirement is, upload the file in chunks(20gb) to blob storage. It can be direct to blob storage to via web api using c#. Could you please the code c# code as web server

akshaybogar1984 commented 3 months ago

could you help me to work this on my machine ?

kukhariev commented 3 months ago

You can add uploaderClass: Tus to options

@Component({
  selector: 'app-upload-component',
  standalone: true,
  imports: [RouterOutlet, UploadxDirective],
  template: `
    <input type="file" [uploadx]="options" (state)="onUpload($event)" />
  `
})
export class AppUploadComponent {
  options: UploadxOptions = {
    endpoint: '[URL]',
    uploaderClass: Tus
  };
  onUpload(state: UploadState) {
    console.log(state);
  }
}

and use https://github.com/tusdotnet/tusdotnet

akshaybogar1984 commented 3 months ago

Thanks. But using front end as Angular and Backend as web api c#. So I would like to know when agnular pushes chunks to server how web api would recieved chunks in order to upload to blobl storage ?

akshaybogar1984 commented 3 months ago

Untitled

Could you please let us know where to set the Location header in angular to call the serveice?

kukhariev commented 3 months ago

Sorry, i have almost no knowledge of either C# or blob storage and don't have access to Azure :(

While I'm on vacation, I'm going to try to work towards doing blob storage support.

Is there currently a working server or any description of how it works?

For example:

[HttpPost]
public async Upload
{
  // validation
  // generate uid from JSON paylod metadata
  // construct the UploadChunk URL from uid
  // store metadata
  // create a new blob
  // send  UploadChunk Url in Location header
  //  return 201 Created
}

[HttpPut('{uid}')]
public async UploadChunk(string uid)
{
  // validation
  // Append chunk to Blob Storage
  // update metadata
  // send the range header and status 308 or 200 if it's the last chunk of the file.

}
akshaybogar1984 commented 3 months ago

I have server(web api c#) but unable to upload using ngx-uploadx.

kukhariev commented 3 months ago

@akshaybogar1984 , Here's a working example with direct upload from browser to blob storage. Url generation and blob creation in getFileUrl can be moved to server.

import { Uploader } from 'ngx-uploadx';

const env = {
  sasToken:
    'sv=2023-08-03&ss=bfqt&srt=sco&se=2030-12-12T08%3A11%3A41Z&sp=rwdlacup&sig=%2F9bU8%2FkdnDe2bVMRSVZWGJc5QlR58nvCEeCEfOXzjA0%3D',
  containerURL: 'http://127.0.0.1:10000/devstoreaccount1/container1'
};

/**
 *  Azure Blob Storage support
 * @example
 *   options: UploadxOptions = {
 *     allowedTypes: 'image/*,video/*',
 *     chunksize: 4 * 1024 * 1024,
 *     endpoint: `[containerURL]`,
 *     uploaderClass: BlobUploader
 *   };
 */
export class BlobUploader extends Uploader {
  override async getFileUrl(): Promise<string> {
    const headers = {
      'x-ms-version': '2022-11-02',
      'x-ms-date': getISODate(),
      'x-ms-blob-type': 'AppendBlob'
    };
    const url = `${env.containerURL}/${this.file.name}?${env.sasToken}`;
    await this.request({ method: 'PUT', url, headers });
    return url;
  }

  override async sendFileContent(): Promise<number | undefined> {
    const { body, start, end } = this.getChunk();
    const url = `${this.url}&comp=appendblock`;
    const headers = {
      'x-ms-version': '2022-11-02',
      'x-ms-date': getISODate(),
      'x-ms-blob-condition-appendpos': start,
      'x-ms-blob-condition-maxsize': this.size
    };
    await this.request({ method: 'PUT', body, headers, url });
    return this.responseStatus > 201 ? start : end;
  }

  override abort(): void {} // Azurite does not support blob upload interrupts ?!

  override async getOffset(): Promise<number | undefined> {
    const headers = {
      'x-ms-version': '2022-11-02',
      'x-ms-date': getISODate()
    };
    await this.request({ method: 'HEAD', headers, url: this.url });
    if (this.responseStatus === 200) {
      return Number(this.responseHeaders['content-length']) || 0;
    }
    this.url = '';
    return this.offset || 0;
  }
}
function getISODate() {
  return new Date().toISOString();
}
ARosentiehl24 commented 2 months ago

Hi @akshaybogar1984, were you able to get it working with a Web Api in C#? I am facing the same problem as you. My stack is the same Angular and .NET and I have a hybrid storage so I am not only going to upload files in Blob Storage, I can also use local storage. That's why the @kukhariev approach won't work for me.

ARosentiehl24 commented 2 months ago

I've found something I'm doing this

` [HttpPost] public async Upload { // validation // generate uid from JSON paylod metadata // construct the UploadChunk URL from uid // store metadata // create a new blob // send UploadChunk Url in Location header // return 201 Created }

[HttpPut('{uid}')] public async UploadChunk(string uid) { // validation // Append chunk to Blob Storage // update metadata // send the range header and status 308 or 200 if it's the last chunk of the file.

} `

And looks like something like this

` [HttpPost] [AllowAnonymous] public async Task Upload([FromBody] UploadMetadata metadata) { if (metadata == null) { return BadRequest("Invalid metadata."); }

var uid = Guid.NewGuid().ToString();

var uploadChunkUrl = Url.Action(nameof(UploadChunk), new { uid });

await metadataService.StoreMetadataAsync(uid, metadata);

Response.Headers["location".ToLower()] = "https://localhost:7081" + uploadChunkUrl;

return StatusCode((int)HttpStatusCode.Created);

}

[HttpPut("{uid}")] [AllowAnonymous] public async Task UploadChunk(string uid) { if (string.IsNullOrEmpty(uid) || Request.Form.Files.Count == 0) { return BadRequest("Invalid request."); }

var file = Request.Form.Files[0];

var metadata = await metadataService.GetMetadataAsync(uid);
if (metadata == null)
{
    return NotFound("Metadata not found.");
}

metadata.CurrentSize += file.Length;
await metadataService.UpdateMetadataAsync(uid, metadata);

if (metadata.CurrentSize >= metadata.TotalSize)
{
    return Ok("Upload complete.");
}
else
{
    Response.Headers["Range"] = $"bytes=0-{metadata.CurrentSize - 1}";
    return StatusCode(308, "Upload in progress.");
}

}

`

But for some reason I'm always getting this issue

image

image

which is weird because I am sending the url through the header

image

kukhariev commented 2 months ago

@ARosentiehl24, this response headers miss Access-Control-Expose-Headers: Location, Range

Response.Headers["Location".ToLower()] = "https://localhost:7081" + uploadChunkUrl;
Response.Headers["Access-Control-Expose-Headers".ToLower()] = "Location,Range" 

return StatusCode((int)HttpStatusCode.Created);

Also var file = Request.Form.Files[0]; is incorrect, should use the Request.Body stream

ARosentiehl24 commented 2 months ago

That's right, now it's working as expected I'll let here the way how can be implemented, this is just an example.

image

image

Just a little question, is the Response.Headers["Range"] = $"bytes=0-{metadata.CurrentSize - 1}"; well implemented?

kukhariev commented 2 months ago

Looks good, but missing Response.Headers["Access-Control-Expose-Headers".ToLower()] = "Range"

ARosentiehl24 commented 2 months ago

Sure, thanks! works like a charm.

akshaybogar1984 commented 2 months ago

thanks @ARosentiehl24 let me give try today

akshaybogar1984 commented 2 months ago

can we upload 10gb file to blob storage

akshaybogar1984 commented 2 months ago

@ARosentiehl24 Please share server how are uploading to blob. thanks

ARosentiehl24 commented 2 months ago

Hello @akshaybogar1984 , sure, my approach is the following

I'm using the BlockBlobClient class

image

You need to store the blockIds somewhere during the upload process, in my case I'm using SQL Server

image

I'm storing the size cause that's how I know when I complete the upload by comparing some values

and at the end of the upload process, you can do something like this

image

You must send the list of the Ids to make the commit and with that you can make the upload by blocks of heavy files.

Hope it helps.

Greetings.

akshaybogar1984 commented 2 months ago

@ARosentiehl24 thanks alot. what is the maximum size are you trying to upload.. in my case 10gb to 20 gb

akshaybogar1984 commented 2 months ago

@kukhariev can we have blockid and blocklist approach. so that i can upload 10gb in faster way

PUT https://myaccount.blob.core.windows.net/mycontainer/myblob?comp=block&blockid=AAAAAA%3D%3D HTTP/1.1  

Request Headers:  
x-ms-version: 2018-03-28  
x-ms-date: Sat, 31 Mar 2018 14:37:35 GMT    
Authorization: SharedKey myaccount:J4ma1VuFnlJ7yfk/Gu1GxzbfdJloYmBPWlfhZ/xn7GI=  
Content-Length: 0
x-ms-copy-source: https://myaccount.blob.core.windows.net/mycontainer/myblob
x-ms-source-range: bytes=0-499
=======================================================

PUT https://myaccount.blob.core.windows.net/mycontainer/myblob?comp=blocklist HTTP/1.1  

Request Headers:  
x-ms-date: Wed, 31 Aug 2011 00:17:43 GMT  
x-ms-version: 2011-08-18  
Content-Type: text/plain; charset=UTF-8  
Authorization: SharedKey myaccount:DJ5QZSVONZ64vAhnN/wxcU+Pt5HQSLAiLITlAU76Lx8=  
Content-Length: 133  

Request Body:  
<?xml version="1.0" encoding="utf-8"?>  
<BlockList>  
  <Latest>AAAAAA==</Latest>  
  <Latest>AQAAAA==</Latest>  
  <Latest>AZAAAA==</Latest>  
</BlockList>
kukhariev commented 2 months ago

@akshaybogar1984, are you asking about the modified example https://github.com/kukhariev/ngx-uploadx/issues/454#issuecomment-2236124015 using the blocklist approach?

kukhariev commented 2 months ago

working example, keeps a list of blocks in memory

import { Uploader } from 'ngx-uploadx';

/**
 *  Azure Blob Storage support
 * @example
 *   options: UploadxOptions = {
 *     allowedTypes: 'image/*,video/*',
 *     chunksize: 100 * 1024 * 1024,
 *     endpoint: `[signedURL]`,
 *     uploaderClass: BlobUploader
 *   };
 */
export class BlobUploader extends Uploader {
  blockList: string[] = [];
  override async getFileUrl(): Promise<string> {
    const oUrl = new URL(this.endpoint);
    oUrl.pathname = [oUrl.pathname, this.file.name].join('/');
    const url = oUrl.toString();
    return url;
  }

  override async sendFileContent(): Promise<number | undefined> {
    const { body, start, end } = this.getChunk();
    const bid = this.uploadId + String(start).padStart(15, '0');
    const blockId = btoa(bid);
    const blockUrl = this.url + `&comp=block&blockid=${encodeURIComponent(blockId)}`;
    await this.request({ method: 'PUT', headers: commonHeaders(), url: blockUrl, body });
    this.blockList.push(blockId);
    if (end === this.size) {
      await this.finish();
    }
    return this.responseStatus > 201 ? start : end;
  }

  async finish() {
    const blocks = this.blockList.map(blockId => '<Latest>' + blockId + '</Latest>').join();
    const body = `<?xml version="1.0" encoding="utf-8"?><BlockList>${blocks}</BlockList>`;
    const url = this.url + `&comp=blocklist`;
    const headers = { ...commonHeaders(), 'Content-Type': 'text/xml; charset=UTF-8' };
    await this.request({ method: 'PUT', headers, url, body });
  }

  override abort(): void {} // FIXME: Azurite does not support blob upload interrupts?!

  override async getOffset(): Promise<number | undefined> {
    const url = this.url + `&comp=blocklist&blocklisttype=all`;
    const headers = commonHeaders();
    try {
      await this.request({ headers, url });
      console.log(this.response);
      // TODO: parse blocklist
    } catch {}
    return this.offset || 0;
  }
}

function commonHeaders(apiVersion = '2022-11-02') {
  return {
    'x-ms-version': apiVersion,
    'x-ms-date': new Date().toISOString()
  };
}
akshaybogar1984 commented 2 months ago

@kukhariev what is the maximum chunksize, can we have [concurent thread concept >](concurrency: 2, // maximum number of parallel transfer workers)

chunksize: 100 1024 1024,

also from this chunk-T6SYERLG.js?v=748897e8:69

   GET https://***.blob.core.windows.net/**/MDR%20679949-2024-06-21%2009-05.xml?sv=****&comp=blocklist&blocklisttype=all

exception:

BlobNotFound The specified blob does not exist. RequestId:29724631-701e-0042-20a2-edaa7c000000 Time:2024-08-13T17:00:09.1040664Z

override async getOffset(): Promise<number | undefined> { const url = this.url + &comp=blocklist&blocklisttype=all; const headers = commonHeaders(); try { await this.request({ headers, url }); console.log(this.response); // TODO: parse blocklist } catch {} return this.offset || 0; }

kukhariev commented 2 months ago

dynamic chunk size by default is unlimited, use maxChunkSize option to limit. concurrency - maximum number of simultaneously uploaded files.

BlobNotFound is fine. It is part of the upload resume logic.

import { Uploader } from 'ngx-uploadx';

/**
 *  Azure Blob Storage support
 * @example
 *   options: UploadxOptions = {
 *     allowedTypes: 'image/*,video/*',
 *     maxChunkSize: 512 * 1024 * 1024,
 *     endpoint: `[signedURL]`,
 *     uploaderClass: BlobUploader
 *   };
 */
export class BlobUploader extends Uploader {
  blockList: string[] = [];
  override async getFileUrl(): Promise<string> {
    const oUrl = new URL(this.endpoint);
    oUrl.pathname = [oUrl.pathname, this.file.name].join('/');
    const url = oUrl.toString();
    return url;
  }

  override async sendFileContent(): Promise<number | undefined> {
    const { body, start, end } = this.getChunk();
    const blockId = btoa(this.uploadId + String(start).padStart(15, '0'));
    const url = this.url + `&comp=block&blockid=${encodeURIComponent(blockId)}`;
    const headers = commonHeaders();
    await this.request({ method: 'PUT', headers, url, body });
    this.blockList.push(blockId);
    if (end === this.size) {
      await this.finish();
    }
    return this.responseStatus > 201 ? start : end;
  }

  async finish() {
    const blocks = this.blockList.map(blockId => '<Latest>' + blockId + '</Latest>').join();
    const body = `<?xml version="1.0" encoding="utf-8"?><BlockList>${blocks}</BlockList>`;
    const url = this.url + `&comp=blocklist`;
    const headers = { ...commonHeaders(), 'Content-Type': 'text/xml; charset=UTF-8' };
    await this.request({ method: 'PUT', headers, url, body });
    return this.size;
  }

  override abort(): void {} // FIXME: Azurite does not support blob upload interrupts?!

  override async getOffset(): Promise<number | undefined> {
    const url = this.url + `&comp=blocklist&blocklisttype=all`;
    const headers = commonHeaders();
    try {
      await this.request({ headers, url });
      const parser = new DOMParser();
      const xmlDoc = parser.parseFromString(this.response, 'text/xml');
      const blocks = xmlDoc
        .getElementsByTagName('UncommittedBlocks')[0]
        .getElementsByTagName('Block');
      const sizes = Array.from(blocks).map(
        el => +(el.getElementsByTagName('Size')[0]?.textContent ?? '0')
      );
      return sizes.reduce((acc, v) => acc + v, 0);
    } catch {}
    return this.offset || 0;
  }
}

function commonHeaders(apiVersion = '2022-11-02') {
  return {
    'x-ms-version': apiVersion,
    'x-ms-date': new Date().toISOString()
  };
}
akshaybogar1984 commented 2 months ago

@kukhariev I have tried between 10 mb to 500mb file. request gets stuck and doesnt process at all

kukhariev commented 2 months ago

@ARosentiehl24 you can clone https://github.com/kukhariev/ngx-uploadx/tree/blob-exp, edit the signed URL of a container https://github.com/kukhariev/ngx-uploadx/blob/5d995ef2c96d65a292955ba435da94e3273b89df/src/app/service-code-way/service-code-way.component.ts#L7-L8 npm run serve:dev

and navigate to http://localhost:4200/service-code-way

kukhariev commented 1 month ago

Closing due to inactivity. Feel free to re-open if your issue isn't resolved.