I'm using Blazor SfUploader Server-Side and I'm having trouble with uploading large files when deployed on Azure.
Locally I don't have any problems so most likely it is a setting/configuration on Azure but I can't find which one.
When uploading I get this error:
syncfusion-blazor.min.js:1 POST https://myWebApp.azurewebsites.net/upload/chunk net::ERR_CONNECTION_RESET
Because I've enabled Chunking and set a high RetryCount the file is eventually uploaded but it takes a long time.
This is my Blazor code:
< SfUploader ID="UploadFiles" AllowedExtensions=".xml" MaxFileSize="429496720" SequentialUpload="true" DropArea="#DropArea">
< UploaderAsyncSettings SaveUrl="upload/chunk" ChunkSize="25000000" RetryCount="10" RetryAfterDelay="3000">
The controller is the one from https://blazor.syncfusion.com/documentation/file-upload/chunk-upload/ and https://www.syncfusion.com/forums/150303/upload-large-file-to-azure-storage-using-chunk-upload.
Here's my controller code:
namespace FileUpload.Controllers
{
[DisableRequestSizeLimit]
public class UploadController : Controller
{
[HttpPost("upload/chunk")]
public async Task SaveChunked(IList chunkFile, IList UploadFiles)
{
try
{
if ((chunkFile == null || !chunkFile.Any()) && UploadFiles != null && UploadFiles.Any())
{
// File is smaller than chunk size:
// Upload to blob storage:
return;
}
if (chunkFile == null) throw new ArgumentException("No data uploaded");
foreach (var file in chunkFile)
{
var httpPostedChunkFile = HttpContext.Request.Form.Files["chunkFile"];
var chunkIndex = HttpContext.Request.Form["chunk-index"];
var totalChunk = HttpContext.Request.Form["total-chunk"];
using (var fileStream = file.OpenReadStream())
{
if (Convert.ToInt32(chunkIndex) <= Convert.ToInt32(totalChunk))
{
var streamReader = new MemoryStream();
fileStream.CopyTo(streamReader);
var byteArr = streamReader.ToArray();
var content = new byte[] { };
if (HttpContext.Session.Get("streamFile") != null)
{
content = HttpContext.Session.Get("streamFile").Concat(byteArr).ToArray();
}
else
{
content = byteArr;
}
HttpContext.Session.Set("streamFile", content);
}
if (Convert.ToInt32(chunkIndex) == Convert.ToInt32(totalChunk) - 1)
{
var fileArray = HttpContext.Session.Get("streamFile");
using (var fileStreams = new FileStream(httpPostedChunkFile.FileName, FileMode.Create))
{
foreach (var stream in fileArray)
{
fileStreams.WriteByte(stream);
}
fileStreams.Seek(0, SeekOrigin.Begin);
HttpContext.Session.Remove("streamFile");
// new MemoryStream(myByteArray);
// Save file: await blockBlob.UploadFromStreamAsync(fileStreams);
}
}
}
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}
}
}
}
But with large files the controller isn't even triggered.
Smaller files are fine. Multiple small files are fine as well.
The error:
Please assist.