We use cookies to give you the best experience on our website. If you continue to browse, then you agree to our privacy policy and cookie policy. Image for the cookie policy date

SfUploader - Working with Azure Storage Blobs

Hi there,

it is a very basic need in many business cases to upload documents to an Azure Storage Blob container in a target folder.

Please provide a tiny, working example of how to achieve that with your SfUploader Blazor component:

This is the default functionality:
  • optional: if the target container/folder does not exist on Azure: create it, and in addition add an "outdated" subfolder
  • optional: if the files to be upload allready exist on Azure: move the old versions as a copy to the outdated folder
  • upload the files into the target folder
Thank you for your help to get this started.

Cheers,
Volker

28 Replies

PM Ponmani Murugaiyan Syncfusion Team December 8, 2020 09:32 AM

Hi Volker, 

Greetings from Syncfusion support. 

We have prepared sample as per your requirement “SfUploader working with Azure Storage Blobs”. Please find the sample below for reference. 

namespace CoreHostedFileUpload.Server.Controllers  
{  
    [Route("api/[controller]")]  
    [ApiController]  
    public class SampleDataController : ControllerBase  
    {  
        public static byte[] content = new byte[] { };   
        [HttpPost("[action]")]  
        public async Task Save(IList<IFormFile> chunkFile, IList<IFormFile> UploadFiles)  
        {  
            try  
  
            {  
                const string accountName = "****"// Provide the account name  
                const string key = "****"// Provide the account key  
  
                var storageCredentials = new StorageCredentials(accountName, key);  
  
                var cloudStorageAccount = new CloudStorageAccount(storageCredentials, true);  
  
                var cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();  
  
                var isChunkFile = false;  
                
                ...  
 
                if(UploadFiles != null && !isChunkFile)  
                {  
                    foreach (var file in UploadFiles)  
                    {  
  
                        var container = cloudBlobClient.GetContainerReference("filo");  
                        await container.CreateIfNotExistsAsync();  
                        await container.SetPermissionsAsync(new BlobContainerPermissions()  
                        {  
                            PublicAccess = BlobContainerPublicAccessType.Blob  
                        });  
  
                        var httpPostedFile = HttpContext.Request.Form.Files["UploadFiles"];  
  
                        var blob = container.GetBlockBlobReference(httpPostedFile.FileName);  
                        using (var stream = file.OpenReadStream())  
                        {  
                            await blob.UploadFromStreamAsync(stream);  
                        }  
  
                    }  
                }  
  
            }  
  
            catch (Exception e)  
  
            {  
                content = new byte[] { };  
                Response.Clear();  
                Response.StatusCode = 204;  
                Response.HttpContext.Features.Get<IHttpResponseFeature>().ReasonPhrase = "File failed to upload";  
                Response.HttpContext.Features.Get<IHttpResponseFeature>().ReasonPhrase = e.Message;  
  
            }  
        }  
    }  
}  



Kindly check with the above sample to meet your requirement. Please get back us if you need further assistance. 

Regards, 
Ponmani M 



VO Volker December 9, 2020 06:35 AM

Hi Ponmani,

thank you, but I need a "Blazor Server" version, not a web-assambly, please transform it.


Basically, I need just something straight forward like this:


@using System.IO
 
@{
    var path = "/uploads/testfolder/";
 
    <SfUploader AutoUpload="true">
        <UploaderEvents ValueChange="@(e => OnChangeUploadFiles(epath))">UploaderEvents>
    SfUploader>
}
 
@code {
 
    private void OnChangeUploadFiles(Syncfusion.Blazor.Inputs.UploadChangeEventArgs argsstring path)
    {
        foreach (var file in args.Files)
        {
            string qualifiedfilename = @"wwwroot" + path + file.FileInfo.Name.ToLower();
 
            FileStream filestream = new FileStream(qualifiedfilenameFileMode.Create, FileAccess.Write);
            file.Stream.WriteTo(filestream);
            filestream.Close();
            file.Stream.Close();
        }
    }
}
Cheers,
Volker


PM Ponmani Murugaiyan Syncfusion Team December 20, 2020 06:46 PM

Hi Volker, 

Sorry for the inconvenience caused. 

Currently we are preparing sample for your requirement. We will update in 2 business days. 

Regards, 
Ponmani M 



VO Volker February 12, 2021 11:51 PM

Hi Ponmani,

any news to that topic?
Still waiting...

Cheers,
Volker


PM Ponmani Murugaiyan Syncfusion Team February 15, 2021 04:47 AM

Hi Volker, 

Thanks for the patience. 

We have prepared sample as per your requirement using server side Blazor. Please find the sample below for reference. 

 
Kindly check with the above sample. Please get back us if you need further assistance. 

Regards, 
Ponmani M 



VO Volker February 19, 2021 10:56 PM

Hi Ponmani,

thank you!

I hope you can help me, cause I need some modifications of your solution, that in principle every business case will need, too:

1) Please replace the outdated WindowsAzure.Storage by the recommended Microsoft.Azure.Storage.Blob and update your solution.
Simply swapping the Nuget packages doesn't work.



2) We must be able to let the SfUploader know where to upload the files dynamically within the given Azure Blob Container (e.g. lets asume each customer has it's own target folder). So we need to tell the API an individual folder name (on the fly from the frontend, not hardcoded in the API controller) when uploading, please show us how to do that within SfUploader component.

3) Afterwards please return an "OK" from the API-endpoint after finishing Save, so that we can trigger some automated cleanup-routines after an upload on Index.razor was succesfully finished. I will use this OK to refresh the blob-container listing on client-side, so that the newly uploaded items are displayed immideately:



4) In addition please provide an example of your solution that doesn't use an API-endpoint but gets executed directly in a component, like this:
https://blazor.syncfusion.com/documentation/file-upload/getting-started/#without-server-side-api-endpoint 

This would make things (points 2 and 3) less complicated to store single files an Azure Cloud Blob Storage:



Cheers,
Volker


SP Sureshkumar P Syncfusion Team February 24, 2021 02:58 AM

Hi Volker, 
 
Thanks for your update. 
 
Query 1: Please replace the outdated WindowsAzure.Storage by the recommended Microsoft.Azure.Storage.Blob and update your solution. Simply swapping the Nuget packages doesn't work. 
 
Answer:  
              The Azure SDK is migrating to Azure.* as documented(https://docs.microsoft.com/en-us/dotnet/api/overview/azure/storage ). Not all functionality for the SDK has been migrated yet so you need to look at what you're using to see if you can switch to the new library. 
 
 
Query 2: We must be able to let the SfUploader know where to upload the files dynamically within the given Azure Blob Container (e.g. lets asume each customer has it's own target folder). So we need to tell the API an individual folder name (on the fly from the frontend, not hardcoded in the API controller) when uploading, please show us how to do that within SfUploader component. 
 
Answer: 
              We did not have any property to pass the container name to the server, but we can pass the custom parameter in the event FileSelected. 
 
Please find the code example here: 
 
<SfUploader ID="UploadFiles" AutoUpload="true"> 
    <UploaderAsyncSettings SaveUrl="api/SampleData/Save"></UploaderAsyncSettings> 
    <UploaderEvents FileSelected="@Selected"></UploaderEvents> 
</SfUploader> 
 
@code{ 
 
    void Selected(SelectedEventArgs args) 
    { 
        args.CurrentRequest = new List<object> { new { container = "containerName" } }; 
    } 
 
} 
 
Please find the screen shot here: 
 
 
Query 3: Afterwards please return an "OK" from the API-endpoint after finishing Save, so that we can trigger some automated cleanup-routines after an upload on Index.razor was succesfully finished. I will use this OK to refresh the blob-container listing on client-side, so that the newly uploaded items are displayed immediately: 
 
Answer: 
              We already have success event from our uploader component. in this event we have return the status text for each file update.  
 
Please find the screen shot here: 
 
 
Query 4: In addition please provide an example of your solution that doesn't use an API-endpoint but gets executed directly in a component, like this: 
 
Answer:  
              we have modified the code example without API endpoint. Please refer the code below. 
<SfUploader ID="UploadFiles" AutoUpload="true"> 
    <UploaderEvents ValueChange="changeEvent"></UploaderEvents> 
</SfUploader> 
 
@code { 
    public async Task changeEvent(UploadChangeEventArgs args) 
    { 
        foreach (var file in args.Files) 
        { 
            const string accountName = "***"// Provide your accountName 
            const string key = "***"// Provide your account key 
            var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, key), true); 
            var blobClient = storageAccount.CreateCloudBlobClient(); 
            var container = blobClient.GetContainerReference("filo"); // Provide your container name 
            await container.CreateIfNotExistsAsync(); 
            await container.SetPermissionsAsync(new BlobContainerPermissions() 
            { 
                PublicAccess = BlobContainerPublicAccessType.Blob 
            }); 
            var blob = container.GetBlockBlobReference(file.FileInfo.Name); 
            await blob.UploadFromStreamAsync(file.Stream); 
        } 
    } 
} 
 
 
 
 
We have modified the previously attached sample. please find the sample here: https://www.syncfusion.com/downloads/support/forum/160405/ze/AzureBlobStorage-4167287031773856045  
 
Regards, 
Sureshkumar P 



VO Volker February 25, 2021 10:37 PM

Hi Sureshkumar,

thank you for all your help.

Everything works perfectly now!

Cheers from Graz/Austria,
Volker


PM Ponmani Murugaiyan Syncfusion Team February 25, 2021 10:51 PM

Hi Volker, 

Thanks for the update. 

We are glad to know that the provided solution helps you in achieving your requirement. Please get back us if you need further assistance. 

Regards, 
Ponmani M 



MP Megha Patel May 26, 2021 02:20 AM

I don't need file save into a folder directly upload on azure. 

Can you provide me a solution?


PM Ponmani Murugaiyan Syncfusion Team May 27, 2021 06:08 AM

Hi Megha, 

Thanks for the update. 

We are quite unclear about the reported query. We request you to elaborate your requirement and share details like where you would like save the uploaded files in azure, the requested details will help us to check and provide you the solution at earliest. 

Regards, 
Ponmani M 



MP Megha Patel May 29, 2021 01:27 AM

[HttpPost("[action]")]
        public async Task Save(IList<IFormFile> chunkFile, IList<IFormFile> UploadFiles)
        {
            try

            {
                const string accountName = "****"; // Provide the account name
                const string key = "****"; // Provide the account key

                var storageCredentials = new StorageCredentials(accountName, key);

                var cloudStorageAccount = new CloudStorageAccount(storageCredentials, true);

                var cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();

                var isChunkFile = false;
              

                foreach (var file in chunkFile)

                {

                    isChunkFile = true;

                    var httpPostedChunkFile = HttpContext.Request.Form.Files["chunkFile"];

                    var chunkIndex = HttpContext.Request.Form["chunk-index"];

                    var totalChunk = HttpContext.Request.Form["total-chunk"];

                    using (var fileStream = file.OpenReadStream())

                    {

                        if (Convert.ToInt32(chunkIndex) <= Convert.ToInt32(totalChunk))

                        {

                            var streamReader = new MemoryStream();

                            fileStream.CopyTo(streamReader);

                            var byteArr = streamReader.ToArray();

                            if(content.Length > 0)
                            {
                                content =  content.Concat(byteArr).ToArray();
                            }
                            else
                            {
                                content = byteArr;
                            }

                        }

                        if (Convert.ToInt32(chunkIndex) == Convert.ToInt32(totalChunk) - 1)

                        {
                            

                            var container = cloudBlobClient.GetContainerReference("filo");

                            CloudBlockBlob blockBlob = container.GetBlockBlobReference(httpPostedChunkFile.FileName);

                          
using (FileStream fileStreams = new FileStream(httpPostedChunkFile.FileName, FileMode.Create)) { for (int i = 0; i < content.Length; i++) {

                                    fileStreams.WriteByte(content[i]);

                                }

                                fileStreams.Seek(0, SeekOrigin.Begin);

                                content = new byte[] { };

                                await blockBlob.UploadFromStreamAsync(fileStreams);

                            }

                        }

                    }

                }

                if(UploadFiles != null && !isChunkFile)
                {
                    foreach (var file in UploadFiles)
                    {

                        var container = cloudBlobClient.GetContainerReference("filo");
                        await container.CreateIfNotExistsAsync();
                        await container.SetPermissionsAsync(new BlobContainerPermissions()
                        {
                            PublicAccess = BlobContainerPublicAccessType.Blob
                        });

                        var httpPostedFile = HttpContext.Request.Form.Files["UploadFiles"];

                        var blob = container.GetBlockBlobReference(httpPostedFile.FileName);
                        using (var stream = file.OpenReadStream())
                        {
                            await blob.UploadFromStreamAsync(stream);
                        }

                    }
                }

            }

            catch (Exception e)

            {
                content = new byte[] { };
                Response.Clear();
                Response.StatusCode = 204;
                Response.HttpContext.Features.Get<IHttpResponseFeature>().ReasonPhrase = "File failed to upload";
                Response.HttpContext.Features.Get<IHttpResponseFeature>().ReasonPhrase = e.Message;

            }
        }
    }
}

In this code file first upload in project folder then upload on azure. i want to directly upload on azure not need to save into folder.


PM Ponmani Murugaiyan Syncfusion Team May 31, 2021 08:48 AM

Hi Megha, 

Thanks for the update. 

Solution1: If you using Uploader without asyncSettings, you can directly save the file in azure without save into a project folder.  

<SfUploader ID="UploadFiles" AutoUpload="true">  
    <UploaderEvents ValueChange="changeEvent"></UploaderEvents>  
</SfUploader>  
  
@code {  
    public async Task changeEvent(UploadChangeEventArgs args)  
    {  
        foreach (var file in args.Files)  
        {  
            const string accountName = "***"// Provide your accountName  
            const string key = "***"// Provide your account key  
            var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, key), true);  
            var blobClient = storageAccount.CreateCloudBlobClient();  
            var container = blobClient.GetContainerReference("filo"); // Provide your container name  
            await container.CreateIfNotExistsAsync();  
            await container.SetPermissionsAsync(new BlobContainerPermissions()  
            {  
                PublicAccess = BlobContainerPublicAccessType.Blob  
            });  
            var blob = container.GetBlockBlobReference(file.FileInfo.Name);  
            await blob.UploadFromStreamAsync(file.Stream);  
        }  
    }  
}  


Solution2: If you render Uploader with asyncsettings, you can directly upload the file in azure programmatically as in the save handler. 

public async Task Save(IList<IFormFile> UploadFiles) 
        { 
            try 
            { 
                foreach (var file in UploadFiles) 
                { 
 
                    const string accountName = "***"; // Provide your accountName 
                    const string key = "***"; // Provide your account key 
 
                    var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, key), true); 
 
                    var blobClient = storageAccount.CreateCloudBlobClient(); 
                    var container = blobClient.GetContainerReference("filo"); // Provide your container name 
                    await container.CreateIfNotExistsAsync(); 
                    await container.SetPermissionsAsync(new BlobContainerPermissions() 
                    { 
                        PublicAccess = BlobContainerPublicAccessType.Blob 
                    }); 
 
                    var blob = container.GetBlockBlobReference(file.FileName); 
                    using (var stream = file.OpenReadStream()) 
                    { 
                        await blob.UploadFromStreamAsync(stream); 
                    } 
 
                } 
            } 


Regards, 
Ponmani M   



MP Megha Patel June 1, 2021 01:46 AM

But  I want to upload the file in chunks and also calculate the hash for that file.


JM Jeyanth Muthu Pratheeban Sankara Subramanian Syncfusion Team June 2, 2021 04:57 AM

Hi Megha, 

Thanks for your update. 

We checked the reported query. In order to upload the file as chunk, we suggest you to set ChunkSize in the AsyncSettings. Also, azure will not merge the chunk in to single file. Therefore, we need to merge the chunk manually and then upload in to azure. We have made sample for your convenience. Please find the sample in the below link.

 
  AutoUpload="true" Multiple="false" ID="UploadFiles" MaxFileSize="6000000000"> 
         SaveUrl="api/SampleData/Save" ChunkSize="10000000"> 
     


Please refer the below Stack Overflow solution to calculate hash for the file.

Link 1    : https://stackoverflow.com/a/55492604/9133493 



Kindly integrate the provided solution with your application and get back to us if you need any further assistance on this. 

Regards, 
Jeyanth. 



MP Megha Patel June 3, 2021 10:35 PM

But I want to upload files directly on azure. Code given by you first upload in the project folder and then Upload on azure.


PM Ponmani Murugaiyan Syncfusion Team June 7, 2021 08:36 AM

Hi Megha, 

Thanks for the update. 

We would like to know you that the files are saved in the project folder only if we have provided the local path (save location) in the save handler as like below code snippet. 

foreach (var file in chunkFile) 
{ 
  var filename = ContentDispositionHeaderValue 
                          .Parse(file.ContentDisposition) 
                          .FileName 
                          .Trim('"'); 
  filename = hostingEnv.ContentRootPath + $@"\{filename}" 

But, in our previously provided sample, we didn’t provided any local path to save the file in the project folder. We have directly uploaded the file in the azure by creating a folder “filo” in the blob storage. 

 
var container = cloudBlobClient.GetContainerReference("filo"); 
 
CloudBlockBlob blockBlob = container.GetBlockBlobReference(httpPostedChunkFile.FileName); 

Regards, 
Ponmani M 



MP Megha Patel June 9, 2021 11:43 PM

Can you please check again?

using (FileStream fileStreams = new FileStream(chunkFile.FileName, FileMode.Create))
                            {
                                for (int i = 0; i < content.Length; i++)
                                {
                                    fileStreams.WriteByte(content[i]);
                                }
                                fileStreams.Seek(0, SeekOrigin.Begin);
await blockBlob.UploadFromStreamAsync(fileStream);
                            



PM Ponmani Murugaiyan Syncfusion Team June 11, 2021 06:09 AM

Hi Megha, 

Thanks for the update. 

Based on your provided code snippet, below line will store the file in Azure as per your requirement. In your application, still if you are getting the file stored in local path, please share the video demonstration with sample, which helps us to check and provide you the solution at earliest.  

await blockBlob.UploadFromStreamAsync(fileStream); 

Regards, 
Ponmani M 



VO Volker replied to Sureshkumar P September 21, 2021 08:44 AM

Hi Sureshkumar ,

another problem has arisen with your solution, please have a look to your answer from February 24, 2021 11:58 AM

Concerning question 3 :


The problem with this solution is that it always claims SUCCESS (args.StatusText "file uploaded successfully"), even if we throw a manual exception in the controller to indicate something went wrong in the business-logic (e.g. try to overwrite an existing file or maximum allowed files per folder reached) :


How can we catch the individual exeption-message of the controller (backend) in the Blazor-component (SFUploader, frontend)?

Cheers,
Volker



DR Deepak Ramakrishnan Syncfusion Team September 22, 2021 09:28 AM

Hi Volker, 
 
Thanks for the update. 
 
We are currently working on your requirement . We will update the further possibilities on or before 24th , September 2021. We appreciate your patience until then. 
 
Thanks, 
Deepak R. 
 



DR Deepak Ramakrishnan Syncfusion Team September 27, 2021 10:18 AM

Hi Volker, 
 
Thanks for patience. 
 
We have modified the sample as per you requirement . Also we suggest you to use  the catch part as like below highlighted code (set status code to 400). 
 
catch (Exception e) 
            { 
                Response.Clear(); 
                Response.StatusCode = 400; 
                Response.HttpContext.Features.Get<IHttpResponseFeature>().ReasonPhrase = "File failed to upload"; 
                Response.HttpContext.Features.Get<IHttpResponseFeature>().ReasonPhrase = e.Message; 
            } 
 
 
 
Thanks, 
Deepak R. 



VO Volker October 4, 2021 11:00 PM

Hi Deepak,

thank you.

But how can we add some custom details that are visible in the blazor-component (eg. a short message or a custom error number) WHAT caused the error in the controller, not THAT an error has happened?


e.g. only 3 documents allowed per folder (checked in the business logic of our Data Access Layer before uploading, not checking in the Blazor Component, so we have to give the component a hint, what went wrong)


How can we talk from the controller to the component (unidirectional)?

Cheers,
Volker



DR Deepak Ramakrishnan Syncfusion Team October 5, 2021 05:50 AM

Hi Volker, 
 
Thanks for your update. 
 
We are currently trying to send customized response text from server to client. We will update the possibilities in two business days(7th,October 2021) . 
 
Thanks, 
Deepak R. 



DR Deepak Ramakrishnan Syncfusion Team October 11, 2021 10:16 AM

Hi Volker, 
 
Thanks for the patience. 
 
We can send the headers along with the response and it can be received at the client end .In the below sample we have added the headers with required message in the catch part (Hence it needed when maximum files reached at the server path ) . Kindly refer the below sample and code snippet for your reference. 
[Controller] (Save method) 
catch (Exception e) 
            { 
                Response.Clear(); 
                Response.StatusCode = 400; 
                Response.HttpContext.Features.Get<IHttpResponseFeature>().ReasonPhrase = "File failed to upload"; 
                Response.HttpContext.Features.Get<IHttpResponseFeature>().ReasonPhrase = e.Message; 
                Response.Headers.Add("ID", "Maximum Uploaded files reached"); 
            } 
 
 
 
 
 [Index.razor] 
<SfUploader ID="UploadFiles" AutoUpload="true"> 
    <UploaderAsyncSettings SaveUrl="api/SampleData/Save"></UploaderAsyncSettings> 
    <UploaderEvents FileSelected="@Selected" Success="@FileSuccess" OnFailure="@FailureHandler"></UploaderEvents> 
</SfUploader> 
 
<p>key value is: @key</p> 
<p>pair value is: @value</p> 
 
@code{ 
    public string key { get; set; } = ""; 
    public string value { get; set; } = ""; 
 
 
 
    void Selected(SelectedEventArgs args) 
    { 
        args.CurrentRequest = new List<object> { new { container = "containerName" } }; 
    } 
 
    public void FileSuccess(SuccessEventArgs args) 
    { 
 
    } 
 
    public void FailureHandler(FailureEventArgs args) 
    { 
        var customHeader = new string[] { }; 
        customHeader = args.Response.Headers.Split(new Char[] { '' }); // To split the response header values 
        for (var i = 0; i < customHeader.Length; i++) 
        { 
            if (customHeader[i].Split(new Char[] { ':' })[0] == "id") 
            { 
                key = customHeader[i].Split(new Char[] { ':' })[0]; // To get the key pair of provided custom data in header 
                value = customHeader[i].Split(new Char[] { ':' })[1].Trim(); // To get the value for the key pair of provided custom data in header 
            } 
        } 
    } 
 
} 
 
 
 
In the client end we have used OnFailure event to assign the values to the respective header variables , And the event will triggered once the file failed to upload in the server. 
 
 
Thanks, 
Deepak R. 



PE Peter October 14, 2021 05:55 AM

Hello Syncfusion people,


I'm trying to upload big files to my blob storage for a while and still I don't see any working solution from Syncfusion.


Yes, the chunck upload works, but the crazy thing is that that method uploads the upload file to the root of the server and then copies it to the blobstorage and leaves the uploaded file behind in the root of the server.


I guess this is what nobody wants.


Please advise.


Best,

Pete




DR Deepak Ramakrishnan Syncfusion Team October 15, 2021 07:38 AM

Hi Volker, 
 
Greetings from Syncfusion support. 
 
We are currently checking the feasibility to implement the requirement in our end . We will update the details in two business days (19th,October 2021). We appreciate your patience until then. 
 
Thanks, 
Deepak R. 
 



DR Deepak Ramakrishnan Syncfusion Team October 19, 2021 07:12 AM

Hi Volker, 
Thanks for your patience. 
Query 1: Yes, the chunck upload works, but the crazy thing is that that method uploads the upload file to the root of the server and then copies it to the blobstorage and leaves the uploaded file behind in the root of the server. 
Based upon our case study the mentioned behavior is the suggested way to perform cloud uploading actions . It is also the recommended way suggested by Microsoft team .Kindly refer the below link for your reference. 
 
But we can delete the files in the server once the file get uploaded in the azure cloud  . Please find the below highlighted code and sample for your reference. 
public async Task Save(IList<IFormFile> UploadFiles) 
        { 
 
            var ContainerName = Response.HttpContext.Request.Headers["container"].ToString(); 
            try 
            { 
                if (false) 
                { 
                    throw new InvalidOperationException("Max allowed files erros"); 
                } 
                else 
                { 
                    foreach (var file in UploadFiles) 
                    { 
 
                        const string accountName = "***"; // Provide your accountName 
                        const string key = "***"; // Provide your account key 
 
                        var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, key), true); 
 
                        var blobClient = storageAccount.CreateCloudBlobClient(); 
                        var container = blobClient.GetContainerReference("filo"); // Provide your container name 
                        await container.CreateIfNotExistsAsync(); 
                        await container.SetPermissionsAsync(new BlobContainerPermissions() 
                        { 
                            PublicAccess = BlobContainerPublicAccessType.Blob 
                        }); 
 
                        var blob = container.GetBlockBlobReference(file.FileName); 
                        using (var stream = file.OpenReadStream()) 
                        { 
                            await blob.UploadFromStreamAsync(stream); 
                        } 
 
                        var filename = hostingEnv.ContentRootPath + $@"\{file.FileName}"; 
                        if (System.IO.File.Exists(filename)) 
                        { 
                            System.IO.File.Delete(filename); 
                        } 
 
 
                    } 
                } 
            } 
The above highlighted code part will delete the in the server once it get uploaded to the cloud. Please find the below link for sample demonstration 
 
Thanks, 
Deepak R. 


Loader.
Live Chat Icon For mobile
Up arrow icon