|
//Define the root directory to the file manager
public void GetBucketList() {
ListingObjectsAsync("", "", false).Wait();
RootName = response.S3Objects.First().Key;
} |
Thank you for your prompt response to my question.
Based on your response I was able to achieve a working solution by a bit of further experimentation as follows:
NOTE: Some screen captures are included to make this a little clearer than my attempt at explaining this.
First Try (failed):
1) Using AWS S3 Control Panel, deleted all folders in the S3 Bucket
2) Using AWS S3 Control Panel, added one folder at the root level.
3) Then viewed this using Angular File Manager with Net Core S3 File Manager and then got a view showing the folder added at the root level. However the view also showed a sub-folder below the root folder with no name. So I tried to rename that folder to give it a name but received an error, then tried to delete it, but also received an error so could not delete it either.
4) Tried adding a new folder and this worked, so now have the new added folder with a name, as well as the unnamed folder under the root.
5) Went back to AWS S3 Control Panel and found that the root level folder (actually a key in AWS) shows a folder within the root folder with the same name -weird, doesn't make sense.
6) So basically everything works here, but can't get rid of the strange folder with no name.
Second Try (failed):
1) Using AWS S3 Control Panel, again deleted all folders in the S3 Bucket.
2) Then tried adding the Root Folder from Angular File Manager with Net Core S3 File Manager, but received an error when attemping this, so it appears that I cannot add the Root Folder using the Angular File Manager.
Third Try (success):
1) Added Root Folder using AWS S3 Control Panel
2) Add on Sub Folder under the Root Folder using AWS S3 Control Panel. So now have the Root plus child folders (actually keys in S3).
3) Back the Angular File Manager with Net Core S3 File Manager and now get a view that is correct -> a Root folder with one sub folder.
3) Now found add/deleting folders works correctly, and file upload also works on first try.
So it looks like to get this working initially, I needed to:
1) Add one and only one (per your earlier suggestion) folder at the root level, but using AWS resources such as S3 Control Panel.
2) Add at least on sub-folder below the root, also using AWS resource -S3 Control Panel in my case.
Then everything proceeds as expected, so I'm going to proceed from here with this.
As mentioned earlier, this is a great Control!! Should be very useful.
I have been running into similar problems while setting up my S3 bucket for the first time. It seems like some essential detail is missing from the documentation. My discoveries are below, which I would have loved to read in the docs.
I appreciate that this component and the dotnet service are so well developed, and I'm really excited to get using it in my applications. But the nit-picky headaches with S3 setup have been really frustrating and unnecessary. Some modest documentation would go a long way. Thank you!
1) Structure the bucket with exactly one folder at the top level. (Or if you have multiple folders at the bucket root, the first alphanumeric item is chosen.)
2) The folder must be created with the S3 Console "Create Folder" command, which explicitly creates a 0b object by the name of the directory. This hung me up for hours. I was trying to start with existing bucket content, and it's not enough to have objects with Keys that start with "Files/"; I had to explicitly add an extra, empty object with Key "Files/".
3) The most recent documentation I could find related to setting up the bucket for the S3 File Provider appears to be a blog post (not ideal), and the post is missing the crucial details above. It also specifies to make the bucket public, which I assume is not what most users want. A public bucket shouldn't be necessary, because the S3 File Provider has Secret Key credentials to access the bucket.
4) All subfolders must also be defined with a 0b object in S3 where the Key name is the full path of the folder. Without these markers, browsing the directory structure with the S3FileProvider will be buggy. (This could be fixed with a feature enhancement. Is this the sort of thing I should file a feature request for?)
5) The S3FileManager service, as currently coded, relies on folder objects (described in #2 & #4 above) in order to parse the directory structure. If you're doing file manipulation with CLI tools such as `aws s3 cp` or `aws s3 sync` to recursively traverse directories to copy files, you will lose the directory-marker objects. More effort is required to make things work with FileManager if you're doing manual S3 file manipulations. For example, even a backup `aws s3 sync
Hi Matthew,
|
S.NO |
Queries |
Solution |
|
|
1. |
1) Structure the bucket with exactly one folder at the top level. (Or if you have multiple folders at the bucket root, the first alphanumeric item is chosen.) |
With the shared details, you need a configuration detail related to our Amazon S3 file service provider. We would like to let you know that, as per our current implement of amazon s3 file service provider will takes the first available folder in the bucket.
But you can be able to set the specific bucker folder as FileManager root folder by using the below code details in the AmazonS3FileProvider.cs file GetBucketList method.
Refer to the below code details.
GitHub location: https://github.com/SyncfusionExamples/amazon-s3-aspcore-file-provider/blob/bc78bb411198cd1c2bfdd6b2af88d149005adb93/Models/AmazonS3FileProvider.cs#L42 |
|
|
2. |
2) The folder must be created with the S3 Console "Create Folder" command, which explicitly creates a 0b object by the name of the directory. This hung me up for hours. I was trying to start with existing bucket content, and it's not enough to have objects with Keys that start with "Files/"; I had to explicitly add an extra, empty object with Key "Files/". |
We would like to let you know that creating folder in Amazon S3 bucket will creates a folder with 0 bytes. This is the behaviour of folder creation and the similar will happen while you trying to create a folder in windows explorer itself.
Also, as per current implementation of FileManager component you can be able to set a single folder as root folder. But you do not need to explicitly add an extra, empty object with Key "Files/". You can be able to set our own folder as FileManager root folder using the above-mentioned code details. |
|
|
3. |
3) The most recent documentation I could find related to setting up the bucket for the S3 File Provider appears to be a blog post (not ideal), and the post is missing the crucial details above. It also specifies to make the bucket public, which I assume is not what most users want. A public bucket shouldn't be necessary, because the S3 File Provider has Secret Key credentials to access the bucket. |
As per the details, you are using our Syncfusion blob to create an amazon bucket and facing issue with following the steps mentioned in the blob. Also, from the blob we have set the amazon bucket as public from our side.
But it is not necessary to make a bucket as public from your side. We have made the bucket as public due to below mentioned purpose.
Also, we have considered this correction as an improvement from our side and will refresh the blob as soon as possible. |
|
|
4. |
4) All subfolders must also be defined with a 0b object in S3 where the Key name is the full path of the folder. Without these markers, browsing the directory structure with the S3FileProvider will be buggy. (This could be fixed with a feature enhancement. Is this the sort of thing I should file a feature request for?) |
We are facing issue with browsing the directory structure with the S3FileProvider while using key as full path of the folder. But unfortunately, we are unable to replicate the reported issue from our side.
Since we are unbale to replicate the reported issue from our side, share a exact step to replicate the reported issue from your side or share a video footage to demonstrate the issue. It will help us to investigate the reported issue from our side. |
|
|
5. |
5) The S3FileManager service, as currently coded, relies on folder objects (described in #2 & #4 above) in order to parse the directory structure. If you're doing file manipulation with CLI tools such as `aws s3 cp` or `aws s3 sync` to recursively traverse directories to copy files, you will lose the directory-marker objects. More effort is required to make things work with FileManager if you're doing manual S3 file manipulations. For example, even a backup `aws s3 sync ` and immediate restore `aws s3 sync ` will not leave the bucket in a functional state. |
With these details, you are facing issue with copying the folders inside the Amazon S3 bucket using the mentioned CLI. Our FileManager component is used to browse, manage, and organize the files and folders in a file system through a web application. All basic file operations like creating a new folder, uploading and coping of files in the file system, and deleting and renaming of existing files and folders are available in the file manager component.
So, we suggest you try our FileManager operations to achieve your requirement. We have shared documentation which explains the FileManager operations for your reference.
Documentation: https://ej2.syncfusion.com/aspnetcore/documentation/file-manager/file-operations |