- Make sure to ignore files that contain sensitive info like keys in .gitignore
- Click "Storage Accounts"
- Basics tab configurations (leave default for other tabs)
- Create the resource, go to it after it's made, and click "Containers" in the left panel
- Click "+ Container," make a name for it, then click "Create"
- Once it's made, click the new container
- Click "Upload" and upload anything
- Go back to "Storage Accounts" and click the resource made before
- On the left panel, click "Security + networking," then "Access keys"
- Unhide the key value, then copy it
- Store the key value into a name into an .env file (e.g. AZURE_STORAGE_ACCESS_KEY='some_key_name')
- NOTE: If you do this in Google Colab, name the file to something that is not .env first because the file will disappear if you do. Paste the key into the file, then rename it to .env
- Create python code to upload files to storage
- This repo's code is found in azure_storage.py (NOTE: This .py file was converted from Google Colab)
- Make sure to substitute in your key name (like AZURE_STORAGE_ACCESS_KEY), container name, and name of the file you want to upload, as needed in the code
- NOTE: I initially put the code in a .py file rather than Google Colab, but installing the azure-blob-storage package will require installing Rust. From what I researched, it seems that Rust can only be installed globally and not on a venv. Thus, I chose to move to Colab rather than doing a global installation.
- This repo's code is found in azure_storage.py (NOTE: This .py file was converted from Google Colab)
- Go to "Cloud Storage - Enterprise-ready object storage." These configurations were set for the following headers:
- Create the bucket, then click "Upload" to upload a file/folder
- Go to "IAM - IAM & Admin"
- Hover to left bar and click "Service Accounts"
- Click "Create Service Account
- Change role to "Editor" and leave rest alone
- Create the account, then click "Manage keys"
- Create new key as JSON
- Rename file as desired, then put it into a repository
- Ignore the file in .gitignore
- Create and activate a venv, then install google-cloud-storage and pillow with pip
- Create python code to upload files to storage
- This repo's code is found in gcp_storage.py
- The comments note the things to replace with the key, bucket name, etc.
- This repo's code is found in gcp_storage.py
- Storage Accounts
- In the "Security + networking" tab...
- Networking tab
- can configure public network access
- Access keys tab
- can rotate keys
- Shared access signature tab
- can give storage account access to clients with differing permissions
- Encryption
- Microsoft Defender for Cloud
- Networking tab
- In the "Data management" tab...
- Data protecton tab
- has options for recovering data when modified or deleted
- Lifecycle management tab
- Can create rules to move data to certain access tiers or have them expire at the end of its lifecycle
- Data protecton tab
- In the "Security + networking" tab...
- Containers
- change access level
- access control (IAM)
- IAM
- create deny policies
- remove or grant access in IAM tab or in a specific service account
- delete existing keys from service accounts
- Buckets
- Permission tab
- prevent public access
- switch to fine-grained object access
- Protection
- change soft delete policy
- turn object versioning off or on
- set a bucket or object retention policy
- default event-based hold option
- Lifecycle
- add or delete lifecycle rules
- Permission tab