How to Create an Azure Storage account

Introduction

Most organizations have diverse requirements for their cloud-hosted data. For example, storing data in a specific region, or needing separate billing for different data categories. Azure storage accounts let you formalize these types of policies and apply them to your Azure data.

Suppose you work at a chocolate manufacturer that produces baking ingredients such as cocoa powder and chocolate chips. You market your products to grocery stores who then sell them to consumers.

Your formulations and manufacturing processes are trade secrets. The spreadsheets, documents, and instructional videos that capture this information are critical to your business and require geographically redundant storage. This data is primarily accessed from your main factory, so you would like to store it in a nearby datacenter. The expense for this storage needs to be billed to the manufacturing department.

You also have a sales group that creates cookie recipes and baking videos to promote your products to consumers. Your priority for this data is low cost, rather than redundancy or location. This storage must be billed to the sales team.

By creating multiple Azure storage accounts, with each one having the appropriate settings for the data it holds, you can handle these types of business requirements.

Decide how many storage accounts you need

Organizations often have multiple storage accounts to enable them to implement different sets of requirements. In the chocolate-manufacturer example, there’s one storage account for private business data and one storage account for consumer-facing files. In this unit, you learn the policy factors that each type of storage account controls, which helps you decide how many accounts you need.

What is Azure Storage?

Azure provides many ways to store your data, including multiple database options like Azure SQL Database, Azure Cosmos DB, and Azure Table Storage. Azure offers multiple ways to store and send messages, such as Azure Queues and Event Hubs. You can even store loose files using services like Azure Files and Azure Blobs.

Azure groups four of these data services together under the name Azure Storage. The four services are Azure Blobs, Azure Files, Azure Queues, and Azure Tables. The following illustration shows the elements of Azure Storage.

These four data services are all primitive, cloud-based storage services, and are often used together in the same application.

What is a storage account?

storage account is a container that groups a set of Azure Storage services together. Only data services from Azure Storage can be included in a storage account (Azure Blobs, Azure Files, Azure Queues, and Azure Tables). The following illustration shows a storage account containing several data services.

kanilislam.com

Combining data services into a single storage account enables you to manage them as a group. The settings you specify when you create the account, or any changes that you make after creation, apply to all services in the storage account. Deleting a storage account deletes all of the data stored inside it.

A storage account is an Azure resource and is part of a resource group. The following illustration shows an Azure subscription containing multiple resource groups, where each group contains one or more storage accounts.

kanilislam.com

Other Azure data services, such as Azure SQL and Azure Cosmos DB, are managed as independent Azure resources and can’t be included in a storage account. The following illustration shows a typical arrangement: Blobs, Files, Queues, and Tables are contained within storage accounts, while other services aren’t.

Storage account settings

A storage account defines a policy that applies to all the storage services in the account. For example, you could specify that all the contained services will be stored in the West US datacenter, accessible only over https, and billed to the sales department’s subscription.

A storage account defines the following settings:

  • Subscription: The Azure subscription that’s billed for the services in the account.
  • Location: The datacenter that stores the services in the account.
  • Performance: Determines the data services you can have in your storage account and the type of hardware disks used to store the data.
    • Standard allows you to have any data service (Blob, File, Queue, Table) and uses magnetic disk drives.
    • Premium provides more services for storing data. For example, storing unstructured object data as block blobs or append blobs, and specialized file storage used to store and create premium file shares. These storage accounts use solid-state drives (SSD) for storage.
  • Replication: Determines the strategy used to make copies of your data to protect against hardware failure or natural disaster. At a minimum, Azure automatically maintains three copies of your data within the datacenter associated with the storage account. The minimum replication is called locally redundant storage (LRS), and guards against hardware failure but doesn’t protect you from an event that incapacitates the entire datacenter. You can upgrade to one of the other options such as geo-redundant storage (GRS) to get replication at different datacenters across the world.
  • Access tier: Controls how quickly you’re able to access the blobs in a storage account. The Hot access tier is optimized for storing data that’s accessed or modified frequently and gives quicker access than Cool, but at increased storage cost. The Cool access tier is optimized for storing data that’s infrequently accessed or modified, and has a lower storage cost. Hot access tier applies only to blobs, and serves as the default value for new blobs.
  • Secure transfer required: A security feature that determines the supported protocols for access. Enabled requires HTTPS, while disabled allows HTTP.
  • Virtual networks: A security feature that allows inbound access requests only from the virtual network(s) you specify.

How many storage accounts do you need?

A storage account represents a collection of settings like location, replication strategy, and subscription owner. You need one storage account for each group of settings that you want to apply to your data. The following illustration shows two storage accounts that differ in one setting; that one difference is enough to require separate storage accounts.

Typically, your data diversity, cost sensitivity, and tolerance for management overhead determine the number of storage accounts you need.

Data diversity

Organizations often generate data that differs in where it’s consumed, how sensitive it is, which group pays the bills, etc. Diversity along any of these vectors can lead to multiple storage accounts. Let’s consider two examples:

  1. Do you have data that is specific to a country/region? If so, you might want to store the data in a datacenter in that country/region for performance or compliance reasons. You need one storage account for each geographical region.
  2. Do you have some data that is proprietary and some for public consumption? If so, you could enable virtual networks for the proprietary data and not for the public data. Separating proprietary data and public data requires separate storage accounts.

In general, increased diversity means an increased number of storage accounts.

Cost sensitivity

A storage account by itself has no financial cost; however, the settings you choose for the account do influence the cost of services in the account. Geo-redundant storage costs more than locally redundant storage. Premium performance and the Hot access tier increase the cost of blobs.

You can use multiple storage accounts to reduce costs. For example, you could partition your data into critical and noncritical categories. You could place your critical data into a storage account with geo-redundant storage and put your noncritical data in a different storage account with locally redundant storage.

Tolerance for management overhead

Each storage account requires some time and attention from an administrator to create and maintain. It also increases complexity for anyone who adds data to your cloud storage. Everyone in an administrator role needs to understand the purpose of each storage account so they add new data to the correct account.

Storage accounts are powerful tools to help you obtain the performance and security you need while minimizing costs. A typical strategy is to start with an analysis of your data. Create partitions that share characteristics like location, billing, and replication strategy. Then, create one storage account for each partition.

Choose your account settings

Completed100 XP

The storage account settings we’ve already covered apply to the data services in the account. Here, we discuss the three settings that apply to the account itself, rather than to the data stored in the account:

  • Name
  • Deployment model
  • Account kind

These settings affect how you manage your account and the cost of the services within it.

Name

Each storage account has a name. The name must be globally unique within Azure, use only lowercase letters and digits and be between 3 and 24 characters.

Deployment model

deployment model is the system Azure uses to organize your resources. The model defines the API that you use to create, configure, and manage those resources. Azure provides two deployment models:

  • Resource Manager: the current model that uses the Azure Resource Manager API
  • Classic: a legacy offering that uses the classic deployment model

Most Azure resources only work with Resource Manager, which makes it easy to decide which model to choose. However, storage accounts, virtual machines, and virtual networks support both, so you must choose one or the other when you create your storage account.

The key feature difference between the two models is their support for grouping. The Resource Manager model adds the concept of a resource group, which isn’t available in the classic model. A resource group lets you deploy and manage a collection of resources as a single unit.

Microsoft recommends that you use the Resource Manager deployment model for all new resources.

Account kind

Storage account kind is a set of policies that determine which data services you can include in the account and the pricing of those services. There are four kinds of storage accounts:

  • Standard – StorageV2 (general purpose v2): the current offering that supports all storage types and all of the latest features
  • Premium – Page blobs: Premium storage account type for page blobs only
  • Premium – Block blobs: Premium storage account type for block blobs and append blobs
  • Premium – File shares: Premium storage account type for file shares only

Microsoft recommends that you use the Standard – StorageV2 (general purpose v2) option for new storage accounts.

The core advice is to choose the Resource Manager deployment model and the Standard – StorageV2 (general purpose v2) account kind for all your storage accounts. For new resources, there are few reasons to consider the other choices.

Choose an account creation tool

Completed100 XP

There are several tools that create a storage account. Your choice is typically based on if you want a GUI and whether you need automation.

Available tools

The available tools are:

  • Azure portal
  • Azure CLI (Command-line interface)
  • Azure PowerShell
  • Management client libraries

The portal provides a GUI with explanations for each setting, which makes it easy to use and helpful for learning about the options.

The other tools in this list all support automation. The Azure CLI and Azure PowerShell let you write scripts, while the management libraries allow you to incorporate the creation into a client app.

How to choose a tool

Storage accounts are typically based on an analysis of your data, so they tend to be relatively stable. As a result, storage-account creation is usually a one-time operation done at the start of a project. For one-time activities, the portal is the most common choice.

In the rare cases where you need automation, the decision is between a programmatic API or a scripting solution. Scripts are typically faster to create and less work to maintain because there’s no need for an IDE, NuGet packages, or build steps. If you have an existing client application, the management libraries might be an attractive choice; otherwise, scripts are a better option.

Exercise – Create a storage account using the Azure portal

In this unit, you use the Azure portal to create a storage account for a fictitious southern California surf report web app. The surf report site lets users upload photos and videos of local beach conditions. Viewers of the site use the content to help them choose the beach with the best surfing conditions.

Your list of design and feature goals is:

  • Video content must load quickly.
  • The site must handle unexpected spikes in upload volume.
  • Outdated content must be removed as surf conditions change so the site always shows current conditions.

You decide to buffer uploaded content in an Azure Queue for processing and then transfer it to an Azure Blob for persistent storage. You need a storage account that can hold both queues and blobs while delivering low-latency access to your content.

Create a storage account using Azure portal

  1. Sign in to the Azure portal using the same account you used to activate the sandbox.
  2. On the resource menu, or from the Home page, select Storage accounts. The Storage accounts pane appears.
  3. On the command bar, select Create. The Create a storage account pane appears.
  4. On the Basics tab, enter the following values for each setting.
    Setting Value
    Project details
    Subscription Concierge Subscription
    Resource group learn-6495dde3-2a29-4660-b468-303148f68818 from the dropdown list.
    Instance details
    Storage account name Enter a unique name. This name is used to generate the public URL to access the data in the account. The name must be unique across all existing storage account names in Azure. Names must have 3 to 24 characters and can contain only lowercase letters and numbers.
    Region Select a location near to you from the dropdown list.
    Performance Standard. This option decides the type of disk storage used to hold the data in the Storage account. Standard uses traditional hard disks, and Premium uses solid-state drives (SSD) for faster access.
    Redundancy Select Locally redundant storage (LRS) from the dropdown list. In our case, the images and videos quickly become out-of-date and are removed from the site. As a result, there’s little value to paying extra for Geo-redundant storage (GRS). If a catastrophic event results in data loss, you can restart the site with fresh content from your users.
  5. Select Next : Advanced. On the Advanced tab, enter the following values for each setting.
    Setting Value
    Security
    Require secure transfer for REST API operations Check. This setting controls whether HTTP can be used for the REST APIs that access data in the storage account. Setting this option to enable forces all clients to use HTTPS. Most of the time, you want to set secure transfer to enable; using HTTPS over the network is considered a best practice.
    Allow enabling anonymous access on individual containers Check. Blob containers, by default, don’t permit anonymous access to their content. This setting allows authorized users to selectively enable anonymous access on specific containers.
    Enable storage account key access Check. We want to allow clients to access data via SAS.
    Default to Microsoft Entra authorization in the Azure portal Uncheck. Clients are public, not part of an Active Directory.
    Minimum TLS version Select Version 1.2 from dropdown list. TLS 1.2 is a secure version of TLS, and Azure Storage uses it on public HTTPS endpoints. TLS 1.1 and 1.0 are supported for backwards compatibility. See Warning at end of table.
    Permitted scope for copy operations Accept default
    Hierarchical Namespace
    Enable hierarchical namespace Uncheck. Data Lake hierarchical namespace is for big-data applications that aren’t relevant to this module.
    Access protocols
    Enable hierarchical namespace Accept default. Blob and Data Lake Gen2 endpoints are provisioned by default.
    Blob storage
    Allow cross-tenant replication Uncheck. Active Directory isn’t being used for this exercise.
    Access tier Hot. This setting is only used for Blob storage. The Hot access tier is ideal for frequently accessed data; the Cool access tier is better for infrequently accessed data. This setting only sets the default value. When you create a Blob, you can set a different value for the data. In our case, we want the videos to load quickly, so we use the high-performance option for our blobs.
    Azure Files
    Enable large file shares Uncheck. Large file shares provide support up to a 100 TiB, however this type of storage account can’t convert to a Geo-redundant storage offering, and upgrades are permanent.

     Warning

    If Enable large file shares is selected, it will enforce additional restrictions, and Azure files service connections without encryption will fail, including scenarios using SMB 2.1 or 3.0 on Linux. Because Azure storage doesn’t support SSL for custom domain names, this option cannot be used with a custom domain name.

  6. Select Next : Networking. On the Networking tab, enter the following values for each setting.
    Setting Value
    Network connectivity
    Network access Enable public access from all networks. We want to allow public Internet access. Our content is public facing, and we need to allow access from public clients.
    Network routing
    Routing preference Microsoft network routing. We want to make use of the Microsoft global network that is optimized for low-latency path selection.
  7. Select Next : Data protection. On the Data protection tab, enter the following values for each setting.
    Setting Value
    Recovery
    Enable point-in-time restore for containers Uncheck. Not necessary for this implementation.
    Enable soft delete for blobs Uncheck. Soft delete lets you recover blob data in cases where blobs or blob snapshots are deleted accidentally or overwritten.
    Enable soft delete for containers Uncheck. Soft delete lets you recover your containers that are deleted accidentally.
    Enable soft delete for file shares Uncheck. File share soft delete lets you recover your accidentally deleted file share data more easily.
    Tracking
    Enable versioning for blobs Uncheck. Not necessary for this implementation.
    Enable blob change feed Uncheck. Not necessary for this implementation.
    Access control
    Enable version-level immutability support Uncheck. Not necessary for this implementation.
  8. Select Next : Encryption. Accept the defaults.
  9. Select Next : Tags. Here, you can associate key/value pairs with the account for your categorization to determine if a feature is available to selected Azure resources.
  10. Select Next : Review to validate your options and to ensure all the required fields are selected. If there are issues, this tab identifies them so you can correct them.
  11. When validation passes successfully, select Create to deploy the storage account.
  12. When deployment is complete, which may take up to two minutes, select Go to resource to view Essential details about your new storage account.

You created a storage account with settings driven by your business requirements. For example, you might have selected a West US datacenter because your customers were primarily located in southern California. The typical flow for creating a storage account is: first analyze your data and goals, and then configure the storage account options to match.

Summary

Storage accounts let you create a group of data management rules and apply them all at once to the data stored in the account: blobs, files, tables, and queues.

If you tried to achieve the same thing without storage accounts, the end product would be tedious and error-prone. For example, what are the chances that you could successfully apply the same rules to thousands of blobs?

Instead, you capture the rules in the settings for a storage account, and those rules are automatically applied to every data service in the account.

Clean up

The sandbox automatically cleans up your resources when you’re finished with this module.

When you’re working in your own subscription, it’s a good idea at the end of a project to identify whether you still need the resources you created. Resources that you leave running can cost you money. You can delete resources individually or delete the resource group to delete the entire set of resources.

 Important

When you’re working in your own subscription, to avoid unwanted usage charges, you must remove any resources that you create.

Use the following steps in the Azure portal to delete the resource group and all associated resources.

  1. In the resource menu, select Resource groups.
  2. Select the resource group you created.
  3. In the command bar, select Delete resource group.
  4. In the confirmation pane, you’re prompted to type the resource group name; you can right-click and drag the title from the Resource group pane.
  5. When the expected name is a match, Delete is available.
  6. Select Delete. It may take several minutes to delete your resource group. Check Notifications in the Global Controls in the upper right corner of the Azure portal to ensure your operation completed.

Leave a Comment

Your email address will not be published. Required fields are marked *

Exit mobile version