Data Discovery and Classification in Azure SQL Database is part of the "Advanced Data Security" feature. This feature allows organisations to discover, classify, label and protect PII (Personally Identifiable Data). It also provide organisations to comply against audit and regulatory compliance.
The classification taxonomy for the sensitive columns will be located in one central place for organisations to manage.
a) Logon to the Azure Portal. Refer steps a) and b) for creation of resource group and Azure SQL Database.
b) Navigate to the "sampledb" under the "AAD-SQL" resource group and click on "Advanced data security"
c) Enable "Advanced Data Security on server"
d) Once enabled, you can see the columns recommended for classification along with the sensitivity label provided.
e) The database is scanned for sensitive columns and it will be listed along with the sensitivity labels. You can accept the automatic recommendations or you can manual add classifications. Custom sensitivity labels can also be added and columns can be changed to the new sensitive labels as required.
f) Save the recommendations.
g) The "Overview" tab provides the summary of classification information of the database including the list of classified columns along with the schema, informative types and sensitive labels. Once classified you can monitor the access to sensitive data through "SQL Auditing".
Sharing knowledge does not lessen your store, often it gets you more.
Success doesn't happen overnight and patience is key to living your dream life.
Success is a journey not a destination
Saturday, November 30, 2019
Friday, November 29, 2019
Data Platform Tips 13 - Auditing on Azure SQL Database
Azure SQL provides both server level as well as database level auditing. Auditing tracks database events and records them in Blob storage, Log Analytics or Event Hubs.
a) Logon to the Azure Portal. Refer steps a) and b) for creation of resource group and Azure SQL Database.
b) Navigate to the resource group "AAD-SQL" and to the "aad-azuresql" SQL Server and click on "Auditing".
c) Turn "Auditing" On at server level and configure the blob storage to capture the audit events.
d) Navigate to the "sampledb" database on the Azure SQL Server and click on "Auditing" and turn on Database level Auditing along with configuring Blob storage to store the Database level audit events.
e) Once configured you can navigate between Server level Auditing events and Database level Auditing events as shown below.
Note from Microsoft Docs
You should avoid enabling both server blob auditing and database blob auditing together, unless:
More information on Azure SQL Database Auditing
a) Logon to the Azure Portal. Refer steps a) and b) for creation of resource group and Azure SQL Database.
b) Navigate to the resource group "AAD-SQL" and to the "aad-azuresql" SQL Server and click on "Auditing".
c) Turn "Auditing" On at server level and configure the blob storage to capture the audit events.
d) Navigate to the "sampledb" database on the Azure SQL Server and click on "Auditing" and turn on Database level Auditing along with configuring Blob storage to store the Database level audit events.
e) Once configured you can navigate between Server level Auditing events and Database level Auditing events as shown below.
Note from Microsoft Docs
You should avoid enabling both server blob auditing and database blob auditing together, unless:
- You want to use a different storage account or retention period for a specific database.
- You want to audit event types or categories for a specific database that differ from the rest of the databases on the server. For example, you might have table inserts that need to be audited only for a specific database.
More information on Azure SQL Database Auditing
Growing your community through User Groups
This article is about sharing my experiences with starting a
user group in Wellington and growing the user group.
The "Wellington
Data Management and Analytics" (https://www.meetup.com/Wellington-Data-Management-and-Analytics-Meetup/)
user group was started back in August
2017 with just 1 member. Also, I
am a vegetarian and non-alcoholic but still decided to organise the user group
and other events in Wellington.
I had the meetup site created and patiently waited for a
month before I scheduled the first user group session. I was surprised that
without much promotions or marketing I had more than 20 people attending the
session with more than 30 registrations. Now the user group in 2019 is close to reaching 1000 members.
Scheduling regular sessions at user group
I have seen some interesting user groups getting started but
not being able to schedule regular sessions and the community losing the
interest with the group. I was clear with one thing which is to run at least
one session each month at our user group. This helped really to build the
community as we meet regularly each month and share our knowledge and
experiences.
Engaging more women attending the user groups
Twice a year I make a point to run Women in Tech sessions to
engage more women to get involved in the tech community. It is always great to
see more women turning up for these sessions. I am working on few more ideas
around how to grow this initiative further.
Collaboration with other user groups
I build relationships with other user group leaders within
the community to increase the visibility of the user groups. I met Xiaodi Yan who runs the Chinese IT
Association New Zealand (CITANZ) user group in Wellington. We met last year and
co-organised Global AI bootcamp, Global Azure bootcamp and few other events.
Xiaodi uses his experience related to the community to help CITA
grow rapidly. These days the CITA meetup has more than 500 members. As a core
member of CITA, Xiaodi helps CITA organize monthly meetups. Over the last year,
CITA has delivered more than 10 meet-ups, and successfully delivered a range of
diversified topics, which covered Azure DevOps, blockchain, cloud computing,
.NET Core, Container, Agile, job hunting and many more. The overall attendees
are more than 400.
Now I have more Chinese IT professionals attending our
events and thanks to Xiaodi for promoting the events and supporting us.
Also, this year most of the Microsoft user groups in
Wellington joined together as a single group and hosting the Christmas party on
December 5th, 2019. This is a great effort initiated by Ondrej Aubrecht (PowerShell User Group Leader) and we are
supporting this initiative. We have more than 100 people across the following
user groups attending the Christmas party this year.
- Wellington PowerShell
- Wellington Data Management and Analytics
- Wellington Azure AI
- Wellington Azure
- Wellington Excel and Power BI
- Wellington Dynamics
- Wellington D365
Hosting more events and talks to keep community engaged with support from MVP community
Post starting the Wellington Data Management and Analytics
user group, with the support and mentoring from the MVPs I decided to bring
SQLSaturday event back to Wellington which further increased the number of
people attending the user group and helped in growing the community further.
Along with that whenever I get to know about MVPs visiting Wellington, I try to
host a user group event to help the community to learn from the experts.
Support from co-organisers and volunteers
User groups can be started with one person but to run it
long you need support. I am lucky enough to have Raju RH as my co-organiser who understands the core ideologies of
the user group and supports me running it from Day 1. Also, thanks to all my Volunteers and other User group leaders who assist with the user groups as well as other events that we host
in Wellington.
Here is a great video put together by one of my volunteer (Lavanya) showcasing the highlights of our SQLSaturday events in Wellington
It is great to be an MVP not just focusing from technical front but also showcasing how to build a great community to learn and share knowledge and learning.
Thursday, November 28, 2019
Data Platform Tips 12 - Azure Active Directory Authentication with Azure SQL Database
a) Logon to the Azure Portal
b) Create a new Resource group named "AAD-SQL"
c) Create a new Azure SQL Database Server named "aad-azuresql"
d) Configure Azure Active Directory user for accessing the Azure SQL Server.
e) Select "Set Admin" and select the AAD user who needs to be given access to the Azure SQL Server.
Now you can create a database under this Azure SQL Server and can access the database using the above configured Azure Directory Admin user using Password Authentication as shown below. Note: Make sure to configure the firewall settings on your Azure SQL Server.
b) Create a new Resource group named "AAD-SQL"
c) Create a new Azure SQL Database Server named "aad-azuresql"
d) Configure Azure Active Directory user for accessing the Azure SQL Server.
e) Select "Set Admin" and select the AAD user who needs to be given access to the Azure SQL Server.
Now you can create a database under this Azure SQL Server and can access the database using the above configured Azure Directory Admin user using Password Authentication as shown below. Note: Make sure to configure the firewall settings on your Azure SQL Server.
Wednesday, November 27, 2019
Data Platform Tips 11 - Azure SQL Database vs. Managed Instances on Azure
Azure SQL Database is a relational database hosted on Azure and a fully managed service provided as the Platform as a Service (PaaS). It provides 3 deployment options. It's a pay-as-you-go option allowing users to scale up or out based on their needs. SQL Database has some additional features that are not available in SQL Server, such as built-in high availability, intelligence, and management.
More information on Azure SQL Database options
- Single Database - single database with defined set of resources
- Elastic Pool - multiple databases within a elastic pool sharing resources
- Database Server - manages single databases and elastic pools
SQL Databases | SQL Managed Instances |
Supports most on-premises capabilities | Supports all on-premises capabilities |
99.995% availability | 99.99% availability |
Supports built-in backups, patching and recovery | Supports built-in backups, patching and recovery |
Supports latest stable Database Engine version | Supports latest stable Database Engine version |
Migration involves validating whether all the features used by on-premises version is supported. | Easy migration from on-premises |
Supports databases up to 100TB | Supports databases up to 8 TB |
On-premises application can access data in Azure SQL Database. | Connectivity to your on-premises environment using Azure Express Route or VPN Gateway |
More information on Azure SQL Database options
Tuesday, November 26, 2019
Data Platform Tips 10 - What are Elastic Pools and when you need one?
Elastic Pools on Azure are simple and cost effective way to manage and scale multiple databases of varying usage demands with a predictable cost. Multiple databases in an elastic pool are on a single Azure SQL Database server share a defined set of resources for a set price.
Elastic Pools are really handy for SaaS developers where by they have to provision separate database for each of their clients who may have varying demands. Elastic Pools will manage the resources based on the demand across the different databases within the pool.
If any additional resources are required they can be added to the pool without any downtime except when databases needs to be moved in and out of the pool there will be a minimum downtime at the end of the operation where connections will be dropped.
There is no per-database charge for elastic pools. You are billed for each hour a pool exists at the highest eDTU or vCores, regardless of usage or whether the pool was active for less than an hour.
d) Create 2 databases inside the SQL Server and assign the Elastic Pool as part of it. Now the resources will be shared within the 2 databases as part of the Elastic Pool meeting their demands efficiently for a known cost.
More information on Elastic Pools
Elastic Pools and Elastic Jobs (Video)
Elastic Pools are really handy for SaaS developers where by they have to provision separate database for each of their clients who may have varying demands. Elastic Pools will manage the resources based on the demand across the different databases within the pool.
If any additional resources are required they can be added to the pool without any downtime except when databases needs to be moved in and out of the pool there will be a minimum downtime at the end of the operation where connections will be dropped.
There is no per-database charge for elastic pools. You are billed for each hour a pool exists at the highest eDTU or vCores, regardless of usage or whether the pool was active for less than an hour.
When do you really need an Elastic Pool?
If you have multiple databases but really don't know the demands of the databases then it is better to assign them in an Elastic Pool with known set of resources and predictable cost. The Elastic Pool will take care of allocating required resources to the databases during demands.
a) Log on to the Azure Portal
b) Search for Elastic Pools on the Marketplace.
c) Provision an Elastic Pool and SQL Server on Azure.
Creating an Elastic Pool on Azure
a) Log on to the Azure Portal
b) Search for Elastic Pools on the Marketplace.
c) Provision an Elastic Pool and SQL Server on Azure.
d) Create 2 databases inside the SQL Server and assign the Elastic Pool as part of it. Now the resources will be shared within the 2 databases as part of the Elastic Pool meeting their demands efficiently for a known cost.
More information on Elastic Pools
Monday, November 25, 2019
Data Platform Tips 9 - Import a BACPAC file to a Azure SQL Database
Let us take the BACPAC file created as part of the previous tip and import it into a new database.
a) Logon to the Azure Portal
b) Navigate to the resource group "export-database"
c) Click on the SQL Server named "export-database-db-server" and click on "Import database" link.
d) Select the BACPAC file from the blob storage
e) Select the "Pricing Tier" as "Standard" and 20 DTUs.
f) Click OK and the import database would have started.
g) After completion, the "import-database-db" will be under the "SQL Databases" link.
Download the SQLPackage Utility
Check syntax here
Note: SqlPackage.exe will be located under "C:\Program Files\Microsoft SQL Server\150\DAC\bin" post the installation.
For using Windows Authentication, run the following command to get the bacpac file.
sqlpackage.exe /a:Import /sf:export-database-db.bacpac /tdn:import-database-db /tsn:export-database-db-server.database.windows.net /ua:True /tid:"xxxxx.onmicrosoft.com"
For using SQL Authentication, run the following command to get the bacpac file.
sqlpackage.exe /a:import /tcs:"Data Source=export-database-db-server.database.windows.net;Initial Catalog=import-database-db;User Id=xxxxxx;Password=xxxxxx" /sf:export-database-db.bacpac /p:DatabaseEdition=Standard /p:DatabaseServiceObjective=S2
a) Logon to the Azure Portal
b) Navigate to the resource group "export-database"
c) Click on the SQL Server named "export-database-db-server" and click on "Import database" link.
d) Select the BACPAC file from the blob storage
e) Select the "Pricing Tier" as "Standard" and 20 DTUs.
f) Click OK and the import database would have started.
g) After completion, the "import-database-db" will be under the "SQL Databases" link.
Import using SQLPackage Utility
SQLPackage is a command-line utility that is used to export, import, extract, publish and run T-SQL scripts against databases.Download the SQLPackage Utility
Check syntax here
Note: SqlPackage.exe will be located under "C:\Program Files\Microsoft SQL Server\150\DAC\bin" post the installation.
For using Windows Authentication, run the following command to get the bacpac file.
sqlpackage.exe /a:Import /sf:export-database-db.bacpac /tdn:import-database-db /tsn:export-database-db-server.database.windows.net /ua:True /tid:"xxxxx.onmicrosoft.com"
For using SQL Authentication, run the following command to get the bacpac file.
sqlpackage.exe /a:import /tcs:"Data Source=export-database-db-server.database.windows.net;Initial Catalog=import-database-db;User Id=xxxxxx;Password=xxxxxx" /sf:export-database-db.bacpac /p:DatabaseEdition=Standard /p:DatabaseServiceObjective=S2
Sunday, November 24, 2019
Data Platform Tips 8 - Export a BACPAC file of a database
Export using Azure Portal
a) Logon to the Azure Portal
b) Navigate to the Azure SQL Database that needs to be exported and also make sure that you have a Azure Storage service provisioned which will hold the bacpac files.
c) Select the "Export" button.
d) Provide the details. Export bacpac file name, storage container and credentials and click ok.
e) You may receive an error with the following message.
Database export error
Failed to export the database: export-database-db.
ErrorCode: 400
ErrorMessage: There was an error that occurred during this operation : '
f) Set the Firewall rules to allow the IP address of the machine that tries to initiate the export as shown below.
g) Once done repeat step b) and initiate the export again.
h) Export request will be submitted and you also check the progress as shown below.
i) Once the request is complete, the bacpac file will be successfully created in the storage container.
Export using SQLPackage Utility
SQLPackage is a command-line utility that is used to export, import, extract, publish and run T-SQL scripts against databases.Download the SQLPackage Utility
Check syntax here
Note: SqlPackage.exe will be located under "C:\Program Files\Microsoft SQL Server\150\DAC\bin" post the installation.
For using Windows Authentication, run the following command to get the bacpac file.
SqlPackage.exe /a:Export /tf:export-database-db.bacpac /scs:"Data Source=export-database-db-server.database.windows.net;Initial Catalog=export-database-db;" /ua:True /tid:"xxxxxx.onmicrosoft.com"
For using SQL Authentication, run the following command to get the bacpac file.
SqlPackage.exe /a:Export /tf:export-database-db.bacpac /scs:"Data Source=export-database-db-server.database.windows.net;Initial Catalog=export-database-db;user id=xxxxx;password=xxxxxx" /ua:False
Saturday, November 23, 2019
Data Platform Tips 7 - Move Azure SQL Database from one region to another
There may be reasons why organisations need to move an existing Azure SQL Database from one region to another region. Let us see how to do that using Azure Portal.
a) Logon to the Azure Portal
b) Create a new resource group named "move-azure-sql-db"
c) Create a new database named "move-azure-sql-db" database and server named "move-azure-sql-db-server" in "AustraliaEast" region.
d) The "Move" option allows to move the database to another resource group or another subscription. But what we are trying to do is move the database to a new region and in order to do that we need use the Failover option.
e) Create a Failover group named "move-azure-sql-failover-group" and add the database to a new database server named "move-azure-sql-db-server1" in a different region "Southeast Asia".
f) Now initiate the failover by clicking the "failover" button so the database is moved to the other region.
g) Once the failover has happened, clear all the resources from the source region like the database and the database server.
a) Logon to the Azure Portal
b) Create a new resource group named "move-azure-sql-db"
c) Create a new database named "move-azure-sql-db" database and server named "move-azure-sql-db-server" in "AustraliaEast" region.
d) The "Move" option allows to move the database to another resource group or another subscription. But what we are trying to do is move the database to a new region and in order to do that we need use the Failover option.
e) Create a Failover group named "move-azure-sql-failover-group" and add the database to a new database server named "move-azure-sql-db-server1" in a different region "Southeast Asia".
f) Now initiate the failover by clicking the "failover" button so the database is moved to the other region.
g) Once the failover has happened, clear all the resources from the source region like the database and the database server.