Showing posts with label sql server. Show all posts
Showing posts with label sql server. Show all posts

Wednesday, November 06, 2019

SQL Server 2019 is now GA

SQL Server 2019 is now generally available and announced at Microsoft Ignite 2019 on Day 1. SQL Server 2019 is an unified platform for enterprises to meet their business needs.

SQL Server 2019 in addition supports Big Data Clusters, Data Virtualization, Data Marts and Enterprise Data Lake.


More information - https://cloudblogs.microsoft.com/sqlserver/2019/11/04/sql-server-2019-is-now-generally-available/

Download SQL Server 2019 - https://www.microsoft.com/en-us/evalcenter/evaluate-sql-server-2019

What's new - https://docs.microsoft.com/en-us/sql/sql-server/what-s-new-in-sql-server-ver15?view=sql-server-ver15

Download - Introducing SQL Server 2019 e-book

Wednesday, December 26, 2018

Azure Resource Explorer in Azure Data Studio

Using Azure Resource Explorer in Azure Data Studio you can now manage Azure SQL Server, Azure SQL database, and Azure SQL Managed Instance resources.
1. Open “Azure Data Studio” and an Azure icon appears in the left menu bar. Click the icon to open “Azure Resource Explorer”.
AzureResourceExplorer1
2. Sign into Azure
AzureResourceExplorer2
3. Click the “Add an Account” button and you will be shown with the Device Code and URL and click on “Copy and Open” button and it will open the URL and Type the Device Code.
AzureResourceExplorer3
4. Pick the Azure account to sign in
AzureResourceExplorer4
5. Once signed in, the below window will be shown.
AzureResourceExplorer5
6. Close the above browser window and go back to Azure Data Studio to see the below screenshot.
AzureResourceExplorer6
7. Expand the connection to the see the list of subscriptions and SQL Services under them.
AzureResourceExplorer7
8. Finally you can also add Firewall rule to allow access to the Azure SQL Server.
AzureResourceExplorer8

Azure Data Studio

Azure Data Studio is a cross-platform database tool for data professionals using the Microsoft family of on-premises and cloud data platforms on Windows, MacOS, and Linux. It was previously called as “SQL Operations Studio”. It has been engineered keeping Data Platform user in mind. Azure Data Studio has been designed to focus on commonly used functionality with any additional experiences made available as extensions.

Microsoft informs that “The vision of the product is to create a unified experience across heterogenous data sources regardless of their form or location: structured or unstructured, on-premises or cloud. Azure Data Studio currently offers built-in support for SQL Server on-premises and on the cloud and Azure SQL Database, along with preview support for Azure SQL Managed Instance, Azure SQL Data Warehouse and SQL Server 2019 Big Data Clusters.”

Use Azure Data Studio if you:
  • Need to run on macOS or Linux
  • Are connecting to a SQL Server 2019 big data cluster
  • Spend most of your time editing or executing queries
  • Need the ability to quickly chart and visualize result sets
  • Can execute most administrative tasks via the integrated terminal using sqlcmd or PowerShell
  • Have minimal need for wizard experiences
  • Do not need to do deep administrative configuration
Use SQL Server Management Studio if you:
  • Spend most of your time on database administration tasks
  • Are doing deep administrative configuration
  • Are doing security management, including user management, vulnerability assessment, and configuration of security features
  • Make use of the Reports for SQL Server Query Store
  • Need to make use of performance tuning advisors and dashboards
  • Are doing import/export of DACPACs
  • Need access to Registered Servers and want to control SQL Server services on Windows
Getting Started with Azure Data Studio

Download Azure Data Studio

1. Open Azure Data Studio
AzureDataStudio1
 
2. To connect to SQL Server, select
  • Connection Type: Microsoft SQL Server
  • Server: [Your SQL Server Instance name]
  • Authentication Type: [Windows Authentication/SQL Login]
  • User name: [username, if SQL Login]
  • Password: [password, if SQL Login]
  • Database: [choose the Database name you would like to connect]
  • Server Group: [Group servers under common heading]
AzureDataStudio2
3. Set any Advanced Properties if required.
AzureDataStudio3
4. Click connect and you can see a screenshot similar to below.
AzureDataStudio4
Technorati Tags: ,,,

Monday, March 27, 2017

SQL Server vNext – Get Data experience in SSAS Tabular Models

The next version of SQL Server, SQL Server vNext introduces the same experience of “Get Data” like what we see in the Power BI Desktop and Excel 2016. This is mainly applicable for tabular models when the compatibility level is set to “SQL Server vNext (1400)”. Great to see Microsoft trying to unify the UI across different toolsets.

getdatadlg

https://blogs.msdn.microsoft.com/analysisservices/2016/12/16/introducing-a-modern-get-data-experience-for-sql-server-vnext-on-windows-ctp-1-1-for-analysis-services/

Saturday, April 18, 2015

How to avoid broken views in SQL Server?

SQL Views will break if you change the schema of the underlying table.
  • Create a table as shown below
CREATE TABLE [dbo].[Employee](
 [EmployeeID] [int] IDENTITY(1,1) NOT NULL,
 [EmployeeFirstName] [varchar](50) NOT NULL,
 [EmployeeLastName] [varchar](50) NOT NULL,
 [EmployeeDOB] [date] NOT NULL,
 [CreatedDateTime] [datetime] NOT NULL,
 [ModifiedDateTime] [datetime] NOT NULL,
 CONSTRAINT [PK_Employee] PRIMARY KEY CLUSTERED 
(
 [EmployeeID] ASC
)
GO
  • Populate the table with some records

  • Create a View as below
CREATE VIEW [dbo].[EmployeeView]
AS
SELECT        EmployeeID, EmployeeFirstName, EmployeeLastName, EmployeeDOB, CreatedDateTime, ModifiedDateTime
FROM            dbo.Employee

GO
  • Do a select from the view.
SELECT * FROM [dbo].[EmployeeView]
image
  • Everything looks fine now.
  • Include a EmployeeMiddleName column in the Employee table and do the select from the view again.
image
  • Can you see the difference? EmployeeMiddleName column doesn’t appear but the EmployeeMiddleName data appears under the EmployeeLastName column and EmployeeDOB has the EmployeeLastName data and so son.
What happened here? SQL Server doesn’t bind the schema to the view and therefore when the schema changes the view is not affected. To mitigate this issue
a) Don’t use * instead denote the column names in the view.

b) Use Schema binding when creating a view but it will keep the schema of table and view intact and cannot change the underlying table schema until the view definition is modified or view is dropped otherwise the DB engine will throw an error. Schema Binding will improve stability and performance but also increases developers work.

Technorati Tags: ,

Friday, March 06, 2015

SQL Server Always On Availability Groups and MultiSubnetFailover and OLEDB connections

Having problems with failover not working with SQL Server Always On Availability Groups?

Check whether you are using OLEDB Connections as OLEDB in SQL Native Client doesn't support MultiSubnetFailover keyword and ignores it even if available. Instead ADO.NET or ODBC need to be used.

Sunday, July 27, 2014

Import Data from a large .sql file in SQL SERVER

If you try to import a large .SQL file using SQL Server Management Studio, you may get an "Out of memory" error. To import the file successfully you can use the SQLCMD utility.

The sqlcmd utility lets you enter Transact-SQL statements, system procedures, and script files at the command prompt, in Query Editor in SQLCMD mode, in a Windows script file or in an operating system (Cmd.exe) job step of a SQL Server Agent job. This utility uses ODBC to execute Transact-SQL batches.

SQLCMD -S -i C:\Temp\.sql

sqlcmd
-a packet_size -A (dedicated administrator connection)
-b (terminate batch job if there is an error)
-c batch_terminator
-C (trust the server certificate)
-d db_name
-e (echo input)
-E (use trusted connection)
-f codepage | i:codepage[,o:codepage] | o:codepage[,i:codepage]
-h rows_per_header
-H workstation_name
-i input_file
-I (enable quoted identifiers)
-k[1 | 2] (remove or replace control characters)
-K application_intent
-l login_timeout
-L[c] (list servers, optional clean output)
-m error_level
-M multisubnet_failover
-N (encrypt connection)
-o output_file
-p[1] (print statistics, optional colon format)
-P password
-q "cmdline query"
-Q "cmdline query" (and exit)
-r[0 | 1] (msgs to stderr)
-R (use client regional settings)
-s col_separator
-S [protocol:]server[\instance_name][,port]
-t query_timeout
-u (unicode output file)
-U login_id
-v var = "value"
-V error_severity_level
-w column_width
-W (remove trailing spaces)
-x (disable variable substitution)
-X[1] (disable commands, startup script, environment variables and optional exit)
-y variable_length_type_display_width
-Y fixed_length_type_display_width
-z new_password
-Z new_password (and exit)
-? (usage)

Tuesday, April 22, 2014

@@TRANCOUNT - SQL SERVER

Returns the number of BEGIN TRANSACTION statements that have occurred on the current connection.

The BEGIN TRANSACTION statement increments @@TRANCOUNT by 1. ROLLBACK TRANSACTION decrements @@TRANCOUNT to 0, except for ROLLBACK TRANSACTION savepoint_name, which does not affect @@TRANCOUNT. COMMIT TRANSACTION or COMMIT WORK decrement @@TRANCOUNT by 1.

PRINT @@TRANCOUNT
--  The BEGIN TRAN statement will increment the
--  transaction count by 1.
BEGIN TRAN
    PRINT @@TRANCOUNT
    BEGIN TRAN
        PRINT @@TRANCOUNT
--  The COMMIT statement will decrement the transaction count by 1.
    COMMIT
    PRINT @@TRANCOUNT
COMMIT
PRINT @@TRANCOUNT

Wednesday, February 12, 2014

Cross Database Ownership Chaining in SQL SERVER

Consider a scenario where you have a view object in a database named "Database A" tries to access a table from "Database A" as well as a table from "Database B". In this scenario to run the view the user must also be a user in Database B along with Database A. If these databases are tightly linked, cross database ownership chaining can be used where user can access the table from Database B only through the view from Database A but doesn't have explicit permissions to query the table in Database B.

Steps to enable Cross Database Ownership Chaining

Set both Database A and Database B in the chaining as trustworthy.

ALTER DATABASE Database1 SET TRUSTWORTHY ON;
ALTER DATABASE Database2 SET TRUSTWORTHY ON;

Make sure that both Tables inside the Database A and Database B belong to the same owner.

Enable Cross Database Ownership Chaining. This can be done on whole instance level or specific databases.

For instance level

Exec sp_configure 'cross db ownership chaining', 1;
RECONFIGURE;

For database level

ALTER DATABASE Database1 SET DB_CHAINING ON;
ALTER DATABASE Database2 SET DB_CHAINING ON;

Tuesday, February 04, 2014

24 Hours of Pass - Business Analytics Edition

This 24 Hours of PASS: Business Analytics event will take place on February 5, beginning at 16:00 GMT and will feature 12 webcasts followed by 12 hours of on demand replay.

Sessions

Advanced Analysis Techniques

Advanced Analytics in Excel 2013
Excel is “the” analytical tool in Microsoft’s suite for advanced analysts. Of course, you know that...
Dejan Sarka
An Introduction to Predictive Modeling
This session will show you how to solve a real-world data mining problem by demonstrating the...
Carlos Bossy
Introduction to Querying in DAX
The DAX (Data Analysis Expressions) language can be used to query a Tabular and PowerPivot data...
Marco Russo
Predictive Analytics for Absolute Beginners
In this one-hour preview presentation, we cover the main motivation for learning something about...
Mark Tabladillo, Artus Krohn-Grimberghe

Analytics and Visualization

Adding Valuable Techniques to your Data Science Toolbox
Data Science requires us to see and understand complex patterns in data, so we use techniques like...
Mark Whitehorn
End-to-End Power BI
The aim of this session is to excite you with the potential of Microsoft's new self-service suite...
Peter Myers
Power Query: Beyond the Basics
You already know that you can accomplish a lot within the Power Query user interface. If you are a...
Chris Webb
Small Big Data - Phase 0 - Data Hygiene 
In this talk, we will talk about the starting point for every Big Data project - getting your...
Lynn Langit

Big Data

Has Big Data Killed the EDW?
Big Data technologies have given us the ability to process massive amounts of data at a low cost,...
Stacia Misner
Implementation and Design on the Parallel Data Warehouse
If you need to design a data warehouse that scales for today and into the future, then you need to...
Jason Strate

Information Delivery

DataViz You Thought You Could NOT Do with SSRS
Despite SQL Server Reporting Services (SSRS) being a very flexible and adaptable tool when it comes...
Jason Thomas

Strategy and Architecture

Panel: Myths, Misunderstandings, and Successes in Data Analytics
Big Data, Business Analytics, Data Analytics, NoSQL, Relational . . . do we even agree on what we...
Karen Lopez, Joseph D'Antoni, Lynn Langit, Stacia Misner

Sunday, November 03, 2013

Wednesday, September 18, 2013

Microsoft TechEd 2013 - NZ

Attended TechEd this year (2013). It's been full of interesting sessions. It was so easy to pick my sessions as my prime focus was around Cloud, Database and Architecture sessions.

Missed the Keynote session as I didn't reach Auckland until 8pm on Tuesday (10/09/2013). Caught up with my HP friend for a Hot Chocolate and went to bed.

I was ready in the morning and caught up with Glassboy for breakfast and collected my Registration pack and was ready for Scott Guthrie's session on Cloud (of course Windows Azure)

It was a full on session went for 2 Parts.

I was playing with Windows Azure recently and I am impressed especially when Scott mentioned that they do numerous releases to production (for features) every day.

Then another important session I really enjoyed was "Largest SQL and Azure Projects in the World".

Glad to hear that SQL Server can handle Terabytes and Terabytes of data.....

There were other good sessions like "Benefits Dependency Network to bring business and IT changes together", "Panel discussion failure of DW projects", "Performance Tuning Analysis Services Cube" etc...

Food was ok but can cater better for Vegetarians and Vegans.

HUB was so crowded and not really keen on the stands this time.

Met my Swedish friend at the stands......

Next year TechEd.....Keen but need to see how the rest of year goes......

Saturday, May 11, 2013

SQL 2012 New Features - Column Store Indexes

Stores data in columns instead of storing in rows

Enables accessing data in easy and fast manner.

Improves processing time of Data Warehouse (DW) queries by hundreds to thousands times

To improve query performance, you need to build a column store index on the fact tables in a DW.

CREATE COLUMNSTORE INDEX [Index_Name] on [table_name] (Column_Name)

Monday, February 20, 2012

Single row result set is specified but no rows were returned

If you try to execute a SQL query in SSIS with the resultset as 'single row' and if your query doesn't return any records then you will get this error.

Solution

IF NOT EXISTS (Query Here)
SELECT ''
ELSE
Query Here

Saturday, February 18, 2012

How to configure SQL SERVER to accept remote connections?

STEP 1: Enabling TCP/IP

First we must tell SQL Server to listen on TCP/IP, to do this
perform the following steps:

1. Launch the SQL Server Configuration Manager from the "Microsoft SQL
Server 2008 R2 > Configuration Tools" Program menu
2. Click on the "Protocols for your SQL instance"
3. Right click on "TCP/IP" in the list of Protocols and choose, "Enable"

STEP 2: SQL Browser Service

Next, we have to determine if we want the SQL Browser service to be running or not. The benefit of having this service run is that users connecting remotely do not have to specify the port in the connection string.

STEP 3: Firewall..?

At this point you should be able to remotely connect. If you still
can't chances are you have a firewall configured on the computer where SQL
Server is running.

Make sure to add the port number you used for tcp/ip connection added to the exceptions list. The default port for SQL Server is 1433 unless you changed it to listen on another port

If you chose to use the SQL Browser service, you must also add
sqlbrowser service executable to the exceptions list as it listens on udp
port 1434.

Could not locate file 'FIelD' for database db in sys.database_files. The file either does not exist, or was dropped.

If you get the above error while truncating the transaction logs then either you haven't selected the proper database to run the script for truncating the logs or you haven't provided the correct logical log file name for the database.

Truncating Transaction Logs in SQL SERVER

Truncating transacation logs!!! Normally you need to find why your transactional logs are growing. The major reason would be that your transaction logs are not getting backed up.

First thing is to check your database recovery model and if you have it as 'FULL' and you deal with too many transactions within the day then start backing up your transaction logs otherwise change your recovery model to 'SIMPLE' (not suitable for databases dealing with large transactions)

Before you change the recovery model you need to truncate your logs.

USE [DatabaseName]
GO

ALTER DATABASE [DatabaseName] SET RECOVERY SIMPLE
GO

DBCC ShrinkFile('[Database log file Logical name]',100)
GO

Note: For Data Warehouse Databases, it is recommended to use 'SIMPLE' recovery model

Tuesday, February 22, 2011

ODBC error 208 (42S02) Invalid object name

Issue

You may get this error even though the table is available in the database.

Solution

Using qualified table name "database.owner.table" will fix this issue.

Friday, February 11, 2011

SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified

Check the following and fix the issue accordingly.

1) Make sure your server name is correct, e.g., no typo on the name.
2) Make sure your instance name is correct and there is actually such an instance on your target machine. [Update: Some application converts \\ to \. If you are not sure about your application, please try both Server\Instance and Server\\Instance in your connection string]
3) Make sure the server machine is reachable, e.g, DNS can be resolve correctly, you are able to ping the server (not always true).
4) Make sure SQL Browser service is running on the server.
5) If firewall is enabled on the server, you need to put sqlbrowser.exe and/or UDP port 1434 into exception.
6) Also check whether TCP/IP is enabled and the port is added to the firewall exception (if it is not using the default 1433 port)

You can download PortQry from http://support.microsoft.com/kb/832919, run "portqry.exe -n yourservername -p UDP -e 1434". If this command returns information and it contains your target instance, then you can rule out possiblity 4) and 5) above, meaning you do have a SQL Browser running and your firewall does not block SQL Browser UDP packet. In this case, you can check other issue, e.g. wrong connection string.

Sunday, July 18, 2010

DateTime MinValue in SQL SERVER and C#

I was trying to store C# DateTime.MinValue in SQL SERVER and got the following exception

SqlDateTime overflow. Must be between 1/1/1753 12:00:00 AM and 12/31/9999 11:59:59 PM.

In SQL Server the minimum date that can be stored in a datetime field (1753/1/1), is not equal to the MinValue of the DateTime .NET data type (0001/1/1).

Therefore if your datetime value is DateTime.MinValue then don't save it in SQL SERVER.