Thursday, November 06, 2014

Validation error. File System Task: Variable "variablename" is used as source or destination and is empty

The Variable "variablename" hasn't been assigned a default value. Assign a default value and the error will disappear.

Note: This is not an issue in SSIS 2008 but an issue from SSIS 2012, SSIS 2014 and above.

Wednesday, October 15, 2014

How to debug Power View errors?

If you like to trouble shoot any errors that you get with Power View then you have an option to enable tracing in PowerPivot in Excel 2013. You can enable tracing as shown in the below screenshot.

Once enabled and PowerPivot model is active, this can be done by opening PowerPivot window then by clicking settings again from the PowerPivot menu you will be able to see the trace file path as shown below.

Note: The trace file is similar to the SQL Trace file and can be opened in the same way and will provide the ability to debug the DAX

Thursday, October 09, 2014

SignalR and WebSockets

SignalR is an amazing framework that delivers a real-time and bidirectional messaging platform.

ASP.NET SignalR is a library for ASP.NET developers that simplifies the process of adding real-time web functionality to applications. Real-time web functionality is the ability to have server code push content to connected clients instantly as it becomes available, rather than having the server wait for a client to request new data.

SignalR uses the new WebSocket transport mechanism which is supported by browsers that supports HTML5. A ASP.NET SignalR connection starts as HTTP, and is then promoted to a WebSocket connection if it is available. WebSocket is the ideal transport for SignalR, since it makes the most efficient use of server memory, has the lowest latency, and has the most underlying features (such as full duplex communication between client and server)

WebSocket requires the server to be using Windows Server 2012 or Windows 8, and .NET Framework 4.5. If these requirements are not met, SignalR will attempt to use other transports to make its connections.

Current attempts to provide real-time web applications largely revolve around polling and other server-side push technologies, the most notable of which is Comet.

Comet is a web application model in which a long-held HTTP request allows a web server to push data to a browser, without the browser explicitly requesting it. Comet-based push is generally implemented in JavaScript and uses connection strategies such as polling, long-polling or streaming.

With polling, the browser sends HTTP requests at regular intervals and immediately receives a response. This technique was the first attempt for the browser to deliver real-time information. This is good only if the message intervals are well known.

With long-polling, the browser sends a request to the server and the server keeps the request open for a set period. If a notification is received within that period, a response containing the message is sent to the client. If a notification is not received within the set time period, the server sends a response to terminate the open request.

It is important to understand, however, that when you have a high message volume, long-polling does not provide any substantial performance improvements over traditional polling. In fact, it could be worse, because the long-polling might spin out of control into an unthrottled, continuous loop of immediate polls.

With streaming, the browser sends a complete request, but the server sends and maintains an open response that is continuously updated and kept open indefinitely (or for a set period of time). The response is then updated whenever a message is ready to be sent, but the server never signals to complete the response, thus keeping the connection open to deliver future messages.

However, since streaming is still encapsulated in HTTP, intervening firewalls and proxy servers may choose to buffer the response, increasing the latency of the message delivery.

HTML5 Web Sockets to the Rescue!

HTML5 Web Sockets provides a true standard that you can use to build scalable, real-time web applications.

Sunday, September 28, 2014

ASP.NET MVC vs ASP.NET Web API

ASP.NET MVC excels at accepting form data and generating HTML. It returns Views and Data. Requests are mapped to action names. MVC features exists in the System.Web.MVC assembly.

ASP.NET Web API excels at accepting and generating structured data like JSON and XML. It returns only Data not View. Build Restful services and supports content type negotiation (response data format can be JSON, XML, ATOM etc). Requests are mapped to actions based on HTTP Verbs. With the Web API, no longer IIS need to be used to host HTTP services, services can be hosted in a custom Windows service, console application, or any other type of host you need. Provides automatic support for OData. The model binding, filters, routing and others MVC features exist in Web API are different from MVC and exists in the new System.Web.Http assembly. Light weight architecture and good for devices like smart phones.

Sunday, August 31, 2014

SSRS Preview in VS 2013: There was no endpoint listening at net.pipe://localhost/PreviewProcessingService4440/ReportProcessing that could accept the message.

When you preview the SSRS report, a console window opens and don't close that window. If you have done that already then close and open the solution and try preview again.

Saturday, August 30, 2014

SPARSE Columns and ColumnSets

I had a scenario where I had to load a csv file which had around 2000 columns. Then I started looking at the wide table feature which came as part of SQL Server 2008. A wide table can have 30,000 columns as opposed to a non-wide table which can have 1024 columns.

Having 30,000 or even 1,024 columns in a table is a lot of columns and design review needs to be done for these type of scenarios. In my scenario it is another system which generates this file and the business wants an ability to query it easily via a single table. More or less just for analysis purposes.

Sparse columns are designed to allow a relational database to store and process relatively unstructured data, where any individual entity may have a modest selection from a very wide set of attributes. This sort of data can be found frequently in scientific, pharmaceutical, and medical applications. EAV (Entity/Attribute/Value) storage techniques were engineered for such data; however, EAV data storage has proved to be somewhat inefficient in such areas as query speed and storage cost.

Sparse columns work well where a high proportion of the data in a column is NULL, because negligible storage space is used for the NULL representation. A table that is wide in theory can be made narrow in practice by implementing sparse columns, and the addition of filtered indexes provides effective searches on the sparse data. Sparse columns support the use of check constraints, as long as the constraints allow NULL values (a sparse column requirement).

Column sets provide a solution to the difficulties of efficient imports and exports of sparse data. The untyped XML representation that is generated for each row interacts directly with each sparse field. The XML field accepts inserts, and its data can be extracted and modified easily. This feature effectively supports certain types of serialized object data from various applications.

CREATE TABLE SPARSETABLE
(
 ID INT NOT NULL IDENTITY(1,1)
 ,FIRSTNAME VARCHAR(100) NOT NULL
 ,MIDDLENAME VARCHAR(100) SPARSE NULL
 ,LASTNAME VARCHAR(100) NOT NULL
 ,SPARSEDATASET XML COLUMN_SET FOR ALL_SPARSE_COLUMNS
)

INSERT SPARSETABLE(FIRSTNAME, MIDDLENAME, LASTNAME)
VALUES ('John','Daniel','Smith');
GO

INSERT SPARSETABLE(FIRSTNAME, LASTNAME)
VALUES ('Peter','Wilson');
GO


SELECT [ID]
      ,[FIRSTNAME]
      ,[LASTNAME]
      ,[SPARSEDATASET]
  FROM [SPARSETABLE]

Capture Metadata for Data Warehouses using SQL Server Extended Properties

SQL SERVER extended properties allows developers to load metadata information like description, Source etc on a table or column within a table. This will allow information to be documented and attached to SQL SERVER and also queryable.

--Add Extended Property for a table
EXEC sp_addextendedproperty 
 @name=N'Description', 
 @value=N'Dimension with the list of all products.' , 
 @level0type=N'SCHEMA',
 @level0name=N'dbo', 
 @level1type=N'TABLE',
 @level1name=N'DIM_PRODUCTS'

--Update Extended Property for the table
EXEC sp_updateextendedproperty 
 @name=N'Description', 
 @value=N'Dimension with the list of all available products.' , 
 @level0type=N'SCHEMA',
 @level0name=N'dbo', 
 @level1type=N'TABLE',
 @level1name=N'DIM_PRODUCTS'
GO

--Add Extended Property for a column
EXEC sp_addextendedproperty 
 @name=N'Description', 
 @value=N'Name of the Product.' , 
 @level0type=N'SCHEMA',
 @level0name=N'dbo', 
 @level1type=N'TABLE',
 @level1name=N'DIM_PRODUCTS', 
 @level2type=N'COLUMN',
 @level2name=N'PRODUCT_NAME'

--Update Extended Property for a column
EXEC sp_updateextendedproperty 
 @name=N'Description', 
 @value=N'Name of the Product' , 
 @level0type=N'SCHEMA',
 @level0name=N'dbo', 
 @level1type=N'TABLE',
 @level1name=N'DIM_PRODUCTS', 
 @level2type=N'COLUMN',
 @level2name=N'PRODUCT_NAME'
GO

--Query Extended Properties
SELECT objname, name, value
FROM fn_listextendedproperty(NULL, 'schema', 'dbo', 'table', default, NULL, NULL)

SELECT objtype, objname, name, value
FROM fn_listextendedproperty (NULL, 'schema', 'dbo', 'table', 'DIM_PRODUCTS', 'column', default);

Saturday, August 16, 2014

Package Configurations missing in SSIS 2012

Couldn't find the package configurations tab in SSIS 2012. This is because Package Configurations in SSIS 2012 is available only for the Package Deployment Model and not for Project Deployment Model.

Make sure to convert the project to Package Deployment Model by right-clicking the project and click on Convert to Package Deployment Model.

Project consistency check failed. The following inconsistencies were detected: Test BIML [PackageName].dtsx has a different ProtectionLevel than the project.

If you get this error when trying to create a package using BIML in SSIS 2012 using Project Deployment Model, make sure that the Project ProtectionLevel matches the Package ProtectionLevel.

Monday, August 04, 2014

SSIS Best Practices 3 - Handling and Logging Data Load Errors

Normally you don't want a ETL load to fail because of any data errors. For instance, if you have a file with 1000 rows with 10 rows having some issues, you don't want the load to fail. Instead the package must successfully load the 990 rows and the 10 error rows need to be redirected to a separate table to identify the issue and fix them. If it is a flat file then it is worth while to generate and include a Row Number which will help in identifying the row in the flat file.

The error table will have all columns with data type of varchar(max) across all columns with additional columns like ErrorCode, ErrorColumn and ErrorDescription.

Error rows are handled in the Data Flow through the use of the Error Row Configuration properties. These properties tell the Data Flow Components what to do when a row fails an operation, such as data conversion or truncation.

Redirect Row Event Handler denotes that when a row reaches an error for any column marked as redirected, the entire row is sent out the red error path output which normally connects to the Error table destination. If the red error path output is not used then the row will be ignored out of the Data flow.

public override void Input0_ProcessInputRow(Input0Buffer Row) 
{ 
Row.ErrDesc = ComponentMetaData.GetErrorDescription(Row.ErrorCode); 
}

Sunday, August 03, 2014

SSIS Best Practices 2 - Single Package vs Multiple Packages

When developing an ETL package the first thing that comes to the mind is whether to build single package doing the end-to-end flow or multiple packages doing the same.

Being an Architect, I have been challenged with this question many times from my development team especially around single package model helps in maintaining less number of packages whereas multiple packages creates an overhead with the number of packages.

One Package = One Unit of Work

It is always best practice to have one package doing only one thing. The following benefits can be achieved doing so.

  • Less time spent waiting on design-time validation. SQL Server Data Tools has a rich interface that provides, among other things, a near real-time evaluation of potential metadata problems in the SSDT designer. If, for example, a table that is accessed by the SSIS package is changed, the developer will be presented with a warning (or an error, if applicable) in SSDT indicating that metadata accessed by the package has changed. This constant metadata validation is beneficial in that it can help to identify potential problems before they are pushed out for testing. There's also a performance cost associated with this. The length of time required for validation increases as the size of the package increases, so naturally keeping the packages as small as practical will cut down on the amount of time you're drumming your fingers on your desk waiting for validation to complete.
  • Easier testing and deployment. A single package that loads, say, 10 dimensions has a lot of moving parts. When developing each of the dimensions within the package, there is no easy way to test just one element of the package (apart from manually running it in the SSDT designer, which isn't a completely realistic test for a package that will eventually be deployed to the server). The only realistic test for such a package would be to test the entire package as a server-based execution, which may be overkill if you're only interested in one or two changed properties. Further, it's not uncommon for organizations with formal software testing and promotion procedures to require that the entire thing be retested, not just the new or changed elements. By breaking up operations into smaller units, testing and deployment are usually less of a burden because you are only operating on one component at a time.
  • Distributed development. If you work in an environment where you are the only person developing SSIS packages, this is less of a concern. However, if your shop has multiple ETL developers, those do-everything packages are quite inconvenient. Although it's much easier in SQL Server 2012 than in previous releases to compare differences between versions of the same package file, it's still a mostly manual process. By segmenting the workload into multiple packages, it's much easier to farm out development tasks to multiple people without having to reconcile multiple versions of the same package.
  • Reusability. It's not uncommon for the same logic to be used more than once in the same ETL execution, or for that same logic to be shared among multiple ETL processes. By encapsulating these logical units of work into their own packages, it's much easier to share that logic and avoid duplicate development.

Also we follow the following design patterns

  • Each dimension has a separate package.
  • Each fact table has a separate package.
  • Staging operations (if used) each have their own package.
  • Dropping or disabling indexes, Truncating tables can be grouped together and can typically reside in the same package.

Saturday, August 02, 2014

SSIS Best Practices 1 - Naming Conventions

Always name the tasks and components using the following acronyms at the beginning of the task and component names. Following these naming conventions will facilitate on-going maintenance as well as making any global changes to the packages as package updates can be performed programmatically.

TaskPrefix
For Loop ContainerFLC
Foreach Loop ContainerFELC
Sequence ContainerSEQC
ActiveX ScriptAXS
Analysis Services Execute DDLASE
Analysis Services ProcessingASP
Bulk InsertBLK
Data FlowDFT
Data Mining QueryDMQ
Execute DTS 2000 PackageEDPT
Execute PackageEPT
Execute ProcessEPR
Execute SQLSQL
File SystemFSYS
FTPFTP
Message QueueMSMQ
ScriptSCR
Send MailSMT
Transfer DatabaseTDB
Transfer Error MessagesTEM
Transfer JobsTJT
Transfer LoginsTLT
Transfer Master Stored ProceduresTSP
Transfer SQL Server ObjectsTSO
Web ServiceWST
WMI Data ReaderWMID
WMI Event WatcherWMID
XMLXML
ExpressionEXPR


ComponentPrefix
DataReader SourceDR_SRC
Excel SourceEX_SRC
Flat File SourceFF_SRC
OLE DB SourceOLE_SRC
Raw File SourceRF_SRC
XML SourceXML_SRC
AggregateAGG
AuditAUD
Character MapCHM
Conditional SplitCSPL
Copy ColumnCPYC
Data ConversionDCNV
Data Mining QueryDMQ
Derived ColumnDER
Export ColumnEXPC
Fuzzy GroupingFZG
Fuzzy LookupFZL
Import ColumnIMPC
Lookup ColumnLKP
MergeMRG
Merge JoinMRGJ
MulticastMLT
OLE DB CommandCMD
Percentage SamplingPSMP
PivotPVT
Row CountCNT
Row SamplingRSMP
Script ComponentSCR
Slowly Changing DimensionSCD
SortSRT
Term ExtractionTEX
Term LookupTEL
Union AllALL
UnpivotUPVT
Data Mining Model TrainingDMMT_DST
Data Reader DestinationDR_DST
Dimension ProcessingDP_DST
Excel DestinationEX_DST
Flat File DestinationFF_DST
OLE DB DestinationOLE_DST
Partition ProcessingPP_DST
Raw File DestinationRF_DST
Recordset DestinationRS_DST
SQL Server DestinationSS_DST
SQL Server Mobile DestinationSSM_DST

Note: Thanks to Jamie Thomson for the naming conventions.

Entity Framework 6 Features

Async Query and Save

Entity Framework 6 provides the ability to execute a query and command asynchronously using DbContext.

Code based Configuration

Entity framework configurations that are usually set in app.config section now can be set programmatically using System.Data.Entity.DbConfiguration class.

Db Command Logging

EF6 provides an ability to log all the activity performed using context.database.Log

using (var context = new PersonEntities())
            {
                context.Database.Log = Console.Write;
                var person = context.People.FirstOrDefault(p => p.FirstName == "John");
                if (person != null) person.LastName = "Smith";
                context.SaveChanges();
            }

Command Interception

Starting with EF6.1 interceptors can be registered in the config file. Interceptors allows to run additional logic when EF performs certain operations, such as executing database queries, opening connections, etc. To append to the log file if the log file already exists

 
   
     
       
       
     
   

Transactions

EF 6 has introduced Database.BeginTransaction and Database.UseTransaction to provide more control over transaction. In previous versions of Entity Framework, whenever you execute SaveChanges() to do Create, Update, Delete operations in the database it will be wrapped in a transaction. This transaction lasts only until the operation is executed. A new transaction is started once another operation is initiated.

Database.BeginTransaction() – An easier method for a user to start and complete themselves with an existing DbContext – allowing several operations to be combined within the same transaction and hence either all committed or all rolled back as one. It also allows the user to more easily specify the isolation level for the transaction

Database.UseTransaction() – Allows the DbContext to use a transaction which was started outside Entity Framework.

using (System.Data.Entity.DbContextTransaction dbTran = context.Database.BeginTransaction( ))
        {
            try
            {
                Person person = new Person() { FirstName = "John" };
                context.Persons.Add(std1);
                context.Database.ExecuteSqlCommand(
                    @"UPDATE Person SET FirstName = 'James'" +
                        " WHERE PersonID =1"
                    );
                context.Persons.Remove(std1);
                //saves all above operations within one transaction
                context.SaveChanges();
                //commit transaction
                dbTran.Commit();
            }
            catch (Exception ex)
            {
                //Rollback transaction if exception occurs
                dbTran.Rollback();
            }
        }

DbSet.AddRange and DbSet.RemoveRange

DbSet in EF6 has introduced new methods AddRange and RemoveRange. DbSet.AddRange adds collection of entities to the DbContext. DbSet.RemoveRange removes collection of entities from DbSet.

            IList persons = new List();
            persons.Add(new Person() { FirstName = "Person 1" });
            persons.Add(new Person() { FirstName = "Person 2" });
            persons.Add(new Person() { FirstName = "Person 3" });
            using (var context = new PersonEntities())
            {
                context.People.AddRange(persons);
                context.SaveChanges();
            }
            using (var context = new PersonEntities())
            {
                IList personList = context.People.Where(p => p.FirstName == "First Name 1").ToList();
                context.People.RemoveRange(personList);
                context.SaveChanges();

Custom Conventions

EF6 provides the ability to define own custom conventions which will be the default behavior for the models. There are 2 types of conventions, Configuration Conventions and Model Conventions

Configuration Convention

Configuration Conventions are the way to configure set of objects without overriding configuration provided by Fluent API. We can define this convention inside OnModelCreating method or we can use own class that inherits Convention class.

Suppose you would like to configure all DateTime properties must of data type DateTime2 across all tables in the model this can be easily achieved by Configuration Conventions by overriding the onModelCreating method or having a separate class that inherits Convention class.

protected override void onModelCreating(DbModelBuilder modelBuilder)
{
modelBuilder.Properties() 
            .Configure(c => c.HasColumnType("datetime2"));
}

Using Convention class

class CustomDateTimeConvention : Convention
{
 public CustomDateTimeConvention()
 {
  Properties() 
            .Configure(c => c.HasColumnType("datetime2"));
 }
}

The above method can be called from onModelCreating method of DbContext using modelBuilder.Conventions.Add()

Model Convention

Model Conventions are based on the underlying model metadata. There are two types of model conventions, Conceptual (C-Space) and Store (S-Space). This distinction indicates where in the pipeline a convention is executed. A C-Space convention is applied to the model that the application builds, whereas an S-Space convention is applied to the version of the model.

Creating a model convention will require creating a class that implements either IConceptualModelConvention or IStoreModelConvention. These interfaces both accept a generic that can be of type MetadataItem which is used to filter the type data type that the convention applies to.

public class ModelBasedConvention : IStoreModelConvention
{
    public void Apply(EdmProperty property, System.Data.Entity.Infrastructure.DbModel model)
    {
        if (property.PrimitiveType.PrimitiveTypeKind == PrimitiveTypeKind.Decimal
            && property.Scale == null)
        {
            property.Scale = 4;
        }
    }
}

Stored Procedure Mapping

Entity Framework 6 Code First provides ability to create and use stored procedure for add, update and delete operation. To create stored procedures for Person Entity automatically

    protected override void OnModelCreating(DbModelBuilder modelBuilder)
    {
        modelBuilder.Entity()
            .MapToStoredProcedures();
    }

This will create three procedures Person_Insert, Person_Update and Person_Delete. Person_Insert and Person_Update stored procedures has parameters name corresponds to the property names. Person_Delete will have primary key property PersonID parameter.

Limitations:

  • Only Fluent API can be used to map stored procedures. You cannot use Data Annotation attributes in EF 6 for stored procedure mapping.
  • You cannot use mixture of stored procedure and query to add, update and delete operation on the same entity. You can either use stored procedure or SQL query for all add, update and delete operation with entity.

Index Attribute

Entity Framework 6 provides Index attribute to create Index on particular column in the database

        class Person
        {
            public Person()
            {
            }
            public int PersonID { get; set; }
            public string FirstName { get; set; }      
            [Index]
            public datetime DateOfBirth { get; set; }
        }

By default, Index name will be IX_{property name}.But this can be changed using

    [Index( "INDEX_DOB", IsClustered=true, IsUnique=true )]
    public datetime DateOfBirth { get; set; }

Thursday, July 31, 2014

Business Intelligence Markup Language (BIML)

BIML (Business Intelligence Markup Language) is a domain specific XML for defining Business Intelligence and DW solutions. It is compatible in SSIS and SSAS. BIML can be used to build Templates using which subsequent packages can be generated automatically in a consistent fashion.


 
  
   
    Template Package
   
   
    
     
      
     
    
   
  
 

Use BIDsHelper to create the BIML files and subsequently Packages using the BIML file.

Sunday, July 27, 2014

Import Data from a large .sql file in SQL SERVER

If you try to import a large .SQL file using SQL Server Management Studio, you may get an "Out of memory" error. To import the file successfully you can use the SQLCMD utility.

The sqlcmd utility lets you enter Transact-SQL statements, system procedures, and script files at the command prompt, in Query Editor in SQLCMD mode, in a Windows script file or in an operating system (Cmd.exe) job step of a SQL Server Agent job. This utility uses ODBC to execute Transact-SQL batches.

SQLCMD -S -i C:\Temp\.sql

sqlcmd
-a packet_size -A (dedicated administrator connection)
-b (terminate batch job if there is an error)
-c batch_terminator
-C (trust the server certificate)
-d db_name
-e (echo input)
-E (use trusted connection)
-f codepage | i:codepage[,o:codepage] | o:codepage[,i:codepage]
-h rows_per_header
-H workstation_name
-i input_file
-I (enable quoted identifiers)
-k[1 | 2] (remove or replace control characters)
-K application_intent
-l login_timeout
-L[c] (list servers, optional clean output)
-m error_level
-M multisubnet_failover
-N (encrypt connection)
-o output_file
-p[1] (print statistics, optional colon format)
-P password
-q "cmdline query"
-Q "cmdline query" (and exit)
-r[0 | 1] (msgs to stderr)
-R (use client regional settings)
-s col_separator
-S [protocol:]server[\instance_name][,port]
-t query_timeout
-u (unicode output file)
-U login_id
-v var = "value"
-V error_severity_level
-w column_width
-W (remove trailing spaces)
-x (disable variable substitution)
-X[1] (disable commands, startup script, environment variables and optional exit)
-y variable_length_type_display_width
-Y fixed_length_type_display_width
-z new_password
-Z new_password (and exit)
-? (usage)

Tuesday, April 22, 2014

@@TRANCOUNT - SQL SERVER

Returns the number of BEGIN TRANSACTION statements that have occurred on the current connection.

The BEGIN TRANSACTION statement increments @@TRANCOUNT by 1. ROLLBACK TRANSACTION decrements @@TRANCOUNT to 0, except for ROLLBACK TRANSACTION savepoint_name, which does not affect @@TRANCOUNT. COMMIT TRANSACTION or COMMIT WORK decrement @@TRANCOUNT by 1.

PRINT @@TRANCOUNT
--  The BEGIN TRAN statement will increment the
--  transaction count by 1.
BEGIN TRAN
    PRINT @@TRANCOUNT
    BEGIN TRAN
        PRINT @@TRANCOUNT
--  The COMMIT statement will decrement the transaction count by 1.
    COMMIT
    PRINT @@TRANCOUNT
COMMIT
PRINT @@TRANCOUNT

Saturday, April 19, 2014

Cannot register the hard disk xxxxx because a hard disk xxxxxx with UUID xxxxxxx already exists in Oracle VirtualBox

Failed to open the hard disk .

Cannot register the hard disk because a hard disk with UUID already exists.

Solution

  • Open the Command Prompt
  • Change the directory to where VirtualBox is installed (Default: C:\Program Files\Oracle\VirtualBox)
  • Type the following VBOXMANAGE.EXE internalcommands sethduuid [PathOfNewVHD]

How to Setup a VM to install TFS 2013?

Install in the following order:

  • Install Windows Server 2012 Standard or Data Center Edition
  • Promote the server as Domain Controller (optional)
  • Install SQL SERVER 2014 Developer Edition with the features Database Engine Services, Full-text and semantic extraction for search, Analysis Services, Reporting Services - Native, Management Tools - Basic, Management Tools - Complete
  • Install TFS 2013 Update 2

TF255146: Team Foundation Server requires SQL Server 2012 SP1 (11.00.3000) or greater. The SQL Server instance XXXXXXXX you supplied is version 12.0.2000.8.

This issue raised when trying to install Team Foundation Server (TFS) 2013 over SQL 2014 database. Make sure to install TFS 2013 Update 2 as it will support SQL SERVER 2014 while the previous version supports up to SQL SERVER 2012.

Monday, April 14, 2014

Continue loop in SSIS even after an Error

Scenario

Got set of messages to process using a For Loop in SSIS. But when there is an error in one message then the rest of the messages are not processed. Given the child package failed the parent package is also failed. How to continue with the messages even a message has errors?

Solution

Set the MaximumErrorCount property of the For Loop container to 0. FailPackageOnFailure: When set to True, the failure of the task or container results in the failure of the package in which it is defined. FailParentOnFailure: When set to True, the failure of the task or container results in the failure of its container. MaximumErrorCount: Property specifies the maximum number of errors that can occur before the item fails.

Sunday, April 13, 2014

Get Date and TimeStamp from an FTP Server

    FtpWebRequest request = (FtpWebRequest)WebRequest.Create (serverUri);
    request.Method = WebRequestMethods.Ftp.GetDateTimestamp;
    FtpWebResponse response = (FtpWebResponse)request.GetResponse ();
    Console.WriteLine ("{0} {1}",serverUri,response.LastModified);

Monday, March 24, 2014

TF31003. Either you have not entered the necessary credentials or your user account does not have permission to connect to the Team Foundation server.

Scenario

Trying to connect to TFS 2010 using VS 2008 Team Explorer where TFS is on a different domain than the current logged in Domain.

Solution

Make sure the Server URL is the full URL along with the Team Project Collection Name. If you don't provide the Project Collection Name in the URL then you will get the above error.
http://[server name]:[port name]/tfs/[project collection name]

Also make sure to include the domain beside the user name when you are asked for credentials as the TFS server is on a different domain.
User name: domain name\username

Monday, February 17, 2014

VS 2013 - New Features - One ASP.NET

There is only one Web Project Type in Visual Studio unlike the previous versions of Visual Studio.

From Standard Web Forms to MVC all the project templates related to building websites and web applications are available in the same place.

System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.IO.FileNotFoundException: Could not load file or assembly 'xxxxxx.dll, Version=1.0.0.0, Culture=neutral, PublicKeyToken=7e7edcc6a6871fc7' or one of its dependencies. The system cannot find the file specified

  • Check whether the DLL is deployed successfully in GAC (Global Assembly Cache)
  • If not make sure that the DLLs are available in the following locations
    • D:\Program Files\Microsoft SQL Server\100\DTS\Binn (depending on the version of SSIS)
    • D:\Program Files (x86\Microsoft SQL Server\100\DTS\Binn (depending on the version of SSIS)
  • Also make sure that the DLL is compiled for the right platform (32-bit or 64-bit or both)

Sunday, February 16, 2014

SSIS FTP Step gives an error "The Password Was Not Allowed" and doesn't connect to the FTP Server

  • Right click on your FTP connection manager, select properties and type the FTP Password.
  • Save the package with EncryptSensitiveWithPassword property and provide a password and save the package and execute.

SQL Azure Reporting Services

Windows Azure SQL Reporting is a cloud-based reporting service for the Windows Azure Platform built on SQL Server Reporting Services technologies.

SQL Reporting service is available to current subscribers, but should not be used for new software development projects. The service will be discontinued on October 31, 2014.

An alternative to SQL Reporting is to use one or more instances of SQL Server Reporting Services (SSRS) running on Windows Azure Virtual Machines (VM).

A known issue of SQL Reporting is that report execution is slower than SSRS running on premises. In contrast, performance of SSRS on a VM is similar to the performance you might expect from an on-premises SSRS instance.

Transition from SQL Reporting to SQL VM provides the following advantages:

  • Extensible report server for customized reporting
  • Scheduled report execution and delivery
  • Integration with hybrid solutions
  • Faster Performance

Wednesday, February 12, 2014

SQL SERVER CLR Integration

SQLCLR allows integration of .NET Framework CLR components in SQL SERVER. .NET CLR Assemblies reside and run from database rather than file system. SQL SERVER need to check only once during deployment. This boosts performance and reliability.

SQLCLR allows to write stored procedures, functions, triggers, user defined types and aggregates using variety of .NET Framework languages like C#, VB.NET etc.

When to use SQLCLR?

Don't use SQLCLR for querying data from SQL SERVER. It is better to use T-SQL for that. Instead use SQLCLR when you can't perform some tasks using native SQL functions like GZip Compression/Decompression, PGP Encryption/Decryption, complex string manipulations etc.

How to enable SQLCLR?

exec sp_configure 'clr enabled', 1;
GO
RECONFIGURE
GO

Cross Database Ownership Chaining in SQL SERVER

Consider a scenario where you have a view object in a database named "Database A" tries to access a table from "Database A" as well as a table from "Database B". In this scenario to run the view the user must also be a user in Database B along with Database A. If these databases are tightly linked, cross database ownership chaining can be used where user can access the table from Database B only through the view from Database A but doesn't have explicit permissions to query the table in Database B.

Steps to enable Cross Database Ownership Chaining

Set both Database A and Database B in the chaining as trustworthy.

ALTER DATABASE Database1 SET TRUSTWORTHY ON;
ALTER DATABASE Database2 SET TRUSTWORTHY ON;

Make sure that both Tables inside the Database A and Database B belong to the same owner.

Enable Cross Database Ownership Chaining. This can be done on whole instance level or specific databases.

For instance level

Exec sp_configure 'cross db ownership chaining', 1;
RECONFIGURE;

For database level

ALTER DATABASE Database1 SET DB_CHAINING ON;
ALTER DATABASE Database2 SET DB_CHAINING ON;

Tuesday, February 11, 2014

Are nested Windows groups supported as SQL Server 2008 logins?

Yes, it is completely supported in SQL Server 2008 irrespective of the instance running on Windows Authentication or Mixed mode Authentication.

Make sure the AD Groups are created as Security Groups rather than Distribution Lists. If not created as Security Groups, then Nested AD Groups will not be working.

How to start SSMS with a different windows user account than the logged in account?

Attempt to start SSMS with a different windows user account if you are using credentials on the same domain

runas /netonly /user:domain\username ssms.exe

Use the "/netonly" switch, you can log in using remote credentials on a domain that you're not currently a member of, even if there's no trust set up

runas /netonly /user:domain\username ssms.exe

Tuesday, February 04, 2014

24 Hours of Pass - Business Analytics Edition

This 24 Hours of PASS: Business Analytics event will take place on February 5, beginning at 16:00 GMT and will feature 12 webcasts followed by 12 hours of on demand replay.

Sessions

Advanced Analysis Techniques

Advanced Analytics in Excel 2013
Excel is “the” analytical tool in Microsoft’s suite for advanced analysts. Of course, you know that...
Dejan Sarka
An Introduction to Predictive Modeling
This session will show you how to solve a real-world data mining problem by demonstrating the...
Carlos Bossy
Introduction to Querying in DAX
The DAX (Data Analysis Expressions) language can be used to query a Tabular and PowerPivot data...
Marco Russo
Predictive Analytics for Absolute Beginners
In this one-hour preview presentation, we cover the main motivation for learning something about...
Mark Tabladillo, Artus Krohn-Grimberghe

Analytics and Visualization

Adding Valuable Techniques to your Data Science Toolbox
Data Science requires us to see and understand complex patterns in data, so we use techniques like...
Mark Whitehorn
End-to-End Power BI
The aim of this session is to excite you with the potential of Microsoft's new self-service suite...
Peter Myers
Power Query: Beyond the Basics
You already know that you can accomplish a lot within the Power Query user interface. If you are a...
Chris Webb
Small Big Data - Phase 0 - Data Hygiene 
In this talk, we will talk about the starting point for every Big Data project - getting your...
Lynn Langit

Big Data

Has Big Data Killed the EDW?
Big Data technologies have given us the ability to process massive amounts of data at a low cost,...
Stacia Misner
Implementation and Design on the Parallel Data Warehouse
If you need to design a data warehouse that scales for today and into the future, then you need to...
Jason Strate

Information Delivery

DataViz You Thought You Could NOT Do with SSRS
Despite SQL Server Reporting Services (SSRS) being a very flexible and adaptable tool when it comes...
Jason Thomas

Strategy and Architecture

Panel: Myths, Misunderstandings, and Successes in Data Analytics
Big Data, Business Analytics, Data Analytics, NoSQL, Relational . . . do we even agree on what we...
Karen Lopez, Joseph D'Antoni, Lynn Langit, Stacia Misner

Monday, February 03, 2014

Milliseconds and SSIS DateTime Variables

Want to do delta loads for your Data Warehouse and having problems in storing Last Load time in an SSIS DateTime Variable?

SSIS DateTime data type truncates the milliseconds from the DateTime. To avoid this set the data type for the variable to store the Last Load time as String. e.g.

SELECT CONVERT(VARCHAR(23), LastModified, 121) LastModified FROM ........

Then you can change it to DateTime before passing it back to the database.

.......WHERE LastModified > CONVERT(DATETIME, @LastModified, 121)