The Variable "variablename" hasn't been assigned a default value. Assign a default value and the error will disappear.
Note: This is not an issue in SSIS 2008 but an issue from SSIS 2012, SSIS 2014 and above.
Sharing knowledge does not lessen your store, often it gets you more.
Success doesn't happen overnight and patience is key to living your dream life.
Success is a journey not a destination
The Variable "variablename" hasn't been assigned a default value. Assign a default value and the error will disappear.
Note: This is not an issue in SSIS 2008 but an issue from SSIS 2012, SSIS 2014 and above.
If you like to trouble shoot any errors that you get with Power View then you have an option to enable tracing in PowerPivot in Excel 2013. You can enable tracing as shown in the below screenshot.
Once enabled and PowerPivot model is active, this can be done by opening PowerPivot window then by clicking settings again from the PowerPivot menu you will be able to see the trace file path as shown below.
Note: The trace file is similar to the SQL Trace file and can be opened in the same way and will provide the ability to debug the DAX
SignalR is an amazing framework that delivers a real-time and bidirectional messaging platform.
ASP.NET SignalR is a library for ASP.NET developers that simplifies the process of adding real-time web functionality to applications. Real-time web functionality is the ability to have server code push content to connected clients instantly as it becomes available, rather than having the server wait for a client to request new data.
SignalR uses the new WebSocket transport mechanism which is supported by browsers that supports HTML5. A ASP.NET SignalR connection starts as HTTP, and is then promoted to a WebSocket connection if it is available. WebSocket is the ideal transport for SignalR, since it makes the most efficient use of server memory, has the lowest latency, and has the most underlying features (such as full duplex communication between client and server)
WebSocket requires the server to be using Windows Server 2012 or Windows 8, and .NET Framework 4.5. If these requirements are not met, SignalR will attempt to use other transports to make its connections.
Current attempts to provide real-time web applications largely revolve around polling and other server-side push technologies, the most notable of which is Comet.
Comet is a web application model in which a long-held HTTP request allows a web server to push data to a browser, without the browser explicitly requesting it. Comet-based push is generally implemented in JavaScript and uses connection strategies such as polling, long-polling or streaming.
With polling, the browser sends HTTP requests at regular intervals and immediately receives a response. This technique was the first attempt for the browser to deliver real-time information. This is good only if the message intervals are well known.
With long-polling, the browser sends a request to the server and the server keeps the request open for a set period. If a notification is received within that period, a response containing the message is sent to the client. If a notification is not received within the set time period, the server sends a response to terminate the open request.
It is important to understand, however, that when you have a high message volume, long-polling does not provide any substantial performance improvements over traditional polling. In fact, it could be worse, because the long-polling might spin out of control into an unthrottled, continuous loop of immediate polls.
With streaming, the browser sends a complete request, but the server sends and maintains an open response that is continuously updated and kept open indefinitely (or for a set period of time). The response is then updated whenever a message is ready to be sent, but the server never signals to complete the response, thus keeping the connection open to deliver future messages.
However, since streaming is still encapsulated in HTTP, intervening firewalls and proxy servers may choose to buffer the response, increasing the latency of the message delivery.
HTML5 Web Sockets to the Rescue!
HTML5 Web Sockets provides a true standard that you can use to build scalable, real-time web applications.
ASP.NET MVC excels at accepting form data and generating HTML. It returns Views and Data. Requests are mapped to action names. MVC features exists in the System.Web.MVC assembly.
ASP.NET Web API excels at accepting and generating structured data like JSON and XML. It returns only Data not View. Build Restful services and supports content type negotiation (response data format can be JSON, XML, ATOM etc). Requests are mapped to actions based on HTTP Verbs. With the Web API, no longer IIS need to be used to host HTTP services, services can be hosted in a custom Windows service, console application, or any other type of host you need. Provides automatic support for OData. The model binding, filters, routing and others MVC features exist in Web API are different from MVC and exists in the new System.Web.Http assembly. Light weight architecture and good for devices like smart phones.
I had a scenario where I had to load a csv file which had around 2000 columns. Then I started looking at the wide table feature which came as part of SQL Server 2008. A wide table can have 30,000 columns as opposed to a non-wide table which can have 1024 columns.
Having 30,000 or even 1,024 columns in a table is a lot of columns and design review needs to be done for these type of scenarios. In my scenario it is another system which generates this file and the business wants an ability to query it easily via a single table. More or less just for analysis purposes.
Sparse columns are designed to allow a relational database to store and process relatively unstructured data, where any individual entity may have a modest selection from a very wide set of attributes. This sort of data can be found frequently in scientific, pharmaceutical, and medical applications. EAV (Entity/Attribute/Value) storage techniques were engineered for such data; however, EAV data storage has proved to be somewhat inefficient in such areas as query speed and storage cost.
Sparse columns work well where a high proportion of the data in a column is NULL, because negligible storage space is used for the NULL representation. A table that is wide in theory can be made narrow in practice by implementing sparse columns, and the addition of filtered indexes provides effective searches on the sparse data. Sparse columns support the use of check constraints, as long as the constraints allow NULL values (a sparse column requirement).
Column sets provide a solution to the difficulties of efficient imports and exports of sparse data. The untyped XML representation that is generated for each row interacts directly with each sparse field. The XML field accepts inserts, and its data can be extracted and modified easily. This feature effectively supports certain types of serialized object data from various applications.
CREATE TABLE SPARSETABLE ( ID INT NOT NULL IDENTITY(1,1) ,FIRSTNAME VARCHAR(100) NOT NULL ,MIDDLENAME VARCHAR(100) SPARSE NULL ,LASTNAME VARCHAR(100) NOT NULL ,SPARSEDATASET XML COLUMN_SET FOR ALL_SPARSE_COLUMNS ) INSERT SPARSETABLE(FIRSTNAME, MIDDLENAME, LASTNAME) VALUES ('John','Daniel','Smith'); GO INSERT SPARSETABLE(FIRSTNAME, LASTNAME) VALUES ('Peter','Wilson'); GO SELECT [ID] ,[FIRSTNAME] ,[LASTNAME] ,[SPARSEDATASET] FROM [SPARSETABLE]
SQL SERVER extended properties allows developers to load metadata information like description, Source etc on a table or column within a table. This will allow information to be documented and attached to SQL SERVER and also queryable.
--Add Extended Property for a table EXEC sp_addextendedproperty @name=N'Description', @value=N'Dimension with the list of all products.' , @level0type=N'SCHEMA', @level0name=N'dbo', @level1type=N'TABLE', @level1name=N'DIM_PRODUCTS' --Update Extended Property for the table EXEC sp_updateextendedproperty @name=N'Description', @value=N'Dimension with the list of all available products.' , @level0type=N'SCHEMA', @level0name=N'dbo', @level1type=N'TABLE', @level1name=N'DIM_PRODUCTS' GO --Add Extended Property for a column EXEC sp_addextendedproperty @name=N'Description', @value=N'Name of the Product.' , @level0type=N'SCHEMA', @level0name=N'dbo', @level1type=N'TABLE', @level1name=N'DIM_PRODUCTS', @level2type=N'COLUMN', @level2name=N'PRODUCT_NAME' --Update Extended Property for a column EXEC sp_updateextendedproperty @name=N'Description', @value=N'Name of the Product' , @level0type=N'SCHEMA', @level0name=N'dbo', @level1type=N'TABLE', @level1name=N'DIM_PRODUCTS', @level2type=N'COLUMN', @level2name=N'PRODUCT_NAME' GO --Query Extended Properties SELECT objname, name, value FROM fn_listextendedproperty(NULL, 'schema', 'dbo', 'table', default, NULL, NULL) SELECT objtype, objname, name, value FROM fn_listextendedproperty (NULL, 'schema', 'dbo', 'table', 'DIM_PRODUCTS', 'column', default);
Couldn't find the package configurations tab in SSIS 2012. This is because Package Configurations in SSIS 2012 is available only for the Package Deployment Model and not for Project Deployment Model.
Make sure to convert the project to Package Deployment Model by right-clicking the project and click on Convert to Package Deployment Model.
Normally you don't want a ETL load to fail because of any data errors. For instance, if you have a file with 1000 rows with 10 rows having some issues, you don't want the load to fail. Instead the package must successfully load the 990 rows and the 10 error rows need to be redirected to a separate table to identify the issue and fix them. If it is a flat file then it is worth while to generate and include a Row Number which will help in identifying the row in the flat file.
The error table will have all columns with data type of varchar(max) across all columns with additional columns like ErrorCode, ErrorColumn and ErrorDescription.
Error rows are handled in the Data Flow through the use of the Error Row Configuration properties. These properties tell the Data Flow Components what to do when a row fails an operation, such as data conversion or truncation.
Redirect Row Event Handler denotes that when a row reaches an error for any column marked as redirected, the entire row is sent out the red error path output which normally connects to the Error table destination. If the red error path output is not used then the row will be ignored out of the Data flow.
public override void Input0_ProcessInputRow(Input0Buffer Row) { Row.ErrDesc = ComponentMetaData.GetErrorDescription(Row.ErrorCode); }
When developing an ETL package the first thing that comes to the mind is whether to build single package doing the end-to-end flow or multiple packages doing the same.
Being an Architect, I have been challenged with this question many times from my development team especially around single package model helps in maintaining less number of packages whereas multiple packages creates an overhead with the number of packages.
One Package = One Unit of Work
It is always best practice to have one package doing only one thing. The following benefits can be achieved doing so.
Also we follow the following design patterns
Always name the tasks and components using the following acronyms at the beginning of the task and component names. Following these naming conventions will facilitate on-going maintenance as well as making any global changes to the packages as package updates can be performed programmatically.
Task | Prefix |
For Loop Container | FLC |
Foreach Loop Container | FELC |
Sequence Container | SEQC |
ActiveX Script | AXS |
Analysis Services Execute DDL | ASE |
Analysis Services Processing | ASP |
Bulk Insert | BLK |
Data Flow | DFT |
Data Mining Query | DMQ |
Execute DTS 2000 Package | EDPT |
Execute Package | EPT |
Execute Process | EPR |
Execute SQL | SQL |
File System | FSYS |
FTP | FTP |
Message Queue | MSMQ |
Script | SCR |
Send Mail | SMT |
Transfer Database | TDB |
Transfer Error Messages | TEM |
Transfer Jobs | TJT |
Transfer Logins | TLT |
Transfer Master Stored Procedures | TSP |
Transfer SQL Server Objects | TSO |
Web Service | WST |
WMI Data Reader | WMID |
WMI Event Watcher | WMID |
XML | XML |
Expression | EXPR |
Component | Prefix |
DataReader Source | DR_SRC |
Excel Source | EX_SRC |
Flat File Source | FF_SRC |
OLE DB Source | OLE_SRC |
Raw File Source | RF_SRC |
XML Source | XML_SRC |
Aggregate | AGG |
Audit | AUD |
Character Map | CHM |
Conditional Split | CSPL |
Copy Column | CPYC |
Data Conversion | DCNV |
Data Mining Query | DMQ |
Derived Column | DER |
Export Column | EXPC |
Fuzzy Grouping | FZG |
Fuzzy Lookup | FZL |
Import Column | IMPC |
Lookup Column | LKP |
Merge | MRG |
Merge Join | MRGJ |
Multicast | MLT |
OLE DB Command | CMD |
Percentage Sampling | PSMP |
Pivot | PVT |
Row Count | CNT |
Row Sampling | RSMP |
Script Component | SCR |
Slowly Changing Dimension | SCD |
Sort | SRT |
Term Extraction | TEX |
Term Lookup | TEL |
Union All | ALL |
Unpivot | UPVT |
Data Mining Model Training | DMMT_DST |
Data Reader Destination | DR_DST |
Dimension Processing | DP_DST |
Excel Destination | EX_DST |
Flat File Destination | FF_DST |
OLE DB Destination | OLE_DST |
Partition Processing | PP_DST |
Raw File Destination | RF_DST |
Recordset Destination | RS_DST |
SQL Server Destination | SS_DST |
SQL Server Mobile Destination | SSM_DST |
Note: Thanks to Jamie Thomson for the naming conventions.
Async Query and Save
Entity Framework 6 provides the ability to execute a query and command asynchronously using DbContext.
Code based Configuration
Entity framework configurations that are usually set in app.config section now can be set programmatically using System.Data.Entity.DbConfiguration class.
Db Command Logging
EF6 provides an ability to log all the activity performed using context.database.Log
using (var context = new PersonEntities()) { context.Database.Log = Console.Write; var person = context.People.FirstOrDefault(p => p.FirstName == "John"); if (person != null) person.LastName = "Smith"; context.SaveChanges(); }
Command Interception
Starting with EF6.1 interceptors can be registered in the config file. Interceptors allows to run additional logic when EF performs certain operations, such as executing database queries, opening connections, etc. To append to the log file if the log file already exists
Transactions
EF 6 has introduced Database.BeginTransaction and Database.UseTransaction to provide more control over transaction. In previous versions of Entity Framework, whenever you execute SaveChanges() to do Create, Update, Delete operations in the database it will be wrapped in a transaction. This transaction lasts only until the operation is executed. A new transaction is started once another operation is initiated.
Database.BeginTransaction() – An easier method for a user to start and complete themselves with an existing DbContext – allowing several operations to be combined within the same transaction and hence either all committed or all rolled back as one. It also allows the user to more easily specify the isolation level for the transaction
Database.UseTransaction() – Allows the DbContext to use a transaction which was started outside Entity Framework.
using (System.Data.Entity.DbContextTransaction dbTran = context.Database.BeginTransaction( )) { try { Person person = new Person() { FirstName = "John" }; context.Persons.Add(std1); context.Database.ExecuteSqlCommand( @"UPDATE Person SET FirstName = 'James'" + " WHERE PersonID =1" ); context.Persons.Remove(std1); //saves all above operations within one transaction context.SaveChanges(); //commit transaction dbTran.Commit(); } catch (Exception ex) { //Rollback transaction if exception occurs dbTran.Rollback(); } }
DbSet.AddRange and DbSet.RemoveRange
DbSet in EF6 has introduced new methods AddRange and RemoveRange. DbSet.AddRange adds collection of entities to the DbContext. DbSet.RemoveRange removes collection of entities from DbSet.
IListpersons = new List (); persons.Add(new Person() { FirstName = "Person 1" }); persons.Add(new Person() { FirstName = "Person 2" }); persons.Add(new Person() { FirstName = "Person 3" }); using (var context = new PersonEntities()) { context.People.AddRange(persons); context.SaveChanges(); } using (var context = new PersonEntities()) { IList personList = context.People.Where(p => p.FirstName == "First Name 1").ToList(); context.People.RemoveRange(personList); context.SaveChanges();
Custom Conventions
EF6 provides the ability to define own custom conventions which will be the default behavior for the models. There are 2 types of conventions, Configuration Conventions and Model Conventions
Configuration Convention
Configuration Conventions are the way to configure set of objects without overriding configuration provided by Fluent API. We can define this convention inside OnModelCreating method or we can use own class that inherits Convention class.
Suppose you would like to configure all DateTime properties must of data type DateTime2 across all tables in the model this can be easily achieved by Configuration Conventions by overriding the onModelCreating method or having a separate class that inherits Convention class.
protected override void onModelCreating(DbModelBuilder modelBuilder) { modelBuilder.Properties() .Configure(c => c.HasColumnType("datetime2")); }
Using Convention class
class CustomDateTimeConvention : Convention { public CustomDateTimeConvention() { Properties() .Configure(c => c.HasColumnType("datetime2")); } }
The above method can be called from onModelCreating method of DbContext using modelBuilder.Conventions.Add
Model Convention
Model Conventions are based on the underlying model metadata. There are two types of model conventions, Conceptual (C-Space) and Store (S-Space). This distinction indicates where in the pipeline a convention is executed. A C-Space convention is applied to the model that the application builds, whereas an S-Space convention is applied to the version of the model.
Creating a model convention will require creating a class that implements either IConceptualModelConvention or IStoreModelConvention. These interfaces both accept a generic that can be of type MetadataItem which is used to filter the type data type that the convention applies to.
public class ModelBasedConvention : IStoreModelConvention{ public void Apply(EdmProperty property, System.Data.Entity.Infrastructure.DbModel model) { if (property.PrimitiveType.PrimitiveTypeKind == PrimitiveTypeKind.Decimal && property.Scale == null) { property.Scale = 4; } } }
Stored Procedure Mapping
Entity Framework 6 Code First provides ability to create and use stored procedure for add, update and delete operation. To create stored procedures for Person Entity automatically
protected override void OnModelCreating(DbModelBuilder modelBuilder) { modelBuilder.Entity() .MapToStoredProcedures(); }
This will create three procedures Person_Insert, Person_Update and Person_Delete. Person_Insert and Person_Update stored procedures has parameters name corresponds to the property names. Person_Delete will have primary key property PersonID parameter.
Limitations:
Index Attribute
Entity Framework 6 provides Index attribute to create Index on particular column in the database
class Person { public Person() { } public int PersonID { get; set; } public string FirstName { get; set; } [Index] public datetime DateOfBirth { get; set; } }
By default, Index name will be IX_{property name}.But this can be changed using
[Index( "INDEX_DOB", IsClustered=true, IsUnique=true )] public datetime DateOfBirth { get; set; }
BIML (Business Intelligence Markup Language) is a domain specific XML for defining Business Intelligence and DW solutions. It is compatible in SSIS and SSAS. BIML can be used to build Templates using which subsequent packages can be generated automatically in a consistent fashion.
Template Package
Use BIDsHelper to create the BIML files and subsequently Packages using the BIML file.
If you try to import a large .SQL file using SQL Server Management Studio, you may get an "Out of memory" error. To import the file successfully you can use the SQLCMD utility.
The sqlcmd utility lets you enter Transact-SQL statements, system procedures, and script files at the command prompt, in Query Editor in SQLCMD mode, in a Windows script file or in an operating system (Cmd.exe) job step of a SQL Server Agent job. This utility uses ODBC to execute Transact-SQL batches.
SQLCMD -S
sqlcmd
-a packet_size
-A (dedicated administrator connection)
-b (terminate batch job if there is an error)
-c batch_terminator
-C (trust the server certificate)
-d db_name
-e (echo input)
-E (use trusted connection)
-f codepage | i:codepage[,o:codepage] | o:codepage[,i:codepage]
-h rows_per_header
-H workstation_name
-i input_file
-I (enable quoted identifiers)
-k[1 | 2] (remove or replace control characters)
-K application_intent
-l login_timeout
-L[c] (list servers, optional clean output)
-m error_level
-M multisubnet_failover
-N (encrypt connection)
-o output_file
-p[1] (print statistics, optional colon format)
-P password
-q "cmdline query"
-Q "cmdline query" (and exit)
-r[0 | 1] (msgs to stderr)
-R (use client regional settings)
-s col_separator
-S [protocol:]server[\instance_name][,port]
-t query_timeout
-u (unicode output file)
-U login_id
-v var = "value"
-V error_severity_level
-w column_width
-W (remove trailing spaces)
-x (disable variable substitution)
-X[1] (disable commands, startup script, environment variables and optional exit)
-y variable_length_type_display_width
-Y fixed_length_type_display_width
-z new_password
-Z new_password (and exit)
-? (usage)
Here are the presentation slides and sample application
Problem
Got this error when trying to use ExecuteSQLTask dll programmatically in .NET solution.
Solution
Returns the number of BEGIN TRANSACTION statements that have occurred on the current connection.
The BEGIN TRANSACTION statement increments @@TRANCOUNT by 1. ROLLBACK TRANSACTION decrements @@TRANCOUNT to 0, except for ROLLBACK TRANSACTION savepoint_name, which does not affect @@TRANCOUNT. COMMIT TRANSACTION or COMMIT WORK decrement @@TRANCOUNT by 1.
PRINT @@TRANCOUNT -- The BEGIN TRAN statement will increment the -- transaction count by 1. BEGIN TRAN PRINT @@TRANCOUNT BEGIN TRAN PRINT @@TRANCOUNT -- The COMMIT statement will decrement the transaction count by 1. COMMIT PRINT @@TRANCOUNT COMMIT PRINT @@TRANCOUNT
Failed to open the hard disk
Cannot register the hard disk
Solution
Install in the following order:
Scenario
Got set of messages to process using a For Loop in SSIS. But when there is an error in one message then the rest of the messages are not processed. Given the child package failed the parent package is also failed. How to continue with the messages even a message has errors?
Solution
Set the MaximumErrorCount property of the For Loop container to 0. FailPackageOnFailure: When set to True, the failure of the task or container results in the failure of the package in which it is defined. FailParentOnFailure: When set to True, the failure of the task or container results in the failure of its container. MaximumErrorCount: Property specifies the maximum number of errors that can occur before the item fails.
FtpWebRequest request = (FtpWebRequest)WebRequest.Create (serverUri); request.Method = WebRequestMethods.Ftp.GetDateTimestamp; FtpWebResponse response = (FtpWebResponse)request.GetResponse (); Console.WriteLine ("{0} {1}",serverUri,response.LastModified);
Scenario
Trying to connect to TFS 2010 using VS 2008 Team Explorer where TFS is on a different domain than the current logged in Domain.
Solution
Make sure the Server URL is the full URL along with the Team Project Collection Name. If you don't provide the Project Collection Name in the URL then you will get the above error.
http://[server name]:[port name]/tfs/[project collection name]
Also make sure to include the domain beside the user name when you are asked for credentials as the TFS server is on a different domain.
User name: domain name\username
There is only one Web Project Type in Visual Studio unlike the previous versions of Visual Studio.
From Standard Web Forms to MVC all the project templates related to building websites and web applications are available in the same place.
Windows Azure SQL Reporting is a cloud-based reporting service for the Windows Azure Platform built on SQL Server Reporting Services technologies.
SQL Reporting service is available to current subscribers, but should not be used for new software development projects. The service will be discontinued on October 31, 2014.
An alternative to SQL Reporting is to use one or more instances of SQL Server Reporting Services (SSRS) running on Windows Azure Virtual Machines (VM).
A known issue of SQL Reporting is that report execution is slower than SSRS running on premises. In contrast, performance of SSRS on a VM is similar to the performance you might expect from an on-premises SSRS instance.
Transition from SQL Reporting to SQL VM provides the following advantages:
SQLCLR allows integration of .NET Framework CLR components in SQL SERVER. .NET CLR Assemblies reside and run from database rather than file system. SQL SERVER need to check only once during deployment. This boosts performance and reliability.
SQLCLR allows to write stored procedures, functions, triggers, user defined types and aggregates using variety of .NET Framework languages like C#, VB.NET etc.
When to use SQLCLR?
Don't use SQLCLR for querying data from SQL SERVER. It is better to use T-SQL for that. Instead use SQLCLR when you can't perform some tasks using native SQL functions like GZip Compression/Decompression, PGP Encryption/Decryption, complex string manipulations etc.
How to enable SQLCLR?
exec sp_configure 'clr enabled', 1; GO RECONFIGURE GO
Consider a scenario where you have a view object in a database named "Database A" tries to access a table from "Database A" as well as a table from "Database B". In this scenario to run the view the user must also be a user in Database B along with Database A. If these databases are tightly linked, cross database ownership chaining can be used where user can access the table from Database B only through the view from Database A but doesn't have explicit permissions to query the table in Database B.
Steps to enable Cross Database Ownership Chaining
Set both Database A and Database B in the chaining as trustworthy.
ALTER DATABASE Database1 SET TRUSTWORTHY ON;
ALTER DATABASE Database2 SET TRUSTWORTHY ON;
Make sure that both Tables inside the Database A and Database B belong to the same owner.
Enable Cross Database Ownership Chaining. This can be done on whole instance level or specific databases.
For instance level
Exec sp_configure 'cross db ownership chaining', 1;
RECONFIGURE;
For database level
ALTER DATABASE Database1 SET DB_CHAINING ON;
ALTER DATABASE Database2 SET DB_CHAINING ON;
Yes, it is completely supported in SQL Server 2008 irrespective of the instance running on Windows Authentication or Mixed mode Authentication.
Make sure the AD Groups are created as Security Groups rather than Distribution Lists. If not created as Security Groups, then Nested AD Groups will not be working.
Attempt to start SSMS with a different windows user account if you are using credentials on the same domain
runas /netonly /user:domain\username ssms.exe
Use the "/netonly" switch, you can log in using remote credentials on a domain that you're not currently a member of, even if there's no trust set up
runas /netonly /user:domain\username ssms.exe
Want to do delta loads for your Data Warehouse and having problems in storing Last Load time in an SSIS DateTime Variable?
SSIS DateTime data type truncates the milliseconds from the DateTime. To avoid this set the data type for the variable to store the Last Load time as String. e.g.
SELECT CONVERT(VARCHAR(23), LastModified, 121) LastModified FROM ........
Then you can change it to DateTime before passing it back to the database.
.......WHERE LastModified > CONVERT(DATETIME, @LastModified, 121)