Thursday, December 5, 2013

Task 22. SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER


This week I had another SSIS bug to kill. :)

I have 3 SSIS packages: 2 child and 1 master


The Control Flow for the first one is integrated in a Sequence Container to having the rollback ability. 

When editing the Sequence Container properties there is TransactionOption set to Required. 

For the second package we have another sequence container with the same options.
But the Master one has 2 Execute Package Task only.

In the Properties window of Master package there is the same option named TransactionOption set to Supported.


So, all these packages are working fine on the server.
The problem appears when moving them to another server: the client server

After modifying all the connection strings for all the packages an error appears:

[OLE DB State Source [1565]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "DDBillingPortal_TempLive.dd" failed with error code 0xC0202009.  There may be error messages posted before this with more information on why the AcquireConnection method call failed.

I was searching through a lot of articles but nobody was offering a solution to this error. Finally, the problem is related to that TransactionOption set to Required instead of Supported in those 2 packages.



Tuesday, August 13, 2013

Microsoft Certifications



 I just had a discussion with someone regarding the value of Microsoft certifications and what exactly does it take to become MCSA certified?  Last year Microsoft launched their new MCSA -- Microsoft Certified Solutions Associate certification. While entry-level may be able to successfully sit for the first exam, by the time you get to the second or third, you'll likely need at least 2 years of experience to pass. The MCSA contains 3 certifications you need to pass: 70-461, 70-462 and 70-463, but there are more to stand for like 70-464 and 70-465. Who knows maybe until next year the smart folks from Microsoft will release another 5 certifications :-) 


Exam 70-464:

Developing Microsoft SQL Server 2012 Databases

http://www.microsoft.com/learning/en/us/exam.aspx?id=70-464

Exam 70-465:

Designing Database Solutions for Microsoft SQL Server 2012

https://www.microsoft.com/learning/en/us/exam.aspx?ID=70-465

Tuesday, April 2, 2013

Task 21. Data Processing Techniques





Ever wondered how to process millions of rows daily?
SQL Server 2012 comes with new TSQL functions and index enhancement to help solve high volume data processing with techniques that will be kind to the server. 

The main problem is related to:
 1. High Volume Data ( size is relative)
2. Business Requirements ( complex business logic, duration, data availability, scalability, configurable, rerunable)

The DBA becomes crazy when comes about disk space, memory usage and CPU bottlenecks. They need to care about availability/response time, risks reduction/avoidance and governance.

You need to know all the details about SQL Server environment: information about recovery model, utilisation, hardware environment: number of processors, memory, network speed, disk configuration.

Useful techniques:
1. Reliable pipe ( batch processing using T-SQL)
2. Fast pipe (SSIS, BCP, BULK INSERT)
3.Divide and conquer (Balanced data distributor with SSIS, Partitioning)

The idea of batching in T-SQL is to break the problem into smaller pieces:
1. Table Lock duration reduced
2. Less rows to rollback
3.Less TempDB consumption
4.Control on error handling
5. Customised visibility on progress


Batching SQL is great for:
1. Transferring data in the same instance
2. Low priority data transfer with long duration
3.Batching scripts can be converted to stored procedures - ideal for replication
4. TempDB size is limited
5. Transaction Log is generally small for SIMPLE or BULK LOGGED recovery mode

Break large delete operations into chunks  is an article for daily operations data process.

There are some tricky scenarios that can not be done using batching like get a unique list of number of columns from 100 million rows or joining 100 million rows with another 100 million rows on nvarchar(500) columns.

In SQL 2012 there are column store indexes which group store data for each column and join all the columns to complete the whole index. Enable faster performance for common data warehousing queries such as filtering, aggregating, grouping and star-join queries.

CREATE NONCLUSTERED COLUMNSTORE INDEX [ISCS_DailySpeed_ColumnStore] ON [dbo].[DailySpeed_ColumnStore] ( [LocationID], [CaptureDateID],[DownloadKbps], [UploadKbps]) WITH (DROP_EXISTING = OFF) ON [PRIMARY]
GO

SSIS -Data flow task
Minimally Logged Inserts
1. Transaction Log keeps track of extent allocations and metadata changes only