Saturday, May 23, 2015

Accessing Reporting Services Using a Fully Qualified Domain Name

If you installed SQL Server Reporting Services (SSRS) on a server in a domain and you use a domain user to start the service and did not perform any further configuration, then you most probably can only access the Report Manager using an IP and not the Fully Qualified Domain Name (FQDN) of the server (if an SPN is not set).

If you try to use the Fully Qualified Domain Name to access reporting services then you will most probably be prompted for username password several times ending with an empty page.

In order to be able to access Reporting Services using FQDN you will need to perform the following actions:

1. Register a Service Principal Name (SPN) for the Domain User Account that Runs SSRS
Consider the following example:

sample computer name: reportsrv01
sample domain: example.com
sample domain account: example\ssrsuser

Then on the Domain Controller Server in a Command Prompt with Elevated Rights, you can Run as Administrator:

example 1: If SSRS are on port 80 (no need to specify 80 as it is the default http port):
Setspn -s http/reportsrv01.example.com example\ssrsuser

example 2: If SSRS are on any other port (i.e. 2430):
Setspn -s http/reportsrv01.example.com:2430 example\ssrsuser

2. Edit the RsReportServer.config File
On the Reporting Services server, in the virtual directory of SSRS, edit the "RsReportServer.config" file and locate the section.

Then add as the first entry in the  section. 

The above step will actually enable NTLM.

For more information please visit this MSDN article.


Read more on this article...

Monday, May 4, 2015

Transferring Ownership of All Database Objects Back to DBO

Consider the following scenario: You created a Window or SQL login in SQL Server and in the user mapping along with assigning a database role membership, you assigned as default schema something different than the “dbo”.

Take as an example the following screenshot:










































As you can see, the SQL login “testLogin” has been assigned the database role “db_owner” on the database “SampleDB1” but along with that, “testLogin” was also set as the default schema.

If you do that, any object the login “testLogin” creates, their owner schema will be “testLogin” and not “dbo”.

If however, for any reason you want to remove “testLogin” from the SQL Server instance you will not be able to do it because the login, through its schema, will own database objects.

If you still want to remove the login don’t lose hope, there is a way! :)

For achieving the above you need to transfer back the ownership of the objects to the “dbo” (or any other) schema.

You can do that using the “ALTER SCHEMA” command. As described in BOL, this command transfers a securable between schemas.

So how can you use this command in order to transfer back to dbo all objects owned by "testLogin"?

One of the easiest approaches is to generate the set of SQL statements that you can run in order to restore ownership for all objects of the following types:
- tables
- views
- functions (scalar, table-valued)
- stored procedures
- ...and maybe any other object types

A sample logic that could do that would be:

--User Option 1: Set the database (in this example "SampleDB1")
USE SampleDB1
GO

--Variable declarations
DECLARE @currentSchema VARCHAR(100)
DECLARE @newSchema VARCHAR(100)

--User Options 2 and 3: set the current schema and the new schema 
--to which you want to transfer the selected database's objects
--in this example current schema='testLogin' and new schema='dbo'
SET @currentSchema='testLogin'
SET @newSchema='dbo'

DECLARE @schemaID INT
SET @schemaID=(SELECT [schema_id] FROM sys.schemas WHERE [name]=@currentSchema)

--The actual logic
IF OBJECT_ID('TempDB..#Statements') IS NOT NULL
DROP TABLE #Statements

CREATE TABLE #Statements(
ScriptToRun VARCHAR(200)
)

--Generate statements for all user tables
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='U' AND [schema_id]=@schemaID

--Generate statements for all user views
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='V' AND [schema_id]=@schemaID

--Generate statements for all stored procedures
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='P' AND [schema_id]=@schemaID

--Generate statements for all scalar functions
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='FN' AND [schema_id]=@schemaID

--Generate statements for all table-valued-functions
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='TF' AND [schema_id]=@schemaID

--Set "Results to Text" (Ctrl + T)
SELECT * FROM #Statements

Then, if you want to proceed with the transfer of ownership you can use the generated "ScriptToRun", that is the contents of the temporary table #Statements.

Note 1: As all posts in this blog, this post is provided as-is and for educational purposes only. Do not play with your Production environments if you do not know what you are doing and of course, always take backups of your data.

Note 2: In the above logic we searched for user tables and views, stored procedures and scalar/table-valued functions. You can search for even more database object types in a similar way (more info in one of my previous posts).

Read more on this article...

Wednesday, April 1, 2015

Microsoft SQL Server MVP for the 7th Year in a Row!

Today, a while ago, I have received that special email from Microsoft informing me that I was awarded with the Microsoft Most Valuable Professional (MVP) Award:

-------------









Dear Artemakis Artemiou,

Congratulations! We are pleased to present you with the 2015 Microsoft® MVP Award! This award is given to exceptional technical community leaders who actively share their high quality, real world expertise with others. We appreciate your outstanding contributions in SQL Server technical communities during the past year.

...

The Microsoft MVP Award provides us the unique opportunity to celebrate and honor your significant contributions and say "Thank you for your technical leadership."
-------------

This is the 7th year in a row I am awarded with the Microsoft MVP Award on SQL Server and I am really excited! Every time I received the MVP Award was - and is - unique because it shows the genuine appreciation of Microsoft on what we do by sharing our passion on technology and in this case, SQL Server.

Working so many years with SQL Server and being involved with the worldwide SQL Server community is a way of living for me. The active involvement with the community, the direct communication with Microsoft and especially with the SQL Server Product Group and our Community Program Manager, the exchange of knowledge with fellow MVPs and the broader community, the participation to conferences, user group meetings, blogging, book authoring, software development and many other activities, is a constant source of inspiration and creativity to me. All this makes me not only a better professional but also a better human being because it is all based to the genuine passion of sharing knowledge with fellow people on a technology we all love, and that is SQL Server.

A huge thanks to all of you, my fellow SQL Server community members for just being awesome.

I would also like to thank Microsoft for re-awarding me with this prestigious award, the SQL Server Product Group for doing all this great stuff with SQL Server and related technologies and of course our Community Program Manager at Microsoft for keeping us up to date with all this great stuff provided by Microsoft. Additionally I would also like to thank the local Microsoft Office at Cyprus for supporting our community activities at the Cyprus .NET User Group.

On top of everything, a huge thank you to my beautiful wife and daughter for their true support, love and understanding for all those hours I spent away of them working on all those "community stuff" :) Without your support, you know it would just not feel right. I love you and thank you from the depths of my heart loves! Once again, this award is dedicated to you!

The journey continues. My commitment to all of you is that I will continue doing my best and share with you the technical knowledge that I acquire on SQL Server as well as my experience via all the available channels!

This is going to be a really amazing year. Among other, for the last year I have been silently working on developing something really special that has to do with an exceptional technology in SQL Server, that is In-Memory Optimization. The development is now completed and testing/QA is in its final stages. This is going to be huge and it will be announced real soon! Stay tuned friends!

Read more on this article...

Monday, March 30, 2015

There is insufficient system memory in resource pool 'internal' to run this query.

Even though you might get the above error message for different occasions, one thing is for sure, it has to do with memory management. :)

SQL Server allows you to limit the maximum memory that can be utilized by its Database Engine. You can use this feature in order to allow the OS to have access to memory that cannot be used by SQL Server and thus ensure stability. However you have to remember to re-evaluate this setting in the event of a memory upgrade on the machine.

You can perform the above action as follows:
Right click on instance name in SSMS - Select "Properties" - Select the "Memory" tab. You then get the below dialog (in this example I have set as maximum server memory 3GB):




































The above method is a way to ensure OS stability by not allowing SQL Server to consume all memory in any case and by leaving an amount of memory available for the OS.

Now, in the case you get the "There is insufficient system memory in resource pool 'internal' to run this query." error you can try the following actions:


Action 1
Run the T-SQL command:
ALTER RESOURCE GOVERNOR RESET STATISTICS

As described on an MSDN Library article, the above command will cause Resource Governor to resets statistics on all workload groups and resource pools including the "internal" resource pool.

Action 2:
If the above action does not solve the issue you can try to free-up more memory and eventually upgrade the memory capacity of the machine.


Read more on this article...

Thursday, March 5, 2015

Handling NULL Character \x00 when Exporting to File Using BCP

This article discusses the behavior of the BCP utility when extracting data from SQL Server to files and more specifically to CSV files.

Imagine that you have the below table:

















As you can see, records 2 and 3 contain null values.

Then you export the table contents to CSV using BPC:

bcp "SELECT [code],[description],[price] FROM [TestDB1].[dbo].[product]" queryout "c:\tmp\testExtract.csv" -c -C 1253 -t ; -S ".\SQL2K14" -T

Output:







As you can see on the above screenshot, the output seems correct. The records' NULL values have been replaced with empty strings.

Now consider that the original table, instead of NULL values has empty strings:


















Let's try again to export the new table contents to CSV using BPC:

bcp "SELECT [code],[description],[price] FROM [TestDB1].[dbo].[product]" queryout "c:\tmp\testExtract.csv" -c -C 1253 -t ; -S ".\SQL2K14" -T

Output:







As you can see, the empty strings have been replaced by NULLs which correspond to the hexadecimal character \x00.

From the MSDN Library article on BCP, we read the following:

  • out copies from the database table or view to a file. If you specify an existing file, the file is overwritten. When extracting data, note that the bcp utility represents an empty string as a null and a null string as an empty string.
It seems from the above example, that the queryout keyword has the same behavior regarding null and empty strings.

Now, if you are going to feed the CSV file to another application for parsing and processing it, if the application does not explicitly handle possible occurrences of the NULL character then most probably an error will be occurred.

To this end, always have in mind the above behavior when extracting data to files using BCP.


Read more on this article...

Wednesday, February 18, 2015

Searching for Keywords in SQL Server Jobs

Hi friends all around the globe, it's been a while since my last article but this is a very busy period with many other community activities of which you will soon enjoy the outcome! These activities have to do with heavy software development as well as the authoring of my third eBook with title "Developing SQL Server". More updates on everything will be provided along the way.

This article is actually a T-SQL tip that can become quite handy. It tackles the issue of finding which SQL Server Agent jobs are affected due to schema changes (i.e. table rename, etc.).

Consider the following example where you have three tables:
-table1
-table2
-table3

You also have a SQL Server Agent jobs that reference the above tables.

Then, for any reason, you want to rename 'table2' to 'table2New'

What happens to the SQL Server Agent job the next time it runs? Of course it fails.

When you rename an object in SQL Server using the sp_rename stored procedure, the Database Engine warns you with the below message:
Caution: Changing any part of an object name could break scripts and stored procedures.

In order not to get into problems with such actions, especially in the case of large databases, you must first research if there are any functions, stored procedures, jobs, etc. that might be affected.

The below script will return all jobs that include one or more steps that reference a given string pattern (this can be a table name, stored procedure name, function name, etc.). In this case we search for 'table2':

----------------
declare @pattern nvarchar(max)
set @pattern='table2'

select j.[job_id],j.[name],j.[enabled],j.[description],s.command,j.[date_created],j.[date_modified]
from msdb.dbo.sysjobs j
inner join msdb.dbo.[sysjobsteps] s on j.job_id=s.job_id 
where s.command like '%'+@pattern+'%'
----------------

A subsequent article will contain a master procedure that can be used for searching for a given pattern in all the programmability objects in SQL Server.


Read more on this article...

Wednesday, December 31, 2014

Selected Tweets of Year 2014!

So here I am, writing the last post for 2014! Don't worry new posts for 2015 are coming up soon! :) With only few hours left to welcome the new year, my thoughts go back deep into 2014 trying recollect the highlights of this year through my personal aspect. Beyond the personal highlights I would like to add a note regarding the Community-related ones.

In 2014 I launched Artemiou Data Tools: a brand new community initiative aiming to provide high-quality, simple and still straight-to-the point tools. All tools have and will always have a Community Edition free for all! New websites, new content, new articles, new software and much more is what I recollect for 2014. Another major highlight is my participation to the MS Conference in Cyprus on March 21 2014 as an invited speaker where I talked about #SQLServer 2014 and the In-Memory OLTP Engine - that session was amazing!

Throughout all these years, the interaction with you, the community goes even stronger. This interaction really excites me and gives me strength to move on and come up with more initiatives! The discussions after my sessions in conferences or via your comments in my articles, the email exchange with your questions and topics for discussion and all the rest methods of communication is a unique experience for me and I am proud that I am part of this great SQL Server community.

The year 2015 will be even more powerful! I am planning of releasing the In-Memory OLTP Simulator (stay tuned on this one - it is something really amazing), a new eBook, website, other new tools, articles and much more!

Wishing you a Happy New Year 2015 always in good health and a constant quest for knowledge!

Below are my selected Tweets for 2014!

Blog Posts/Articles and Announcements


  • The mobile version of my official website is now live! Check it out via a mobile device at: http://www.aartemiou.com  #MVPBuzz #Community
  • Join me at MS Conference on March 21 where I will talk about #SQLServer 2014 and the In-Memory OLTP Engine - http://www.imhbusiness.com/microsoftevent2014 #MVPBuzz
  • Artemiou Data Tools just got its own Twitter page! Feel free to follow and support my initiative! @ArtDataTools #MVPBuzz #Software #fb
  • My new website is now live! Its purpose is to be used as a hub for SQL Server knowledge and data tools. http://www.aartemiou.com  #MVPBuzz #fb


News
  • [News] I am launching soon two sub-page systems within my official website: (i) Artemiou SQL Books,  (ii) Artemiou Data Tools #Community #fb
  • Congratulations to the winners of CY Windows Hackathon for the amazing apps and congrats to all for the great ideas and effort! Keep it up!
  • Today it's my #twitterversary! 5 years of tweeting!
  • The data export tool just became a multi-exporter! New release date: Thursday, June 26!
  • Tomorrow, @ArtDataTools releases a brand new free data tool that allows the user to easily export data from #SQLServer to Excel! #MVPBuzz
  • Are you ready for tomorrow's public release of #SQLServer 2014? ;) Are you ready for the In-Memory OLTP Engine? :) #MVPBuzz
  • Written 10K lines of C# / SQL code so far...the In-Memory OLTP Simulator for #SQLServer started taking form! More news soon! #MVPBuzz #fb
  • Working on my session for #Microsoft Conference 2014 in Cyprus...with 4 hours of sleep and my 5th cup of coffee :) #MVPBuzz #SQLServer
  • [News] My personal website enriched with my SQL Books and Data Tools launches next week on Feb 5! #MVPBuzz #SQLServer #Community #fb
  • Artemiou Data Tools: My collection of S/W tools that I develop and have to do with data processing and SQL Server interaction #MVPBuzz #fb
  • Artemiou SQL Books: My current and upcoming books on different SQL Server topics #Community #fb #MVPBuzz

Miscellaneous
  • UI design takes so much time! Can't I just write the code? :)
  • Sometimes, exception handling in software development takes much, but much longer than the actual development! :) #MVPBuzz
  • Setting up a Win Srv 2K12 R2 VM in Hyper-V Manager on my Windows 8.1 box while developing! Extremely fast, extremely smooth! #MVPBuzz
  • Just used Telerik's free tool JustDecompile. Cool tool! #Community
  • Doing some serious development in Visual Studio 2013 running on Windows 8.1 x64 with Update 1. This is smooth! :) #MVPBuzz
  • Into deep development on In-Memory OLTP Simulator. Developing a plug-in engine... #MVPBuzz #SQLServer #fb
  • #Microsoft Conference was a big success today! Thank you for attending my session on #SQLServer In-Memory OLTP Engine! #MVPBuzz
  • Everything is ready for my session at the MS Conference 2014! My In-Mem OLTP Simulator for #SQLServer is ready for some demos! #MVPBuzz
  • Just backed up and restored a SQL 2014 database from Windows Azure storage to my local SQL Server installation! Pretty cool! :) #MVPBuzz


Read more on this article...

Friday, December 12, 2014

How to Import and Export Unstructured Data in SQL Server - FileTables: The Movie!

A few weeks ago I was contacted by Webucator, an online training provider that requested my permission to include one of my articles into a free video-training series called "SQL Server Solutions from the Web". This is one of my articles on FileTables with title: “How to Import and Export Unstructured Data in SQL Server – FileTables”. As I really liked this initiative by Webucator, I gave them my permission to turn my article into a video and include it in the series and here's the result which I find it very cool!



I really liked the video which by the way is a high-quality, clean video with straight to the point explanations, useful annotations and above all, with content that highlights the most important points of my blog article.
If you want to see FileTables in action then this is a good opportunity!

The free SQL Server Solutions from the Web course is listed on Webucator's SQL Server training page where you can find the full list of available SQL Server-related training classes provided by Webucator.

Last but not least, a big thank you to Webucator for being an active supporter of the technical community. Keep it up guys!

Read more on this article...

Monday, December 1, 2014

CY Hackathon Nov 29-30 2014: A Short Review

On the weekend of Nov 29-30, Windows Hackathon took place in Cyprus. The contestants had almost two days to develop a universal application for Windows Store and Windows Phone. They were given an open data API that they had to use. Beyond that, everything was up to their imagination which as their presentations proved, it has no limits!

I had the joy to participate to the three-judge committee where we had to judge the apps for four aspects:
  • Functionality
  • Idea
  • User Interface
  • Universal nature (whether the app could run on both Windows Phone and Windows 8.1 PCs)
The event was a huge success. With more than 50 participants, many teams where formed and developed some really great apps.

The three winning teams presented three very interesting apps that have a great potential ahead. In addition, all the participants’ ideas were very good and proved that the level of software development in Cyprus is very high.

It is really exciting to see that so many people stayed for almost two days without sleep just to follow their passion, to put their ideas in motion and develop cool applications.

My advice to all of them, and in the end of the day to all of us -including me-, is to follow their passion, chase their ideas, make a plan and develop that great app that will change the world!

Once again congratulations to all participants as well as to the Computer Science Department for hosting the event and a huge bravo to Microsoft Cyprus for organizing and sponsoring such an exciting event.

Read more on this article...

Thursday, October 30, 2014

Listing all Tables of a Linked Server's Database

There are many scenarios in the DBMSs world where you might need to use a linked server between two different SQL Server instances in order to execute direct queries between them. However, the majority of T-SQL auto-complete tools have difficulties to list the objects (i.e. tables) of the linked server's database.

A way to see these objects, is to browse via SSMS's Object Explorer. What about however if you want to have a quick view of all the available tables in the linked server's database while writing your T-SQL script?

The answer is easy! You can do this by using the Information Schema Views. Below you will find a stored procedure that I created which can help you to list all the tables of linked server's database.

Here's the code:

------------------------------------------------------------------------------------------
-- Sample Code for Blog Post: Listing all Tables of a Linked Server's Database
-- Blog URL: http://aartemiou.blogspot.com
-- Author: Artemakis Artemiou, SQL Server MVP
-- Email Address: a_artemiou@hotmail.com
-- Official Website: http://www.aartemiou.com
--
-- Disclaimer: This is a sample code. Do not use it in Production Systems before properly
-- testing it. You bear sole responsibility for usage of any content of this T-SQL code. 
-- You are responsible for any loss of data, loss of peace or any damages due to usage of 
-- this code. Always take backup of your data.
------------------------------------------------------------------------------------------
USE [ENTER_DATABASE_NAME];
GO

CREATE PROCEDURE [dbo].[DBTableInfo] (@LinkedServerName varchar(50), @DBName varchar(50))
AS

--variables declaration
DECLARE @LinkedServerNameFiltered VARCHAR(50)
DECLARE @LinkedServerNameFound int
DECLARE @DBsFound int
DECLARE @DBNameFiltered varchar(50)
DECLARE @Command nvarchar(500)

--check if specified linked server exists
SET @LinkedServerNameFiltered=QUOTENAME(@LinkedServerName);
SET @LinkedServerNameFound=0;
SET @LinkedServerNameFound=(SELECT COUNT(*) FROM sys.servers s WHERE s.name=@LinkedServerName);

--report findings
IF @LinkedServerNameFound=0
BEGIN
PRINT 'Error: Linked server ' + @LinkedServerNameFiltered+ ' not found.';
RETURN;
END

--check if specified database exists
DECLARE @QUERY nvarchar(250);
SET @DBNameFiltered = QUOTENAME(@DBName,'''');
SET @DBsFound=0;
SET @QUERY='(SELECT @DBs=COUNT(*) FROM ' + @LinkedServerNameFiltered +'.[master].sys.sysdatabases s WHERE s.name='+@DBNameFiltered+')';
EXEC sp_executesql @QUERY, N'@DBs int OUTPUT', @DBs=@DBsFound OUTPUT;

--report findings
IF @DBsFound=0
BEGIN
PRINT 'Error: Database ' + @DBNameFiltered + ' not found.';
RETURN;
END

--construct dynamic T-SQL statement
SET @DBNameFiltered = QUOTENAME(@DBName);
SET @LinkedServerNameFiltered=QUOTENAME(@LinkedServerName);

SET @Command= 'SELECT TABLE_SCHEMA as TableSchema, TABLE_NAME as TableName,(''SELECT TOP 10 * FROM '+ @LinkedServerNameFiltered +'.'+ @DBNameFiltered +'.'' + QUOTENAME(TABLE_SCHEMA) + ''.'' + QUOTENAME(TABLE_NAME)) as SampleQuery
FROM ' + @LinkedServerNameFiltered+'.' + @DBNameFiltered+'.INFORMATION_SCHEMA.TABLES i WHERE i.TABLE_TYPE LIKE ''%TABLE%'' ORDER BY 1';

--execute the command
EXEC sp_executesql @command;

------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------

You can call the stored procedure as follows:

USE [ENTER_DATABASE_NAME];
GO

EXEC [dbo].[DBTableInfo] 'LINKED_SERVER_NAME', 'LINKED_SERVER_DB'
GO

The stored procedure displays three columns for each record: (i) Table Schema, (ii) Table Name, (iii) A sample query that can be executed and returns the top 10 rows for the specific table.

Sample output of the stored procedure:

Figure 1: Sample Output of the DBTableInfo Stored Procedure.















You can also download the stored procedure using the following link.

Read more on this article...

Sunday, August 31, 2014

In-Memory Optimization in SQL Server: A Simple Experiment

In one of my previous articles I briefly introduced In-Memory Optimization in SQL Server 2014. In this article we will conduct a simple experiment utilizing this new technology and discussing the benefits. The experiment will be organized in the following steps:
  1. Create the In-Memory processing enabled database
  2. Create the tables (disk-based, memory-enabled)
  3. Populate the tables with sample data
  4. Prepare the processing code for three scenarios (execution modes):
    • Disk-based processing
    • In-Memory optimization
    • In-Memory optimization with natively compiled stored procedure
  5. Run all three execution modes and measure execution times
  6. Discuss findings
Enough words, let’s proceed with some T-SQL code!


1. Create the In-Memory processing enabled database

--Creates a memory-enabled database by adding a memory-optimized filegroup with a specified directory.
--Note: This assumes that directory c:\tmp exists (the subdirectory is created by SQL Server).
CREATE DATABASE InMemDB;
GO
ALTER DATABASE InMemDB
ADD FILEGROUP [inMemOLTP_fg] CONTAINS MEMORY_OPTIMIZED_DATA
ALTER DATABASE InMemDB
ADD FILE ( name = [inMemOLTP_fg_dir],
   filename= 'c:\tmp\InMemOLTPDir')
   TO FILEGROUP [inMemOLTP_fg];
GO


2. Create the tables (disk-based, memory-enabled)
--Create sample tables representing a scenario involving 200000 Products
USE InMemDB;
GO

--Create traditional table (disk-based)
CREATE TABLE tblDiskBased(
id INT PRIMARY KEY,
code VARCHAR(50),
descr VARCHAR(250),
price float
);
GO

--Create a durable (data will be persisted) memory-optimized table
--Alternative choice: SCHEMA_ONLY = Data will be lost if the server turns off unexpectedly
CREATE TABLE tblMemOptimized(
id INT NOT NULL PRIMARY KEY NONCLUSTERED HASH WITH (BUCKET_COUNT=400000),
code VARCHAR(50),
descr VARCHAR(250),
price float
)WITH (MEMORY_OPTIMIZED=ON, DURABILITY = SCHEMA_AND_DATA);
GO


--Create a durable (data will be persisted) memory-optimized table
--Alternative choice: SCHEMA_ONLY = Data will be lost if the server turns off unexpectedly
--Note: This table will be used by the natively-compiled stored procedure
CREATE TABLE tblMemOptimized2(
id INT NOT NULL PRIMARY KEY NONCLUSTERED HASH WITH (BUCKET_COUNT=400000),
code VARCHAR(50),
descr VARCHAR(250),
price float
)WITH (MEMORY_OPTIMIZED=ON, DURABILITY = SCHEMA_AND_DATA);
GO

So far: We have created a memory enabled database and data structures. Now, it's time to populate the tables with sample data.

3. Populate the tables with sample data
--Populate the tables with sample data (200K records)
--tblMemOptimized
DECLARE @i INT
SET @i=0
WHILE (@i<200000)
BEGIN
INSERT INTO tblMemOptimized
SELECT CAST(@i as varchar(6)) as id, ('Code '+ (CAST (@i AS varchar(6)))) as code, ('Description '+ (CAST (@i AS varchar(6)))) as descr, (10 + (500-10)*RAND()) as price
SET @i=@i+1
END
GO

--Populate the tables with sample data (200K records)
--tblDiskBased
--tblMemOptimized2
INSERT INTO tblDiskBased
SELECT * FROM tblMemOptimized;
GO

INSERT INTO tblMemOptimized2
SELECT * FROM tblMemOptimized;
GO

All three tables now have same data (200K rows each). You can easily check it out!

4. Prepare the processing code for three scenarios (execution modes):
We will wrap each code block into a stored procedure for easier execution.
The scenario is to decrease each product's price by 5%.

a. Disk-based processing
--Disk-based processing
CREATE PROCEDURE updateDiskBased as
BEGIN
UPDATE dbo.tblDiskBased
SET price=price-(price*0.05)
END
GO

b. In-Memory optimization
--Memory-Optimized
CREATE PROCEDURE updateMemoryOptimized as
BEGIN
UPDATE dbo.tblMemOptimized
SET price=price-(price*0.05)
END
GO

c. In-Memory optimization with natively compiled stored procedure
--Natively-Compiled Stored Procedure
CREATE PROCEDURE updateMemoryOptimizedNatively
with native_compilation, schemabinding, execute as owner
as 
begin atomic with (transaction isolation level = snapshot,language = N'English')
UPDATE dbo.tblMemOptimized2
SET price=price-(price*0.05)
END
GO

So far: We have created a memory enabled database and data structures. We have populated the tables with sample data (all three tables have identical data for experiment consistency purposes) and we have prepared the stored procedures that target each one of the three tables and perform the processing, that is the update of the 'price' column by decreasing it by 5%.

Now it's time to run each stored procedure and measure the execution times.

But before executing the stored procedures let's check a sample (TOP 5) from each table:

Figure 1: Top 5 Records from each table before update.



















5. Run all three execution modes and measure execution times
Ok, now it's time to execute the three stored procedures, thus update all tables and measure execution times and here they are:

Figure 2: Execution times for three modes.





























Additionally, let's check once more some records just to confirm that the output is the same in all three tables:

Figure 3: Top 5 Records from each table after update.




























6. Discussion
First of all, from the above figure we can see that the output of the three update operations is exactly the same which is something that validates the correctness of the three different execution modes.

As you can see from the screenshot with execution times (Figure 2), the slowest execution was the one that involved the disk-based table (elapsed time: 1924 ms) having also the highest CPU time (328 ms).

The fastest execution was the one of the memory-optimized table with the natively compiled stored procedure where the elapsed time was 329 ms and the CPU 93 ms!

The memory optimized execution was faster than the disk-based with 404 ms elapsed time and 250 ms CPU time but it was not faster than the memory-optimized scenario combined with the natively compiled stored procedure.

In-Memory optimization in SQL Server is a great new technology which can significantly improve performance in many cases. Especially in cases of certain workloads such as staging processes in Data Warehouse systems, high data insert rates (i.e. smart metering) and low latency execution scenarios (i.e. investment trading), In-Memory optimization can boost performance. Of course, before applying it on Production systems, a proper analysis is required along with testing.

During the last six months I have been working developing a simulator for In-Memory optimization in SQL Server. I expect to make it publicly available by the end of September 2014. So make sure that you check my official website often for updates!


Read more on this article...