Wednesday, July 8, 2015

In-Memory OLTP Simulator v1.0 Now Live!

After two years of heavy development, my tool "In-Memory OLTP Simulator" is finally out!

In-Memory OLTP Simulator enables the IT Professional and Developer to easily test the powerful In-Memory OLTP Engine of SQL Server 2014® (or later) with different simulations and against different workloads. In the Ultimate Edition of the tool, among other, the user can load data from different sources (i.e. data from Production databases using Linked Servers) into the Simulator’s database, simulate real-life scenarios with the use of custom scenarios as well as get in-depth analytics.

Video Introduction:
For a short introduction, please watch the video below:

  • Community Edition: This is the free version of the tool where basic functionality is enabled.
  • Ultimate Edition: This edition contains all the basic functionality plus the ability to create/modify and import/export custom scenarios as well as have access to more analytics. The Ultimate Edition of In-Memory OLTP Simulator can be very useful for organizations or individuals that want to create custom scenarios and simulate real Production data and processes with using In-Memory Optimization in SQL Server. The Ultimate Edition requires a license to be purchased.
Read more on this article...

Friday, July 3, 2015

Introducing In-Memory OLTP Simulator

One of the key features shipped with SQL Server 2014 was the long-awaited In-Memory OLTP, codenamed as project “Hekaton”. In-Memory OLTP introduced new data structures for optimizing the performance of OLTP workloads. These data structures are called “memory-optimized” tables and along with natively compiled stored procedures, they can offer a huge performance boost when it comes to processing large volumes of data, especially in data environments with high levels of concurrency. In certain cases, the execution time can be up to 50 times faster. As its name implies, the In-Memory OLTP Engine is optimized for OLTP workloads. Examples of such workloads include:

  • Staging tables (i.e. DW)
  • High data insert rate (i.e. smart metering)
  • Low latency execution (i.e. investment trading)

Since the early releases (previews, CTPs, RCs) of SQL Server 2014, I have been studying and testing the In-Memory OLTP Engine and of course the inevitable came: I became a fun of it!

Throughout the hundreds of experiments I ran against the In-Memory OLTP Engine, the idea of developing a special simulation tool came up and this is what I am presenting today.

After almost two years of development, today I’m presenting "In-Memory OLTP Simulator".

What Does In-Memory OLTP Simulator Do?
In-Memory OLTP Simulator enables the IT Professional and Developer to easily test the powerful In-Memory OLTP Engine of SQL Server 2014 (or later) with different simulations and against different workloads. In the Ultimate Edition of the tool, among other, the user can load data from other sources (i.e. using Linked Servers) into the Simulator’s database and along with the logic of a custom scenario, simulate real life scenarios and get in-depth analytics on performance when utilizing In-Memory Optimization and Natively-Compiled Stored Procedures in SQL Server. In addition to using In-Memory OLTP Simulator for benchmarking and performance analysis, the user can also use it for resource-sizing as it provides statistics on resource usage (i.e. CPU and Memory) when running simulations.

Figure 1: In-Memory OLTP Simulator Start Page.

How Does it Work?
The Community Edition (free) of In-Memory OLTP Simulator supports six standard scenarios that are executed against three modes (where available):
  • Disk-Based
  • Memory-Optimized
  • Memory-Optimized with Natively-Compiled Stored Procedure
In Disk-Based execution mode, all tables are stored in disk storage. This is the "traditional" execution mode which can be found in all Relational Database Management Systems (RDBMs). The Disk-Based mode is considered as the baseline for all simulations.

In Memory-Optimized execution mode, all tables are loaded and processed in-memory. Under this mode, memory-optimized tables can be also combined with disk-based tables if this better suits the user’s needs.

In Memory-Optimized with Natively-Compiled Stored Procedure execution mode, the full power of SQL Server In-Memory OLTP Engine is being utilized. Along with the memory-optimized data structures, a new special type of SQL Server stored procedure is used. Natively compiled stored procedures are compiled when they are created (in contrast to the disk-based stored procedures where they are compiled at their first execution) thus achieving significantly faster execution.
In-Memory OLTP Simulator undertakes the execution of the scenarios in each mode, monitors the resource usage, logs all statistics and provides rich analytics. Additionally, in the Ultimate Edition of the tool, the user is able to create/modify and import/export custom scenarios as well as get access to multiple processing options and more analytics.

Figure 2: In-Memory OLTP Simulator Simulation Page.

Available Editions
There are currently two editions of In-Memory OLTP Simulator available:
  • Community Edition: This is the free version of the tool where basic functionality is enabled.
  • Ultimate Edition: This edition contains all the basic functionality plus the ability to create/modify and import/export custom scenarios as well as have access to more analytics. The Ultimate Edition of In-Memory OLTP Simulator can be very useful for organizations or individuals that want to create custom scenarios and simulate real Production data and processes with using In-Memory Optimization in SQL Server.

Figure 3: In-Memory OLTP Simulator Simulation Page After Running a Simulation Multiple Times.

Release Date
In-Memory OLTP Simulator will be released on July 8 2015!

More Resources:

Read more on this article...

Thursday, May 28, 2015

Compatibility Levels Supported by Different SQL Server Versions

Software is evolving and certainly SQL Server is not an exception to the rule. Every few years we need to upgrade our databases in order to run on a newer version of SQL Server and take advantage of significant new features that will enhance the operations of our organization.

However, upgrading a database to a newer compatibility level of any DBMS involves evaluating not just the database but also how an entire application supported by that database will behave.

The easiest thing for many people is to move the database to the new version of SQL Server but make use of the backwards compatibility support of the Database Engine. This is however not a recommended approach because this way you cannot utilize all the features provided by the Database Engine of that new version of SQL Server.

A recommended high-level practice for moving a database to a newer version of SQL Server is:
  • Analyze your database with SQL Server Upgrade Advisor (for each SQL Server version there is the corresponding version of the Upgrade Advisor) in order to find any incompatibilities (i.e. the usage of deprecated features, etc.).
  • Resolve any compatibility issues that might be reported by the tool.
  • Move the database to the desired version of SQL Server on a Test Environment.
  • Test your database and the supported application to check if everything works well (this step involves any business users they application might have too).
  • Resolve any issues that might be raised.
  • Only if you are sure that everything works well after the testing process and any issues have been resolved then you can proceed to the actual migration.
Of course, it goes without saying that you always need to take backups of your databases, not only during their operational cycle but also always before you are going to make changes to their configuration.
Now, you might encounter certain cases where you deal with a "legacy" application and it might be nearly impossible to upgrade its supporting database to a newer version but at the same time you want to move it to a newer version of SQL Server. In such cases you can make use of the backwards compatibility the Database Engine of SQL Server provides. However, each version of SQL Server supports up to a specific compatibility level. For example if you have a SQL Server 2000 database you cannot migrate it to a SQL Server 2012 or 2014 instance because there is not backwards compatibility support for such an old version of SQL Server. You can however migrate it to a SQL Server 2008 R2 instance.
Below you can find the supported compatibility modes for SQL Server versions 2008/R2, 2012 and 2014:

SQL Server 2008
80: SQL Server 2000
90: SQL Server 2005
100: SQL Server 2008

SQL Server 2008 R2
80: SQL Server 2000
90: SQL Server 2005
100: SQL Server 2008 (and 2008 R2 - it is the same compatibility level)

SQL Server 2012
90: SQL Server 2005
100: SQL Server 2008 and SQL Server 2008 R2
110: SQL Server 2012

SQL Server 2014
100: SQL Server 2008 and SQL Server 2008 R2
110: SQL Server 2012
120: SQL Server 2014

Read more on this article...

Saturday, May 23, 2015

Accessing Reporting Services Using a Fully Qualified Domain Name

If you installed SQL Server Reporting Services (SSRS) on a server in a domain and you use a domain user to start the service and did not perform any further configuration, then you most probably can only access the Report Manager using an IP and not the Fully Qualified Domain Name (FQDN) of the server (if an SPN is not set).

If you try to use the Fully Qualified Domain Name to access reporting services then you will most probably be prompted for username password several times ending with an empty page.

In order to be able to access Reporting Services using FQDN you will need to perform the following actions:

1. Register a Service Principal Name (SPN) for the Domain User Account that Runs SSRS
Consider the following example:

sample computer name: reportsrv01
sample domain:
sample domain account: example\ssrsuser

Then on the Domain Controller Server in a Command Prompt with Elevated Rights, you can Run as Administrator:

example 1: If SSRS are on port 80 (no need to specify 80 as it is the default http port):
Setspn -s http/ example\ssrsuser

example 2: If SSRS are on any other port (i.e. 2430):
Setspn -s http/ example\ssrsuser

2. Edit the RsReportServer.config File
On the Reporting Services server, in the virtual directory of SSRS, edit the "RsReportServer.config" file and locate the section.

Then add as the first entry in the  section. 

The above step will actually enable NTLM.

For more information please visit this MSDN article.

Read more on this article...

Monday, May 4, 2015

Transferring Ownership of All Database Objects Back to DBO

Consider the following scenario: You created a Window or SQL login in SQL Server and in the user mapping along with assigning a database role membership, you assigned as default schema something different than the “dbo”.

Take as an example the following screenshot:

As you can see, the SQL login “testLogin” has been assigned the database role “db_owner” on the database “SampleDB1” but along with that, “testLogin” was also set as the default schema.

If you do that, any object the login “testLogin” creates, their owner schema will be “testLogin” and not “dbo”.

If however, for any reason you want to remove “testLogin” from the SQL Server instance you will not be able to do it because the login, through its schema, will own database objects.

If you still want to remove the login don’t lose hope, there is a way! :)

For achieving the above you need to transfer back the ownership of the objects to the “dbo” (or any other) schema.

You can do that using the “ALTER SCHEMA” command. As described in BOL, this command transfers a securable between schemas.

So how can you use this command in order to transfer back to dbo all objects owned by "testLogin"?

One of the easiest approaches is to generate the set of SQL statements that you can run in order to restore ownership for all objects of the following types:
- tables
- views
- functions (scalar, table-valued)
- stored procedures
- ...and maybe any other object types

A sample logic that could do that would be:

--User Option 1: Set the database (in this example "SampleDB1")
USE SampleDB1

--Variable declarations
DECLARE @currentSchema VARCHAR(100)
DECLARE @newSchema VARCHAR(100)

--User Options 2 and 3: set the current schema and the new schema 
--to which you want to transfer the selected database's objects
--in this example current schema='testLogin' and new schema='dbo'
SET @currentSchema='testLogin'
SET @newSchema='dbo'

SET @schemaID=(SELECT [schema_id] FROM sys.schemas WHERE [name]=@currentSchema)

--The actual logic
IF OBJECT_ID('TempDB..#Statements') IS NOT NULL
DROP TABLE #Statements

CREATE TABLE #Statements(
ScriptToRun VARCHAR(200)

--Generate statements for all user tables
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='U' AND [schema_id]=@schemaID

--Generate statements for all user views
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='V' AND [schema_id]=@schemaID

--Generate statements for all stored procedures
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='P' AND [schema_id]=@schemaID

--Generate statements for all scalar functions
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='FN' AND [schema_id]=@schemaID

--Generate statements for all table-valued-functions
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='TF' AND [schema_id]=@schemaID

--Set "Results to Text" (Ctrl + T)
SELECT * FROM #Statements

Then, if you want to proceed with the transfer of ownership you can use the generated "ScriptToRun", that is the contents of the temporary table #Statements.

Note 1: As all posts in this blog, this post is provided as-is and for educational purposes only. Do not play with your Production environments if you do not know what you are doing and of course, always take backups of your data.

Note 2: In the above logic we searched for user tables and views, stored procedures and scalar/table-valued functions. You can search for even more database object types in a similar way (more info in one of my previous posts).

Read more on this article...

Wednesday, April 1, 2015

Microsoft SQL Server MVP for the 7th Year in a Row!

Today, a while ago, I have received that special email from Microsoft informing me that I was awarded with the Microsoft Most Valuable Professional (MVP) Award:


Dear Artemakis Artemiou,

Congratulations! We are pleased to present you with the 2015 Microsoft® MVP Award! This award is given to exceptional technical community leaders who actively share their high quality, real world expertise with others. We appreciate your outstanding contributions in SQL Server technical communities during the past year.


The Microsoft MVP Award provides us the unique opportunity to celebrate and honor your significant contributions and say "Thank you for your technical leadership."

This is the 7th year in a row I am awarded with the Microsoft MVP Award on SQL Server and I am really excited! Every time I received the MVP Award was - and is - unique because it shows the genuine appreciation of Microsoft on what we do by sharing our passion on technology and in this case, SQL Server.

Working so many years with SQL Server and being involved with the worldwide SQL Server community is a way of living for me. The active involvement with the community, the direct communication with Microsoft and especially with the SQL Server Product Group and our Community Program Manager, the exchange of knowledge with fellow MVPs and the broader community, the participation to conferences, user group meetings, blogging, book authoring, software development and many other activities, is a constant source of inspiration and creativity to me. All this makes me not only a better professional but also a better human being because it is all based to the genuine passion of sharing knowledge with fellow people on a technology we all love, and that is SQL Server.

A huge thanks to all of you, my fellow SQL Server community members for just being awesome.

I would also like to thank Microsoft for re-awarding me with this prestigious award, the SQL Server Product Group for doing all this great stuff with SQL Server and related technologies and of course our Community Program Manager at Microsoft for keeping us up to date with all this great stuff provided by Microsoft. Additionally I would also like to thank the local Microsoft Office at Cyprus for supporting our community activities at the Cyprus .NET User Group.

On top of everything, a huge thank you to my beautiful wife and daughter for their true support, love and understanding for all those hours I spent away of them working on all those "community stuff" :) Without your support, you know it would just not feel right. I love you and thank you from the depths of my heart loves! Once again, this award is dedicated to you!

The journey continues. My commitment to all of you is that I will continue doing my best and share with you the technical knowledge that I acquire on SQL Server as well as my experience via all the available channels!

This is going to be a really amazing year. Among other, for the last year I have been silently working on developing something really special that has to do with an exceptional technology in SQL Server, that is In-Memory Optimization. The development is now completed and testing/QA is in its final stages. This is going to be huge and it will be announced real soon! Stay tuned friends!

Read more on this article...

Monday, March 30, 2015

There is insufficient system memory in resource pool 'internal' to run this query.

Even though you might get the above error message for different occasions, one thing is for sure, it has to do with memory management. :)

SQL Server allows you to limit the maximum memory that can be utilized by its Database Engine. You can use this feature in order to allow the OS to have access to memory that cannot be used by SQL Server and thus ensure stability. However you have to remember to re-evaluate this setting in the event of a memory upgrade on the machine.

You can perform the above action as follows:
Right click on instance name in SSMS - Select "Properties" - Select the "Memory" tab. You then get the below dialog (in this example I have set as maximum server memory 3GB):

The above method is a way to ensure OS stability by not allowing SQL Server to consume all memory in any case and by leaving an amount of memory available for the OS.

Now, in the case you get the "There is insufficient system memory in resource pool 'internal' to run this query." error you can try the following actions:

Action 1
Run the T-SQL command:

As described on an MSDN Library article, the above command will cause Resource Governor to resets statistics on all workload groups and resource pools including the "internal" resource pool.

Action 2:
If the above action does not solve the issue you can try to free-up more memory and eventually upgrade the memory capacity of the machine.

Read more on this article...

Thursday, March 5, 2015

Handling NULL Character \x00 when Exporting to File Using BCP

This article discusses the behavior of the BCP utility when extracting data from SQL Server to files and more specifically to CSV files.

Imagine that you have the below table:

As you can see, records 2 and 3 contain null values.

Then you export the table contents to CSV using BPC:

bcp "SELECT [code],[description],[price] FROM [TestDB1].[dbo].[product]" queryout "c:\tmp\testExtract.csv" -c -C 1253 -t ; -S ".\SQL2K14" -T


As you can see on the above screenshot, the output seems correct. The records' NULL values have been replaced with empty strings.

Now consider that the original table, instead of NULL values has empty strings:

Let's try again to export the new table contents to CSV using BPC:

bcp "SELECT [code],[description],[price] FROM [TestDB1].[dbo].[product]" queryout "c:\tmp\testExtract.csv" -c -C 1253 -t ; -S ".\SQL2K14" -T


As you can see, the empty strings have been replaced by NULLs which correspond to the hexadecimal character \x00.

From the MSDN Library article on BCP, we read the following:

  • out copies from the database table or view to a file. If you specify an existing file, the file is overwritten. When extracting data, note that the bcp utility represents an empty string as a null and a null string as an empty string.
It seems from the above example, that the queryout keyword has the same behavior regarding null and empty strings.

Now, if you are going to feed the CSV file to another application for parsing and processing it, if the application does not explicitly handle possible occurrences of the NULL character then most probably an error will be occurred.

To this end, always have in mind the above behavior when extracting data to files using BCP.

Read more on this article...

Wednesday, February 18, 2015

Searching for Keywords in SQL Server Jobs

Hi friends all around the globe, it's been a while since my last article but this is a very busy period with many other community activities of which you will soon enjoy the outcome! These activities have to do with heavy software development as well as the authoring of my third eBook with title "Developing SQL Server". More updates on everything will be provided along the way.

This article is actually a T-SQL tip that can become quite handy. It tackles the issue of finding which SQL Server Agent jobs are affected due to schema changes (i.e. table rename, etc.).

Consider the following example where you have three tables:

You also have a SQL Server Agent jobs that reference the above tables.

Then, for any reason, you want to rename 'table2' to 'table2New'

What happens to the SQL Server Agent job the next time it runs? Of course it fails.

When you rename an object in SQL Server using the sp_rename stored procedure, the Database Engine warns you with the below message:
Caution: Changing any part of an object name could break scripts and stored procedures.

In order not to get into problems with such actions, especially in the case of large databases, you must first research if there are any functions, stored procedures, jobs, etc. that might be affected.

The below script will return all jobs that include one or more steps that reference a given string pattern (this can be a table name, stored procedure name, function name, etc.). In this case we search for 'table2':

declare @pattern nvarchar(max)
set @pattern='table2'

select j.[job_id],j.[name],j.[enabled],j.[description],s.command,j.[date_created],j.[date_modified]
from msdb.dbo.sysjobs j
inner join msdb.dbo.[sysjobsteps] s on j.job_id=s.job_id 
where s.command like '%'+@pattern+'%'

A subsequent article will contain a master procedure that can be used for searching for a given pattern in all the programmability objects in SQL Server.

Read more on this article...

Wednesday, December 31, 2014

Selected Tweets of Year 2014!

So here I am, writing the last post for 2014! Don't worry new posts for 2015 are coming up soon! :) With only few hours left to welcome the new year, my thoughts go back deep into 2014 trying recollect the highlights of this year through my personal aspect. Beyond the personal highlights I would like to add a note regarding the Community-related ones.

In 2014 I launched Artemiou Data Tools: a brand new community initiative aiming to provide high-quality, simple and still straight-to-the point tools. All tools have and will always have a Community Edition free for all! New websites, new content, new articles, new software and much more is what I recollect for 2014. Another major highlight is my participation to the MS Conference in Cyprus on March 21 2014 as an invited speaker where I talked about #SQLServer 2014 and the In-Memory OLTP Engine - that session was amazing!

Throughout all these years, the interaction with you, the community goes even stronger. This interaction really excites me and gives me strength to move on and come up with more initiatives! The discussions after my sessions in conferences or via your comments in my articles, the email exchange with your questions and topics for discussion and all the rest methods of communication is a unique experience for me and I am proud that I am part of this great SQL Server community.

The year 2015 will be even more powerful! I am planning of releasing the In-Memory OLTP Simulator (stay tuned on this one - it is something really amazing), a new eBook, website, other new tools, articles and much more!

Wishing you a Happy New Year 2015 always in good health and a constant quest for knowledge!

Below are my selected Tweets for 2014!

Blog Posts/Articles and Announcements

  • The mobile version of my official website is now live! Check it out via a mobile device at:  #MVPBuzz #Community
  • Join me at MS Conference on March 21 where I will talk about #SQLServer 2014 and the In-Memory OLTP Engine - #MVPBuzz
  • Artemiou Data Tools just got its own Twitter page! Feel free to follow and support my initiative! @ArtDataTools #MVPBuzz #Software #fb
  • My new website is now live! Its purpose is to be used as a hub for SQL Server knowledge and data tools.  #MVPBuzz #fb

  • [News] I am launching soon two sub-page systems within my official website: (i) Artemiou SQL Books,  (ii) Artemiou Data Tools #Community #fb
  • Congratulations to the winners of CY Windows Hackathon for the amazing apps and congrats to all for the great ideas and effort! Keep it up!
  • Today it's my #twitterversary! 5 years of tweeting!
  • The data export tool just became a multi-exporter! New release date: Thursday, June 26!
  • Tomorrow, @ArtDataTools releases a brand new free data tool that allows the user to easily export data from #SQLServer to Excel! #MVPBuzz
  • Are you ready for tomorrow's public release of #SQLServer 2014? ;) Are you ready for the In-Memory OLTP Engine? :) #MVPBuzz
  • Written 10K lines of C# / SQL code so far...the In-Memory OLTP Simulator for #SQLServer started taking form! More news soon! #MVPBuzz #fb
  • Working on my session for #Microsoft Conference 2014 in Cyprus...with 4 hours of sleep and my 5th cup of coffee :) #MVPBuzz #SQLServer
  • [News] My personal website enriched with my SQL Books and Data Tools launches next week on Feb 5! #MVPBuzz #SQLServer #Community #fb
  • Artemiou Data Tools: My collection of S/W tools that I develop and have to do with data processing and SQL Server interaction #MVPBuzz #fb
  • Artemiou SQL Books: My current and upcoming books on different SQL Server topics #Community #fb #MVPBuzz

  • UI design takes so much time! Can't I just write the code? :)
  • Sometimes, exception handling in software development takes much, but much longer than the actual development! :) #MVPBuzz
  • Setting up a Win Srv 2K12 R2 VM in Hyper-V Manager on my Windows 8.1 box while developing! Extremely fast, extremely smooth! #MVPBuzz
  • Just used Telerik's free tool JustDecompile. Cool tool! #Community
  • Doing some serious development in Visual Studio 2013 running on Windows 8.1 x64 with Update 1. This is smooth! :) #MVPBuzz
  • Into deep development on In-Memory OLTP Simulator. Developing a plug-in engine... #MVPBuzz #SQLServer #fb
  • #Microsoft Conference was a big success today! Thank you for attending my session on #SQLServer In-Memory OLTP Engine! #MVPBuzz
  • Everything is ready for my session at the MS Conference 2014! My In-Mem OLTP Simulator for #SQLServer is ready for some demos! #MVPBuzz
  • Just backed up and restored a SQL 2014 database from Windows Azure storage to my local SQL Server installation! Pretty cool! :) #MVPBuzz

Read more on this article...

Friday, December 12, 2014

How to Import and Export Unstructured Data in SQL Server - FileTables: The Movie!

A few weeks ago I was contacted by Webucator, an online training provider that requested my permission to include one of my articles into a free video-training series called "SQL Server Solutions from the Web". This is one of my articles on FileTables with title: “How to Import and Export Unstructured Data in SQL Server – FileTables”. As I really liked this initiative by Webucator, I gave them my permission to turn my article into a video and include it in the series and here's the result which I find it very cool!

I really liked the video which by the way is a high-quality, clean video with straight to the point explanations, useful annotations and above all, with content that highlights the most important points of my blog article.
If you want to see FileTables in action then this is a good opportunity!

The free SQL Server Solutions from the Web course is listed on Webucator's SQL Server training page where you can find the full list of available SQL Server-related training classes provided by Webucator.

Last but not least, a big thank you to Webucator for being an active supporter of the technical community. Keep it up guys!

Read more on this article...