Tuesday, April 1, 2014

Congratulations 2014 Microsoft MVP!‏

It's that time of the year again. April 1st, where you just wait to see if you have received the magic email from Microsoft saying that you are awarded as a Most Valuable Professional. During this waiting process you just keep on asking yourself: "Was I active enough and contributed as much as possible to the community in the last year?" Of course, the answer is not easy as sharing knowledge with the community and helping other people is never enough. However, the important thing here, is to feel good with yourself that you really helped other people without requesting anything in return. To feel good because you just shared knowledge with a fellow human being irrespective of race, religion and beliefs.

Technology is common for all of us as well as the challenges. Technology is a global language. It is really a privilege interacting with all of you, my fellow community members around the globe and exchange knowledge on SQL Server. Thank you for being awesome!

I would also like to thank Microsoft for re-awarding me with the MVP award on SQL Server as well as the SQL Server Product Team for building and keep on developing such an amazing data platform. I would also like to thank the local Microsoft Office in Cyprus for supporting my community activities as well as my MVP Lead for her continuous support.

Additionally, I would like to thank you, my fellow SQL Server community members for being just great! I feel really proud for being a member of such an awesome community with so many great people and for sharing a common passion, the passion of technology!

On top of everything, a huge thank you to my beautiful wife and daughter for their true support, love and understanding for all those hours I spent away of them working on all those "community stuff" :) Without your support, you know it would just not feel right. I love you and thank you from the depths of my heart loves! This award is dedicated to you!

My commitment to all of you is that I will continue doing my best and share with you the technical knowledge that I acquire on SQL Server as well as my experience via all the available channels!

-----------------










Dear Artemakis Artemiou,

Congratulations! We are pleased to present you with the 2014 Microsoft® MVP Award! This award is given to exceptional technical community leaders who actively share their high quality, real world expertise with others. We appreciate your outstanding contributions in SQL Server technical communities during the past year.

...

The Microsoft MVP Award provides us the unique opportunity to celebrate and honor your significant contributions and say "Thank you for your technical leadership."
-----------------
Read more on this article...

Monday, March 31, 2014

The In-Memory OLTP Engine in SQL Server 2014: An Overview

A few days ago, on March 21st, I participated as an invited speaker at Microsoft Conference 2014 that took place in Nicosia, Cyprus.

My session was on SQL Server 2014 and specifically about what’s new in this release of SQL Server.

My session’s agenda was on the following areas of new features on SQL Server:
  • Interaction with Windows Azure
  • Security Enhancements
  • Performance Improvements & Enhancements
  • T-SQL Enhancements
  • In-Memory OLTP Engine
Someone might wonder why the “In-Memory OLTP Engine” is in bold, red letters :)

The reason is that this feature is the long-awaited in-memory processing engine that it is built-in into the SQL Server Engine and when you properly use it, you can get extremely high speedups in data processing when compared to the “traditional processing”.

The In-Memory OLTP Engine introduces new data structures described as “memory-optimized” (i.e. memory-optimized tables and table variables). Memory-optimized tables store their data into memory using multiple versions of each row’s data. This technique is characterized as “Non-blocking multi-version optimistic concurrency control” and eliminates both Locks and Latches thus achieving breakthrough performance. The list of main features of memory-optimized tables are:
  • Rows in the table are read from and written to memory
  • The entire table resides in memory
  • Non-blocking multi-version optimistic concurrency control
  • Durable & non-durable
  • A second copy maintained on disk for durability (if enabled)
  • Data in memory-optimized tables is only read from disk during database recovery
  • Interoperable with disk-based tables

Another feature of the In-Memory OLTP Engine is “natively compiled stored procedures”. A natively-compiled stored procedure is a SQL Server object that can access only memory-optimized data structures such as memory-optimized tables, table variables, etc. The main features of a natively-compiled stored procedure are:
  • It is compiled to native code (DLL) when it is created
  • Aggressive optimizations take time at compile time
  • Can only interact with memory-optimized tables

The call to a natively compiled stored procedure is actually a DLL Entry Point which is something that offers very fast execution times!

When you properly combine memory-optimized tables with natively compiled stored procedures for heavy workloads like DW ETL processes, high data insert rate processes and processes that demand low latency execution you can expect some serious speedups.

Feel free to view/download my presentation below:



This was the introductory of a series of posts, screencasts and webcasts having as a topic the In-Memory OLTP Engine of SQL Server 2014.

Last but not least: SQL Server 2014 will be generally available tomorrow, April 1st, 2014, so feel free to evaluate it and see the true power of the In-Memory OLTP Engine!

Read more on this article...

Friday, February 28, 2014

How to Import and Export Unstructured Data in SQL Server - Part 2

A few weeks ago I posted an article on how to import and export unstructured data in SQL Server 2005 by using the "image" data type.

In this article I will show how you can manipulate unstructured data in SQL Server 2008 or later by using the FILESTREAM feature. FILESTREAM allows storing unstructured data (i.e. music, video, documents, etc.) onto the NTFS file system and manipulating it via the Database Engine.

Step 1: Enabling FILESTREAM
Before using the FILESTREAM feature you have to enable it. To this end you need to navigate to SQL Server Configuration Manager and under SQL Server Services right click on the SQL Server instance, select properties and in the “FileStream” tab check “Enable FILESTREAM for Transact-SQL access” and “Enable FILESTREAM for file I/O access”:

Figure 1: Enabling FILESTREAM


































The last step for enabling FILESTREAM is from within SSMS, to open a new query window and execute the following T-SQL statement and then restart the SQL Server instance:

EXEC sp_configure filestream_access_level, 2;
GO
RECONFIGURE
GO

Note: the available filestream access levels are:

  • 0: Disables FILESTREAM support for this instance.
  • 1: Enables FILESTREAM for Transact-SQL access.
  • 2: Enables FILESTREAM for Transact-SQL and Win32 streaming access.


Step 2: Creating a FILESTREAM-Enabled Database 
For creating a FILESTREAM-enabled database you just need to include a FILESTREAM filegroup. For example:

CREATE DATABASE FileStreamDB 
ON
PRIMARY ( NAME = FileStreamDBData,
    FILENAME = 'C:\Blog\SQLData\filestreamDB_data.mdf'),
FILEGROUP FileStreamGroup_1 CONTAINS FILESTREAM( NAME = FileStreamDBFS,
    FILENAME = 'C:\Blog\SQLData\filestream1')
LOG ON  ( NAME = FileStreamDBLogs,
    FILENAME = 'C:\Blog\SQLData\filestreamDB_log.ldf');
GO

Step 3: Creating a Table for Storing FileStream Data
The only difference between a "normal" table and a table that can store filestream data is the use of the "FILESTREAM" data type for a specific column in the table's definition script:

USE [FileStreamDB];
GO

CREATE TABLE dbo.Files
(
[Id] [uniqueidentifier] ROWGUIDCOL NOT NULL UNIQUE, 
[FileName] VARCHAR(100),
[ActualFile] VARBINARY(MAX) FILESTREAM NULL
);
GO

Step 4: Storing FileStream Data
For this example, consider the following unstructured data file:
Figure 2: Image file to be stored in FILESTREAM-enabled database.


















Now, let's store the file in our FILESTREAM-enabled database and table that was created earlier:
USE [FileStreamDB];
GO

INSERT INTO dbo.Files
    VALUES (newid (), 
'SampleImage.png',
(SELECT * FROM   OPENROWSET(BULK 'C:\Testing\2\SampleImage.png',SINGLE_BLOB) AS x)
  )
GO

Here's the contents of the table:
Figure 3: The contents of the FILESTREAM-enabled table after inserting unstructured data (image file).
















As you can see, the file is visible on the file system level too:
Figure 4: Binary file stored using FILESTREAM - Accessible on the file system level.





















Now let's try to open the file using MS Paint: 

Figure 5: Accessing the data stored in the FILESTREAM-enabled database from the file system level (Windows).




















As you can see, the image file is stored in the SQL Server database table but besides T-SQL access, you can also access it from Windows! 

What we just did with the above example, shows a small glimpse of the real power of FILESTREAM, that is leveraging the performance and rich APIs of the Windows file system and at the same time maintaining consistency between structured and unstructured data in SQL Server.

FILESTREAM actually works like a bridge between structured and unstructured data via a combination of transactional data and file system access and can be extremely useful in cases where you have many binary objects like images and videos and you want to store it in SQL Server and being able to access it with the speed of the file system.

Read more on this article...

Saturday, February 8, 2014

A New Project: Artemiou Data Tools

A few months ago I started a new project that has to do with software development. The concept behind this project is to develop simple yet useful software tools that have to do with data processing. All my tools will be distributed under the collection name “Artemiou Data Tools”.

The majority of the tools will be interacting with SQL Server directly or indirectly. However, there will be also tools for undertaking any other data-related tasks (i.e. split files, etc.).

The purpose is to help fellow community members to solve easy tasks that however require valuable time when are not performed via a software tool. As I plan to be releasing one tool every month, make sure that you visit my website often!

The project went live a few days ago and until now “Artemiou Data Tools” has three tools with a new one coming out next week.

File Splitter v1.0
File Splitter does exactly what the name suggests: it splits files based on either file size or number of records.

File Splitter v1.0 - Main Screen (visit homepage).


Dynamic T-SQL Generator v1.2
Dynamic T-SQL Generator makes it easy for Database Developers to generate dynamic T-SQL scripts by converting a given static T-SQL script and vice versa. Also, it provides a set of additional functions like parsing T-SQL and converting T-SQL scripts to uppercase and lowercase.

Dyamic T-SQL Generator v1.2 - Main Screen (visit homepage)




























Snippets Generator v2.2
Snippets Generator allows the user to easily create/modify T-SQL snippets for SQL Server. You can create a snippet from scratch or by using snippet templates.

Snippets Generator v2.2 - Main Screen (visit homepage)













































Feel free to visit "Artemiou Data Tools", download the tools and provide your valuable feedback!







Read more on this article...

Thursday, January 30, 2014

How to Import and Export Unstructured Data in SQL Server - Part 1

Importing and exporting unstructured data from a database is a common practice, especially in large applications. SQL Server 2008 introduced the FILESTREAM feature that allows storing unstructured data (i.e. music, video, documents, etc.) onto the NTFS file system and manipulating it via the Database Engine. SQL Server 2012 introduced FileTables which is an enhancement to FILESTREAM.

With this post I am starting a series of articles that will deal with the topic of storing, manipulating and extracting unstructured data from SQL Server.

In this article we are going to see how it is possible to import and export binary objects in SQL Server 2005. Instead of saying more I would just like to show you by example how you can do this.

For this example, consider that we have the following two objects:

  • SampleImage.png
  • SampleTextFile.txt

Figure 1: The two sample files for the BLOBs example.




















Let's take a closer look at the files just for checking out their content:
Figure 2: Content of SampleTextFile.txt

























Figure 3: Content of SampleImage.png




























Now let's store these two files in a SQL Server table.

USE master;
GO

--Create test database
CREATE DATABASE [BinaryFilesDB];
GO

--Use test database
USE [BinaryFilesDB];
GO

--Create table that will be hosting the files
CREATE TABLE [dbo].[tblFiles](
[fileID] [bigint] IDENTITY(1,1),
[fileName] [nvarchar](255) NULL,
[binFile] [image] NOT NULL);
GO

--
--Import SampleTextFile.txt
--
INSERT INTO dbo.tblFiles
           ([fileName],[binFile])
VALUES     ('SampleTextFile.txt',(SELECT * FROM   OPENROWSET(BULK 'c:\testing\SampleTextFile.txt',SINGLE_BLOB) AS x))
GO
--
--Import SampleImage.png
--
INSERT INTO dbo.tblFiles
           ([fileName],[binFile])
VALUES     ('SampleImage.png',(SELECT * FROM   OPENROWSET(BULK 'c:\testing\SampleImage.png',SINGLE_BLOB) AS x))
GO

Let's check the contents of "tblFiles" table:
Figure 4: Contents of the table after importing the binary files.








Let's rename the files, just for performing a minor modification:

--Rename files
UPDATE tblFiles
SET fileName='SampleTextFileModified.txt'
WHERE filename='SampleTextFile.txt';
GO

UPDATE tblFiles
SET fileName='SampleImageModified.png'
WHERE filename='SampleImage.png';
GO

Let's check the table contents again:
Figure 5: Renamed contents of table "tblFiles"








Now let's produce the export statements:

SELECT
'bcp "select binFile from BinaryFilesDB.dbo.tblFiles where fileid=' + cast (fileID as varchar(50)) + '" queryout "c:\testing\'+[filename]
+'" -f bcp.fmt -S .\SQL2K14CTP2 -T' as RunTheseOnCommandPrompt
FROM BinaryFilesDB.dbo.tblFiles;
GO

Some notes on the above T-SQL statement:

- SQL2K14CTP2 is the named instance of my SQL Server
- Also note that I am using Trusted (-T) connection for my generated bcp commands
- If you want to use username/password instead of a Trusted connection replace "-T" with -U [username] -P [password]
- You can download the bcp.fmt format file from here.

Figure 6: Generated statements for exporting the binary files.










Now let's run the two bcp commands on the command prompt:
Figure 7: Executing the bcp commands for exporting the binary files.


















Also, let's check the contents of the "c:\testing" directory:
Figure 8: Binary files successfully exported from SQL Server.












As you can see, the two renamed files were successfully exported and have the exact same size as the original ones!

I hope you enjoyed the article! Until next time!
Read more on this article...

Tuesday, December 31, 2013

Selected Tweets of Year 2013!

Another year has come to an end. Being faithful to the "tradition" of posting the last day of every year my selected tweets for the year that ends, here I am again, posting the "Selected Tweets of Year 2013!".

2013 was remarkable, especially when it comes to Community. Interacting with you, the SQL Server Community through all available channels is a unique experience as is it is with this interaction we all become better, not only in Technology but as human beings too.

2013 was a special year for me as my wife gave birth to a beautiful baby girl! Even though now I get less sleep I have an additional source of inspiration! :) That's why I published my first two e-Books in this year!

In October 2013 I published "Tuning SQL Server" and in December "Administering SQL Server", all part of my series "The SQL Server and .NET eBook Series".

Another year has come to an end. Throughout the difficulties of our times we stay strong. We keep on trying for the best. We keep on following our passion thus maintaining a strong Technical Community. The one thing I can say for sure after all these years of community activity, is that it is something that makes you a better person.

2014 can be an even better year because you know, it is us, the people that can make this happen. As a good friend of mine once said: "Change the World, or go home!" (cheers Karl :)

It is a privilege to communicate with you, via all possible channels like blogs, forums, offline and online events, conferences and of course, Twitter! Here are the tweets!


Blog Posts/Articles and Announcements
  • CDNUG's event on May 22 will be also streamed live! Join us and learn more about Windows/SQL #Azure! http://bit.ly/13xGGfF  #mvpbuzz #fb

News
  • 3 days left for the release of my 2nd eBook "Administering SQL Server". Putting the finishing touches! http://sqlbooks.aartemiou.com  #MVPBuzz #fb
  • Building a new website. Hint: It will be #SQLServer-related but different from my other websites :) More info before Christmas! #MVPBuzz #fb
  • Only one week left for the release of my second eBook titled "Administering SQL Server". Check often: http://sqlbooks.aartemiou.com  #MVPBuzz #fb
  • Editing my second eBook titled "Administering SQL Server".To be published on December 2,2013. http://sqlbooks.aartemiou.com  #fb #SQLServer #MVPBuzz
  • Just finished the migration of 62 databases from #SQLServer 2005 to 2008 R2 on a Win 2012 cluster! Let's put the final touches! #MVPBuzz
  • My second eBook on #SQLServer titled "Administering SQL Server" will be published in one month. Check often http://sqlbooks.aartemiou.com  #MVPBuzz
  • Just finished my new website that will be hosting my #SQLServer eBooks and tutorials! Check it out: http://sqlbooks.aartemiou.com  #MVPBuzz #fb
  • My first ebook on #SQLServer tuning has entered its editing phase. More info in two days! #MVPBuzz #fb #Community
  • New functionality added: Open/modify existing T-SQL snippet files, Create SurroundsWith snippets. #SnippetsGenerator #MVPBuzz #SQLServer #fb
  • I have added interesting new features in this new release of Snippets Generator. #MVPBuzz #SQLServer
  • [News] More info about my three eBooks (i.e. release dates, etc.) on September 10, 2013 #MVPBuzz #fb #Community #SqlServer
  • [News] The third eBook will be about #SQLServer Development #MVPBuzz #fb #Community
  • [News] The second eBook will be about #SQLServer Tuning #MVPBuzz #fb #Community
  • [News] The first eBook will be about #SQLServer Administration and Maintenance #MVPBuzz #fb #Community
  • [News] I am currently working on writing not one but three eBooks on #SQLServer! #Free to the #community of course! #MVPBuzz #fb
  • A new release of my open source tool "Snippets Generator for SQL Server 2012" is on the way. More info in September! http://snippetsgen.codeplex.com/ 
  • Working on something amazing! Yes, it has to do with in-memory OLTP! :) More news within a few days... #mvpbuzz #fb #sqlserver
  • Downloading #SQLServer 2014 CTP1! It's time to talk about In-Memory OLTP (Hekaton)! :) #MVPBuzz #fb
  • Tomorrow at 6pm (local Cyprus time) watch live the CDNUG event on Windows/SQL Azure! Live Meeting URL: https://www.livemeeting.com/cc/mvp/join?id=WN4DCQ&role=attend … #mvpbuzz #fb
  • Putting some final touches for my session at tomorrow's #CDNUG event! I am really enjoying all these SQL Azure demos! :) #mvpbuzz #community
  • On May 22 2013 I will be presenting a session on #Windows #Azure SQL Database. For more info: http://bit.ly/13xGGfF  #mvpbuzz #fb #cdnug
  • On May 22 I will be delivering a session on #Windows #Azure #SQL #Database (formerly SQL Azure). More info soon! #fb #mvpbuzz
  • Working on an article on data processing. More info soon! #mvpbuzz #fb #sqlserver
  • Even though we usually avoid using cursors, sometimes are indeed the best option, if not the only one! #mvpbuzz #sqlserver #fb
  • I am a #SQLServer MVP for the 5th year! Thank you #Microsoft! Thank you #Community!!! #in #mvpbuzz
  • A great event! RT @CDNUG:Review of our event that took place on Feb 28 2013 - OWASP Top 10 No-Nos http://bit.ly/XWNuh7  #Community #MVPBuzz
  • [New Release] Snippets Generator v2.0 is Out! - http://bit.ly/106VahH  #mvpbuzz #fb #community #free #sqlserver #SSMS #tools
  • You like #SSMS 2012? You like T-SQL coding? A new release of a #free special tool is scheduled for tomorrow! Stay tuned! #mvpbuzz #SQLServer
  • Always! RT @CDNUG: By, and for, the #DotNet and #SQLServer #Community in #Cyprus! Follow us for a deep dive into #knowledge!
  • I have just been renewed in the Friends of Red Gate program for 2013! Thank you @redgate! #MVPBuzz #SQLServer #fb @RedGateFriends

          My warmest wishes for a Happy New Year 2014! A year full of health, love, and knowledge sharing!


          Read more on this article...

          Monday, December 9, 2013

          The SQL Server and .NET Blog eBook Series: Administering SQL Server

          A few days ago I published my second free eBook of "The SQL Server and .NET Blog" eBook Series. The title of the book is "Administering SQL Server" and it is for database administrators and architects who monitor and administer SQL Server instances on a regular basis and have as a primary goal to maintain the health of the instances and thus the hosted databases. The book discusses among other, basic administration tasks, security and compliance, integration and troubleshooting.

          The book is organized in 5 chapters. Chapter 1 discusses basic SQL Server maintenance tasks such as disk usage monitoring, history cleanup, backup-related topics, and more. Chapter 2 discusses security and compliance topics such as Policy-Based Management and encryption. Chapter 3 talks about different integration topics such as linked servers, use of proxies on SQL Server Agent job steps and Unicode support. Chapter 4 discusses special SQL Server topics like the Windows Internal Database (SSEE) and dynamic T-SQL generation. Chapter 5 discusses error handling and ways to overcome errors you might encounter due to external factors like permission issues, invalid user input, etc.

          Below you can find an excerpt of the book containing the table of contents:


          You can download the book at: http://sqlbooks.aartemiou.com/books/administration/

          For a list of all my books please visit: http://sqlbooks.aartemiou.com/books/
          Read more on this article...

          Thursday, November 21, 2013

          Where are Programmability Objects Stored in SQL Server?

          SQL Server programmability objects such as stored procedures, functions, assemblies, etc. are widely used, especially in cases of a major database design where you want to have a well-structured database with code reuse and performance.

          Even though many of us use these programmability features, we do not often think of where these objects are stored.

          However, it is very useful to know where the above objects are stored as we can access them via T-SQL and retrieve useful information (i.e. in the case of an upgrade where we want to check for deprecated features in the T-SQL definition of those objects) or even apply corrections.

          Taken the above into consideration, there are two catalog views that we need to study prior to see the T-SQL statements that can be used for accessing the programmability objects.

          The first catalog is "sys.objects". This catalog view provides information about user-defined, schema-scoped objects created within a database such as: object name, id, type, creation/modification date, etc. From Books Online we can see that in SQL Server Database Engine there are the following object types available:
          • AF = Aggregate function (CLR)
          • C = CHECK constraint
          • D = DEFAULT (constraint or stand-alone)
          • F = FOREIGN KEY constraint
          • FN = SQL scalar function
          • FS = Assembly (CLR) scalar-function
          • FT = Assembly (CLR) table-valued function
          • IF = SQL inline table-valued function
          • IT = Internal table
          • P = SQL Stored Procedure
          • PC = Assembly (CLR) stored-procedure
          • PG = Plan guide
          • PK = PRIMARY KEY constraint
          • R = Rule (old-style, stand-alone)
          • RF = Replication-filter-procedure
          • S = System base table
          • SN = Synonym
          • SO = Sequence object
          • SQ = Service queue
          • TA = Assembly (CLR) DML trigger
          • TF = SQL table-valued-function
          • TR = SQL DML trigger
          • TT = Table type
          • U = Table (user-defined)
          • UQ = UNIQUE constraint
          • V = View
          • X = Extended stored procedure

          The screenshot below displays some of the objects of the “AdventureWorks2012” sample database:

          Screenshot 1: Partial set of records retrieved from the sys.objects catalog view in the "AdventureWorks2012" sample database.
          The second catalog is "sys.sql_modules". This catalog view provides information about database objects that are SQL language-defined modules. The most important column in this catalog is "definition" which is of the data type nvarchar(max). The "definition" column contains the SQL text that defines the module. In cases where there is a NULL value it means that it is encrypted.

          The screenshot below displays some of the sql_modules of the “AdventureWorks2012” database:

          Screenshot 2: Partial set of records retrieved from the sys.sql_modules catalog view in the "AdventureWorks2012" sample database.


          Now let’s run some queries by joining the above two system catalogs. For example let’s find the T-SQL definition for all stored procedures in the AventureWorks2012 database:

          USE AdventureWorks2012;
          GO

          SELECT o.name as SPName, m.[Definition],o.create_date as DateCreated, o.type_desc as TypeDescription
          FROM sys.objects o, sys.sql_modules m
          WHERE o.[object_id]=m.[object_id]
          AND o.[type]='P';
          GO

          And this is what we get:

          Screenshot 3: All stored procedures of theAdventureWorks2012 sample database.


















          Now let’s find all the SQL scalar functions for the same database:

          USE AdventureWorks2012;
          GO

          SELECT o.name as SPName, m.[Definition],o.create_date as DateCreated, o.type_desc as TypeDescription
          FROM sys.objects o, sys.sql_modules m
          WHERE o.[object_id]=m.[object_id]
          AND o.[type]='FN';
          GO

          Let’s see the result:

          Screenshot 4: All scalar functions of the AdventureWorks2012 sample database.


















          You can find even more information on different database objects by joining the above catalog views with other such as sys.parameters (finding the parameters of stored procedures), sys.types (finding information about system- and user-defined data types), etc. By the time all this information is available the only limit is your imagination!
          Read more on this article...

          Tuesday, October 1, 2013

          The SQL Server and .NET Blog eBook Series: Tuning SQL Server

          My first eBook on SQL Server has been released and of course, it's free for all! The title of the book is "Tuning SQL Server" and it is part of "The SQL Server and .NET Blog eBook Series!".

          You can find the download link at: http://sqlbooks.aartemiou.com/books/tuning/

          As mentioned in a previous post, the books of this series will be based on articles already posted on my blog, updated and enriched with additional information.

          "Tuning SQL Server" is for database administrators and architects who monitor and tune SQL Server instances in order to keep them operating to the maximum possible performance and stability. The book suggests several techniques that can be used for ensuring a performant SQL Server instance.

          Feel free to download the book as well as let other fellow community members know about it in order to download it too!

          Make sure that you check my books' website often for updates and new releases: http://sqlbooks.aartemiou.com/

          Additionally, you can subscribe to the SQLBooks newsletter in order to stay up to date: http://sqlbooks.aartemiou.com/newsletter/

          Last but not least, I kindly as for your valuable feedback! You can leave your feedback by visiting the following link: http://sqlbooks.aartemiou.com/feedback/

          Thanks!

          Cheers,
          Artemakis

          Read more on this article...

          Tuesday, September 10, 2013

          My Upcoming eBooks on SQL Server

          I love technology.

          I love exploring.

          I love writing about it and share knowledge.

          I started working with SQL Server and .NET (C#) more than ten years ago. Since then it has been quite a journey! Each release came - and still comes - with exciting new features enabling us to do more and more! Every time waiting for that new built, in order to start testing it, exploring it, learning it, as soon as it becomes available. The possibilities are endless! The limit is your creativity!

          The massive interaction with the SQL Server community started at about seven years ago. Blogging, organizing user group events, speaking in user group meetings, conferences and other events, open-source projects related to SQL Server, guest articles, discussions on message boards/forums, and much more!

          The love for technical writing and knowledge sharing urged me for adding another activity to my interaction with the community, that is book authoring. In order to be able to write, you first need to acquire and comprehend the specific technical knowledge. You need to explore, to experiment, to test. You need to test the limits of each new technology or feature in order to be able to fully understand its nature and capabilities.

          SQL Server is a powerful data platform that includes several data management and analysis technologies that allow you to do just about anything. Since 2002 I have been exploring SQL Server in many areas. I have been deep diving into various topics of SQL server having to do mainly with: administration, development/data access and performance tuning. These are the three areas of SQL Server I like the most and on which I base my interaction with SQL Server and acquisition of knowledge. To this end, I have decided to publish three eBooks on the three above mentioned areas of SQL Server.

          The content of the eBooks will be mainly based on articles that I have already written on my blog. It will be actually a collection of selected articles I have already published, edited and enriched with additional fresh content.

          You can find below the scheduled publishing dates along with the eBook topics:

          October 1, 2013
          Topic: Tuning SQL Server

          December 2, 2013
          Topic: Administering SQL Server

          February 3, 2014
          Topic: Developing SQL Server

          The eBooks will be published on different online locations like: my blog, my website as well as in other channels (TBA) and of course they will be totally free.

          Read more on this article...

          Monday, September 9, 2013

          Monitoring Locking in SQL Server

          SQL Server 2005 or later has a specific dynamic management view (DMV) which provides detailed information regarding the active locks within the SQL Server instance that is, locks that have been already granted or they are waiting to be granted. The DMV is called: sys.dm_tran_locks.

          The following T-SQL query uses sys.dm_tran_locks in order to return useful information:

          SELECT
           resource_type, 
           resource_database_id,
           (select [name] from master.sys.databases where database_id =resource_database_id) as [DBName],
           (case resource_type when 'OBJECT' then  object_name(resource_associated_entity_id,resource_database_id) else NULL end) as ObjectName,
           resource_associated_entity_id,
           request_status, request_mode,request_session_id,
           resource_description
          FROM sys.dm_tran_locks

          A sample output of the above query would look like the one below:





          An alternative way of monitoring the active locks is to use the “Activity Monitor” module in SQL Server Management Studio (SSMS) 2005 or later.
          By joining the records returned by sys.dm_tran_locks and sys.dm_os_waiting_tasks DMVs you can get blocking information:

          SELECT
           t1.resource_type,
           t1.resource_database_id,
           (select [name] from master.sys.databases where database_id =resource_database_id) as [DBName],
           t1.resource_associated_entity_id,
           (case resource_type when 'OBJECT' then  object_name(resource_associated_entity_id,resource_database_id) else NULL end) as ObjectName,
           t1.request_mode,
           t1.request_session_id,
           t2.blocking_session_id
          FROM sys.dm_tran_locks as t1, sys.dm_os_waiting_tasks as t2
          WHERE t1.lock_owner_address = t2.resource_address

          The above two queries provide you with information on active locks and blocking cases and can help you identify incidents that might need manual intervention (i.e. in cases of a bad database design and when the database is accessed concurrently).
          Read more on this article...