Thursday, April 21, 2016

Entity Framework: Code First

A few days ago we had our second user group meeting for 2016 at Cyprus .NET User Group and it was all about data access.

I was the speaker at the event and I presented Entity Framework and more specifically the "Code First" development approach.

Entity Framework is an Object/Relational Mapping (ORM) Framework that treats data as Domain-Specific Objects. Entity Framework supports a variety of DBMSs.

The purpose of Entity framework is to simplify the data-access required when developing database applications.

In order to work with Entity Framework you need:
In order to install Entity Framework, right after you start a new project in Visual Studio, you can add it either via the GUI of NuGet Manager or via Package Manager Console with the command: Install-Package EntityFramework.

Figure 1: NuGet Package Manager.


Figure 2: Package Manager Console - Syntax for Installing Entity Framework.






Code First was first introduced in Entity Framework 4.1 and it follows the Domain Driven Design (DDD) paradigm. It is one of the three available development approaches in Entity Framework. The main idea is to create Domain Classes based on which the database can be created in the background.

In Entity Framework you can query data with any of the below methods:
  • LINQ to Entities
    • Method syntax
    • Query syntax
  • Entity SQL 
  • Native SQL


Figure 3: Component Stack when Using Entity Framework. Source: MSDN.
























For more information you can visit the below resources:

Read more on this article...

Wednesday, April 13, 2016

SQL Server 2016: Dynamic Data Masking

Dynamic Data Masking is another data protection feature in SQL Server 2016. It limits sensitive data exposure by masking it to non-privileged users. The masking rules are applied in the query results and you designate how much of the sensitive data to reveal and to who. The Dynamic Data Masking feature provides four types of masking methods:
  • Default: Full masking according to the data types of the designated fields.
  • Email: Masking method which exposes the first letter of an email address and the constant suffix ".com", in the form of an email address. . aXXX@XXXX.com.
  • Custom String: Masking method which exposes the first letter of an email address and the constant suffix ".com", in the form of an email address. . aXXX@XXXX.com.
  • Random: A random masking function for use on any numeric type to mask the original value with a random value within a specified range.
A simple example of Dynamic Data Masking is to display only the last 4 digits of a phone number to the support personnel of an IT Department or don't show the email address to unauthorized users, etc.

Now let's proceed with a demo in order to see some examples of using Dynamic Data Masking.

/*
Demo: Dynamic Data Masking
*/

--Create a sample database
IF DB_ID('SampleDB_DataMasking') IS NULL
BEGIN
CREATE DATABASE SampleDB_DataMasking;
END
GO

--Use sample database
USE SampleDB_DataMasking;
GO

--Create sample table
CREATE TABLE tblEmployeesMasking
    (
      empName VARCHAR(250) ,
      empPhoneNo VARCHAR(20) ,
      empEmailAddress VARCHAR(100)
    );
GO

--Populate sample table with data
INSERT  INTO dbo.tblEmployeesMasking
VALUES  ( 'John Clarck', '545-5478-1234', 'john.clark@example.com' ),
        ( 'Kostas Andreou', '878-8888-5678', 'kostas.andreou@example.com' ),
        ( 'Wei Xing', '673-5123-9101', 'wei.xing@example.com' );

--Check records (unmasked)
SELECT * FROM tblEmployeesMasking


Output










--Now let's mask the email address
ALTER TABLE dbo.tblEmployeesMasking
ALTER COLUMN empEmailAddress VARCHAR(100) MASKED WITH (FUNCTION = 'email()');

Let's check again the records of the table, this time using a non-sysadmin user.

--Create user (non-sysadmin) with GRANT SELECT permission on the sample table
CREATE USER TestUser WITHOUT LOGIN;
GRANT SELECT ON dbo.tblEmployeesMasking TO TestUser;
GO

--Retrieve the table contents by running the query with TestUser
EXECUTE AS USER = 'TestUser';
SELECT * FROM dbo.tblEmployeesMasking;
GO


Output













As you can see in the above screenshot, the column's 'empEmailAdress' values are presented as masked.
OK. So now let's switch the execution context back our original login's context in order to proceed with masking the 'empPhoneNo' column:

REVERT;
GO

--Let's mask the phone number with default
ALTER TABLE dbo.tblEmployeesMasking
ALTER COLUMN empPhoneNo VARCHAR(100) MASKED WITH (FUNCTION = 'default()');

Let's check the records of the table using a non-sysadmin user.

--Retrieve the table contents by running the query with TestUser
EXECUTE AS USER = 'TestUser';
SELECT * FROM dbo.tblEmployeesMasking;
GO

Output:











Now you can see that the column 'EmpPhoneNo' is also masked.
The next test is to remove the 'empPhoneNo' column's masking and set a custom masking function:

REVERT;
GO

--Remove masking 
ALTER TABLE dbo.tblEmployeesMasking
ALTER COLUMN empPhoneNo VARCHAR(100) 

--Let's mask the phone number with custom function
ALTER TABLE dbo.tblEmployeesMasking
ALTER COLUMN empPhoneNo VARCHAR(100) MASKED WITH (FUNCTION = 'partial(3,"-XXXX-XX",2)');


Let's check the records of the table using a non-sysadmin user.

--Retrieve the table contents by running the query with TestUser
EXECUTE AS USER = 'TestUser';
SELECT * FROM dbo.tblEmployeesMasking;
GO

Output:











As you can see from the above screenshot, now the empPhoneNo column is masked based on our custom masking function.
Last, if you want to allow 'TestUser' to be able to see the masked data, you can grant her or him the UNMASK permission:

REVERT;
GO

-- Granting the UNMASK permission
GRANT UNMASK TO TestUser;
GO

Let's check the records of the table within the context of 'TestUser':

--Retrieve the table contents by running the query with TestUser
EXECUTE AS USER = 'TestUser';
SELECT * FROM dbo.tblEmployeesMasking;
GO

Output:











Because TestUser was granted the UNMASK permission, it was possible to see the unmasked data.

Similarly, if REVOKE the UNMASK permission, TestUser will only see the masked data:


















Dynamic Data Masking is a significant feature in SQL Server 2016 and Azure SQL Database. As you can see from the above examples it is very easy to use it and it can really help you when it comes to Security and Compliance regarding yours or your organization's most valuable asset: the data.


References: 
MSDN Library Article: What's New in SQL Server 2016 Release Candidate (RC2)
MSDN Library Article: Dynamic Data Masking
Microsoft Azure Document: Get started with SQL Database Dynamic Data Masking (Azure Portal)

See also...
SQL Server 2016 Top Features
Read more on this article...

Friday, April 1, 2016

I'm a Microsoft Data Platform MVP for the 8th Year!

Just got the magic email from Microsoft saying:











Dear Artemakis Artemiou,

Congratulations! We are pleased to present you with the 2016 Microsoft® MVP Award! This award is given to exceptional technical community leaders who actively share their high quality, real world expertise with others. We appreciate your outstanding contributions in Data Platform technical communities during the past year.

...

The Microsoft MVP Award provides us the unique opportunity to celebrate and honor your significant contributions and say "Thank you for your technical leadership."
-------------------------------------------
This is my 8th year as a Microsoft Data Platform MVP and I feel very honored receiving this prestigious award for another year. 
Interacting with the Data Platform technical communities all over the world is something magic. The endless technical discussions, the quest for knowledge and solutions to complex problems, the excitement for new technologies, events, books, blogs, social networks, ..., everything, it is something special.

I have been working with SQL Server for over than 15 year now and hey, I'm just loving it! Working with and constantly researching the most powerful Data Platform in the world for so many years can only offer you deep knowledge about a large set of technologies, that all together converge and form this amazing Data Platform known to all of us as SQL Server and Azure SQL Database.

A huge thank you to all of you, my fellow SQL Server community members for having these great technical discussions through all online and offline channels. It is these discussions that make us better in what we do.

I would also like to thank Microsoft for supporting the technical communities all over the globe so actively and continuously. Also, a special thank you to the local Microsoft Office at Cyprus for supporting our community activities at the Cyprus .NET User Group.

I would also like to thank the SQL Server and other involved teams at Microsoft for implementing this amazing Data Platform vision and for having all those great discussions with us.

Furthermore, I would like to thank the CEE MVP Community Team for all its valuable support throughout each MVP Award year.

On top of everything, a huge thank you to my beautiful wife and daughter for their true support, love and understanding for all those endless hours I spent away from them working on all those "community stuff" :) Without your support, you know it would just not feel right. I love you and thank you from the depths of my heart loves! Once again, this award is dedicated to you!

The journey continues. So many technologies, all part of this amazing Data Platform: In-Memory OLTP, SQL Server on Linux, Advanced Analytics with SQL Server R Services, and the list goes on.

Once again, my commitment to all of you is that I will continue sharing with you my knowledge on SQL Server and related technologies via my blog, online and offline events, my books, social networks, my technical papers on different journals, and any other available channel!


Read more on this article...

Wednesday, March 30, 2016

SQL Server 2016: The STRING_SPLIT Function

A long-awaited string function added to SQL Server 2016 is STRING_SPLIT. As the name implies, this function splits the given character expression using the separator set by user.

Let's see some examples of using the STRING_SPLIT function.


--
--Example #1
--
DECLARE @string AS VARCHAR(250);
SET @string = '1-2-3-4-5-6-7-8-9-10';

SELECT  Value
FROM STRING_SPLIT(@string, ',');

--Output:


























--
--Example #2
--
CREATE TABLE #test
    (
      id INT ,
      productsPurchased VARCHAR(250)
    );

INSERT  INTO #test
VALUES  ( 1, 'product1, product2, product3' );

SELECT  id ,
        LTRIM(t2.value)
FROM    #test t
        CROSS APPLY STRING_SPLIT(productsPurchased, ',') t2;

--Output:





























--
--Example #3
--
CREATE TABLE #CSVHeaders ( headers VARCHAR(MAX) );

INSERT  INTO #CSVHeaders
VALUES  ( 'header1,header2,header3,header4,header5,header6,header7,header8,header9,header10,header11,header12,header13,header14,header15' );

SELECT  LTRIM(value) AS HeaderName
FROM    #CSVHeaders
        CROSS APPLY STRING_SPLIT(headers, ','); 

--Output:































As you can see, STRING_SPLIT is a very handy new string function in SQL Server 2016. It allows you to easily manipulate text and organize it by parsing it using separators.

For more info, please visit MSDN Books Online.

Read more on this article...

Saturday, March 5, 2016

In-Memory OLTP Simulator: The Experiment

A few days ago I published a 38-page technical paper titled "In-Memory OLTP Simulator: The Experiment".

In-Memory OLTP Simulator is a software tool that I developed, which allows the user to easily simulate virtually any workload against the powerful In-Memory OLTP Engine of Microsoft SQL Server.

This paper presents a series of simulations for a specific scenario against different workload sizes. The paper discusses the findings of these simulations by describing how the entire process was designed using In-Memory OLTP Simulator, as well as by observing and analyzing the performance trends after executing the different versions of the scenario against SQL Server’s In-Memory OLTP Engine.

The highlight of the paper is the fact that for a workload of 5M records, I achieved breakthrough performance with a speedup of 26x over disk-based tables!

You can download the paper from In-Memory OLTP Simulator's official website.


In-Memory OLTP Simulator: Statistics for the 5M Record Processing.




Read more on this article...

Wednesday, February 24, 2016

SQL Server 2016: Row Level Security

Row-Level Security (RLS) is one of the top features in SQL Server 2016. With RLS you can control access to rows in a table based on the characteristics of the user executing a query.

The access restriction logic is located in the database tier and access restrictions are always applied, thus they cannot be skipped.

Below I will showcase RLS with the use of a simple scenario. This example features a fictitious school's database and more specifically a table containing the school's student records. Each student is assigned an advisor. By using Row-Level Security it is possible for each advisor to see only the records of his students. In the same scenario, only the school principal can view all student records.

--
-- Row Level Security Example
--

USE master;
GO

--Create sample database
CREATE DATABASE RLSDemoDB;
GO

--Use the sample database
USE RLSDemoDB;
GO

--Create four database users 
CREATE USER Advisor1 WITHOUT LOGIN;
CREATE USER Advisor2 WITHOUT LOGIN;
CREATE USER Advisor3 WITHOUT LOGIN;
CREATE USER Principal WITHOUT LOGIN;

--Create the students table
CREATE TABLE tblStudents
(
      code VARCHAR(10) NOT NULL PRIMARY KEY,
      name VARCHAR(100) NOT NULL, 
      advisorCode sysname,
      classID INT,
      GPA FLOAT
);

--Insert sample data
INSERT  tblStudents
VALUES  ( 'ST001', 'Student1','Advisor1', 1,3.2),
        ( 'ST002', 'Student2','Advisor1', 3,3.3),
        ( 'ST003', 'Student3','Advisor1', 5,2.8),
        ( 'ST004', 'Student4','Advisor2', 2,3.9),
        ( 'ST005', 'Student5','Advisor2', 4,3.1),
        ( 'ST006', 'Student6','Advisor2', 5,1.8),
        ( 'ST007', 'Student7','Advisor3', 6,2.2),
        ( 'ST008', 'Student8','Advisor3', 4,3.7),
        ( 'ST009', 'Student9','Advisor3', 3,3.6),
        ( 'ST010', 'Student10','Advisor4',2,3.5)
GO

Let's check the table contents before enabling the security policy:






















Now let's grant SELECT access to all users which are the three advisors and the principal:

--Grant SELECT access to users
GRANT SELECT ON tblStudents TO Advisor1;
GRANT SELECT ON tblStudents TO Advisor2;
GRANT SELECT ON tblStudents TO Advisor3;
GRANT SELECT ON tblStudents TO Principal;
GO

Now it's time to create the Security policy. First we have to create the schema:

--Create security schema
CREATE SCHEMA Security;
GO

Then, we need to create the function that will be used for checking the executing user:

--Create function that checks the executing user
CREATE FUNCTION Security.fn_securitypredicate(@advisorCode AS sysname)
    RETURNS TABLE
WITH SCHEMABINDING
AS
    RETURN SELECT 1 AS fn_securitypredicate_result 
WHERE @advisorCode= USER_NAME() OR USER_NAME() = 'Principal';
GO

Finally we create and enable the security policy which uses the schema and the function created above:

--Create security policy
CREATE SECURITY POLICY StudentFilter
 ADD FILTER PREDICATE Security.fn_securitypredicate(advisorCode) 
 ON dbo.tblStudents
 WITH (STATE = ON);
GO

Now let's retrieve the records from the the tblStudents table, each time within the context of a different user (i.e. advisor1, advisor2, advisor3 and principal).


If we execute the above query as the user 'Advisor1' we can only see the student records that have 'Advisor1' as their advisor:

















If we execute the above query as the user 'Advisor2' we can only see the student records that have 'Advisor2' as their advisor:

















If we execute the above query as the user 'Advisor3' we can only see the student records that have 'Advisor3' as their advisor:

















If we execute the above query as the user 'Principal' we can see all student records:

























If we disable the policy, then we return to the original state where all the records are retrieved from the tblStudents table within the context of any user that was granted the SELECT permission on the table:































































































From the above example you can see that it's quite easy to control access to your data at the most granular level in the database, that is the record level, by using Row-Level Security in SQL Server 2016.
Read more on this article...

Wednesday, February 3, 2016

The transaction log for database 'dbname' is full due to 'XTP_CHECKPOINT'‏

When using In-Memory OLTP in SQL Server 2014, under certain conditions you might get the below error message:

The transaction log for database '[database_name]' is full due to 'XTP_CHECKPOINT'‏ 

This was fixed in Cumulative Update 3 for SQL Server 2014 SP1

More information about the fix can be found here.

Read more on this article...

Thursday, December 31, 2015

Selected Tweets of Year 2015!

Another year has passed...as always, full of everything! Events, articles, direct communication with you, my fellow SQL Server community members, via different online and offline channels, software and book releases and many other activities!

Among my blog and journal articles, user group/conference events, tweets and software and book releases, I feel that two of my activities were the most highlighted in 2015. The first highlight was the release of my software tool 'In-Memory OLTP Simulator', and the second was the release of my third book titled 'Developing SQL Server' which is part of 'The SQL Server and .NET eBook Series'.

In-Memory OLTP Simulator allows you to easily test SQL Server's In-Memory OLTP Engine against your own workload, with the use of custom scenarios which you can easily create using the tool. I was working for two years on this and I am really proud about it!

My book 'Developing SQL Server' contains dozens of articles I wrote on different topics that have to do with SQL Server's development. This book completed my three eBook series 'The SQL Server and .NET Blog eBook Series' but hey, don't worry, there is much more to come! :)

All the above, provided me with a great interaction with you, my fellow SQL Server community members, for one more year. Thank you for making the SQL Server community an amazing place to be!

I'm really looking forward for 2016 having already scheduled a variety of amazing community activities!

Wishing to all of you a Happy New Year 2016!

As every year's end, below you can find my selected tweets for the year 2015.

In-Memory OLTP Simulator



Blog Posts/Articles/Books and Announcements



News

  • Another successful #CDNUG event took place yesterday! Next stop: The In-Memory OLTP Experiment Technical Whitepaper! #SQLServer #MVPBuzz
  • Preparing my session for CDNUG's event on Nov 24. Title: #SQLServer 2016 Top Features! #sqlfamily #community #mvpbuzz
  • A technical whitepaper for In-Memory OLTP Simulator is in the works. Make sure you check often: https://www.inmemoltpsim.com/  #SQLServer #MVPBuzz
  • Stay up to date with all my publications on #SQLServer - Register to the monthly newsletter! - http://goo.gl/forms/jlINJR9vCb #MVPBuzz #Community
  • If you are interested in #SQLServer In-Memory Optimization then you should try "In-Memory OLTP Simulator". https://www.inmemoltpsim.com/download/  #Hekaton
  • Get the Community Edition of In-Memory OLTP Simulator today at: https://www.inmemoltpsim.com/download  #Hekaton #SQLServer #InMemoryOLTP
  • Up to 20 Million records where processed via a custom scenario. The results are amazing! More news soon! #Hekaton #SQLServer
  • A tech whitepaper on In-Memory OLTP Simulator describing the 1st large-scale In-Memory OLTP Simulation conducted is in the works. #SQLServer
  • All scenarios are designed and executed using the Ultimate Edition of In-Memory OLTP Simulator: https://www.inmemoltpsim.com/features/  #SQLServer
  • Up to now: 1M records:2.25x speedup, 5M records:7.52x speedup, 10M records:10.17x speedup.Now, who can see the trend? :) #SQLServer #MVPBuzz
  • Started running the In-Memory OLTP Experiment! It is expected to be completed by tomorrow or Wednesday the latest! #MVPBuzz #SQLServer
  • Designing the 100M scenario in In-Memory OLTP Simulator (https://www.inmemoltpsim.com/) - More info tomorrow! #MVPBuzz
  • Generating 100M records to simulate a complex update operation using the In-Memory OLTP Engine in #SQLServer #MVPBuzz
  • I have just been re-awarded as a Microsoft SQL Server MVP for the 7th Year in a Row! http://aartemiou.blogspot.com/2015/04/microsoft-sql-server-mvp-for-7th-year.html #MVPBuzz #SQLServer
  • Simulating heavy aggregations against 10M+ records using In-Memory OLTP...This is amazing! More news real soon! #MVPBuzz #SQLServer
  • Recommended 2-day free online .NET conference - dotnetConf - http://www.dotnetconf.net/ 
  • The more I use @Snagit the more I get pleasantly surprised by its intelligent engine! Keep it up guys! #MVPBuzz


Miscellaneous

  • Have just rebuilt the #SQLServer system databases on an instance in order to change their collation! Cool process but be careful! 😊 #MVPBuzz
  • Migration completed! Back to "relaxing" mode :) #MVPBuzz
  • Relaxing...before starting the migration of 60 DBs from #SQLServer 2008R2 to SQL Server 2014! 😊 #MVPBuzz
  • There is always a bug around when you write and maintain tens of thousands of lines of code! If you are lucky you will find it :) #Community
  • Great article on @lifehackorg - How To Work Faster And Smarter - http://www.lifehack.org/articles/productivity/how-work-faster-and-smarter.html?mid=20150205&ref=mail&uid=363574&feq=daily - That's a must, especially in large scale projects!
  • ISACA CY / itSMF CY Event on Cloud Security & Service Management on Jan 29, 2015. For more info & agenda: http://www.itsmf.org.cy/  #Community


Read more on this article...

Saturday, December 26, 2015

SQL Server 2016: TempDB Enhancements

TempDB system database plays a major role when it comes to performance in SQL Server. TempDB is available to all users connected to a SQL Server instance and it is used to hold local or global temporary tables, temporary stored procedures, table variables and cursors.

Another example of when TempDB is used, is when you make use of the 'SORT_IN_TEMPDB' option when you create or rebuild an index. This option tells SQL Server's Database Engine to store the intermediate sort results that are used to build the index in the TempDB system database.

For all the above, and more, properly configuring the TempDB database plays a significant role to SQL Server's performance.

Until SQL Server 2014, you would usually install the SQL Server instance and then tune the TempDB database after installation.
SQL Server 2016 now provides you with the capability of configuring the TempDB database during the installation of a SQL Server instance along with some other other enhancements. These are:
  • All files will grow at the same time (no need for Trace flags 1117 and 1118)
  • By default Setup adds as many tempDB Files as the CPU count (max=8)
  • Through Setup you can specify:
    • No. of tempdb database files, initial size, autogrowth and directory placement (you can set multiple volumes for tempDB database files)
    • Initial size of log file, autogrowth and directory placement

The above assist the DBA to properly configure the TempDB system database faster and simpler during the setup of SQL Server, thus ensuring a performant tempDB configuration.

Below you can find a screenshot taken during the installation of SQL Server 2016 (CTP 3.0) that illustrates the TempDB configuration dialog:


























References:
MSDN Library Article: tempdb Database
MSDN Library Article: SORT_IN_TEMPDB Option For Indexes

See also...
SQL Server 2016 Top Features
Read more on this article...

Wednesday, December 16, 2015

SQL Server 2016: Temporal Tables

Temporal tables in SQL Server 2016 is a new type of user table. It was introduced in ISO/ANSI SQL 2011.

Temporal tables keep a full history of data changes (based on UTC time) by using a pair of current-historical tables. These tables has the same structure.

The period of validity for every row is managed by the system (i.e. Database Engine) by using two explicitly-defined columns of the DateTime2 datatype.

Below you can find an example of a user table's definition that has system versioning enabled:

--Create table along with specifying its historical table
CREATE TABLE dbo.Students2
(
  studentID INT NOT NULL IDENTITY(1,1) CONSTRAINT PK_Students PRIMARY KEY,
  studentName VARCHAR(150) NOT NULL,
  studentAddress VARCHAR(150) NOT NULL,
  regDate DATE NOT NULL,
  SysStartTime DATETIME2 GENERATED ALWAYS AS ROW START HIDDEN NOT NULL,
  SysEndTime DATETIME2 GENERATED ALWAYS AS ROW END HIDDEN NOT NULL,
  PERIOD FOR SYSTEM_TIME (SysStartTime, SysEndTime)
)
WITH (SYSTEM_VERSIONING = ON (HISTORY_TABLE = dbo.Students2History));
GO

The above DDL will create two tables: (i) Students2, and (ii) Students2History.

In the "dbo.Students2" table, the user needs to populate only the first 4 columns (studentID, studentName, studentAddress and regDate). Note: In this case "studentID" will be populated automatically too because it is an identity column. The rest of the columns will be maintained automatically by SQL Server's Database Engine. Table "dbo.Students2History" will be populated fully automatically by the Database Engine.

In the below screenshot we can see both tables:

Figure 1: Pair of Current-Historical Table



































Let's check both tables for any contents:

Figure 2: Table Contents.
























Now, let's insert some data in the "dbo.Students2" table:

--Insert initial data
INSERT INTO dbo.Students2
        ( studentName ,
          studentAddress ,
          regDate 
        )
VALUES  ( 'John Adams', 
          'Address 1', 
          GETDATE() 
        );
GO

Let's check again both tables for any contents:

Figure 3: Table Contents After Insert.






















As you can see, the "dbo.Students2" table was populated with data.

The "dbo.Students2History" table was not populated with any data because system versioning works only in case of an update operation against one or more rows.

That being said, let's try to update the row:

--Now let's update a record
UPDATE dbo.Students2
SET studentAddress='Address 1 Update'
WHERE studentID=1;
GO

Let's check again the table contents:

Figure 4: Table Contents After Update Operation.





















Now you can see that the table "Students2History" contains the original values for row with studentID=1 just before the update operation took place.

Let's update the same record again:

--Update the same record
UPDATE dbo.Students2
SET studentAddress='Address 1 Update 2'
WHERE studentID=1;
GO

Let's check the table contents:

Figure 5: Table Contents After Second Update.























As you can see, the table "Students2History" contains also the "new" original value for row with studentID=1 just before the new update operation took place. So in total, the historical table contains the rows for studentID=1 just before the two update operations took place. The validity of each row can be determined from the "SysStartTime" and "SysEndTime" values.

This article was a simple example of using temporal tables in SQL Server 2016. I consider temporal tables one of the top new features in SQL Server 2016 as they have to offer significant functionality.

Some benefits of using temporal tables:
  • Auditing all data changes (i.e. for data forensics)
  • Data reconstruction/recovery
  • Trends monitoring over time


Read more on this article...

Monday, December 14, 2015

SQL Server 2016: Built-In JSON Support

JSON stands for JavaScript Object Notation and it is widely used for data representation in the data services world. With JSON support, you can format query results as JSON by adding the FOR JSON clause to a SELECT statement in SQL Server. Also, you can converts JSON data to rows and columns by calling the OPENJSON rowset provider function.

Below we are going to see some examples of utilizing SQL Server's JSON support.

For the purposes of these demos, I have created a sample tabled called "Students" and I have populated it with random data. Let's take a look at the first 10 rows of this table:


























Figure 1: Sample Table "Students".

In order to extract data from a table and represent it in the JSON format you must use wither the "FOR JSON AUTO" clause or the "FOR JSON PATH" clause.

The "FOR JSON AUTO" clause allows you to have full control over the format of the JSON output. For example, you can specify the root.

Example:


Figure 2: Using the "FOR JSON AUTO" Clause.


The "FOR JSON PATH" clause automatically formats the JSON output based on the structure of the source table.

Example:














Figure 3: Using the "FOR JSON PATH" Clause.


Now, if you want to transform JSON data into a tabular form, you can use the OPENJSON rowset function.

Example:





























Figure 4: Using the "OPENJSON" Rowset Function.



References: 
MSDN Library Article: What's New in SQL Server 2016, November Update
MSDN Library Article: Format Query Results as JSON with FOR JSON (SQL Server)
MSDN Library Article: Convert JSON Data to Rows and Columns with OPENJSON (SQL Server)

See also...
SQL Server 2016 Top Features

Read more on this article...