Thursday, August 27, 2015

In-Memory OLTP Simulator Frequently Asked Questions (FAQ)

  • What is In-Memory OLTP Simulator?
    In-Memory OLTP Simulator is third-party tool that allows the user to easily create and run scenarios with different workloads against the In-Memory OLTP Engine of SQL Server®.
  • What is a Simulation Scenario?
    A simulation scenario is a set of table and stored procedure definitions that generated the data structures, workload and processes that run during a simulation in In-Memory OLTP Simulator.
  • Which editions of SQL Server® can be used?
    In-Memory OLTP Simulator connects and generates its database only in SQL Server editions that support In-Memory OLTP. Currently this is support in the Evaluation, Developer and Enterprise editions of SQL Server 2014 and SQL Server 2016.
  • What are the differences between the Community and Ultimate Editions?
    The Community Edition which is free, allows running only the six standard scenarios available in the tool. The Ultimate Edition in addition to the basic functionality it allows to create and use custom simulation scenarios, generate and use multiple simulation databases as well as access the Executive Report functionality.
  • Can I modify standard scenarios?
    Standard scenarios cannot be modified neither in the Community or the Ultimate Editions. However, as an additional functionality for all users of the tool, the use can change the number of records for all standard scenarios with maximum number 1 million records.
  • Can I change the number of records for standard scenarios?
    Yes, you can change the number of records for standard scenarios within the range of 100K – 1M records. You can perform this action from Configure - Scenario Manager.
  • What are the system requirements for using In-Memory OLTP Simulator?
    You can find all system requirements at the download page of the tool at: http://www.inmemoltpsim.com/download/
  • Can I generate and connect to multiple Simulator databases?
    Yes, you can do this in the Ultimate Edition.
  • In case a simulation is taking too long can I interrupt it?
    You can forcibly stop the simulation using the functionality provided under the “Emergency” menu. I.e. you can choose to “Stop Simulation” as well as “Drop Scenario” in order to free-up any resources occupied by the scenario’s execution.
  • What does “Initialize Scenario” do?
    It creates all the data structures as defined in the selected simulation scenario’s definition as well as it prepares the data to be processed (i.e. sample data generation, or data from other sources).
  • What is “Disk-Based” mode?
    In Disk-Based execution mode, all tables are stored in disk storage. This is the “traditional” execution mode which can be found in all Relational Database Management Systems (RDBMs). The Disk-Based mode is considered as the baseline for all simulations.
  • What is “Memory-Optimized” mode?
    In Memory-Optimized execution mode, all tables are loaded and processed in-memory. Under this mode, memory-optimized tables can be also combined with disk-based tables if this better suits the user’s needs.
  • What is “Memory-Optimized” with “Natively-Compiled Stored Procedure” mode?
    In Memory-Optimized with Natively-Compiled Stored Procedure execution mode, the full power of SQL Server® In-Memory OLTP Engine is being utilized. Along with the memory-optimized data structures, a new special type of SQL Server® stored procedure is used. Natively compiled stored procedures are compiled when they are created (in contrast to the disk-based stored procedures where they are compiled at their first execution) thus achieving significantly faster execution.

Read more on this article...

Wednesday, August 5, 2015

In-Memory Optimization in SQL Server: Will my Workload Execute Faster?

In-Memory OLTP in SQL Server 2014 or later is a powerful engine integrated into the SQL Server Engine, optimized for Online Transaction Processing (OLTP).

If I was about to describe this technology in two simple sentences I would do so with the below:
  • It eliminates both locks and latches with an optimistic multi-version concurrency control mechanism.
  • It introduces memory-optimized tables and natively-compiled stored procedures.
However, organizations always make one key question (besides the cost) when it comes to invest in a new technology, for example in this case the question would most probably look like:  "Will we benefit in terms of significant performance and time gain if we invest in this technology?"
The answer, of course, is "It Depends" :)
But In-Memory Optimization achieves significant speedups so why say it depends"?

The reason for the classic "it depends" answer is because of the workload types each Organization’s DBMS systems usually process. You see, there is a list of best-suited workload types where In-Memory Optimization can provide significant performance. Some of these are:
  • High Data Insert Rate 
    • Examples: Smart Metering, System Telemetry
  • Read Performance and Scale
    • Example: Social Network Browsing
  • Compute Heavy Data Processing 
    • Examples: Manufacturing supply chains or retailers
  • Low latency Execution 
    • Examples: Online Gaming Platforms, Capital Markets
  • Session State Management
    • Example: Managing sessions (i.e. user requests, etc.) for a heavily-visited websites
This of course does not mean that other workload types that are not in the above list cannot benefit from In-Memory Optimization but for them a more thorough investigation/study will be required.

There are many ways to investigate if using In-Memory Optimization in SQL Server could give you the performance gain for you are looking for:
Another way is simulation. I'm a big fan of simulation (and I'm sure the rest of the world is too) because with simulation you can process actual production data/workloads in the safeness of an isolated non-production environment equipped with a promising new technology or a new processing method. The results of the simulation can indicate (maybe not in a 100% factor but very close to that) if processing part of your workload or all of it using the In-Memory OLTP Engine in SQL Server can give you the performance gain you are looking for. 

For all the above reasons I created "In-Memory OLTP Simulator". In-Memory OLTP Simulator enables the user to easily test the powerful In-Memory OLTP Engine of SQL Server 2014 (or later) against different workloads via a simple and intelligent GUI. In-Memory OLTP Simulator features three simulation modes which are:
  • Disk-Based
  • Memory-Optimized
  • Memory-Optimized with Natively-Compiled Stored Procedure
The available editions of In-Memory OLTP Simulator are:
  • Community Edition (free)
  • Ultimate Edition
The main differences between the two editions is that in Community Edition you can execute only six standard scenarios and change only their number of records whereas in the Ultimate Edition, in addition to the basic functionality, among other, you can create, execute and manage custom scenarios as well get access to an executive report. For a full feature comparison please visit: https://www.inmemoltpsim.com/features

Below you can find the presentation:



Resources:

Read more on this article...

Wednesday, July 8, 2015

In-Memory OLTP Simulator v1.0 Now Live!

After two years of heavy development, my tool "In-Memory OLTP Simulator" is finally out!

In-Memory OLTP Simulator enables the IT Professional and Developer to easily test the powerful In-Memory OLTP Engine of SQL Server 2014® (or later) with different simulations and against different workloads. In the Ultimate Edition of the tool, among other, the user can load data from different sources (i.e. data from Production databases using Linked Servers) into the Simulator’s database, simulate real-life scenarios with the use of custom scenarios as well as get in-depth analytics.

Video Introduction:
For a short introduction, please watch the video below:




Editions:
  • Community Edition: This is the free version of the tool where basic functionality is enabled.
  • Ultimate Edition: This edition contains all the basic functionality plus the ability to create/modify and import/export custom scenarios as well as have access to more analytics. The Ultimate Edition of In-Memory OLTP Simulator can be very useful for organizations or individuals that want to create custom scenarios and simulate real Production data and processes with using In-Memory Optimization in SQL Server. The Ultimate Edition requires a license to be purchased.
Read more on this article...

Friday, July 3, 2015

Introducing In-Memory OLTP Simulator

One of the key features shipped with SQL Server 2014 was the long-awaited In-Memory OLTP, codenamed as project “Hekaton”. In-Memory OLTP introduced new data structures for optimizing the performance of OLTP workloads. These data structures are called “memory-optimized” tables and along with natively compiled stored procedures, they can offer a huge performance boost when it comes to processing large volumes of data, especially in data environments with high levels of concurrency. In certain cases, the execution time can be up to 50 times faster. As its name implies, the In-Memory OLTP Engine is optimized for OLTP workloads. Examples of such workloads include:

  • Staging tables (i.e. DW)
  • High data insert rate (i.e. smart metering)
  • Low latency execution (i.e. investment trading)

Since the early releases (previews, CTPs, RCs) of SQL Server 2014, I have been studying and testing the In-Memory OLTP Engine and of course the inevitable came: I became a fun of it!

Throughout the hundreds of experiments I ran against the In-Memory OLTP Engine, the idea of developing a special simulation tool came up and this is what I am presenting today.

After almost two years of development, today I’m presenting "In-Memory OLTP Simulator".

What Does In-Memory OLTP Simulator Do?
In-Memory OLTP Simulator enables the IT Professional and Developer to easily test the powerful In-Memory OLTP Engine of SQL Server 2014 (or later) with different simulations and against different workloads. In the Ultimate Edition of the tool, among other, the user can load data from other sources (i.e. using Linked Servers) into the Simulator’s database and along with the logic of a custom scenario, simulate real life scenarios and get in-depth analytics on performance when utilizing In-Memory Optimization and Natively-Compiled Stored Procedures in SQL Server. In addition to using In-Memory OLTP Simulator for benchmarking and performance analysis, the user can also use it for resource-sizing as it provides statistics on resource usage (i.e. CPU and Memory) when running simulations.

Figure 1: In-Memory OLTP Simulator Start Page.

How Does it Work?
The Community Edition (free) of In-Memory OLTP Simulator supports six standard scenarios that are executed against three modes (where available):
  • Disk-Based
  • Memory-Optimized
  • Memory-Optimized with Natively-Compiled Stored Procedure
In Disk-Based execution mode, all tables are stored in disk storage. This is the "traditional" execution mode which can be found in all Relational Database Management Systems (RDBMs). The Disk-Based mode is considered as the baseline for all simulations.

In Memory-Optimized execution mode, all tables are loaded and processed in-memory. Under this mode, memory-optimized tables can be also combined with disk-based tables if this better suits the user’s needs.

In Memory-Optimized with Natively-Compiled Stored Procedure execution mode, the full power of SQL Server In-Memory OLTP Engine is being utilized. Along with the memory-optimized data structures, a new special type of SQL Server stored procedure is used. Natively compiled stored procedures are compiled when they are created (in contrast to the disk-based stored procedures where they are compiled at their first execution) thus achieving significantly faster execution.
In-Memory OLTP Simulator undertakes the execution of the scenarios in each mode, monitors the resource usage, logs all statistics and provides rich analytics. Additionally, in the Ultimate Edition of the tool, the user is able to create/modify and import/export custom scenarios as well as get access to multiple processing options and more analytics.

Figure 2: In-Memory OLTP Simulator Simulation Page.

Available Editions
There are currently two editions of In-Memory OLTP Simulator available:
  • Community Edition: This is the free version of the tool where basic functionality is enabled.
  • Ultimate Edition: This edition contains all the basic functionality plus the ability to create/modify and import/export custom scenarios as well as have access to more analytics. The Ultimate Edition of In-Memory OLTP Simulator can be very useful for organizations or individuals that want to create custom scenarios and simulate real Production data and processes with using In-Memory Optimization in SQL Server.

Figure 3: In-Memory OLTP Simulator Simulation Page After Running a Simulation Multiple Times.

Release Date
In-Memory OLTP Simulator will be released on July 8 2015!

More Resources:

Read more on this article...

Thursday, May 28, 2015

Compatibility Levels Supported by Different SQL Server Versions

Software is evolving and certainly SQL Server is not an exception to the rule. Every few years we need to upgrade our databases in order to run on a newer version of SQL Server and take advantage of significant new features that will enhance the operations of our organization.

However, upgrading a database to a newer compatibility level of any DBMS involves evaluating not just the database but also how an entire application supported by that database will behave.

The easiest thing for many people is to move the database to the new version of SQL Server but make use of the backwards compatibility support of the Database Engine. This is however not a recommended approach because this way you cannot utilize all the features provided by the Database Engine of that new version of SQL Server.

A recommended high-level practice for moving a database to a newer version of SQL Server is:
  • Analyze your database with SQL Server Upgrade Advisor (for each SQL Server version there is the corresponding version of the Upgrade Advisor) in order to find any incompatibilities (i.e. the usage of deprecated features, etc.).
  • Resolve any compatibility issues that might be reported by the tool.
  • Move the database to the desired version of SQL Server on a Test Environment.
  • Test your database and the supported application to check if everything works well (this step involves any business users they application might have too).
  • Resolve any issues that might be raised.
  • Only if you are sure that everything works well after the testing process and any issues have been resolved then you can proceed to the actual migration.
Of course, it goes without saying that you always need to take backups of your databases, not only during their operational cycle but also always before you are going to make changes to their configuration.
Now, you might encounter certain cases where you deal with a "legacy" application and it might be nearly impossible to upgrade its supporting database to a newer version but at the same time you want to move it to a newer version of SQL Server. In such cases you can make use of the backwards compatibility the Database Engine of SQL Server provides. However, each version of SQL Server supports up to a specific compatibility level. For example if you have a SQL Server 2000 database you cannot migrate it to a SQL Server 2012 or 2014 instance because there is not backwards compatibility support for such an old version of SQL Server. You can however migrate it to a SQL Server 2008 R2 instance.
Below you can find the supported compatibility modes for SQL Server versions 2008/R2, 2012 and 2014:

SQL Server 2008
80: SQL Server 2000
90: SQL Server 2005
100: SQL Server 2008

SQL Server 2008 R2
80: SQL Server 2000
90: SQL Server 2005
100: SQL Server 2008 (and 2008 R2 - it is the same compatibility level)

SQL Server 2012
90: SQL Server 2005
100: SQL Server 2008 and SQL Server 2008 R2
110: SQL Server 2012

SQL Server 2014
100: SQL Server 2008 and SQL Server 2008 R2
110: SQL Server 2012
120: SQL Server 2014


Read more on this article...

Saturday, May 23, 2015

Accessing Reporting Services Using a Fully Qualified Domain Name

If you installed SQL Server Reporting Services (SSRS) on a server in a domain and you use a domain user to start the service and did not perform any further configuration, then you most probably can only access the Report Manager using an IP and not the Fully Qualified Domain Name (FQDN) of the server (if an SPN is not set).

If you try to use the Fully Qualified Domain Name to access reporting services then you will most probably be prompted for username password several times ending with an empty page.

In order to be able to access Reporting Services using FQDN you will need to perform the following actions:

1. Register a Service Principal Name (SPN) for the Domain User Account that Runs SSRS
Consider the following example:

sample computer name: reportsrv01
sample domain: example.com
sample domain account: example\ssrsuser

Then on the Domain Controller Server in a Command Prompt with Elevated Rights, you can Run as Administrator:

example 1: If SSRS are on port 80 (no need to specify 80 as it is the default http port):
Setspn -s http/reportsrv01.example.com example\ssrsuser

example 2: If SSRS are on any other port (i.e. 2430):
Setspn -s http/reportsrv01.example.com:2430 example\ssrsuser

2. Edit the RsReportServer.config File
On the Reporting Services server, in the virtual directory of SSRS, edit the "RsReportServer.config" file and locate the section.

Then add as the first entry in the  section. 

The above step will actually enable NTLM.

For more information please visit this MSDN article.


Read more on this article...

Monday, May 4, 2015

Transferring Ownership of All Database Objects Back to DBO

Consider the following scenario: You created a Window or SQL login in SQL Server and in the user mapping along with assigning a database role membership, you assigned as default schema something different than the “dbo”.

Take as an example the following screenshot:










































As you can see, the SQL login “testLogin” has been assigned the database role “db_owner” on the database “SampleDB1” but along with that, “testLogin” was also set as the default schema.

If you do that, any object the login “testLogin” creates, their owner schema will be “testLogin” and not “dbo”.

If however, for any reason you want to remove “testLogin” from the SQL Server instance you will not be able to do it because the login, through its schema, will own database objects.

If you still want to remove the login don’t lose hope, there is a way! :)

For achieving the above you need to transfer back the ownership of the objects to the “dbo” (or any other) schema.

You can do that using the “ALTER SCHEMA” command. As described in BOL, this command transfers a securable between schemas.

So how can you use this command in order to transfer back to dbo all objects owned by "testLogin"?

One of the easiest approaches is to generate the set of SQL statements that you can run in order to restore ownership for all objects of the following types:
- tables
- views
- functions (scalar, table-valued)
- stored procedures
- ...and maybe any other object types

A sample logic that could do that would be:

--User Option 1: Set the database (in this example "SampleDB1")
USE SampleDB1
GO

--Variable declarations
DECLARE @currentSchema VARCHAR(100)
DECLARE @newSchema VARCHAR(100)

--User Options 2 and 3: set the current schema and the new schema 
--to which you want to transfer the selected database's objects
--in this example current schema='testLogin' and new schema='dbo'
SET @currentSchema='testLogin'
SET @newSchema='dbo'

DECLARE @schemaID INT
SET @schemaID=(SELECT [schema_id] FROM sys.schemas WHERE [name]=@currentSchema)

--The actual logic
IF OBJECT_ID('TempDB..#Statements') IS NOT NULL
DROP TABLE #Statements

CREATE TABLE #Statements(
ScriptToRun VARCHAR(200)
)

--Generate statements for all user tables
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='U' AND [schema_id]=@schemaID

--Generate statements for all user views
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='V' AND [schema_id]=@schemaID

--Generate statements for all stored procedures
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='P' AND [schema_id]=@schemaID

--Generate statements for all scalar functions
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='FN' AND [schema_id]=@schemaID

--Generate statements for all table-valued-functions
INSERT INTO #Statements      
SELECT 'ALTER SCHEMA ' + @newSchema + ' TRANSFER ' + @currentSchema + '.' + [name]
FROM sys.objects
WHERE type='TF' AND [schema_id]=@schemaID

--Set "Results to Text" (Ctrl + T)
SELECT * FROM #Statements

Then, if you want to proceed with the transfer of ownership you can use the generated "ScriptToRun", that is the contents of the temporary table #Statements.

Note 1: As all posts in this blog, this post is provided as-is and for educational purposes only. Do not play with your Production environments if you do not know what you are doing and of course, always take backups of your data.

Note 2: In the above logic we searched for user tables and views, stored procedures and scalar/table-valued functions. You can search for even more database object types in a similar way (more info in one of my previous posts).

Read more on this article...

Wednesday, April 1, 2015

Microsoft SQL Server MVP for the 7th Year in a Row!

Today, a while ago, I have received that special email from Microsoft informing me that I was awarded with the Microsoft Most Valuable Professional (MVP) Award:

-------------









Dear Artemakis Artemiou,

Congratulations! We are pleased to present you with the 2015 Microsoft® MVP Award! This award is given to exceptional technical community leaders who actively share their high quality, real world expertise with others. We appreciate your outstanding contributions in SQL Server technical communities during the past year.

...

The Microsoft MVP Award provides us the unique opportunity to celebrate and honor your significant contributions and say "Thank you for your technical leadership."
-------------

This is the 7th year in a row I am awarded with the Microsoft MVP Award on SQL Server and I am really excited! Every time I received the MVP Award was - and is - unique because it shows the genuine appreciation of Microsoft on what we do by sharing our passion on technology and in this case, SQL Server.

Working so many years with SQL Server and being involved with the worldwide SQL Server community is a way of living for me. The active involvement with the community, the direct communication with Microsoft and especially with the SQL Server Product Group and our Community Program Manager, the exchange of knowledge with fellow MVPs and the broader community, the participation to conferences, user group meetings, blogging, book authoring, software development and many other activities, is a constant source of inspiration and creativity to me. All this makes me not only a better professional but also a better human being because it is all based to the genuine passion of sharing knowledge with fellow people on a technology we all love, and that is SQL Server.

A huge thanks to all of you, my fellow SQL Server community members for just being awesome.

I would also like to thank Microsoft for re-awarding me with this prestigious award, the SQL Server Product Group for doing all this great stuff with SQL Server and related technologies and of course our Community Program Manager at Microsoft for keeping us up to date with all this great stuff provided by Microsoft. Additionally I would also like to thank the local Microsoft Office at Cyprus for supporting our community activities at the Cyprus .NET User Group.

On top of everything, a huge thank you to my beautiful wife and daughter for their true support, love and understanding for all those hours I spent away of them working on all those "community stuff" :) Without your support, you know it would just not feel right. I love you and thank you from the depths of my heart loves! Once again, this award is dedicated to you!

The journey continues. My commitment to all of you is that I will continue doing my best and share with you the technical knowledge that I acquire on SQL Server as well as my experience via all the available channels!

This is going to be a really amazing year. Among other, for the last year I have been silently working on developing something really special that has to do with an exceptional technology in SQL Server, that is In-Memory Optimization. The development is now completed and testing/QA is in its final stages. This is going to be huge and it will be announced real soon! Stay tuned friends!

Read more on this article...

Monday, March 30, 2015

There is insufficient system memory in resource pool 'internal' to run this query.

Even though you might get the above error message for different occasions, one thing is for sure, it has to do with memory management. :)

SQL Server allows you to limit the maximum memory that can be utilized by its Database Engine. You can use this feature in order to allow the OS to have access to memory that cannot be used by SQL Server and thus ensure stability. However you have to remember to re-evaluate this setting in the event of a memory upgrade on the machine.

You can perform the above action as follows:
Right click on instance name in SSMS - Select "Properties" - Select the "Memory" tab. You then get the below dialog (in this example I have set as maximum server memory 3GB):




































The above method is a way to ensure OS stability by not allowing SQL Server to consume all memory in any case and by leaving an amount of memory available for the OS.

Now, in the case you get the "There is insufficient system memory in resource pool 'internal' to run this query." error you can try the following actions:


Action 1
Run the T-SQL command:
ALTER RESOURCE GOVERNOR RESET STATISTICS

As described on an MSDN Library article, the above command will cause Resource Governor to resets statistics on all workload groups and resource pools including the "internal" resource pool.

Action 2:
If the above action does not solve the issue you can try to free-up more memory and eventually upgrade the memory capacity of the machine.


Read more on this article...

Thursday, March 5, 2015

Handling NULL Character \x00 when Exporting to File Using BCP

This article discusses the behavior of the BCP utility when extracting data from SQL Server to files and more specifically to CSV files.

Imagine that you have the below table:

















As you can see, records 2 and 3 contain null values.

Then you export the table contents to CSV using BPC:

bcp "SELECT [code],[description],[price] FROM [TestDB1].[dbo].[product]" queryout "c:\tmp\testExtract.csv" -c -C 1253 -t ; -S ".\SQL2K14" -T

Output:







As you can see on the above screenshot, the output seems correct. The records' NULL values have been replaced with empty strings.

Now consider that the original table, instead of NULL values has empty strings:


















Let's try again to export the new table contents to CSV using BPC:

bcp "SELECT [code],[description],[price] FROM [TestDB1].[dbo].[product]" queryout "c:\tmp\testExtract.csv" -c -C 1253 -t ; -S ".\SQL2K14" -T

Output:







As you can see, the empty strings have been replaced by NULLs which correspond to the hexadecimal character \x00.

From the MSDN Library article on BCP, we read the following:

  • out copies from the database table or view to a file. If you specify an existing file, the file is overwritten. When extracting data, note that the bcp utility represents an empty string as a null and a null string as an empty string.
It seems from the above example, that the queryout keyword has the same behavior regarding null and empty strings.

Now, if you are going to feed the CSV file to another application for parsing and processing it, if the application does not explicitly handle possible occurrences of the NULL character then most probably an error will be occurred.

To this end, always have in mind the above behavior when extracting data to files using BCP.


Read more on this article...

Wednesday, February 18, 2015

Searching for Keywords in SQL Server Jobs

Hi friends all around the globe, it's been a while since my last article but this is a very busy period with many other community activities of which you will soon enjoy the outcome! These activities have to do with heavy software development as well as the authoring of my third eBook with title "Developing SQL Server". More updates on everything will be provided along the way.

This article is actually a T-SQL tip that can become quite handy. It tackles the issue of finding which SQL Server Agent jobs are affected due to schema changes (i.e. table rename, etc.).

Consider the following example where you have three tables:
-table1
-table2
-table3

You also have a SQL Server Agent jobs that reference the above tables.

Then, for any reason, you want to rename 'table2' to 'table2New'

What happens to the SQL Server Agent job the next time it runs? Of course it fails.

When you rename an object in SQL Server using the sp_rename stored procedure, the Database Engine warns you with the below message:
Caution: Changing any part of an object name could break scripts and stored procedures.

In order not to get into problems with such actions, especially in the case of large databases, you must first research if there are any functions, stored procedures, jobs, etc. that might be affected.

The below script will return all jobs that include one or more steps that reference a given string pattern (this can be a table name, stored procedure name, function name, etc.). In this case we search for 'table2':

----------------
declare @pattern nvarchar(max)
set @pattern='table2'

select j.[job_id],j.[name],j.[enabled],j.[description],s.command,j.[date_created],j.[date_modified]
from msdb.dbo.sysjobs j
inner join msdb.dbo.[sysjobsteps] s on j.job_id=s.job_id 
where s.command like '%'+@pattern+'%'
----------------

A subsequent article will contain a master procedure that can be used for searching for a given pattern in all the programmability objects in SQL Server.


Read more on this article...