Building Microsoft Enterprise Applications from the Command Line

Background

Reviewing an Application that makes heavy use of Microsoft’s Enterprise Library.

Unfortunately, my installed copy of Microsoft’s Visual Studio is not working and everything has to be done through the command line.

Here is how I stumbled through.

Compilation

When we invoked csc, encountered the error pasted below.

Error


Daily.cs(11,17): error CS0234: The type or
namespace name 'Practices' does not exist in the namespace 'Microsoft' (are you missing an assembly reference?)

Daily.cs(18,17): error CS0246: The type or namespace name 'Database' could not be found (are you missing a using
directive or an assembly reference?)

Source Code

Here is what lines 11 and 18 look like.


using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Text.RegularExpressions;
using System.Data;
using System.Data.Common;
using System.Configuration;


using Microsoft.Practices.EnterpriseLibrary.Data;
using System.Transactions;

namespace reporting
{
    public class reportManager : ReportManager<DailyPaymentTransactionData>
    {
        private Database db;


Explanation

The cited lines are:

  1. Using Microsoft.Practices.EnterpriseLibrary.Data;
  2. private Database db;

 

Download Enterprise Library

Download Microsoft’s Enterprise Library from https://msdn.microsoft.com/en-us/library/ff632023.aspx.

Install Enterprise Library

Here we installed Microsoft Enterprise Library unto our machine.

CustomSetup

I will urge you to keep in mind the target folder.

Code Revision

Compile Code

Here is the original & revised compiler invocation statements.

Original


C:\Windows\Microsoft.NET\Framework64\v4.0.30319\csc.exe /define:DEBUG /optimize /out:reporting.exe *.cs 

Revised


C:\Windows\Microsoft.NET\Framework64\v4.0.30319\csc.exe /define:DEBUG /optimize /out:reporting.exe *.cs /r:"Microsoft.Practices.EnterpriseLibrary.Data.dll" /r:"Microsoft.Practices.EnterpriseLibrary.Common.dll"

Explanation

  1. Added a reference ( using /r )  to Microsoft.Practices.EnterpriseLibrary.Data.dll
  2. Added reference to Microsoft.Practices.EnterpriseLibrary.Common.dll

Execution

Error

Missing Dlls

 

 

Error Missing
Unhandled Exception: System.IO.FileNotFoundException: Could not load file or assembly ‘Microsoft.Practices.EnterpriseLibrary.Data, Version=5.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35’ or one of its dependencies. The system cannot find the file specified. Microsoft.Practices.EnterpriseLibrary.Data
Unhandled Exception: System.IO.FileNotFoundException: Could not load file or assembly ‘Microsoft.Practices.EnterpriseLibrary.Common, Version=5.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35’ or one of its dependencies. The system
cannot find the file specified.
Microsoft.Practices.EnterpriseLibrary.Common
 Unhandled Exception: System.IO.FileNotFoundException: Could not load file or assembly ‘Microsoft.Practices.ServiceLocation, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35’ or one of its dependencies. The system cannot find the file specified.
at Microsoft.Practices.EnterpriseLibrary.Data.DatabaseFactory.InnerCreateDatabase(String name)
 Microsoft.Practices.ServiceLocation
Unhandled Exception: System.IO.FileNotFoundException: Could not load file or assembly ‘Microsoft.Practices.Unity, Version=2.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35’ or one of its dependencies. The system cannot find the file specified.
at Microsoft.Practices.EnterpriseLibrary.Common.Configuration.EnterpriseLibraryContainer.CreateDefaultContainer
( IConfigurationSource configurationSource )
at Microsoft.Practices.EnterpriseLibrary.Common.Configuration.EnterpriseLibraryContainer.SetCurrentContainerIfNotSet()
at Microsoft.Practices.EnterpriseLibrary.Data.DatabaseFactory.InnerCreateDatabase(String name)
Microsoft.Practices.Unity
Unhandled Exception: System.IO.FileNotFoundException: Could not load file or assembly ‘Microsoft.Practices.Unity.Interception, Version=2.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35’ or one of its dependencies. The system cannot find the file specified.
at Microsoft.Practices.EnterpriseLibrary.Common.Configuration.ContainerModel.
Unity.UnityContainerConfigurator..ctor(IUnityContainer container)
at Microsoft.Practices.EnterpriseLibrary.Common.Configuration.EnterpriseLibraryContainer.CreateDefaultContainer(IConfigurationSource configurationSource)
at Microsoft.Practices.EnterpriseLibrary.Common.Configuration.EnterpriseLibra
ryContainer.SetCurrentContainerIfNotSet()
at Microsoft.Practices.EnterpriseLibrary.Data.DatabaseFactory.InnerCreateDatabase(String name)
‘Microsoft.Practices.Unity.Interception

 

Missing Application Configuration File

Here is one of the errors we saw:


Unhandled Exception: Microsoft.Practices.ServiceLocation.ActivationException: Activation error occured while trying to get instance of type Database, key "DEDB"

 ---> Microsoft.Practices.Unity.ResolutionFailedException: Resolution of the dependency failed, type = "Microsoft.Practices.EnterpriseLibrary.Data.Database", name = "DEDB".

Exception occurred while: while resolving.
Exception is: InvalidOperationException - The type Database cannot be constructed. You must configure the container to supply this value.
-----------------------------------------------
At the time of the exception, the container was:

  Resolving Microsoft.Practices.EnterpriseLibrary.Data.Database,DEDB
 ---> System.InvalidOperationException: The type Database cannot be constructed.
 You must configure the container to supply this value.
   at Microsoft.Practices.ObjectBuilder2.DynamicMethodConstructorStrategy.GuardTypeIsNonPrimitive(IBuilderContext context, SelectedConstructor selectedConstruct
or)
   at Microsoft.Practices.ObjectBuilder2.DynamicMethodConstructorStrategy.PreBuildUp(IBuilderContext context)
   at Microsoft.Practices.ObjectBuilder2.StrategyChain.ExecuteBuildUp(IBuilderContext context)
   at Microsoft.Practices.ObjectBuilder2.DynamicMethodBuildPlanCreatorPolicy.CreatePlan(IBuilderContext context, NamedTypeBuildKey buildKey)


Explanation

  1. In our case the Application configuration’s file was missing all together
  2. Yes, we had App.Config file
  3. In the IDE, msbuild uses the App.Config file to produce the bin\Debug\<app>.exe.config file

Sample Configuration File


<?xml version="1.0" encoding="utf-8" ?>
<configuration>

   <connectionStrings>
    
	<add name="DEDB" connectionString="Data Source=HRDB;Initial Catalog=HRDBDev;Persist Security Info=True;User ID=reporter;Password=123TRS" providerName="System.Data.SqlClient" />
    
   </connectionStrings>  
  
</configuration>


MSBUILD

Rather than using csc we should have used msbuild.

Pasted below are the screenshots from invoking msbuild; along with a brief description of what is being accomplished.

Build

Enterprise Library Files

Here we have the Enterprise Library files copied from our installed folder to the destination folder ( bin\debug):

msbuild-EnterpriseLibrary-CopyingFile

App.config File

Here the app.config file copied to bin\debug\[application].exe.config

appConfigFile

EXE and PDB

Here the executable and the PDB, a program database file which holds debugging and project state information and allows for incremental linking of a Debug configuration of our program, are copied to the bin\debug folder.

ExeAndPDB

 

BIN FOLDER

Here is what our bin\debug folder look like

bin-debug-folder

Summary

Really should have issued msbuild as it will have made use of the existing msproj file.

Issuing msbuild would have copied the Enterprise Library Dlls and mirrored the app.config as [app].exe.config file.

Sermons & Discussions – 2016/Mid Feb

 

Prelude

Last weekend I watched a touching and affirming video.

The subject area is a bit outside my normal trajectory and it came from walking in on my cousin, Azziz, as he meditated.

I also watched a sermon forwarded to me by Paul whom I met on my daily commute.  He pointed me to Paris Reidhead’s redirecting indictment, “Ten Shekels and a Shirt“.

 

Videos

  1. Ali Banat
  2. Paris Reidhead
  3. Perry Stone
  4. John Piper
  5. Stuart Olyott

Discussion

Ali Banat

  • Upon going to the cemetery for brother’s burial, came back with the thoughts that after you are gone, there is nothing left behind, just your deeds
  • That is the only thing that will keep you in your grave and be with you till your final destination
  • He who loves to meet Allah, Allah loves to meet him.  And, he who hates to meet Allah, Allah hates to meet him

 

Paris Reidhead

Ten Shekels and a Shirt

  1. Convince bad man that he needs God
  2. The only one reason for repentance is that God deserves worship, adulation, admiration of his heart
  3. If the one repents just to get out of Hell, then one is just a Levi trying to serve God because he does us good
  4. It is not what you are going to get out of God, it is what he is going to get out of you
  5. That’s what the Chinese Christian said, you know, when he got back to China
    What impressed you most about America?” He said, “The great things Americans can accomplish without God.”
  6. This became the call of Moravian missions. And this is our only reason for being…that the Lamb that was slain may receive the reward of His suffering!

SQL Server – Log Shipping – Error – “Msg 14262, Level 16, State 1, Procedure sp_verify_job_identifiers, Line 41”

Background

Applying Log Shipping script, but getting the error pasted below.

 

Error

Textual


Msg 14262, Level 16, State 1, Procedure sp_verify_job_identifiers, Line 41
The specified @job_id ('46B4CCA0-8420-4FBF-AE1E-087F4A7F712E') does not exist.

Image
TheSpecifiedJobDoesNotExist

List MSDB Jobs


	-- exec msdb.dbo.sp_help_job
	declare @database sysname
	declare @databaseJob sysname

	set @database = 'NetTraffic'
	set @databaseJob = 'LogShipping_' + @database + '_db_log'

	select 
			  [database] = @database
			, job_id
			, [name]
			, [description]
			, date_created
	from msdb.dbo.sysjobs 
	where name = @databaseJob



ListDBJobs-LogShipping-DBLog.20160201

Code

Original Code

Here is the original code.




-- Execute the following statements at the Primary to configure Log Shipping 
-- for the database [DBPRIMARY].[NetTraffic_db],
-- The script needs to be run at the Primary in the context of the [msdb] database.  
------------------------------------------------------------------------------------- 
-- Adding the Log Shipping configuration 

-- ****** Begin: Script to be run at Primary: [DBPRIMARY] ******


DECLARE @LS_BackupJobId	AS uniqueidentifier 
DECLARE @LS_PrimaryId	AS uniqueidentifier 
DECLARE @SP_Add_RetCode	As int 


EXEC @SP_Add_RetCode = master.dbo.sp_add_log_shipping_primary_database 
		@database = N'NetTraffic_db' 
		,@backup_directory = N'E:\SQLBackups\MSSQL\NetTraffic_db' 
		,@backup_share = N'\\DBPRIMARY\SQLBackups\MSSQL\NetTraffic_db' 
		,@backup_job_name = N'LogShipping_NetTraffic_db_log' 
		,@backup_retention_period = 2160
		,@backup_threshold = 60 
		,@threshold_alert_enabled = 1
		,@history_retention_period = 5760 
		,@backup_job_id = @LS_BackupJobId OUTPUT 
		,@primary_id = @LS_PrimaryId OUTPUT 
		,@overwrite = 1 


IF (@@ERROR = 0 AND @SP_Add_RetCode = 0) 
BEGIN 

DECLARE @LS_BackUpScheduleUID	As uniqueidentifier 
DECLARE @LS_BackUpScheduleID	AS int 


EXEC msdb.dbo.sp_add_schedule 
		@schedule_name =N'LogShipping_NetTraffic_db_log' 
		,@enabled = 1 
		,@freq_type = 4 
		,@freq_interval = 1 
		,@freq_subday_type = 4 
		,@freq_subday_interval = 15 
		,@freq_recurrence_factor = 0 
		,@active_start_date = 20100616 
		,@active_end_date = 99991231 
		,@active_start_time = 20000 
		,@active_end_time = 230000 
		,@schedule_uid = @LS_BackUpScheduleUID OUTPUT 
		,@schedule_id = @LS_BackUpScheduleID OUTPUT 

EXEC msdb.dbo.sp_attach_schedule 
		@job_id = @LS_BackupJobId 
		,@schedule_id = @LS_BackUpScheduleID  

EXEC msdb.dbo.sp_update_job 
		@job_id = @LS_BackupJobId 
		,@enabled = 1 


END 


EXEC msdb.dbo.sp_update_job 
		@job_id = N'46b4cca0-8420-4fbf-ae1e-087f4a7f712e' 
		,@enabled = 1 

EXEC master.dbo.sp_add_log_shipping_alert_job 

EXEC master.dbo.sp_add_log_shipping_primary_secondary 
		@primary_database = N'NetTraffic_db' 
		,@secondary_server = N'DBSECONDARY' 
		,@secondary_database = N'NetTraffic_db' 
		,@overwrite = 1 

-- ****** End: Script to be run at Primary: [DBPRIMARY]  ******


Revision

And, here is the revised code.



begin tran


	-- Execute the following statements at the Primary to configure Log Shipping 
	-- for the database [DBPRIMARY].[NetTraffic_db],
	-- The script needs to be run at the Primary in the context of the [msdb] database.  
	------------------------------------------------------------------------------------- 
	-- Adding the Log Shipping configuration 

	-- ****** Begin: Script to be run at Primary: [DBPRIMARY] ******


	DECLARE @LS_BackupJobId	AS uniqueidentifier 
	DECLARE @LS_PrimaryId	AS uniqueidentifier 
	DECLARE @SP_Add_RetCode	As int 


	EXEC @SP_Add_RetCode = master.dbo.sp_add_log_shipping_primary_database 
			@database = N'NetTraffic_db' 
			,@backup_directory = N'E:\SQLBackups\MSSQL\NetTraffic_db' 
			,@backup_share = N'\\DBPRIMARY\SQLBackups\MSSQL\NetTraffic_db' 
			,@backup_job_name = N'LogShipping_NetTraffic_db_log' 
			,@backup_retention_period = 2160
			,@backup_threshold = 60 
			,@threshold_alert_enabled = 1
			,@history_retention_period = 5760 
			,@backup_job_id = @LS_BackupJobId OUTPUT 
			,@primary_id = @LS_PrimaryId OUTPUT 
			,@overwrite = 1 


	IF (@@ERROR = 0 AND @SP_Add_RetCode = 0) 
	BEGIN 

	DECLARE @LS_BackUpScheduleUID	As uniqueidentifier 
	DECLARE @LS_BackUpScheduleID	AS int 


	EXEC msdb.dbo.sp_add_schedule 
			@schedule_name =N'LogShipping_NetTraffic_db_log' 
			,@enabled = 1 
			,@freq_type = 4 
			,@freq_interval = 1 
			,@freq_subday_type = 4 
			,@freq_subday_interval = 15 
			,@freq_recurrence_factor = 0 
			,@active_start_date = 20100616 
			,@active_end_date = 99991231 
			,@active_start_time = 20000 
			,@active_end_time = 230000 
			,@schedule_uid = @LS_BackUpScheduleUID OUTPUT 
			,@schedule_id = @LS_BackUpScheduleID OUTPUT 

	EXEC msdb.dbo.sp_attach_schedule 
			@job_id = @LS_BackupJobId 
			,@schedule_id = @LS_BackUpScheduleID  

	EXEC msdb.dbo.sp_update_job 
			@job_id = @LS_BackupJobId 
			,@enabled = 1 


	END 


        /*
            Replace hard-coded Job ID with value returned from master.dbo.sp_add_log_shipping_primary_database 
        */
	EXEC msdb.dbo.sp_update_job 
			--@job_id = N'46b4cca0-8420-4fbf-ae1e-087f4a7f712e' 
			 @job_id = @LS_BackupJobId
			,@enabled = 1 

	EXEC master.dbo.sp_add_log_shipping_alert_job 

	EXEC master.dbo.sp_add_log_shipping_primary_secondary 
			@primary_database = N'NetTraffic_db' 
			,@secondary_server = N'DBSECONDARY' 
			,@secondary_database = N'NetTraffic_db' 
			,@overwrite = 1 

	-- ****** End: Script to be run at Primary: [DBPRIMARY]  ******


rollback tran


Quick Explanation

  1. A quick explanation is that somehow the hard-coded Job ID ( passed to msdb.dbo.sp_update_job) does not match the actual Job ID
  2. To correct we are using the Job ID returned from the master.dbo.sp_add_log_shipping_primary_database function
  3. As a safety guard, we enclosed everything in a transaction block
  4. Once confirmed that no errors are returned, please replace rollback tran with commit tran

YouTube – Upload – “Stuck at Processing 95%”

Prelude

Trying to upload a video segment to YouTube, but the upload is stuck at 95%.

 

Error Message

Image

Processing95Percent

Textual

Processing 95%

 

Download Video Segment

Using ClipConverter.cc

Went to http://www.clipconverter.cc/ and entered the YouTube’s video URL, entered the begin and end time.

Here is the screen progress

Enter Video Segment

DownloadYouTubeVideoSegment

 

Confirm Video Segment

DownloadSegment

 

Conversion Progress

ConversionProgress

 

Conversion Successful

ConversionSuccessful-AVI

Save As

SaveAs

 

Upload Video Segment

Use YouTube

Nothing fanciful here, just using YouTube’s own Video Upload Utility  to upload the download AVI file.

No Error Message…Yet stuck at 95%

StuckAt95Percent

Btw, the URL for Youtube’s Upload is https://www.youtube.com/upload.

 

Workaround

A good workaround is to return back to http://www.clipconverter.cc/, enter the URL and Video’s start and end time.

But, use a different video download format.  In our case, we choose MP4 as Conversion Format.

Using ClipConverter.cc

Enter Video Specification

EnterVideoURL

 

Confirm Specification

ConfirmSpecification

 

Processing

Upload-Processing

Complete

UploadComplete

Summary

So http://www.clipconverter.cc/ works well to capture youtube videos.  But, one might have to try a different format.
We were trying AVI, but we needed to use MP4.

Monsieur M. Going Home

Prelude

Thursday was a tough day at the Office.

Here is why…

Hi everyone. Monsieur M’s health declined rapidly over the weekend. By Tuesday he was almost completely unable to stand or walk. He wasn’t in much pain, and he wasn’t very sad, he was just tired and weak. On Wednesday we went for a nice ride in the car—one of his absolute favorite things to do—and he ate a 3×2 and a cheeseburger from In-N-Out, fries and a strawberry milkshake, and two chocolate cupcakes. Then a mobile vet came to my house and put Monsieur M. to sleep. He died in my arms, peaceful and content, snoring like a baby.

Thank you all for being such good friends to Monsieur M. He loved you all, and he loved being the office dog here. He had a very full and happy life.

I’m sorry to bring you such terrible news today. Please feel absolutely free to talk to me about this whenever you want and in whatever way you want—I’m completely comfortable talking about it, although I will probably cry.

And, for the record, as soon as you all are ready, I’m ready to see other dogs in the office. I’m sure Monsieur R will be happy to have more time in the office, and I can’t wait to meet Ms. Sierra and Ms. Catherine dogs.

Love,

Monsieur N.

 

Listening

Smokey Robinson – Easy to Love

SQL Server – Query Plan – High Compile Times Queries

Background

Nothing original here. Looking for the the more complex SQL Queries in our system.

And, found this nice gem from Jonathan Keyhayias.

Code



/*
	You are here: Home >> Plan Cache >> Identifying High Compile Time Statements from the Plan Cache
	Identifying High Compile Time Statements from the Plan Cache
	-- Find high compile resource plans in the plan cache
*/


SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;
Declare @NumberofRecords int;

set @NumberofRecords = 1E3;

; WITH XMLNAMESPACES 
(DEFAULT 'http://schemas.microsoft.com/sqlserver/2004/07/showplan')
, cteQP
as
(

	SELECT 
			 QueryHash
				= c.value('xs:hexBinary(substring((@QueryHash)[1],3))', 'varbinary(max)') 

			, QueryPlanHash
				= c.value('xs:hexBinary(substring((@QueryPlanHash)[1],3))', 'varbinary(max)')

			, [statementID]
				= c.value('(@StatementId)[1]', 'int')

			, CompileTime_ms
				= c.value('(QueryPlan/@CompileTime)[1]', 'int') 

			, CompileCPU_ms
				= c.value('(QueryPlan/@CompileCPU)[1]', 'int') 

			, CompileMemory_KB
				= c.value('(QueryPlan/@CompileMemory)[1]', 'int') 

			, queryPlan
				= qp.query_plan

			, planHandle
				= cp.plan_handle

			, [Operation]
				=
				(
					cast(n.c.query('.') as xml)
				)

			, qp.[dbid]

			, qp.[objectid]

			, cp.cacheobjtype

			, cp.objtype

			, [referencedCounts]
				= cp.refcounts

			, [usedCounts]
				= cp.usecounts

			, [existSchemaSys]
				= qp.query_plan.exist('//ColumnReference[@Schema!="[sys]"]')

			, [existSchemaEmpty]
				= qp.query_plan.exist('//ColumnReference[@Schema!=""]')

			, [IsMissingIndex]
		        = 
                 CAST
                    (
					     qp.[query_plan].exist('//MissingIndexes') 
                            AS TINYINT
                    )

	FROM sys.dm_exec_cached_plans AS cp

	CROSS APPLY sys.dm_exec_query_plan(cp.plan_handle) AS qp

	CROSS APPLY qp.query_plan.nodes('ShowPlanXML/BatchSequence/Batch/Statements/StmtSimple') AS n(c)


)


, cteQPAggr
as
(

	SELECT 
			   QueryHash
				= max(QueryHash)

			, QueryPlanHash
				= max(QueryPlanHash)

			, [statementID]

			, CompileTime_ms
				= sum(CompileTime_ms)

			, CompileCPU_ms
				= sum(CompileCPU_ms)

			, CompileMemory_KB
				= sum(CompileMemory_KB)

			, queryPlan
				= max(cast(queryPlan as nvarchar(max)))

			, planHandle
				= planHandle

			, [Operation]
				= max(cast([Operation] as nvarchar(max)))

			, [dbid]
				= isNull([dbid], -1)

			, [objectid]
				= isNull([objectid], -1)

			, [cacheobjtype]

			, [objtype]

			, [referencedCounts]
				= sum([referencedCounts])

			, [usedCounts]
				= sum([usedCounts])

			, [existSchemaSys]
				= max(cast([existSchemaSys] as tinyint))

			, [existSchemaEmpty]
				= max(cast([existSchemaEmpty] as tinyint))

			, [IsMissingIndex]
		        = avg(cast(IsMissingIndex as tinyint))

	FROM cteQP

	group by

			  planHandle
			, [statementID]
			, isNull([dbid], -1)
			, isNull([objectid], -1)
			, [cacheobjtype]
			, objtype

)

, cteQueryStat
as
(
	SELECT

			  tblQS.plan_handle

			, tblQS.[sql_handle]

			, [StatementID]
				= ROW_NUMBER() OVER (
										PARTITION BY tblQS.plan_handle
										 ORDER BY tblQS.statement_start_offset
									)

			, tblQS.[execution_count]

			, tblQS.[total_worker_time]

			, tblQS.max_elapsed_time

			, tblQS.max_worker_time

			, tblQS.[total_elapsed_time]

			, tblQS.[statement_start_offset]

			, tblQS.[statement_end_offset]
	
	FROM sys.dm_exec_query_stats tblQS

)
SELECT 
		TOP ( @NumberofRecords)

		  [database] = db_name(qp.[dbid])

		, ObjectName
			 = 
				object_schema_name(qp.[objectid], qp.[dbid])
				+ '.'
				+ object_name(qp.[objectid], qp.[dbid])

		, [cacheobjtype]

		, ObjectType
			= qp.objtype

		, [statementID]
			= qp.[statementID]

		, [planHandle]	
			= qp.planhandle

		, [queryPlan]	
			= cast(qp.queryPlan as xml)

		, [Operation]
			= cast(qp.Operation as xml)

		, qp.[referencedCounts]

		, qp.[usedCounts]

		, executionCounts
			= qs.execution_count

		, qp.CompileTime_ms

		, qp.CompileCPU_ms

		, qp.CompileMemory_KB

		, duration_ms
			= qs.total_elapsed_time/1000

		, cputime_ms
			= qs.total_worker_time/1000

		, avg_duration_ms
			= (qs.total_elapsed_time/qs.execution_count)/1000

		, avg_cputime_ms
			= (qs.total_worker_time/qs.execution_count)/1000 

		, max_duration_ms
			= qs.max_elapsed_time/1000

		, max_cputime_ms
			= qs.max_worker_time/1000 


		, StmtText
			= 
				case objtype

					when 'Adhoc' then sqlTextPH.[text]

					else
							SUBSTRING
							(
									st.[text]
								, (qs.statement_start_offset / 2) + 1
										, (CASE qs.statement_end_offset
												WHEN -1 THEN DATALENGTH(st.text)
												ELSE qs.statement_end_offset
											END - qs.statement_start_offset) / 2 + 1
							) 

				end

		, [IsMissingIndex]

FROM cteQPAggr AS qp

JOIN cteQueryStat AS qs
	--ON qp.QueryHash = qs.query_hash
	ON qp.planhandle = qs.plan_handle
	and qp.statementID = qs.StatementID	

OUTER APPLY sys.dm_exec_sql_text(qs.[sql_handle]) AS st

OUTER APPLY sys.dm_exec_sql_text(qp.planHandle) sqlTextPH

WHERE qp.[existSchemaSys] = 1

ORDER BY 

	qp.CompileTime_ms DESC

OPTION(RECOMPILE, MAXDOP 1)


;

Revision

The changes were very, very miniature.

Our target Version is v2005 and unfortunately Query & Plan Fingerprints was not introduced till v2008.

And, so took Jonathan Kehayias nice lead on how to properly align sys.dm_exec_cached_plans & sys.dm_exec_query_plan and sys.dm_exec_query_stats.

Sample Queries

Here are some of the queries identified

SliceBuildingRuleParms


SELECT
  parm,
  value,
  ID
FROM SliceBuildingRuleParms
WHERE sliceType = 'IC'
AND ruleID = 3
AND inout = 'O'
AND ID IN (SELECT
  ID
FROM SliceBuildingRuleParms
WHERE sliceType = 'IC'
AND ruleID = 3
AND parm = 'copy_part_type'
AND value = 'PARAGRAPH'
AND ID IN (SELECT
  ID
FROM SliceBuildingRuleParms
WHERE sliceType = 'IC'
AND ruleID = 3
AND parm = 'copy_part_width'
AND value = 100
AND ID IN (SELECT
  ID
FROM SliceBuildingRuleParms
WHERE sliceType = 'IC'
AND ruleID = 3
AND parm = 'section_name_lines'
AND value = 1
AND ID IN (SELECT
  ID
FROM SliceBuildingRuleParms
WHERE sliceType = 'IC'
AND ruleID = 3
AND parm = 'copy_part_height'
AND value = 'FULL'
AND ID IN (SELECT
  ID
FROM SliceBuildingRuleParms
WHERE sliceType = 'IC'
AND ruleID = 3
AND parm = 'copy_part_length_MIN'
AND CONVERT(int, value) <= 719 AND ID IN (SELECT ID FROM SliceBuildingRuleParms WHERE sliceType = 'IC' AND ruleID = 3 AND parm = 'copy_part_length_MAX' AND CONVERT(int, value) >= 719))))))

 

Dedicated

Again, nothing original here.

Resting fully and solely on Jonathan Kehayias.

Transact – Avoid WorkTable in DisJointed Joins

Background

I am looking through a Stored Procedure that duplicates existing source records.

It is a self join and unfortunate the source records does not have logical relationships to the destination record; as the relationship is defined by the user and it is malleable.

Query

Original Query

Here is the original query

Code


	update [dbo].Student_BTWT 
	set 
	       paidhours=(
                            select paidhours 
                            from Student_BTWT 
                            where studentID=@tmpstudentID
                         )

	where studentID=@studentID

Query Plan

QueryPlan


Revised Query – Inner Join

Code


update tblBTWT

set     paidhours= tblBTWTParent.paidHours

from  [dbo].Student_BTWT tblBTWT

inner join [dbo].Student_BTWT tblBTWTParent

     on tblBTWTParent.studentID = @tmpstudentID

where tblBTWT.studentID = @studentID

Query Plan

QueryPlanInnerJoin

Revised – Cross Apply /Windowing Function

Code


update tblBTWT

set    paidhours= tblBTWTParent.paidHours

from  [dbo].Student_BTWT tblBTWT

cross apply
	(
	    select 
	       tblBTWTParent.*
	     , rn = ROW_NUMBER() OVER(ORDER BY [refID] DESC)

	    from   [dbo].Student_BTWT tblBTWTParent

	    where tblBTWTParent.studentID = @tmpstudentID

	) tblBTWTParent

where   tblBTWT.studentID = @studentID
and     tblBTWTParent.rn=1


Query Plan

CrossApplyWindowingFunction

Statistics I/O

Picture

StatisticsIO

 

Tabulated

 

 

Query Table Scan Count Logical Reads
Original Query Actual Table  2  9
 Worktable  1  5
Inner Join Actual Table  2  9
 Worktable
Cross Apply Actual Table  2  9
 Worktable

Summary

The original query uses a worktable to store values from the subselect as well as a Table Spool, unfortunately an “Eager Spool“.

Our corrected options does not need worktables nor spools.

But, we are notified that we do not have real joins by the presence of the “No Join Predicate” warning.

 

References

  1. Using CROSS APPLY to optimize joins on BETWEEN conditions
    http://sqlblog.com/blogs/alexander_kuznetsov/archive/2009/07/07/using-cross-apply-to-optimize-joins-on-between-conditions.aspx

  2. When should I use Cross Apply over Inner Join?
    http://stackoverflow.com/questions/1139160/when-should-i-use-cross-apply-over-inner-join