Technical: Quest Software – Benchmark Factory for Databases

 

Technical: Quest Software – Benchmark Factory for Databases – Use Case Scenario – Quick Evaluation of MS SQL Server v2012 – InMemory Database – Simple Table and Programmable Objects

 

Background

Wanted a quick familiarization of MS SQL Server v2012 (In Memory Database) tables and programmable objects.

 

Tools

There are a couple of tools in the market.  Our choice includes:

  • Microsoft – RML Utilities – ostress.exe
  • Quest Software – Benchmark factory for Databases
  • Quest Software – Benchmark factory for Databases Freeware

As this is for home evaluation, was happy to find out that Quest has made a freeware version available.

In this post, We will focus our attention on this tool.

 

Download

Downloaded “Benchmark Factory for Databases Freeware” from http://www.toadworld.com/m/freeware/555.aspx .  Please keep in mind that you need a Quest Account.  As it is easy and free to get one, please register and get one, if you do not have an existing account.

Depending on when you attempt the download the exact download link might have changed. And, if so I will suggest that you visit http://www.toadworld.com/products/benchmark-factory/default.aspx for the more general product web site.

 

Environment Setup – Microsoft SQL Server

On a MS SQL Server 2014 Instance, Let us setup our SQL database objects.

 

Add File Group – In Memory

Check sys.filegroups and see if there are any entries with type = FX. If not, then create one.


set nocount on
set noexec off
go

use [DBLabInMemory]
go

/*
  Check sys.filegroups for type=FX -- HasMemoryOptimizedObjects
*/
if exists
	(
		select top 100 percent
				   tblFileGroup.name
				,  tblFileGroup.[type] 
				, tblFileGroup.type_desc
		from   sys.filegroups tblFileGroup
		where  tblFileGroup.[type] = 'FX'
		order  by tblFileGroup.[type]
	)
begin
	print 'Database - FileGroup - fileGroupInMemoryFileGroup exists!'
	set noexec on
end
go


ALTER DATABASE [DBLabInMemory]
	ADD FILEGROUP fileGroupInMemoryFileGroup
		CONTAINS MEMORY_OPTIMIZED_DATA 
		;

go

set noexec off
go



 

 

Add In-Memory File to File Group (Memory_OPTIMIZED_DATA)

Add in-memory file to file group…


set nocount on
set noexec off

use [DBLabInMemory]
go


if not exists
 (
 select *
 from master.sys.sysdatabases
 where name = 'DBLabInMemory'
 )
begin
    print 'Database - DBLabInMemory does not exists!'
    raiserror( 'Database - DBLabInMemory does not exists!', 16, 1)
    set noexec on
end
go


if exists
 (
     select *
     from sys.master_files
     where database_id = db_id() --db_id('DBLabInMemory')
     and (
                (name = 'datafile_InMemory_0001')
             or (physical_name = 
 'E:\Microsoft\SQLServer\DatafilesInMemory\DBLabInMemory\datafile___InMemory__0001.ndf'
                )
          )
   )
begin

 print 'Database - DBLabInMemory -- file already exists!'
 raiserror( 'Database - DBLabInMemory -- file already exists!', 16, 1)
 set noexec on
end
go

ALTER DATABASE [DBLabInMemory]
 ADD FILE 
 (
    name='datafile_InMemory_0001'
 , filename='E:\Microsoft\SQLServer\DatafilesInMemory\DBLabInMemory\datafile___InMemory__0001.ndf'
 ) 
 TO FILEGROUP [fileGroupInMemoryFileGroup]


go

set noexec off
go


 

Create Traditional Table – dbo.customerTraditional



USE [DBLabInMemory]
GO


CREATE TABLE [dbo].[CustomerTraditional]
(
	[ID] [int] NULL,
	[Firstname] [varchar](50) NULL,
	[Lastname] [varchar](50) NULL,
	[EmailAddress] [varchar](50) NULL,
	[Country] [varchar](50) NULL,
	[IPAddress] [varchar](20) NULL,
	[Company] [varchar](50) NULL,
	[City] [varchar](50) NULL,
	[PhoneWork] [varchar](50) NULL,
	[StreetAddress] [varchar](50) NULL,
	[Username] [varchar](50) NULL,
	[UniqueID] [varchar](40) NOT NULL,
	 CONSTRAINT [PK_UniqueID] PRIMARY KEY CLUSTERED 
	(
		[UniqueID] ASC
	)WITH (
                  PAD_INDEX = OFF
                , STATISTICS_NORECOMPUTE = OFF
                , IGNORE_DUP_KEY = OFF
                , ALLOW_ROW_LOCKS = ON
                , ALLOW_PAGE_LOCKS = ON
              )
	      ON [PRIMARY]

) ON [PRIMARY]

GO


 

 

Create Traditional Table – dbo.customerInMemory

create In-Memory table (dbo.customerInMemory)


USE [DBLabInMemory]
GO


CREATE TABLE [dbo].[CustomerInMemory]
(
 [ID] [int] NULL,
 [Firstname] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [Lastname] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [EmailAddress] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [Country] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [IPAddress] [varchar](20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [Company] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [City] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [PhoneWork] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [StreetAddress] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [Username] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [UniqueID] [varchar](40) COLLATE Latin1_General_100_BIN2 NOT NULL,

    PRIMARY KEY NONCLUSTERED HASH 
      (
        [UniqueID]
      ) WITH ( BUCKET_COUNT = 256)

)
WITH  
   ( 
       MEMORY_OPTIMIZED = ON
    , DURABILITY = SCHEMA_AND_DATA 
   )

GO

SET ANSI_PADDING OFF
GO
 

 

 

Create Stored Procedure dbo.usp_InsertCustomerTraditional

SP – dbo.usp_InsertCustomerTraditional


USE [DBLabInMemory]
GO

if object_id('[dbo].[usp_InsertCustomerTraditional]') is null
begin
 exec ('create procedure [dbo].[usp_InsertCustomerTraditional] as begin print ''undefined'' end ')
end
GO


ALTER PROCEDURE [dbo].[usp_InsertCustomerTraditional]
(
 @ID int 
 ,@Firstname varchar(50)
 ,@Lastname varchar(50)
 ,@EmailAddress varchar(50)
 ,@Country varchar(50)
 ,@IPAddress varchar(20) = null
 ,@Company varchar(50) = null
 ,@City varchar(50) = null
 ,@PhoneWork varchar(50) = null
 ,@StreetAddress varchar(50) = null
 ,@Username varchar(50) = null
 ,@UniqueID varchar(50) = null
)
with execute as owner
as 
begin

 insert into [dbo].[CustomerTraditional]
 (
 [ID]
 ,[Firstname]
 ,[Lastname]
 ,[EmailAddress]
 ,[Country]
 ,[IPAddress]
 ,[Company]
 ,[City]
 ,[PhoneWork]
 ,[StreetAddress]
 ,[Username]
 ,[UniqueID]
 )
 values
 (
 @ID
 ,@Firstname
 ,@Lastname
 ,@EmailAddress
 ,@Country
 ,@IPAddress
 ,@Company
 ,@City
 ,@PhoneWork
 ,@StreetAddress
 ,@Username
 ,NEWID() --@UniqueID
 )

end

GO 

 

 

Create Stored Procedure dbo.usp_InsertCustomerInMemory

SP – dbo.usp_InsertCustomerInMemory


USE [DBLabInMemory]
GO

if object_id('[dbo].[usp_InsertCustomerInMemory]') is not null
begin
  DROP PROCEDURE [dbo].[usp_InsertCustomerInMemory]
end
GO

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO


create procedure [dbo].[usp_InsertCustomerInMemory]
(
  @ID int 
 ,@Firstname varchar(50)
 ,@Lastname varchar(50)
 ,@EmailAddress varchar(50)
 ,@Country varchar(50)
 ,@IPAddress varchar(20) = null
 ,@Company varchar(50) = null
 ,@City varchar(50) = null
 ,@PhoneWork varchar(50) = null
 ,@StreetAddress varchar(50) = null
 ,@Username varchar(50) = null
 ,@UniqueID varchar(50) = null
)
with native_compilation, schemabinding, execute as owner
as 
begin atomic with
(transaction isolation level = snapshot,
language = N'English')

 
 insert into [dbo].[CustomerInMemory]
 (
  [ID]
 ,[Firstname]
 ,[Lastname]
 ,[EmailAddress]
 ,[Country]
 ,[IPAddress]
 ,[Company]
 ,[City]
 ,[PhoneWork]
 ,[StreetAddress]
 ,[Username]
 ,[UniqueID]
 )
 values
 (
 @ID
 ,@Firstname
 ,@Lastname
 ,@EmailAddress
 ,@Country
 ,@IPAddress
 ,@Company
 ,@City
 ,@PhoneWork
 ,@StreetAddress
 ,@Username
 ,NEWID() --@UniqueID
 )

end

GO



 

 

 

Installation

Installation is straight-forward and it is supportable on my old MS Windows 2003 x86 box.

 

Usage

Launch product

 

Create Scenario

 

Welcome

Here is welcome screen

Welcome

 

Select Load Scenario

As we are targeting a “Microsoft SQL Server” database, chose that Database (Microsoft SQL Server) in the Database drop-down.

 

SelectLoadScenario__MicrosoftSQLServer

 

We have a basic choice of Test Types that we can run.  Our basic choices are:

  • Industry Standard benchmarks
  • Test Database scalability
  • Test SQL for scalability
  • Capture and Replay SQL Server workload

Here is Quest’s explanation for each Scenario:

Scenario Explanation
Industry Standard benchmarks Creates load scenarios from industry standard benchmarks (AS3AP, Scalable Hardware, TPC-C, TPC-B, TPC-D, TPC-E, TPC-H). This includes the steps to create/load all the required benchmark objects to execute the standard benchmark workloads. These synthetic workloads can be used when no real-world data is available to be imported for database load testing.
Test Database scalability Creates load scenarios to test database scalability using OLTP, database warehousing, and mixed workloads. This utilizes benchmark objects and transactions from the TPC-C and the TPC-H benchmarks.
Test SQL for scalability A SQL Scalability test compares the performance of SQL Statement variations under a load. Each transaction will execute individually for each user load for a specified number of executions or length of time.
Capture and Replay SQL Server workload Create a workload which replays a captured MSSQL workload from obtained trace file from a shared directory.

 

 

After a quick think, here is our understanding of the various items:

  • “Industry Standard benchmarks” / “Test Database Scalability”  — We will choose one of this options, if we wanted to run a standard tests against our server
    • In our case, we want a very simple test
    • Also, in its initial iteration, Microsoft In-Memory database is a hybrid product and does not have the option of implicitly defining all objects as in-memory
  • Test SQL for Scalability — supports our need of having a small well defined tables and programmable objects, and SQL queries
  • Capture and Replay SQL Server workload — Choose this option if you have captured SQL Server Trace files using SQL Server Profiler or Server side trace.

 

Please consider clicking on the “Show Advanced” check-box, if you will like to see and review additional options.

Again, we will choose “Test SQL for Scalability”

Define Iteration Length

The next screen depends on which Scenario you chose earlier.  As we chose “Test SQL for Scalability”, we are now in the “Define Iteration Length” window.

DefineIterationLength_v2

 

We chose the “Number of Executions per iteration” option and changed from the default of 5 to 1000 executions.

This basically means that we will cycle through our workload 1000 times.

We will skip the step of adding new User Scenario / adding SQL Statement at this time.

Measurement Intervals

The “Measurement Intervals” screen basically defines our virtual workload user; in essence the number of virtual users that we will have at every stage.  Keeping in mind that we ramp up gradually.

Before:

MeasurementIntervals_Before

 

After:

MeasurementIntervals

 

Select new job to

As always choose a good, associative name for your job.  We choose “MicrosoftSQLServer-InMemory-SimpleTableAndProgrammableObjects“.

 

 

SelectNewJobTo_v2

 

Congratulations

Congratulations

 

 

Define Profile

Empty Job View

We are taken to an an empty job view screen.  And, we will now define our Job.

JobsView

 

 

Define Profile

Profiles are basically targeted Database Connections.

Let us go define a new one and target our MS SQL Server Instance.

We initiate that process by clicking on the “Add Profile” main menu option.

 

Welcome

welcome

 

Driver Selection

We choose “Microsoft SQL Server (ODBC)”

DriverSelection

 

Connection Information

Select Data Source Name, enter SQL Server Username, password.

 

ConnectionInformation

 

Profile Name

Please enter a name for your Profile.  In our case, we chose MSSQLInMemory.

ProfileName

 

Congratulations

You can go back and review your selections and once you ‘re good, please click on the Finish button.

Congratulations

 

Define Scenario

Let us return to defining our Scenario.

 

Specify Profile

Access the In the “Job Setup” \ “General” Tab, choose your target profile.

selectProfile

 

Load Setup \ Transactions

Access the In the “Load Setup” \ “Transactions” Tab and at this stage of our work, you will likely see an empty panel, free of any defined Transactions.

Empty Transactions Panel

LoadSetup-emptyPanel

 

Add Transactions

Right click on the empty panel, add from the drop-down menu, please choose to “New Transactions” option.

AddTransaction

 

SQL Statement
INSERT INTO [dbo].[CustomerTraditional]
(
[ID],[Firstname],[Lastname],[EmailAddress]
,[Country],[IPAddress],[Company]
,[City],[PhoneWork],[StreetAddress]
,[Username],[UniqueID]
)
select
$BFRand(1000000)
, ‘$BFLeft($BFFirstName(),20)’
, ‘$BFLeft($BFLastName(),20)’
, ‘$BFLeft($BFEMail(), 50)’
, ‘$BFCountry()’
, null
, ‘$BFLeft($BFCompany(),50)’
, ‘$BFLeft($BFCity(), 50)’
, ‘$BFLeft($BFPhone(), 50)’
, ‘$BFLeft($BFAddress(), 50)’
, ‘$BFLeft($BFFirstName(),20)’
, NEWID()

 

I will suggest that you use the ExecuteStatementto execute and validate the SQL.

 

Real World Latencies

Here we specify the latency between each transaction

RealWorldLatencies

 

Congratulations

Click OK to continue

Congratulations

Add Transactions – Stored Procedure – Invocation of traditional Stored Procedure

Right click on the newly added Transaction, add from the drop-down menu, please choose to “New Transactions” option.

Here is us invoking our Stored Procedure (dbo.usp_InsertCustomerTraditional)

SQL Statement
exec dbo.usp_InsertCustomerTraditional
$BFRand(1000000)
, ‘$BFLeft($BFFirstName(),20)’
, ‘$BFLeft($BFLastName(),20)’
, ‘$BFLeft($BFEMail(), 50)’
, ‘$BFCountry()’
, null
, ‘$BFLeft($BFCompany(),50)’
, ‘$BFLeft($BFCity(), 50)’
, ‘$BFLeft($BFPhone(), 50)’
, ‘$BFLeft($BFAddress(), 50)’
, ‘$BFLeft($BFFirstName(),20)’
, null — uniqueID
Add Transactions – Sql Statement – Insert into in-memory table

Right click on the newly added Transaction, add from the drop-down menu, please choose to “New Transactions” option.

 

SQL Statement
INSERT INTO [dbo].[CustomerInMemory]
(
[ID],[Firstname],[Lastname],[EmailAddress]
,[Country],[IPAddress],[Company]
,[City],[PhoneWork],[StreetAddress]
,[Username],[UniqueID]
)
select
$BFRand(1000000)
, ‘$BFLeft($BFFirstName(),20)’
, ‘$BFLeft($BFLastName(),20)’
, ‘$BFLeft($BFEMail(), 50)’
, ‘$BFCountry()’
, null
, ‘$BFLeft($BFCompany(),50)’
, ‘$BFLeft($BFCity(), 50)’
, ‘$BFLeft($BFPhone(), 50)’
, ‘$BFLeft($BFAddress(), 50)’
, ‘$BFLeft($BFFirstName(),20)’
, NEWID()
Add Transactions – Stored Procedure – Invocation of In-Memory Stored Procedure

Right click on the newly added Transaction, add from the drop-down menu, please choose to “New Transactions” option.

Here is us invoking our Stored Procedure (dbo.usp_InsertCustomerInMemory)

SQL Statement
exec dbo.usp_InsertCustomerInMemory
$BFRand(1000000)
, ‘$BFLeft($BFFirstName(),20)’
, ‘$BFLeft($BFLastName(),20)’
, ‘$BFLeft($BFEMail(), 50)’
, ‘$BFCountry()’
, null
, ‘$BFLeft($BFCompany(),50)’
, ‘$BFLeft($BFCity(), 50)’
, ‘$BFLeft($BFPhone(), 50)’
, ‘$BFLeft($BFAddress(), 50)’
, ‘$BFLeft($BFFirstName(),20)’
, null — uniqueID

 

 

Run Job

Once we have defined our transactions and attached our job to a profile (target database), we can run the job.

To do so we can use the Menu (Jobs \ Run)  or  use the top menu icon:

runjob

 

Review Job

We can review our running job via a few lenses:

  • Internally
  • SQL Server Profiler

Benchmark

Within the Jobs view, access the “Run Job” \ “Summary” tab.

As we starting out, we will see that we are at our initial userload of 1.

Userload of 1

reviewRunningJob-Userload-1

Userload of 10

reviewRunningJob-Userload-10

As we have increased our userload to 10, we have new metrics:

  • Increased Transaction per second (TPS) from 0.97 to 9.83
  • Our average time is @ 0.007

Userload of 20

reviewRunningJob-Userload-20

 

As we have increased our userload to 20, we have new metrics:

  • Increased Transaction per second (TPS) to 17.24
  • Our average time is down a bit to 0.123

 

Database Profiler

SQL Server Profiler

We will use SQL Server’s native profiling tool, SQL Server Profiler, and try to get an internal view of how are statements are being executed.

Here is what a traditional SQL Insert statement looks like.

SQLStatment-InsertIntoCustomerTraditional

 

Hopefully you will notice a few things:

  • Our SQL Statements are prepared
  • As the SQL Statements are being prepared, they will be a bit faster than unprepared SQL Statements

 

See Results

Depending on the complexity of your transactions, the number of transactions you will like to run, and the workload sizes, it might take time to run your job.

To see the current result and pass results, please access the “Job View” \ “See Results” panel.

To view reports, please choose the report and click on the viewReport icon.

You can also select the Report, right click on your selection and choose the “Show Test Results”.

 

Results Summary

Our summary shows the best performing transaction is the “SQLStatementSP-InMemory”.

Transaction Time (baseline)

 

Results Summary – Transaction Time

I think we should dig deeper and so double-clicked on the graph above.

Result Summary Graph - Trasaction Time

 

The graph above demonstrates that In-Memory transactions have appreciably less transaction time than traditional transactions.

 

Results – Transactions per second

Transactions per second

 

As we expected as we increased the user-load, we process more transactions.

 

Results – Deadlocks

Thankfully, we are experiencing zero deadlocks.

Relative Summary Graph - Deadlocks

 

 

Report – Export To Word

Once you have a report up, you might want to export the report to Word.

To do so, please click on the “W” icon at the top of a report.

exportReportToWord

Unfortunately for us, we received an error message:

Textual:

Please make sure the Microsoft Word is installed correctly.  And, the active X control is enabled.

Image:

PleaseMakeSureTheMicrosoftWordIsInstalledCorrectly - Add the ActiveX control is enabled

 

I download Microsoft Word Viewer.  Upon trying to install it, ran into more problems.

Textual:

Please wait while Setup finishes determining your disk space requirements.

Image:

pleaseWait

 

To fix try:

wordview_en-us.exe /passive /norestart

This means that you will not be prompted for a destination folder, etc.

The install worked.

Returned to the report viewer but still no go!  I must need a full version of the product.

 

Report – Export To PDF

Downloaded FoxIT Reader.  Installed the application and was able to return to Quest Benchmark Factory and print the report to the FoxIT printer.

SelectPrinter

 

Reviewed generated PDF File.  But, as it is stale and unable to dig deep into charts.

 

Source Control

GitHub

A few of the files have been pushed onto Github.

The URL:
https://github.com/DanielAdeniji/QuestBenchmarkFactoryMSSQLInMemorySimple

 

Summary

I am really very impressed with Quest Benchmark factory.

It has a very rich test toolset for generating test data.

And, it has a firm foundation for packaging the payload and applying them to a database.

And, it prepares very well polished and applicable report.

3 thoughts on “Technical: Quest Software – Benchmark Factory for Databases

    • Kevin:

      Thanks so much for reading through.

      I thoroughly exercised your product and found nothing else in the market that comes close in terms of thoughtfulness, intuitiveness, and thoroughness.

      Please keep up the good, solid work and community evangelizing.

      Will download the update and I am sure I with find new gems in it.

      Continue to be well,

      Daniel

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s