SQL Tip: Identifying Column store index potential candidate tables in your databases for major query performance improvements

Column Store indexes appeared in SQL Server 2012 and by 2016 and beyond have progressed much further allowing their usage with fewer restrictions than before and this new storage mechanism for indexes and data provides significant gains for performance especially though not only in Data-warehouse workloads.

There are certain criteria defined to help Database professionals to see if CCI are an option that should be tried on their database tables for performance gains.

Recently SQL Server Tiger team at Microsoft wrote a nice little script incorporating these selection criteria which will simply list the tables that should be considered on each database this script is run.

This script makes sure that the table meets the criteria of size, data types allowance, type of DML being executed and workload on these tables.

Obviously, with any real scenario, things can be different or way too complex than this but the script is a great starting point for someone like me who was starting to think of ways to identify potential candidates to use this significantly improved new tool.

I will post some stuff separately about basics of Column Store Indexes and kind of performance improvements.

Without further ado, this is the script with ample comments in it to be understood thankfully put in by the creators.


-------------------------------------------------------
-- The queries below need to be executed per database.
-- Also, please make sure that your workload has run for
-- couple of days or its full cycle including ETL etc
-- to capture the relevant operational stats
-------------------------------------------------------
-- picking the tables that qualify CCI
-- Key logic is
-- (a) Table does not have CCI
-- (b) At least one partition has > 1 million rows and does not have unsupported types for CCI
-- (c) Range queries account for > 50% of all operations
-- (d) DML Update/Delete operations < 10% of all operations
select table_id, table_name
from (select quotename(object_schema_name(dmv_ops_stats.object_id)) + N'.' + quotename(object_name (dmv_ops_stats.object_id)) as table_name,
dmv_ops_stats.object_id as table_id,
SUM (leaf_delete_count + leaf_ghost_count + leaf_update_count) as total_DelUpd_count,
SUM (leaf_delete_count + leaf_update_count + leaf_insert_count + leaf_ghost_count) as total_DML_count,
SUM (range_scan_count + singleton_lookup_count) as total_query_count,
SUM (range_scan_count) as range_scan_count
from sys.dm_db_index_operational_stats (db_id(),
null,
null, null) as dmv_ops_stats
where (index_id = 0 or index_id = 1)
AND dmv_ops_stats.object_id in (select distinct object_id
from sys.partitions p
where data_compression = 1048576
AND object_id in (select distinct object_id
from sys.partitions p, sysobjects o
where o.type = 'u' and p.object_id = o.id))
AND dmv_ops_stats.object_id not in ( select distinct object_id
from sys.columns
where user_type_id IN (34, 35, 241)
OR ((user_type_id = 165 OR user_type_id = 167) and max_length = -1))
group by dmv_ops_stats.object_id
) summary_table
where ((total_DelUpd_count * 100.0/(total_DML_count + 1) 50.0))

For further details see https://blogs.msdn.microsoft.com/sql_server_team/columnstore-index-how-do-i-find-tables-that-can-benefit-from-clustered-columnstore-index/

Tagged with: ,
Posted in Column Store Indexes, SQL Server, sql server performance, Uncategorized

SQL Monitoring Solution Development – SQL Performance Dashboard Reports – Part 2 Customisation for centralised remote monitoring of SQL instances

In Part 1 of this series on SQL Server Performance Monitoring Dashboard I provided an intro.

Now what we would like to do is to be able to run these reports remotely… without logging into a server or ever connecting to them via SQL Server Management Studio. More like point and click fashion of SSRS reports.

This is an important project and if done correctly it can save your shop a good amount of money spent on monitoring tools available off the shelf. Some costing around hundreds of pounds per instance.

In Future parts, we will proceed to further monitoring, baselining and other aspects. I will create a post tracking all these steps in a single place shortly.

Now if at any point you find anything confusing. Feel free to contact me.

For this what we need to do roughly is…

  1. Download the reports
  2. Change connection parameters to be dynamic via parameters
  3. Upload them to a central SSRS server.
  4. Create supporting objects and code in the servers you want to monitor once via deployment script.
  5. Browse the report and see data via the central SSRS server and input the SQL instance names by providing it as the parameter value.

And…

6. Create a central dashboard with List of servers and pass instance names in links to the main dashboard to have a smooth point click and drill down experience with reports. But I will cover this in Part 3.

Ok let’s go over these steps one by one

  • Download the reports: Use the links in  Part 1
  • Change connection parameters to be dynamic via parameters. Refer to Fig A at bottom
  • Upload them to a central SSRS server.
  • Create supporting objects and code in the servers you want to monitor once via deployment script.
  • Browse the report and see data via the central SSRS server and input the SQL instance names by providing it as the parameter value.

Dashboardreport params

 

Fig A: Showing use of parameters to form a dynamic connection string

 

 

 

Tagged with: ,
Posted in Monitoring, SQL Server, SQL Server Administration, sql server performance

How to get GDPR compliant with Microsoft

When the EU’s GDPR (General Data Protection Regulation) is introduced next year, the bar for data privacy protection in the UK will be raised.

For thousands of businesses GDPR highlights daunting issues of compliance. Organisations that fail to comply with the regulation will face massive fines: up to £17 million or four percent of global revenue, whichever is higher.

The May 25, 2018, deadline for GDPR compliance is drawing closer but many businesses, including some Microsoft Partners and customers, are unsure where to begin. Businesses cannot afford to waste time in GDPR preparation. Likewise, Microsoft Partners cannot waste the opportunity to help their customers get compliant.

As part of Inspire 2017, Microsoft’s annual global partner conference, Microsoft has released a set of assets that will help businesses and Microsoft partners achieve GDPR compliance. Let’s take a look at the new resources (and some older ones which deserve a mention).

Read the rest at.

https://blogs.technet.microsoft.com/uktechnet/2017/08/01/how-to-get-gdpr-compliant-with-microsoft/

 

 

 

Tagged with: , ,
Posted in gdpr, security, SQL Server, SQL Server Administration, Uncategorized

SQL Tip: Memory limit too low. Instance failing to start

4527.ConnectionErrorQuick note.

SQL startup parameters set mistakenly by someone to be too low and instance is failing to start.

Page/memory allocation errors in startup log of various types…

Add startup parameter ‘-f’ and start the instance and change the memory cap and restart service.

Tagged with: , ,
Posted in SQL Server, SQL Server Administration, SQL Tips, Uncategorized

SQL Server 2016 – Always Encrypted and the GDPR

Matthew McGiffen DBA

The European General Data Protection Regulation (GDPR) is coming, bringing new rules about the protection of Personally Identifiable Information (PII).

For those of us in Europe – or wishing to sell software products and solutions within Europe – this is a big deal. It’s going to mean an increased focus on data protection, and the potential for large penalties in the case of data breaches. Those penalties are also increased if we can’t show that we have taken sufficient measures to protect that data.

This affects us where we host applications and data for our clients, but also when we sell software to clients to use on-premise. They are going to want to be sure that the data in those applications is as secure as possible.

Security infrastructure and data governance are going to be of critical importance to everyone, but for software vendors it seems likely that Encryption is…

View original post 718 more words

Posted in Uncategorized

SQL Central Management Server – Definition, Benefits and setup

Whether you manage one SQL Server instance or hundreds, you need to have a way to keep track of them all.

Just like best-practices when it comes to security, it would be nice to create logical groups of instances and manage them based on their group membership. Perhaps also inspect, or perform actions against groups of instances:

  • Run queries, including management-type activities like creating or running jobs.
  • Check if the settings match a pre-determined configuration, and if not, fix the settings

Both of the above can be achieved via Registered Servers and Central Management Servers and here’s their comparison

Registered Servers Central Management Server
Storage Method XML file in Windows roaming profile msdb database
List Visibility Current user only Everyone who is authorized (view nothing, view all, edit all)
Security of Instance List Based on Windows user context Managed by SQL Server permissions, and msdb database roles
Types of Instances Database engine, SSAS, SSIS, SSRS Database engine only
Connection String Authentication Windows Authentication and SQL Server authentication Windows Authentication only

Source: http://voluntarydba.com/post/2012/11/21/Registered-Servers-and-Central-Management-Server-Setup.aspx

To make a DBAs life easier… Central management servers store a list of instances of SQL Server that is organized into one or more central management server groups. Actions that are taken by using a central management server group act on all servers in the server group. This includes connecting to servers by using Object Explorer and executing Transact-SQL statements and Policy-Based Management policies on multiple servers at the same time.

Versions of SQL Server that are earlier than SQL Server 2008 cannot be designated as a central management server.

Setting up
The following procedures describe how to perform the following steps.

  1. Create a central management server.
  2. Add one or more server groups to the central management server and add one or more registered servers to the server groups.

Create a central management server

  1. In SQL Server Management Studio, on the View menu, click Registered Servers.
  2. In Registered Servers, expand Database Engine, right-click Central Management Servers, and then click Register Central Management Server.
  3. In the New Server Registration dialog box, select the instance of SQL Server that you want to become the central management server from the drop-down list of servers. You must use Windows Authentication for the central management server.
  4. In Registered Server, enter a server name and optional description.
  5. From the Connection Properties tab, review or modifiy the network and connection properties. For more information, see Connect to Server (Connection Properties Page) Database Engine
  6. Click Test, to test the connection.
  7. Click Save. The instance of SQL Server will appear under the Central Management Servers folder.

Create a new server group and add servers to the group

  1. From Registered Servers, expand Central Management Servers. Right-click the instance of SQL Server added in the procedure above and select New Server Group.
  2. In New Server Group Properties, enter a group name and optional description.
  3. From Registered Servers, right-click the server group and click New Server Registration.
  4. From New Server Registration, select an instance of SQL Server. For more information, see Create a New Registered Server (SQL Server Management Studio). Add more servers as appropriate.

To execute queries against several configuration targets at the same time

Tagged with: ,
Posted in SQL Server, SQL Server Administration, Uncategorized

SQL Tip – Comma delimited list from a table column

If you want to extract query result of a column to a variable as Comma separated values list use the following example and modify accordingly.


declare @Dblist varchar (max)
set @Dblist = ''

select @Dblist =
case when @Dblist = ''
then Name
else @Dblist + coalesce(',' + Name, '')
end
from sys.databases dbs
where dbs.is_read_only = 0

In above example tip we collect the names of databases on the server which are not in read-only mode as a comma values string so it could be fed to another Stored procedure to do Index maintenance.

Tagged with:
Posted in sql, SQL Server, tsql

Microsoft SQL Server Performance Dashboard Reports – 1 – Intro & Default Install

Introduction
The SQL Server 2012 Performance Dashboard Reports are Reporting Services report files designed to be used with the Custom Reports feature of SQL Server Management Studio.

The SQL Server 2012 Performance Dashboard Reports are Reporting Services report files designed to be used with the Custom Reports feature of SQL Server Management Studio. The reports allow a database administrator to quickly identify whether there is a current bottleneck on their system and if a bottleneck is present, capture additional diagnostic data that may be necessary to resolve the problem.

Common performance problems that the dashboard reports may help to resolve to include:
CPU bottlenecks (and what queries are consuming the most CPU)
IO bottlenecks (and what queries are performing the most IO)
Index recommendations generated by the query optimizer (missing indexes)
Blocking
Latch contention
The information captured in the reports is retrieved from SQL Server’s dynamic management views. There is no additional tracing or data capture required, which means the information is always available and this is a very inexpensive means of monitoring your server.

Reporting Services is not required to be installed to use the Performance Dashboard Reports.

Download them from here: https://www.microsoft.com/en-us/download/details.aspx?id=29063

Configuring SQL Server 2012 Performance Dashboard Reports

After finishing the installation, it is necessary to configure the SQL Server to use the Performance Dashboard toolset you have just installed. The default installation directory for SQL Server 2012 Performance Dashboard Reports is ‘C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Performance Dashboard\’. Navigate to this location, search for the ‘setup.sql’ file, double-click it to open the Performance Dashboard Reports in SQL Server Management Studio. Next, establish a connection to the SQL Server Database Engine instance where you need to install and access the reports. Hit ‘F5’ to start configuring the Performance Dashboard Reports.

Tagged with: , , , , ,
Posted in reporting, sql, SQL Server, sql server performance

Database cannot be opened because it is version 782. This server supports version 706 and earlier. A downgrade path is not supported.

This is the error message or similar to it when you try to attach a database from a more advanced version of SQL Server to an old one.

To resolved
1. Try to locate a SQL Server Version most advanced in your environment on any other server / PC
3. Attach the .mdf file
4. Use one of the following methods to create a database with same schema on the intended target server
a) MS Visual Studio SQL Server Tools
b) SQL Server data export and import from SSMS

Visual Studio tools are good for schema comparisons but sometimes breakdown on large data sometimes, vice versa for Data import export with SSMS.

Have tried this on SQL 2012 onwards.

Posted in Uncategorized

PAGE_VERIFY Database Option

Short story…

PAGE_VERIFY Database Option should be set to “CHECKSUM” . This will reduce your chances of data corruption significantly and has (almost) no performance impact. That almost means a little CPU overhead but yes you should not jeopardize the intergrity of the data being stored for any cost obviously. Yes / No ? Read ahead if the answer is Yes or read ahead anyway since I have type some more stuff.

Background

PAGE_VERIFY Database Option was introduced in SQL server 2005 with three possible values. All the values other than CHECKSUM are basically for backward compatibility but aren’t a wise choice as data write operations do not get the benefit of highest level of verification

This is the recommendation from Microsoft to all major SQL Server MVPs and greats including yours truly (me..  no harm in dreaming).

Posted in Uncategorized
%d bloggers like this: