performance
144 TopicsEffectively troubleshoot latency in SQL Server Transactional replication: Part 1
Are you struggling with latency issues in SQL Server Transactional replication? This article provides clear, step-by-step instructions to effectively troubleshoot and resolve these challenges. Dive into proven techniques and best practices that will help you enhance your SQL Server's performance and ensure seamless data replication. Don't let latency slow you down—master the art of SQL Server troubleshooting today! I hope you find this teaser engaging! If you need any adjustments or additional content, feel free to let me know. Thanks to Collin Benkler, Senior Escalation Engineer in Microsoft for SQL Server for his valuable feedbacks.6KViews4likes4CommentsImpact of CDC on SQL Server
Hi, We are testing performance impact of CDC on SQL Server. There are two identical databases (KST_S001, KST_002) on SQL Server 2017 which is running in linux container. They both have CDC enabled for 180 tables and a data generator that is doing mostly updates on these tables. The data generators are doing around 300k DML operations per minute on each database (600k total). The configuration of CDC jobs is kept default (500 transactions, 5s polling interval for capture job, I think three days retention period on cleaning job) The host machine is Azure Standard_E8-4as_v4: 4 vCPU, 64 GB RAM, 12800 IOPS, 128GB SSD. After 1 hour of running the setup with CDC enabled, the average time of 1000 updates is 73ms. With CDC disabled the average time of 1000 updates after 1 hour is 15ms. I've attached two grafana screenshots showing the difference in metrics when the CDC is disabled versus enabled on these databases. We are trying to understand the underlying mechanism that is contributing to the impact visible from the metrics. Why is memory consumption so much higher? Why do DML operations take longer with CDC enabled? CDC disabled: CDC enabled: Thanks for answers and suggestions. Peter12KViews2likes2CommentsOS Hang or Out of Memory due to SQL Ser... No Wait, it's SQL Analysis Services (SSAS)
First published on MSDN on Jan 12, 2018 Recently, we have observed a number of cases where DBAs or application developers are complaining about out-of-memory errors or even machine not responding (hangs) despite the fact that there is plenty of available memory on the system.10KViews5likes1CommentUpdate Stats Sample Rate does not work
Create/Update statistics allows users to speicfy the sample rate. However, the sample rate may not work as you expected in some scenarios. 1.Tables less than 1024 pages I'm going to use the table Production.Product in AdventureWorks2019 to demonstrate use AdventureWorks2019 go create statistics IProductID on Production.Product(ProductID) with sample 20 percent go dbcc show_statistics('Production.Product','IProductID') In this script, Sample rate is set to 20%. However, the 'DBCC how_statistics' shows that the 'Rows' equals to 'Rows Sampled', which means it's 100% sampled. Why? Because for table with less than 1024 pages in the clustered index(if the table is heap, we count the indexid 0) of table, SQL Server ignores the sample specified and always use 100% sampled. In this case, the Prodcution.Product only has 15 pages, hence it's always 100% sampled. Please note, sample 0 is an exception. If you specify 0, SQL Server does not create histogram. 2.Tables that have more than 1024 pages SQL Server guarantee that at least 1024 pages will be sampled. If the sample rate specified is less than 1024, SQL Server will replace it with 1024 pages. If it’s greater than 1024 pages, SQL Server will use following formula as sample rate:1024/Total Pages. 3.What if sample rate is not specified? If the pages is greater than 1024, SQL Server picks up from smaller one from following two TotalPages (15*power(Rows,0.55)/TotalRows*TotalPages)+1024 So if it's smaller table , the rate is TotalPages/TotalPages=100% For big table,the rate is ((15*power(Rows,0.55)/TotalRows*TotalPages)+1024)/TotalPages=15*power(Rows,0.55)/TotalRows*TotalPages/TotalPages+1024/TotalPages=15*power(Rows,0.55)/TotalRows+1024/TotalPages For large table, the rows in the 1024 pages can be ignored. Here are samples 1.If A table has 1,000,000, then 15*power(1000000.0,0.55) =29929 rows will be sampled, almost 29929/1000000=2.9% 2.If A table has 10,000,000, then 15*power(10000000.0,0.55) =106192 rows will be sampled, almost 106192/1000000=1.06%3KViews2likes0CommentsMemory Grants: The mysterious SQL Server memory consumer with Many Names
First published on MSDN on Jan 01, 2013 The Memory Consumer with Many NamesHave you ever wondered what Memory grants are? What about QE Reservations? And Query Execution Memory? Workspace memory? How about Memory Reservations?As with most things in life, complex concepts often reduce to a simple one: all these names refer to the same memory consumer in SQL Server: memory allocated during query execution for Sort and Hash operations (bulk copy and index creation fit into the same category but a lot less common).46KViews3likes0Comments