Skip to main content

Command Palette

Search for a command to run...

In-Memory OLTP for Faster SQL Server: Your Complete Guide

Updated
5 min read

In today's data-driven world, speed and performance are crucial for staying ahead. SQL Server's In-Memory OLTP (Hekaton) is a great feature that helps businesses meet the needs of high-performance, low-latency applications. Whether you're working on real-time analytics, high-throughput transaction systems, or caching solutions, In-Memory OLTP can be a game-changer.

In this blog, we'll explore everything about Microsoft SQL Server In-Memory OLTP—what it is, its use cases, benefits, limitations, and practical examples using the AdventureWorks database. This guide will also cover best practices, monitoring tips, and how to optimize hash bucket counts for memory-optimized tables.

What is SQL Server In-Memory OLTP?

SQL Server In-Memory OLTP is a memory-optimized database engine designed to dramatically improve the performance of transaction-heavy workloads. By keeping tables and indexes in memory and using natively compiled stored procedures, it minimizes latency and maximizes throughput.

Key Features

  1. Memory-Optimized Tables: Store tables entirely in memory for ultra-fast access.

  2. Natively Compiled Stored Procedures: Execute T-SQL procedures as native machine code.

  3. Optimistic Concurrency: Reduce lock contention for highly concurrent workloads.

  4. Hybrid Durability: Choose between fully durable (persistent to disk) and non-durable (memory-only) options.

  5. Optimized Indexing: Supports hash and range indexes tailored for specific query patterns.

When to Use In-Memory OLTP

Not every situation requires In-Memory OLTP. It's most effective for:

  • High-Throughput Applications: Payment systems, stock trading platforms.

  • Real-Time Analytics: IoT dashboards, e-commerce order analytics.

  • Session Management: Web applications handling large user sessions.

  • Caching Layers: Storing frequently accessed reference data.

  • Gaming Leaderboards: Fast-paced updates and queries.

Limitations of In-Memory OLTP

  1. RAM Dependency: Requires sufficient physical memory to store tables and indexes.

  2. T-SQL Limitations: Not all T-SQL features (e.g., triggers, foreign keys) are supported.

  3. Durability Trade-Offs: Non-durable tables lose data on server restart.

  4. Indexing Constraints: Supports only hash and range indexes.

  5. Maintenance Overhead: Requires careful schema planning and memory monitoring.


Hands-On: Implementing In-Memory OLTP with AdventureWorks

Let's get practical and implement In-Memory OLTP using the AdventureWorks database. We'll create a memory-optimized table to cache product data and boost query performance.

Step 1: Set Up the AdventureWorks Database

First, download and restore the AdventureWorks database:

RESTORE DATABASE AdventureWorks
FROM DISK = 'C:\AdventureWorks.bak'
WITH MOVE 'AdventureWorks_Data' TO 'C:\AdventureWorks_Data.mdf',
     MOVE 'AdventureWorks_Log' TO 'C:\AdventureWorks_Log.ldf';

Step 2: Add a Memory-Optimized Filegroup

Memory-optimized tables require a special filegroup:

ALTER DATABASE AdventureWorks
ADD FILEGROUP InMemoryFG CONTAINS MEMORY_OPTIMIZED_DATA;

ALTER DATABASE AdventureWorks
ADD FILE (NAME = 'InMemoryFile', FILENAME = 'C:\InMemoryFile') 
TO FILEGROUP InMemoryFG;

Step 3: Create a Memory-Optimized Table

We'll create a memory-optimized table to cache product data:

CREATE TABLE dbo.MemoryOptimizedProductCache (
    ProductID INT NOT NULL PRIMARY KEY NONCLUSTERED HASH WITH (BUCKET_COUNT = 10000),
    Name NVARCHAR(50) NOT NULL,
    ListPrice MONEY NOT NULL
) WITH (MEMORY_OPTIMIZED = ON, DURABILITY = SCHEMA_ONLY);

Step 4: Populate the Table

Load data from the Production.Product table:

INSERT INTO dbo.MemoryOptimizedProductCache (ProductID, Name, ListPrice)
SELECT ProductID, Name, ListPrice
FROM Production.Product;

Step 5: Query the Memory-Optimized Table

Run a query to retrieve high-value products:

SELECT * 
FROM dbo.MemoryOptimizedProductCache
WHERE ListPrice > 500;

Step 6: Create a Natively Compiled Stored Procedure

Boost performance further with a natively compiled stored procedure:

CREATE PROCEDURE dbo.UpdateProductPrice
WITH NATIVE_COMPILATION, SCHEMABINDING
AS BEGIN ATOMIC WITH (TRANSACTION ISOLATION LEVEL = SNAPSHOT, LANGUAGE = 'English')
    UPDATE dbo.MemoryOptimizedProductCache
    SET ListPrice = ListPrice * 1.10
    WHERE ProductID = 1;
END;

Execute the procedure:

EXEC dbo.UpdateProductPrice;

Verify the changes:

SELECT * 
FROM dbo.MemoryOptimizedProductCache
WHERE ProductID = 1;

Optimizing Hash Bucket Count

The BUCKET_COUNT in a hash index determines the number of "slots" for rows. Setting it correctly reduces hash collisions and enhances query performance.

Guidelines for Setting BUCKET_COUNT

  1. Match Row Count: Set BUCKET_COUNT equal to or slightly greater than the estimated row count. Example: For 1 million rows, use:

     BUCKET_COUNT = 1000000;
    
  2. Avoid Over-Provisioning: Too high a BUCKET_COUNT wastes memory.

  3. Use Powers of 2: SQL Server performs better with bucket counts as powers of 2.

Monitor Collisions

Use DMVs to check hash index statistics:

SELECT object_name(object_id) AS TableName,
       bucket_count,
       row_count,
       hash_collisions
FROM sys.dm_db_xtp_hash_index_stats
WHERE object_id = OBJECT_ID('dbo.MemoryOptimizedProductCache');

Cleaning and Maintaining In-Memory OLTP Tables

1. Cleaning the Table

  • TRUNCATE:

      TRUNCATE TABLE dbo.MemoryOptimizedProductCache;
    
  • DELETE:

      DELETE FROM dbo.MemoryOptimizedProductCache WHERE ListPrice < 100;
    

2. Shrinking Checkpoint Files

For durable tables (SCHEMA_AND_DATA), reduce checkpoint file size:

CHECKPOINT;

Monitoring In-Memory OLTP

Use the following DMVs to monitor memory-optimized tables:

  1. Memory Usage:

     SELECT * FROM sys.dm_db_xtp_memory_consumers;
    
  2. Checkpoint Stats:

     SELECT * FROM sys.dm_db_xtp_checkpoint_stats;
    
  3. Garbage Collection:

     SELECT * FROM sys.dm_xtp_gc_stats;
    

Points to Note

  1. Plan Memory Usage: Allocate sufficient RAM and monitor usage.

  2. Index Smartly: Use hash indexes for equality searches and range indexes for range queries.

  3. Choose Durability Wisely: Use SCHEMA_ONLY for non-critical data and SCHEMA_AND_DATA for persistence.

  4. Monitor Regularly: Use DMVs to identify bottlenecks and optimize performance.


Summary

SQL Server In-Memory OLTP is a powerful tool for high-performance applications. By using memory-optimized tables and natively compiled procedures, you can achieve unprecedented speed and scalability for transaction-heavy workloads. However, success requires careful planning, proper configuration, and regular monitoring. Implement the techniques in this hands-on guide with the AdventureWorks database and see the difference it makes to your application's performance. Have your insights? Share them in the comments below! For more detailed documentation, visit the official doc’s page.

More from this blog

SQL Server & Azure Database Development, Tuning & Optimization Blog

34 posts

SQL Developer and Backend Engineer. Love to research and work on complex and critical development and data engineering. Like to improve my database skills.