SQL Server In-Memory OLTP Investments

SQL Server In-Memory OLTP 

In the previous two releases of SQL Server, Microsoft built into the product a number of in-memory capabilities to increase speed and throughput and to accelerate analytics. The first of these capabilities was an in-memory analytics add-in for Excel, also known as PowerPivot, in SQL Server 2008 R2. SQL Server 2012 included in-memory Analysis Services and in-memory column store.

When SQL Server 2014 was launched in April 2014, it brought with it a much talked about a bonus. Alongside the standard relational engine and its traditional disk-based tables, SQL Server 2014 includes In-Memory OLTP, a high performance, memory-optimized engine that allows users to create data in memory-optimized tables that reside permanently in-memory.

In development, Microsoft code-named the project ‘Hekaton’, the Greek word for ‘100’. Their goal was to make In-Memory OLTP 100 times faster than disk-Based tables. So far, they’ve actually made it around 30 times faster. Not 100, but still quite some achievement.

This feature, along with the other in-memory capabilities, provides organizations with a holistic approach to drive real-time business with real-time insights. All of which gives organizations the opportunity to create high speed and scalable enterprise and web-scale applications and databases.

This tutorial focuses on the new capabilities associated with In-Memory OLTP.

 MindMajix YouTube Channel

In-Memory OLTP overview

Microsoft first announced work on the In-Memory OLTP feature during the 2012 PASS Summit in Seattle. In the worldwide database community, it is most likely known by its project name, Hekaton. Microsoft SQL Server 2014 not only includes this impressive new database engine feature that allows organizations to achieve significant performance gains for OLTP workloads but also reduces processing time. In many cases, when In-Memory OLTP is combined with a new lock-free and latch-free algorithms that are optimized for accessing memory-resident data enhancements and natively compiled stored procedures, performance would be improved by up to 30 times.

To give database administrators the opportunity to appreciate this new feature, this tutorial not only teaches and enlightens its readers but also aims to dispel some flawed beliefs about In-Memory OLTP. This chapter addresses a series of questions, including the following:

  • What are In-Memory OLTP and memory-optimized tables?
  • How does In-Memory OLTP work?
  • Are there any real-world cases to demonstrate In-Memory OLTP performance gains?
  • Can existing database applications be migrated to In-Memory OLTP?

Related Page: SQL Server In-Memory OLTP Architecture Overview

Let’s look under the hood to see how organizations can benefit from In-Memory OLTP.

The proliferation of data being captured across devices, applications, and services today has led organizations to work continuously on ways to lower the latency of applications while aiming to achieve maximum throughput of performance-critical data at a lower cost. Consider a financial organization that offers credit-card services to its customers. This organization must ensure that it can validate, authorize, and complete millions of transactions per second, or face the fact that it will lose financial opportunities for both itself and also the vendors who use its service. Online gaming is another industry that requires maximum throughput, needing to serve millions of customers who want to gamble online. Gone are the days when people made static bets on the outcome of a game. Today, people place bets in real time based on events transpiring in real time. Take, for example, a football game for which your bet depends on whether you believe the kicker will kick the winning field goal in the Super Bowl. In situations like this, the database platform must be well equipped to process millions of transactions concurrently at low latency, or else the online gaming organization faces the possibility of financial ruin.

 

Frequently asked SQL Server Interview Questions

 

The SQL Server product group recognized that customer requirements are quickly changing in the data world and that the group needed to provide new capabilities to decrease processing times and deliver higher throughput at lower latency. Fortunately, the world is also experiencing a steady trend in the hardware industry that allowed the product group to generate these new capabilities. First, the product group realized that the cost of memory had vastly decreased over the past 20 to 25 years, while the size of memory continued to increase. Moreover, the cost of memory had reached a price point and a capacity point at which it was now viable to have large amounts of data in memory. This trend is illustrated in Figure 2.1.

Price of RAM has drastically decreased over the past 20 years
FIGURE 2.1: The price of RAM has drastically decreased over the past 20 years.

Second, the group recognized both that CPU clock rates had plateaued and that CPU clock rates were not getting any faster, even after the number of cores on a processor had drastically increased, as shown in Figure 2.2. Armed with knowledge from these trends, the SQL Server team reevaluated the way SQL Server processes data from disk and designed the new In-Memory OLTP engine, which can take full advantage of the larger memory sizes that are available and use processors with more cores to significantly improve the performance of OLTP applications

Stagnating growth in CPU.
FIGURE 2.2: Stagnating growth in CPU clock speeds while the number of cores increases.

On an average, most OLTP databases are 1 terabyte or less. As such, the majority of today’s production OLTP databases can reap the performance benefits of In-Memory OLTP because the whole database can fit into memory. Just imagine the possibilities required for continuing the current hardware trend. Perhaps in the next decade, servers will support petabytes of memory, making it possible to move the largest databases and workloads to memory. It will be interesting to see what the future has in its hands.

EndNote :

In-Memory OLTP in SQL Server 2014 gives users the possibility of greatly enhanced performance. It even includes a new Analysis, Migrates and Report tool to evaluate the current performance of your database and tell you whether it can be improved.

Typical instances of businesses that could benefit include those with a high data insert rate such as power companies recording a constant stream of information from smart meters; websites with a large number of users accessing the same data from, say, online magazines; or website gaming platforms where low latency is business-critical.

For those who want, perhaps need, those performance advantages, the journey to In-Memory OLTP is not a straightforward migration. It is a worthwhile one, however, provided the deployment issues are understood first.

Explore SQL Server Sample Resumes! Download & Edit, Get Noticed by Top Employers!Download Now!

List of Related Microsoft Certification Courses:

 SSIS Power BI
 SSRS SharePoint
 SSAS SQL Server DBA
 SCCM BizTalk Server
 Team Foundation Server BizTalk Server Administrator

 

Course Schedule
NameDates
SQL Server TrainingOct 15 to Oct 30View Details
SQL Server TrainingOct 19 to Nov 03View Details
SQL Server TrainingOct 22 to Nov 06View Details
SQL Server TrainingOct 26 to Nov 10View Details
Last updated: 04 Apr 2023
About Author

Arogyalokesh is a Technical Content Writer and manages content creation on various IT platforms at Mindmajix. He is dedicated to creating useful and engaging content on Salesforce, Blockchain, Docker, SQL Server, Tangle, Jira, and few other technologies. Get in touch with him on LinkedIn and Twitter.

read less