TSQL Tuesday – SQL 2022 Planning for the Plan to Fail

If you fail to plan, then you plan to fail. But sometimes your plan sucks and you fail anyway. In these situations, a plan B is nice to have up your sleeve.

Glenn Berry has invited us to talk about our favourite feature in SQL 2022 for TSQL Tuesday this month and for me it’s all about the extra information in Query Plans and the prospect of getting some of my clients busier workloads experiencing the gains from Built-In Query Intelligence.

The query plan is the critical secret SQL sauce in delivering data back to a user as quickly as efficiently as possible. Well, nearly as quickly and efficiently. Before SQL 2022 various forms of black magic were involved in forcing compiled plans to perform consistently. The query optimizer didn’t plan to fail, but if the parameters passed weren’t what it expected it often would, sometimes horribly. In SQL 2022 we have the addition of a Plan B.

Built in query intelligence is a real gamechanger for consistent performance of query plans. And it all starts with a very little change that I haven’t actually heard mentioned too much:

Query store turned on by default

The Query store is incredibly powerful, and simultaneously surprisingly underused in the wild. This is mostly because by default it’s not on, but that changes in SQL 2022. The simple fact that, as a consultant I can come to a new server and already have Query Store available means that the information I can feed back to a client before making any change in the environment is greatly enhanced.

And then the planning got better.

The classic parameter sniffing scenario starts with either a cached plan for an atypical portion of a data set, or of a dataset that represents very disparate usage scenario’s. A classic example would be a user data set with a mix of users from large and small towns and cities. The typical usage may reference users in Capital-City but the first execution is run against a user from Little-Shite-On-The-Mire and we get a seek cached instead of a scan. All subsequent queries run with that plan and server performance is abysmal.

The ability to have a plan cached for both scenario’s is huge and is an example of my favourite type of performance enhancement in SQL Server. You don’t need to turn stuff on or off, you don’t need to change code, you don’t need to modify user behaviour and the overhead is small\insignificant. Just upgrade to SQL 2022, set your compatibility mode correctly and stuff will run faster and more consistently. I love when discussing SQL upgrades with clients being able to discuss enhancements in the way queries are processed because it really does give a compelling reason for them to stay current when they don’t have huge resource to push into configuring and maintaining new features, but just want to know SQL will keep doing what it does, only better.

Add in enhancements to reading from Availability group replicas, hints in query stores, Cardinality Estimator Feedback in plans, and feedback and learnings being persisted between failovers and restarts and I’m really looking forward to getting some of the heavier workloads running against SQL 2022 and using the new toys we’ve been given.

4 thoughts on “TSQL Tuesday – SQL 2022 Planning for the Plan to Fail

  1. Glenn Berry

    I agree that these new features and improvements for query plans and for Query Store should be very useful with many workloads.

    Thanks for writing this post and participating in this month’s T-SQL Tuesday.

    Reply
  2. Pingback: T-SQL Tuesday #154 Recap

  3. Gerard Jaryczewski

    “I love when discussing SQL upgrades with clients” – if clients are ready to compare expenses, and they see that time spend on optimization is more expensive than the price of upgrade, does it make a difference? What is your experience?

    Reply
    1. Rob Post author

      Hi Gerard, historically not so much. In my experience though clients are increasingly linking performance improvements to cost reduction. Cloud hosting has made it easier for performance improvements to be seen as a direct way of reducing cost, not just stopping users moaning that ‘the database is slow’.

      Reply

Leave a Reply

Your email address will not be published. Required fields are marked *