How to Implement Row-Level Security in SQL Server

Imagine a financial firm with a single database housing information for thousands of customers. Each regional manager should only access data relevant to their territory, but instead, they see every record. This not only complicates their workflows but also poses a significant security risk. If they make a single mistake—or worse, a malicious actor compromises one account – it could expose sensitive data.

Today I want to talk about Row-Level Security, a SQL Server feature designed to enforce granular access control directly at the database level. It’s a feature I don’t normally get to play with because this type of problem usually is handled by the application rather than at the database tier. But when I told a client about this recently they got excited. It was exactly what they were after, so we moved ahead with a prototype and….I had so much fun.

Row-Level Security allows DBAs to define policies that restrict access to rows of data based on a user’s identity or role. For the financial firm, this means regional managers can only view the customer records relevant to their assigned territory, while executives might have full access. Implementing RLS ensures data remains secure, streamlined, and compliant with regulations.

Setting up RLS involves creating security policies and predicates. First, define a predicate function—typically an inline table-valued function—that determines the access logic. For example, you could create a function that checks if the user’s region matches the row’s region.

CREATE FUNCTION dbo.fnFilterByPrimaryRegion (@Region AS NVARCHAR(50))  
RETURNS TABLE
WITH SCHEMABINDING
AS
RETURN SELECT 1 AS accessGranted
WHERE @Region = SYSTEM_USER;

Next, bind this function to your table using a security policy:

CREATE SECURITY POLICY dbo.RegionSecurityPolicy  
ADD FILTER PREDICATE dbo.fnFilterByPrimaryRegion(Region)
ON dbo.SalesData
WITH (STATE = ON);

Once enabled, RLS automatically filters queries based on the predicate. Even if someone tries to bypass it with a direct SQL query, they’ll only see rows they’re authorized to view.

Returning to the financial firm, implementing RLS transforms how regional managers interact with the database. Each manager now has a focused view of their data, reducing confusion and minimizing the risk of exposing unrelated customer records. Executives can still access a complete dataset, and compliance audits show that sensitive data is protected by role-based policies.

Auditing and Monitoring SQL Server: Keeping Tabs on Database Access

As a SQL Server DBA, maintaining visibility into who accessed your data, what they did, and when they did it can be crucial for security and compliance. Auditing and monitoring are your database’s security cameras, logging and alerting you to potential risks and suspicious activity. Audit is a feature I’ve not used much in my career, but we have had some interesting setups amongst our client base, so it’s worth a reminder of what we can do with this tool.

SQL Server’s Audit feature is a powerful tool for tracking activity. By creating a server audit object, you define where logs will be stored—such as the Windows Security log or a local file. Then, you set up audit specifications to record specific events, like SELECT, INSERT, or UPDATE actions on sensitive tables. For example, if your organization needs to monitor access to financial records, you can create an audit that tracks who reads or modifies that data. This provides a reliable trail for compliance or forensic investigations.

Real-time monitoring can be equally critical if you need to go beyond a retrospective look at what happened and try and get in front of a treat. While auditing captures a historical record, monitoring tools like Extended Events or SQL Server Profiler help you identify and respond to issues as they happen. For instance, monitoring can alert you to unusual login patterns or excessive data exports, giving you the chance to intervene before serious damage occurs.

The tools we’ve been given for audit purposes are actually really powerful, but as with all things they don’t come for free. Efficient auditing and monitoring require a balance. Logging every action can overwhelm you with data, so focus on high-risk activities—like failed logins, changes to critical tables, or permission alterations. Lower-risk logs, such as simple SELECT queries, can be archived for periodic review if they are required at all. We have one client where audit of server access is a critical requirement and the windows event logs these are written to is many multiples the size of the database itself.

Clients will frequently start with ‘everything’ as a requirement, but when you drill into what ‘everything’ actually is, and how much it will cost, this needs to be scaled back. It’s usually not just about catching bad actors—it’s about demonstrating that your are taking appropriate steps to protect your data and detect unauthorised attempts to access or manipulate it.

AI is coming for your job….but…..you’ll be fine.

This months TSQL Tuesday is hosted by Pinal Dave who asks the question “Has AI helped you with your SQL Server Job?“. It’s a fascinating question because the marketing on numerous products gets me very excited, but then I use them and…well….underwhelming.

First of all, the term AI is a buzzword, and as such it gets picked up by a whole bunch of people who don’t really know what it is, how it works and what it’s current limitations are and chucked into any presentation they are doing. I’ve heard formulas in spreadsheets, threshold based alerting and indexed searching described as AI. Quite honestly if you say your product has AI in it you seem more likely to get your project greenlit of funded. Sometimes even the promise that you will use AI later is enough to get investors and decision makers excited, because for them they hear “AI = Computer doing work = human not doing work = don’t have to pay wages anymore”. Even big companies who should know better fall into this pattern, such as Amazon’s whose just walk out technology, supposedly powered by leading edge vision technology, actually turned out to be powered by hundreds of workers in India watching your shopping session on video.

So far I’ve seen the AI-oversell pattern repeat over and over, and in my own work place I do have to say that the integration of AI has been time-consuming and painful. I was lucky enough to attend PASS summit last year and some of the demo’s of AI in the Azure Portal, and in Vendor products were quite impressive. I couldn’t wait to get back home and try some of them out. But the thing about demo’s are they are easy to get to behave exactly how you want – especially if you control all the inputs and record the demo. I was able to duplicate some, but not all of the demonstrations I saw, and those imperfectly. Some of the features were preview and just not available yet.

How Code is generated and used will be another interesting area to watch. The old meme that “For AI to take over programmers jobs, managers will need to come up with accurate requirements” is very true. Writing code is quite a small part of a programmers job. But I’m at a level when it comes to my code that AI works well at. I will ask for a small specific thing, and I will get back a specific answer that I can plug into a wider application. For the most part that means I don’t have to remember syntax in languages I might use monthly rather than daily. And then add in intellisense (Itself a type of AI) to make sure the code includes keywords that actually exist and I can get things working reasonably quickly.

I was also particularly keen to get my hands on co-pilot for Office 365. It’s been good for drafting emails, but really only for getting from a blank page to ‘something’ and then shaping in the information I want to include. I’ve dropped the use of CoPilot for analyzing my outgoing emails though. My emails tend to answer questions and I try and do this quickly and concisely. CoPilot would prefer I spend more time with cuddly language. Maybe I should but the emails don’t sound like me when I send them.

What I really wanted to be able to do with CoPilot was take a document file that I’d prepared for a client, such as a site review and quickly turn it into a PowerPoint presentation. This can sometimes take up to a day manually depending on the volume and depth of the source document. I had hoped to cut that down to an hour or two, but alas CoPilot has not been very good at picking out key pieces of information, and formatting has also been quite disappointing. With experimentation I’ve found that presenting the source document in a specific format assists greatly with the conversion process and accuracy of the information presented. But I want the PowerPoint to be a summary of the word document – It shouldn’t have to dictate the format of the document in order to do that.

And this leads me to the title of my post today. If AI can get these things right they can shave hours off manual work processes, and obviously that means the same work can be achieved by fewer staff. AI will cost people jobs, but how many people and how many jobs. There’s no doubt that AI will just keep getting smarter and will be capable of achieving more and more tasks, so the real question is how standardized is your job. If you are doing the same thing every day and your decisions are fairly simple, you can expect someone somewhere to start looking at whether AI(or simply well written automation) can do that more cost effectively. If you have a reasonably complex job, and particularly if your job involves complex discussions with other actual human beings, AI is not the threat you should be worried about. The thing that will take your job is another human being who has figured out how to use AI to cut out the time consuming bits of the role and manage the human interactions well.

Microsoft did a very smart thing when they named their AI family of products CoPilot. The not-so-subtle implication is that AI is something that supports you in your job, rather than comes and takes it off you. In the short term at least I see that as being how things will play out. For me AI has been things like CoPilot and chatGPT getting me off a blank page. Asking a question and seeing if the answer tracks. The Search functionality of providing it’s sources is a great addition because I can fact check the source before I read too much into the actual answer.

The Documentation Problem with AI

Because I’m lazy I’m going to say that this post is a response to Brent Ozars TSQL Tuesday invite. It’s wasn’t originally, it’s a week laste, but it kinda fits so let’s roll with it and not ask too many questions?

Brent asks “What was the last ticket you closed?” Unlike most of the respondents I do work in a system where tickets are a real thing and they do largely drive me day. Everything gets turned into tasks, these are tracked against budgets and deadlines and the ticketing system RULES OUR LIVES!

But the last ticket I closed wasn’t actually a ticket, it was a task in Microsoft to-do, or planner, or whatever it’s being called this month. I have a recurring task for several pieces of our documentation that prompts me to go and review what is written, and if necessary to change it.

Why would I do that? Well put simply, I hate documentation, but I hate documentation slightly less than I hate ‘lack-of-documentation’. I’m primarily a technical person and I like to have a list of things or a process to follow for repeatable tasks. Ideally I like to automate them, but that’s something you do after you have a reliable process, not before. I also like to make sure that any process my colleagues have to work through has a nicely documented set of steps too, but the problem is that sometimes that changes as you change your infrastructure, or SaaS products, or software changes or a bunch of other things. The problem is that documentation is seldom scoped as part of any project, and if it is it’s rare that it’s scoped sufficiently to actually do the documentation justice.

And why is hunting down outdated documentation particularly important right at this point of time? Because the world is changing, and we are unleashing AI on our systems, and it’s pulling recommendations from documentation we wrote 5 years ago and never updated or removed from the system. I want to work in an organization that is AI ready, and unfortunately that means a certain amount of legwork to work through all the different boring bits of source material that AI is going to be using to create it’s responses.

There’s a huge amount of buzz about Artificial Intelligence at the moment, and rightly so. But not enough discussion about the groundwork that’s needed for good results. Put simply the businesses that have prioritised structure, process and documentation will reap the benefits of that investment as they integrate AI into their workflows. The businesses who haven’t are the ones that will find AI giving them the strangest answers and have the lowest level of trust in what AI is doing for them. Let’s face it, there’s a lot of these businesses out there, and they are the most likley to jump in feet first to the AI world looking for a miracle that they simple haven’t earned yet.

So, closed? Temporarily. I’ve completed the key bits, but this is a ticket that is never really closed. If we want to get best value from AI we need to make sure it’s source data is as accurate as possible and that’s an ongoing process. Maybe with the right prompting it’s a task AI can eventually take over itself, but until then…. Garbage in-garbage out. Funny how working with data just keeps coming back to the same concepts isn’t it?

Defending Against SQL Injection Attacks: Best Practices for SQL Server

It’s 2024 – and SQL Injection is still a thing. It’s hard to believe that people aren’t aware of the possibility of a SQL injection attack in the current environment, but here’s a thought – it’s going to get worse. Why do I say that? Because instead of having a sensible developer with 30 years of dealing with idiots attacking their code and trying to make it do things it’s not meant to, you are going to have code going into production written by AI responding to a prompt like “Hey CHAT GTP – how do I write a form that people can send me comments about their order through?”. **SIGH** Let’s take a step back.

Imagine leaving your front door wide open, with a sign saying, “Welcome, come in!” That’s what a poorly secured SQL Server feels like to a malicious actor exploiting SQL injection vulnerabilities. SQL injection is a technique where attackers inject malicious SQL code into input fields, tricking the database into executing commands it shouldn’t. It’s one of the most common and devastating attack methods, and as a SQL Server DBA, it’s your job to ensure the door stays locked.

SQL injection doesn’t discriminate—it targets any system that relies on improperly sanitized user inputs. Picture a login form that asks for a username and password. If the application inserts these inputs directly into a SQL query without validation, an attacker could manipulate the input to bypass authentication, exfiltrate data, or even gain administrative control. For example, entering ' OR 1=1 -- as a password might turn the SQL statement into something always true, effectively allowing the attacker in without knowing any credentials.

As I mentioned above, we are going to get an increase in the number of inexperienced people putting code into the world. We simply can’t rely on the frontend being secure. It may be…and that would be great, but if our SQL Server is being breached because someone wrote a poor app it’s as much the DBA’s fault for not closing the door as it is the developers for opening it.

Defending against SQL injection requires more than locking down your SQL Server—it’s about creating multiple layers of defense, much like securing a fortress. The first line of defense is always input validation and parameterized queries. Instead of allowing raw input to dictate how your queries are constructed, parameterized queries use placeholders for inputs, ensuring the database treats them as data rather than executable code. Think of it as giving someone a safe deposit box key that only works for one specific box, rather than granting them access to the entire vault. We can look for that as DBA’s, and we can talk to our developers about the queries that they are writing.

Moving on from a parameterized query we can enforce input via stored procedures and now the code sits in the DBA realm. By encapsulating SQL code in a controlled environment, you reduce the risk of input tampering. However, don’t fall into the trap of thinking all stored procedures are inherently safe. If you dynamically construct SQL strings within a stored procedure, the risk of injection remains. And you want to know when stored procedures are failing. If you write it in such a way that it cannot fail so long as the input is valid – when it fails what does that tell you?

Another crucial defense mechanism is principle of least privilege. Every application user doesn’t need administrative access to the database. Instead, grant permissions based on necessity. If a user only needs to read data, don’t let them modify it. This limits the potential damage of a successful SQL injection attack. What’s more, SQL Server has a wonderful feature called roles that everybody uses like this: Do you need read access – I’ll give you db-reader, do you need write access – I’ll give you db_writer, oh you need both – I’ll give you db_owner. It’s crazy. SQL Server access is so much more granular than that. It you are exposing your database to a web app – what access does that app actually need? Oh it only writes to one table? So give it access to just that table. I know it’s more work and fiddlier – but it protects your data. So do it.

Encryption also plays a role. While it won’t stop an injection attack, encrypting sensitive data—like passwords—adds another barrier. Even if attackers gain access, encrypted data is far less useful without the decryption key.

Finally, don’t underestimate the power of monitoring and auditing. Tools like SQL Server Audit can track login attempts and query execution, providing early warning signs of unusual activity. Regular penetration testing is also invaluable; it’s like hiring a locksmith to find weaknesses in your security setup before a burglar does.

SQL injection is a persistent threat, and it’s probably going to get worse rather than better, but it’s not an inevitable one. By treating your SQL Server like a fortress and layering your defenses, you can ensure that even the most determined attacker finds nothing but locked doors. After all, security isn’t just about keeping bad actors out; it’s about building trust with those who rely on you to safeguard their data. And in this case, trust is the most valuable currency of all.

TSQL Tuesday – Tis the Season to Say Thanks….to ourselves?

This month’s TSQL Tuesday is bought to us by Kay Sauter and asks us to have a think about who needs a thank you thrown their way. As I wind down to go on leave for the year, I’m reflecting on what a crazy year 2023 has been. I wrote this a week ago, and then I got covid which gave me even more time to mull things over, so I’m going to completely rewrite it and thanks someone different!

First thanks obviously goes to family and friends who put up with me. To my wife who rings me and reminds me that I stopped getting paid at 5 and maybe it’s time to come home. To my girls for listening to me recite the latest episode of Darknet Diaries or prattle on about why their school grades are stupid and ours were so much better. “Daddy I got an E” still sounds like a bad thing to me. I will never accept it is the best grade there is. How am I meant to index things in my head if they aren’t alphabetical? We don’t live in hogwarts!

Oh….and I should mention a quick thanks to my blog readers for holding back on the flames when I take off on a tangent.

Secondly, I want to throw out a thank you to some people who really don’t get thanked enough – ourselves. I’m talking about all the people who collect, guard and use data. Every now and then I stop and think about all the amazing things that we do with data and how very lucky we are to be right on the edge of so much of the exciting changes that are happening in the world.

As SQL professionals we are doubly lucky because not only do we have a great product to work with, we have an amazing community to be a part of. I had the fantastic opportunity of travelling to PASS Summit in Seattle last month and was reminded what a special group of people form this community. I want to call out all those who lead sessions – not just at PASS but at SQL and data events everywhere. We deal with a lot of complex topics and sometimes it just takes the right presentation of something and it all clicks into place. For me that usually comes with a comparison, or a story about when something was used in the wild, and those come from blogs, SQL events, or catching up with people in the industry for a beer. I want to extend that community thanks to the folks who have made products, training and code available. Imagine being a DBA without the first responder kit, DBATools, Ola’s maintenance scripts or sp_whoisactive! Imagine if you had to solve every problem from scratch instead of paste an error message in google and be lead straight to someone’s blog where you can read about the why and the how. We are part of an amazing community and I want to take this chance to thank all the SQL heroes – past, present and future.

Encrypting SQL Server Connections with SSL/TLS: A Practical Guide for DBAs

Encrypting SQL Server Connections with SSL/TLS: A Practical Guide for DBAs

As a SQL Server Database Administrator (DBA), ensuring the security of data in transit is a critical responsibility, but messing around with server certificates and encryption is not something we are always familiar with. In this article it’s time to figure it out as SSL/TLS (Secure Sockets Layer/Transport Layer Security) encryption safeguards against threats such as man-in-the-middle attacks, ensuring data integrity and confidentiality.


Why Encrypting Connections Matters

Unencrypted connections are a liability. Data transmitted over the network can be intercepted and exploited by malicious actors, leading to data breaches and compliance violations. DBA’s often assume that protecting the data starts with strong access protection, but we need to remember that data can become exposed after it’s left the fortress of the database server, and we can’t rely on the application for that.

Encryption adds a protective layer, ensuring that even if data is intercepted between the database server and the client application, it remains unintelligible without the decryption key.I’ve been talking about SQL Encryption for years now, but it is actually the move to compliance that has started getting people to pay attention. Going beyond the security benefit, encryption is often a compliance requirement for standards like GDPR, HIPAA, and PCI DSS.

Steps to Configure SSL/TLS Encryption in SQL Server

Configuring encryption for SQL Server involves several steps. Here’s a streamlined approach to get you started:

1. Obtain and Install an SSL/TLS Certificate

  • Acquire a certificate from a trusted Certificate Authority (CA) or create a self-signed certificate for internal testing purposes.
  • Ensure the certificate includes the SQL Server’s hostname and has a private key.
  • Install the certificate on the server hosting SQL Server by importing it into the Windows certificate store under the “Personal” folder for the local computer account.

2. Configure SQL Server to Use the Certificate

  • Open the SQL Server Configuration Manager.
  • Navigate to the SQL Server Network Configuration -> Protocols for [Your Instance].
  • Right-click “Protocols for [Your Instance]” and select Properties.
  • Under the Certificate tab, choose the installed certificate from the dropdown.
  • Set Force Encryption to “Yes” under the Flags tab if you want to mandate encryption for all connections. This is not always possible so make sure all your applications allow it. Additionally you need to consider that a failed connection attempt from lack of encryption is not a failed login. That means it doesn’t show up in your SQL Server event log – you need another way of detecting it(hopefully not from people yelling at you that things don’t work).

3. Configure Client Connections

  • Ensure client applications are configured to use encryption.
  • For SQL Server Management Studio (SSMS), enable the “Encrypt connection” option in the connection settings.
  • If using custom applications, confirm the connection strings include the Encrypt=True and TrustServerCertificate=False parameters for proper validation.

4. Verify Encryption

  • Use tools like Network Monitor or Wireshark to confirm encrypted traffic between the server and clients.
  • Alternatively, query sys.dm_exec_connections DMV in SQL Server and check the encrypt_option column to verify active encrypted connections.

Best Practices for SSL/TLS Encryption

  1. Use Strong Certificates: Ensure certificates use modern encryption algorithms like SHA-256. Avoid deprecated protocols like SSL 3.0 and weak cipher suites.
  2. Regularly Update Certificates: Monitor expiration dates and replace certificates before they expire to prevent service disruptions.
  3. Test in a Non-Production Environment: Validate your encryption setup in a staging environment to identify and resolve potential issues before rolling it out to production. When it does come time to roll into production, use a multi-step process. First add the certificate and update your connection strings. Then monitor the sys.dm_exec_connections DMV to ensure that all connections are being encrypted before flicking the switch and requiring encrypted connections. This way you can identify and update any connection strings that you missed earlier, and correct the issue, rather than just causing them to fail.
  4. Monitor and Audit Connections: Continuously monitor SQL Server logs and use auditing tools to ensure encrypted connections are consistently enforced.

It turns out that securing SQL Server connections with SSL/TLS encryption is a pretty straightforward task for any DBA. As with most things SQL the technical implementation is not the tricky bit. Ensuring you know where your traffic is coming from and which connection strings need updating is the trickier part of the process.