Author Archives: Rob

About Rob

http://gumption.co.nz/about-me/

2022 is just around the corner – is SQL Server dead?

2021 has been a tough year and it’s been one where I have heard SQL Server, delivered as an on premise platform, referred to several times as a legacy technology. I guess nobody likes to hear the product they work on all day referred to as old, but I thought I’d use this post to get my thoughts in order around this so I have a clear response next time it gets casually dropped into a conversation.

The argument for SQL Server being dead?

The argument goes something like this. NoSQL databases started becoming popular in the late 2000’s and the promotion of them was around faster queries, huge ability to scale and querying vast pools of data a more simple exercise for developers. SQL databases were created last millennium with a focus on reducing duplication and adding structure to IT real estates that were already beginning to sprawl as computing systems were in their infancy and adoption was driven by immediate inhouse needs rather than a centrally planned and implemented strategy. SQL databases solved a number of problems, but the primary one was data-sprawl, or more specifically data being duplicated in multiple places, and the inefficiency and storage costs associated with that.

NoSQL databases rose to prominence in a different time in the IT world, a time when IT systems were already critical in the day to day jobs of most office staff and importantly, when the cost of storing data itself had substantially decreased. This meant that there was a shift in what was costing money to businesses when it came to data driven applications. The cost was not so much the infrastructure/storage anymore, so much as the server license fees and the direct development cost of the application(ie/ the developers). Born in the age of DevOps NoSQL developers were keen for a system that allowed them the flexibility of not having to define structure upfront and then be locked into that structure for the life of the application. They wanted, in short, a database system that could be modified as often as the requirements they were fed were. And at the same time cloud computing was on the rise, giving the developers an ability to distribute their code rapidly to multiple different datacenters, housed in multiple different regions, provided by multiple different providers. The infrastructure now existed to make their applications more flexible, reliable and resilient, but a central database server became a huge bottleneck.

NoSQL addressed all of these problems. Typically a NoSQL database will have the following attributes:

  • Ease of use for developers
  • Flexible Schemas
  • Horizontal Scaling
  • Fast Queries due to the data model

In short, the argument goes, when you are working with NoSQL you are working with DATABASES 2.0 – the new and improved way of managing data; the evolution of databases; the purpose built solution for data management in this millennium. Ergo, NoSQL is better. SQL is Legacy. SQL Server is dead.

Technology changes

And all that’s nice, NoSQL database systems have been developed to meet a need that many application developers, and specifically web application developers needed. The initial versions coped a lot of crap about things like reliance on ‘eventual consistency’, not being ACID compliant, not supporting transactions, being insecure and losing data. Some of this is founded, some of this appears to just be completely made up. In current NoSQL versions, these concerns are all addressed and to my mind resolved (If they did actually exist that is, the data loss thing is hard to find any actual evidence on). Where they remain it is as a design choice – transactions are optional, security is something that must be configured to meet requirements (in any product) and options like sharding come with pros and cons.

So NoSQL databases have evolved to meet the objections thrown at them. The mistake made by many NoSQL advocates is thinking that SQL databases haven’t also evolved. Here’s a handful of examples of SQL Server implementing NoSQL, or non relational features:

  • Polybase – introduced in SQL 2016 Polybase allows direct integrations with NoSQL databases and Hadoop data lakes.
  • JSON support – from 2016 JSON can be stored and queried within SQL Server.
  • XML support – this column type has been available since SQL 2005 and can serve a similar purpose to JSON support.
  • Scalability – SQL Server has always offered a range of options here. Availability groups have evolved greatly and are now available in all paid SQL Server editions, and with software assurance licensing is provided for multiple replica servers as part of the primary servers license. But the only technology that offers multiple writable replicas is replication, which is still hugely flexible and…with that it has to be said…hugely complicated.

On top of this Microsoft as a company has focussed most of it’s non-relational development effort into it’s cloud products, and the integration between relational and non-relational cloud products. Non-relational cloud products from Microsoft include:

  • Azure Table Storage
  • Azure Blob Storage
  • Azure File Storage
  • Azure Cosmos DB

So even if NoSQL databases were becoming more popular, that doesn’t necessarily have much effect on Microsofts bottom line as they have plenty of non-relational offerings both inside SQL Server itself and in the wider Microsoft family.

I say even if, because the evidence that non-relational databases are cutting into the relational database market share is flimsy at best. Numbers on DB-engines.com as at December 2021 show only one NoSQL database in the top 10 databases(MongoDB at number 5).

Now let’s talk about Legacy

What’s interesting on that ranking to me is that MS Access is sitting in the top 10. If you want to talk to me about Legacy systems it’s mind boggling that MS Access is still considered one of the top 10 databases in the world. Why is that?

Well, the thing with these charts it’s it isn’t about a direct feature by feature comparison of different database systems, or a fancy cost-benefit analysis of one against the other. It’s about who uses stuff. A lot of the noise about NoSQL databases was, and still is coming from sectors which have to be very agile and dynamic. To tie it down further I would say Startups and greenfields developments. In this scenarios you have an open choice of technologies and can pick the one that is best suited to your needs. Let’s accept all the arguments and say that NoSQL databases were always the right choice for a new data project – and that everything not developed in the past 12 months is ‘Legacy’.

What do you think businesses are running on? It is a hugely expensive exercise to port the backend of an application from one database engine to another, let alone change the underlying assumptions of the database engine itself. I’ve had half a dozen calls this year from people looking to use SQL Server as a backend to Access front-ends and the performance results are horrible. It’s not just a case of cramming the data from one system into another, the underlying methodologies just don’t work – and that is between two ‘relational’ databases!

I don’t have personal experience with moving SQL data to MongoDB data, but I’ve heard some rather horrifying stories from those who have done it, or have had to pick up the pieces after it was done. I’m not saying it can’t or won’t be successful, I’m just saying that I’ve heard enough examples of where it wasn’t to think that the percentage option for a business is to invest in tweaking poor performance in a relational database engine rather than redeveloping it in a non-relational one (or in a different relational one for that matter).

So, even if the correct choice for every new development is a NoSQL database(Hands down, it’s not, but I’m conceding that to make my point) the majority of businesses are not going to redevelop their current functional IT systems which will mean that relational databases will still continue to be the dominant player in data management for a long time to come.

This is mirrored by what I see today in my customer base. Servers become more powerful and can comfortably hold larger and larger numbers of databases, but while we consolidate systems as much as possible new ones arise and because SQL Server is already dominant in the clients environment new applications are often back ended in SQL Server. Even the move to cloud computing hasn’t largely changed this paradigm. Businesses have to support their applications, and this will often swing the choice in favor of technologies where they already have existing skillsets and\or support contracts.

So don’t be alarmed SQL DBA’s. There is plenty of life left in your skillset. Even more importantly there are many exciting paths open to you to expand that skillset.

Finding Your SSRS License Key on an Azure VM

A quick note today. I was installing an Azure VM with SQL Server, and the Reporting Services service is not part of the default install. You can still install reporting services but you’ll either need to copy the installer onto your new VM or run the installation media which can be found on the C drive:

Installation Media: C:\SQLServerFull

Either way you’ll hit a wall where it asks you for the installation key. This is stored in the x64 directory of the install media in a file called Default Setup:

Report Services Key Location: C:\SQLServerFull\x64

Just look for the PID under options and paste that into the installer and you are done. Hope that’s helpful.

SQL Saturday Auckland

Time to polish off the JAFA jokes for another year, SQL Saturday Auckland is just 2 days away. Check out the lineup at:

https://www.sqlsaturday.com/866/

This year I’ll be taking attendees over the types of Encryption that are available within SQL Server, and more importantly – why they should care. If previous experience is anything to go by I don’t expect a huge turnout as Security sessions in general are not that popular at these types of events. That’s why I’ve pitched this session at absolute beginners to the world of protecting their SQL Data, because I believe it’s critical that we all stop hoping that SQL Security is someone elses problem, and accept that we all need to be a part of the security chain.

If you are in Auckland on Saturday, and your firm uses SQL Servers to store you critical data – I really encourage you to come along to this free training event. I see Reza has posted we are expecting up to 380 people to attend, making this the largest free IT training event in the country. Great job once again #SQLSATAUCKLAND team.

SPAM, SPAM, SQL and SPAM

Because I keep having to rewrite my scripts every time I need to have a spammy procedure to just be doing something, and because 10 minutes is 10 minutes – here’s my spamroller.

First create the table:

CREATE DATABASE [SPAMSPAMEGGSANDSPAM]
GO
USE [SPAMSPAMEGGSANDSPAM]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[SpamSpamSpam](
	[spam_id] [int] IDENTITY(1,1) NOT NULL,
	[Time_Entered] [datetime] NOT NULL,
	[TypeOfSpam] [varchar](100) NOT NULL
) ON [PRIMARY]
GO

Then create the job to populate it and mess it up every 10 seconds.

USE [msdb]
GO

/****** Object:  Job [LovelySpam]    Script Date: 1/07/2019 2:45:57 PM ******/
BEGIN TRANSACTION
DECLARE @ReturnCode INT
SELECT @ReturnCode = 0
/****** Object:  JobCategory [[Uncategorized (Local)]]    Script Date: 1/07/2019 2:45:58 PM ******/
IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'[Uncategorized (Local)]' AND category_class=1)
BEGIN
EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N'JOB', @type=N'LOCAL', @name=N'[Uncategorized (Local)]'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

END

DECLARE @jobId BINARY(16)
EXEC @ReturnCode =  msdb.dbo.sp_add_job @job_name=N'LovelySpam', 
		@enabled=1, 
		@notify_level_eventlog=0, 
		@notify_level_email=0, 
		@notify_level_netsend=0, 
		@notify_level_page=0, 
		@delete_level=0, 
		@description=N'To simulate a little activity on the database this job enters Spam into a table.  The job inserts a new row, updates a random existing one and reads everything in the table every 10 seconds.  Note for future reviewers - we may need to reseed the database in about 680 years.', 
		@category_name=N'[Uncategorized (Local)]', 
		@owner_login_name=N'sa', @job_id = @jobId OUTPUT
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
/****** Object:  Step [Bacon Eggs and Spam]    Script Date: 1/07/2019 2:46:00 PM ******/
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Bacon Eggs and Spam', 
		@step_id=1, 
		@cmdexec_success_code=0, 
		@on_success_action=3, 
		@on_success_step_id=0, 
		@on_fail_action=2, 
		@on_fail_step_id=0, 
		@retry_attempts=0, 
		@retry_interval=0, 
		@os_run_priority=0, @subsystem=N'TSQL', 
		@command=N'declare @start as DateTime;
declare @randomnumber as int = cast(rand() * 9000 AS INT)+999
declare @randomtext as varchar(100) = cast(@randomnumber as varchar(100))+''!''
set @start = getdate()


SET @randomtext=replace(@randomtext,''1'',''spam, '')
SET @randomtext=replace(@randomtext,''0'',''sausage, '')
SET @randomtext=replace(@randomtext,''2'',''spam, '')
SET @randomtext=replace(@randomtext,''3'',''baked beans, '')
SET @randomtext=replace(@randomtext,''4'',''spam, '')
SET @randomtext=replace(@randomtext,''5'',''spam, '')
SET @randomtext=replace(@randomtext,''6'',''eggs, '')
SET @randomtext=replace(@randomtext,''7'',''spam, '')
SET @randomtext=replace(@randomtext,''8'',''bacon, '')
SET @randomtext=replace(@randomtext,''9'',''spam, '')
SET @randomtext=replace(@randomtext,'', !'','' and spam.'')

set @randomnumber=CAST(RAND() * 9 AS INT)+1

insert into dbo.SpamSpamSpam(Time_Entered,TypeOfSpam) values(@start,@randomtext)
', 
		@database_name=N'SPAMSPAMEGGSANDSPAM', 
		@flags=0
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
/****** Object:  Step [Spam Spam and Eggs]    Script Date: 1/07/2019 2:46:18 PM ******/
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Spam Spam and Eggs', 
		@step_id=2, 
		@cmdexec_success_code=0, 
		@on_success_action=3, 
		@on_success_step_id=0, 
		@on_fail_action=2, 
		@on_fail_step_id=0, 
		@retry_attempts=0, 
		@retry_interval=0, 
		@os_run_priority=0, @subsystem=N'TSQL', 
		@command=N'Select * from dbo.SpamSpamSpam', 
		@database_name=N'SPAMSPAMEGGSANDSPAM', 
		@flags=0
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
/****** Object:  Step [Spam Sausage Spam Spam and Eggs with Spam]    Script Date: 1/07/2019 2:46:18 PM ******/
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Spam Sausage Spam Spam and Eggs with Spam', 
		@step_id=3, 
		@cmdexec_success_code=0, 
		@on_success_action=1, 
		@on_success_step_id=0, 
		@on_fail_action=2, 
		@on_fail_step_id=0, 
		@retry_attempts=0, 
		@retry_interval=0, 
		@os_run_priority=0, @subsystem=N'TSQL', 
		@command=N'--Update a random row
--Note - this step should get slower the bigger the table gets.  By design.

declare @start as DateTime;
declare @randomrow as int = cast(rand() * 9000 AS INT)+999
declare @randomtext as varchar(100) = cast(@randomrow as varchar(100))+''!''
set @start = getdate()


SET @randomtext=replace(@randomtext,''1'',''spam, '')
SET @randomtext=replace(@randomtext,''0'',''sausage, '')
SET @randomtext=replace(@randomtext,''2'',''spam, '')
SET @randomtext=replace(@randomtext,''3'',''baked beans, '')
SET @randomtext=replace(@randomtext,''4'',''spam, '')
SET @randomtext=replace(@randomtext,''5'',''spam, '')
SET @randomtext=replace(@randomtext,''6'',''eggs, '')
SET @randomtext=replace(@randomtext,''7'',''spam, '')
SET @randomtext=replace(@randomtext,''8'',''bacon, '')
SET @randomtext=replace(@randomtext,''9'',''spam, '')
SET @randomtext=replace(@randomtext,'', !'','' and spam.'')

select top 1 @randomrow=spam_id from dbo.SpamSpamSpam order by newid()

select @randomrow, @randomtext
update dbo.SpamSpamSpam set TypeOfSpam = @randomtext, Time_Entered = @start where spam_id=@randomrow', 
		@database_name=N'SPAMSPAMEGGSANDSPAM', 
		@flags=0
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobschedule @job_id=@jobId, @name=N'Run it heaps', 
		@enabled=1, 
		@freq_type=4, 
		@freq_interval=1, 
		@freq_subday_type=2, 
		@freq_subday_interval=10, 
		@freq_relative_interval=0, 
		@freq_recurrence_factor=0, 
		@active_start_date=20190701, 
		@active_end_date=99991231, 
		@active_start_time=0, 
		@active_end_time=235959, 
		@schedule_uid=N'ea695562-ea40-4536-8164-ea8c2afce7c4'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
COMMIT TRANSACTION
GOTO EndSave
QuitWithRollback:
    IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION
EndSave:
GO


Once you’ve run both bits of code you’ll be the proud owner of a rather boring agent job that runs every 10 seconds, adds a row to a table and updates another random one. Add loops, more jobs and bigger fields depending on your purposes for randomly filling a table.

What’s the point? Well, this particular usecase I wanted to test the effects of statistics as rows changed. I’ve used the same type of procedure for measuring compression and testing isolation levels. It’s amazing how often I find myself rewriting this type of job just because I need to be testing against a database that pretends to have actual connections to it.

Enjoy.

SQL Saturday South Island

I’m honored to be speaking at the next SQL Saturday in Christchurch on June 8th. Check out the link below:

https://www.sqlsaturday.com/831/eventhome.aspx

After taking most of last year off from speaking at these events, I’m looking forward to getting back into it, not to mention catching up with a lot of people I haven’t seen for a while. If you are in Christchurch that weekend be sure to come along.

Hackathon to help Support The Canterbury Muslim Community Trust

I wanted to make sure everyone was aware of this great Initiative to support the Canterbury Muslim Community Trust. After the events of March I think a lot of people are left wondering what they can do to show their support of those effected and to help out. Steve Knutson and Hamish Watson have done a great job of pulling this event together.

Unfortunately I am not going to be able to make it down to Christchurch for this weekend, but I do hope that the extra link helps:

https://techweek.co.nz/whats-on/2019/hackathon-to-help-support-the-canterbury-muslim-community-trust-cmct-460/

A Personal Weight Loss Milestone

2019 has gotten off to a very busy start on the work front, but on the personal front it has been more of the same. I’ve just been putting one foot infront of the other, and I’ve been doing it alot. As a result I’ve dropped 10% of my body weight in the last 3 months.

Back in October 2018 I had a couple of health issues. The most obvious was a back injury which cost me a number of days off work and is still troubling me today when I move in the wrong way. Less obvious was a brief chest pain I experienced which I found a bit scary. My mother has suffered multiple heart attacks and my Grandfather died from one. While this issue was not a heart attack, it was scary. I knew I was overweight and working in IT I live a lifestyle which is mostly sedentary.

But while I wasn’t moving much, things were certainly very busy at work. Quite simply I felt that I didn’t have time to fit exercise into my life. I kept thinking “You can start getting fit when things aren’t so crazy” or “You can work on your health when this project is out of the way”. The only problem is, things are always crazy, and finishing one project just means you’ll have another one take it’s place.

So I took some advice from the Arnold Schwarzenegger when he was asked how he fit everything in. “Sleep faster”. I got a fitbit and resolved to getting up at 5 o’clock each morning and walking for a couple of hours. I still can’t run because it jars my back, but if I can walk, I walk. After a while walking every morning a few of those fitbit challenges seemed within reach and I found myself walking in the evenings at night. It’s a great way to unwind, get exercise and fresh air, and as a nice side effect I have slept very soundly since I started doing this.

Looking back at my history the first day I logged steps was November 21st when I was 98.8kg. Today I dropped under 90kg for the first time, weighing in at 89.6kg. That means dropping a kg roughly every 9 days, which was a bit slower than the 1 per week I was targeting, but still something I’m pretty happy about.

Why is this on a blog which is mostly technical and work related? Quite simply I’ve come to realize that everything starts with looking after yourself. I encourage everyone else to challenge their belief about how much free time they have to fit exercise into your schedule. My perspective has been flipped on its head and now everything else needs to fit around my walking, not the other way around.