A while back I was testing attempting to access SQL Server using a defined port number.
To do that, you have to access SQL Server Configuration Manager to specify the port number, or at least I think you do (If you don’t have to do it this way, please, oh please, let me know!).
So, since my laptop is running on Windows10, I open up the Start menu, type in “config” and…nothing! No SQL Server Configuration Manager! The closest I got was the configuration manager for Reporting Server.
I’m shocked by this especially because when I type it into the Start menu now, I get…
Well it turns out that, for me, the default path is now in…
I say “<nn>” because it’s a number dependant on what version of SQL Server that you are running.
For example, I have a 2012, a 2014 and a 2016 version on my laptop so I have 3 versions of SQL Server Configuration Manager.
Seriously, is that it?
Nope, that ain’t it.
Opening up Windows Explorer, going all the way down to that level just to get the config manager? Ain’t nobody got time for that!
They say “imitation is the greatest form of flattery” so taking ideas garnered from dbatools and the fact that I’m just finished a pre-con for SQL Saturday Boston in PowerShell from Mike Fal ( b | t ), thank you Andy Mallon ( b | t ) for pushing me out of my comfort zone even if it was only to just sign up!, I’ve created a little PowerShell script to let me choose and open a SQL Server Configuration Manager.
LIKE a function… now the song is stuck in your head!
… I’m not sorry…
Ever heard of “Osmosis”? You know, the…
process of gradual or unconscious assimilation of ideas, knowledge, etc.
For the longest time, that was how I thought people learned in SQL Server. You hang around a technology for long enough and the information about it slowly trickles into your brain.
I would hazard that the vast majority of people learn like this. They work with something long enough and slowly they develop, if not a mastery of the subject, then a familiarity with it.
That’s how I learned Transact-SQL anyway.
Working in a help desk, trouble-shooting stored procedures and ad hoc statements; cranking out reports left, right and center, slowly absorbing the differences between INNER, LEFT, RIGHT, and FULL joins. Realizing that there is a vast difference between excluding results with a WHERE clause and with a HAVING clause.
Ahh good times!
However, now I’m in the mindset that if you really want to learn something then study it; purposefully and deliberately.
And with all the new features being released for SQL Server 2016, you would be amazed at what I can learn about features that were released in SQL Server 2008.
So here’s some little known facts I learned about LIKE
Safe to say, that we’ve all used LIKE, we’ve all seen LIKE, and we’re probably all going to continue to use LIKE.
But are we using it to the best of our ability?
So let’s test out this bad boy using the WideWorldImporters database, see if we can find everyone with the first name of Leyla.
Simple right? And because [Sales].[Customers] uses the full name, we have to use LIKE.
WHERE CustomerName LIKE 'Leyla%';
Now a developer comes along and says “Wait a second, my sister is Leila”. So we try to cheat and add a wildcard in there.
WHERE CustomerName LIKE 'le%a%';
Leonardo!? Well I suppose he does count in this situation, but there’s 2 characters between the ‘e’ and the ‘a’ and I only wanted one.
Well you can specify only 1 wildcard with the LIKE function by using the underscore (‘_’), so let’s try that.
WHERE CustomerName LIKE 'Le_la%';
Yes, I cheated and inserted that extra name ‘Lejla’.
Call it Poetic Licence but I only used it to show that we still have options if this is not the results that we want. We are only interested in ‘Leyla’ and ‘Leila’.
‘Lejla’, while a lovely name I’m sure, is not what we require right this second. So what are we to do?
Well, did you know that LIKE has the range function as well? What’s range got to do with it? Well, what happens if we only put in a range of 2 characters?
WHERE CustomerName LIKE 'Le[iy]la%';
There we go! Only the results that we want and none of that Lejla, Leonardo business.
Now you could argue with me (I encourage it actually. How else am I to learn?) and say that you would never do it this way. That it is much easier to do something along the lines of this:
WHERE CustomerName LIKE 'Leyla%'
OR CustomerName LIKE 'Leila%';
I’ll admit that the above reads a lot easier, but it doesn’t scale very well though. What happens if we want to include the Leala, Lebla, Lecla,….all the way to Lenla’s? Are you going to write out 15 different clauses? 1 for each different character?
WHERE CustomerName LIKE 'Leyla%'
OR CustomerName LIKE 'Leila%'
Or are you going to go back to ranges and do a clean, efficient, single range?
WHERE CustomerName LIKE 'Le[a-ny]la%';
Now I’d argue that that is a lot more readable than an endless list of OR clauses tacked on to the end of a script.
There is a lot more you can do with the LIKE function. Just because you may encounter this little guy every day does not mean that you know it.
Read this and don’t bother me for permissions again…I’m talking to the user, not you.
Matt Gordon ( b | t ) has the unfortunate luck of hosting this month’s T-SQL Tuesday. I say “unfortunate” because I don’t think many of the other halfs out there are going to be happy with the last minute writing and checking that bloggers are going to do for this.
He makes up for it though by asking us to think of the positive aspects of SQL Server. What old problems can we fix with the shiny new toys available to us since SQL Server 2014+.
Have you ever had to give someone uniform access to all the tables in all the databases in your instance of SQL Server, prior to SQL Server 2014?
Now seeing as this is a blog post designed to highlight the benefits of the new way and shine a light on the ugliness of the old way. I’m going to show you how I would do this with the GUI.
Is it GUI or ewwee?
Here we have a lovely little 2014 Developer edition of SQL Server, with a couple of random databases.
We need to create a LOGIN for this instance so that we can grant the needed permission to it. So opening the “Security” node on the instance and right-clicking the “Logins” node gives us a nice, little option called “New Login…”. Click that!
Great! Step 1 of the multi-step process complete! Now for the first database which in our case is “DMODisabled”.
A LOGIN needs a USER, right? Well let’s just open up the nodes “Databases | DMODisabled | Security” and right-click “Users”. Now let’s create a “New User…”
We have a choice now.
We could connect to every table, one-by-one, and grant SELECT permissions to our User. Or we could be a bit more clever and grant SELECT permissions on the different schemas themselves.
It’s up to you to be honest. We have to ask ourselves the question though:
What happens if a new database, table, or schema is created?
Answer: We have to do this again for them.
Also remember all this was for one single database! By my count, I still have four more databases to do after this.
I’m going to do none of those things because it’s already tedious, I’m not the fondest of the GUI, and it is making the urge to add coffee to my coffee increase.
Scripting ALL the things:
I’m not sure how you would script this out but for me it involved cursors. Cursors that spanned all the databases and all the tables/schemas in the database. Not the best of solutions also because if any new databases, tables, or schemas got created after it was ran, and the need to connect to those was there as well, then you had to run the damn thing again!
Much like Bonnie Tyler, you need a hero! Well I present to you two heroes!
CONNECT ANY DATABASE, and
SELECT ALL USER SECURABLES.
Check it out:
Now I’ve dropped my User and Login so I can show you the magic of these guys.
Lets create the Login again, T-SQL-style:
CREATE LOGIN [GiveMeALLTheData]
WITH PASSWORD = 'UpUpDownDownLeftRightLeftRightBAStart'
, CHECK_POLICY = OFF;
Now what server permission does it have?
FROM sys.server_principals AS sp
INNER JOIN sys.server_permissions AS perm
ON perm.grantee_principal_id = sp.principal_id
WHERE sp.name = 'GiveMeALLTheData';
Where’s our table at?
Can’t access the databases huh? We’ll see about that. Enter our first hero!
GRANT CONNECT ANY DATABASE TO [GiveMeALLTheData];
Let’s re-run our SELECT statement again and…
The table tries to stop us? Too bad. Our second hero steps up to the plate.
GRANT SELECT ALL USER SECURABLES TO [GiveMeALLTheData];
You can now access any database on your instance. Go ahead, give it a try out and see, but you may ask yourself:
Wait, what about new databases, new schemas, or new tables, like you said?
Fear not! Any new databases have their permissions automatically added, and new schemas have their permissions automatically added, and as for new tables; you better believe that they have their permissions automatically added.
Much better than a script that is needed to be run repeatedly, right?
Keep the shiny new toys coming I say! 🙂
Warning: This is barely SQL Server/PowerShell related. It is Microsoft product related if that helps?
In the beginning…
One thing that I am starting to appreciate more and more as time goes by is Documentation.
If you’ve ever worked on a system with no documentation in place then you know of the frustration, heartbreak and rage that this instills when you’re trying to get to know the system, or even trying to troubleshoot something on that sytem.
But even the act of writing documentation can have some benefits, I’d argue that you would be hard-pressed to find something that forces you to learn about a topic more than trying to write clear and concise documentation for someone else.
What’s your problem?…
In a strange side-effect of trying to become more responsible as a DBA, I’ve actually inherited a slight case of obsessiveness about certain things.
Words need to be spelled correctly, uppercase words need to be in uppercase, and with regard to this post…
I DON’T WANT A SPELL-CHECK ERROR FOR USING A PASSIVE VOICE!!!
It’s documentation. Now I can understand that, depending on your work environment, you can write in a more upbeat and active way. But for documentation, I don’t see anything wrong with this sentence:
Provision one (1) domain account, with no privileges, per service that will be run.
I’m fully expecting the paper clip guy to appear any second and start spouting “Looks like you’re writing in the Passive Voice…”
Calm yourself Word…
When this blue squiggly line (it is blue, right? I’m slightly colourblind) started to pop up everywhere on my document, and after checking things three times, I figured enough is enough and went about turning this off.
Here’s how I did it so that you can to!
Now I’m lucky enough that work have MS Word 2016, so if you have an older version YMMV. If you don’t have any version of Word, then your mileage will vary!
Go to “File | Options | Proofing” and scroll down to the section marked “When correcting spelling and grammar in Word”.
Click the “Settings…” button beside the ” Writing Style:” and “Grammer and Style” dropdown box. This should open up the following window.
Uncheck the “Passive sentences” checkbox and click “OK”. Click “OK” on the “Proofing” window as well and you should get back to your main Word screen.
And here is where the magic happens. Our blue (purple?) squiggly lines have disappeared!
Wrap it up…
You could just click ignore when spell-checking but I tried that. It ignores it once and then the moment that it spell-checks again, it picks up these “errors” again. If there is one thing worse then errors, it’s repeating errors.
Plus isn’t that part of we, as DBAs strive for?
We’re not content with just the quick fix, but want to identify and correct the underlying problem.
Now the reason that this is the topic of this month’s T-SQL Tuesday is nicely spelled out by the host himself:
Now, more than ever, Microsoft has started to respond to Connect requests and get ’em fixed not just in upcoming versions of SQL Server, but even in cumulative updates for existing versions.
If you’ve kept an ear to the ground and an eye out on the social media, (seriously get on Twitter if you haven’t already; with #sqlhelp and #sqlfamily, you’ll never be alone, stuck, and/or bored again), you’d realise that this is true. Microsoft has engaged more than ever with it’s audiences, from hiring a PowerShell developer to actively engaging in various Trello Channels. So much so that a twitterbot was created to keep track of Connect items closed as fixed (Closed As Fixed) by MVP Chrissy LeMaire ( blog | tweets ).
Now, I happen to have a Connect item and I’m going to link to it here. I can do this as this blog shares a commonality with Lesley Gore’s birthday party song. (It’s mine and I can cry if I want to)
However, this T-SQL Tuesday topic has a secondary agenda of getting these Connect items up-voted if we agreed with them. So with that in mind, and seeing how I already had a Connect item open dealing with Views, I took a look at other Connect items around Views to see what else there was and up-voted what I could.
That’s not contributing enough, I hear you say? (Yes, I’ve already heard the “you’re hearing voices” joke, and yes it is slightly funny). Well that’s fine, because I’m going to highlight and link the Connects I found below for you guys to decide.
So go on! Pretend you’re the Emporer of Rome, a Julius/Julia CTE-aeser, determing the fate of these gladitorial Connect items. Do they get the thumbs up? Or the thumbs down?
Battle of the Bugs:
First of all, lets see what we’re running on here…
What would happen if I told you that, with regards to a view, sometimes inserting into the table could fail? Well that’s what this Connect item from Dave_Ballantyne found, along with the reason.
CREATE TABLE myTable
Id integer not null,
InView char(1) not null,
SomeData varchar(255) not null
CREATE VIEW vwIxView
LEFT(SomeData, CHARINDEX('x', SomeData) - 1) AS leftfromx
CREATE UNIQUE CLUSTERED INDEX pkvwIxView ON vwIxView(Id)
Now that we have the groundwork laid out, it’s time for our insert attempt:
He’s even gone so far as to give an explanation as to why this happens!
This is due to the compute scalar being executed prior to the filter.
Taking a look at the estimated execution plan, we can see that he’s right!
Imagine trying to troubleshoot what was causing the inserts to fail? Horrible! I can imagine tearing my hair out at that one!
I have this done as “The Good” just for the fact that, not content to just report the bug, Dave_Ballantyne even went so far as to find the possible cause. Now it’s just up to Microsoft to get to the fixing…
CREATE TABLE test (A int)
CREATE VIEW vtest WITH SCHEMABINDING
SELECT t1.A, t2.A as A2
FROM dbo.test T1
JOIN dbo.test T2 ON T1.A=T2.A
The problem occurs when we try to add a unique clustered index on this bad boy:
CREATE UNIQUE CLUSTERED INDEX UCL_Test
on dbo.vTest (A, A2)
Now, I originally put this down as “The Bad” because I thought that the issue could be down to trying essentially index the same column twice but that’s not the case…
CREATE TABLE dbo.B (col1 tinyint, col2 tinyint)
CREATE VIEW dbo.SecondOne
SELECT t1.col1, t2.col2
FROM dbo.B AS t1
JOIN dbo.B AS t2 ON t1.col1 = t2.col1
CREATE UNIQUE CLUSTERED INDEX UCL_test
ON dbo.SecondOne (col1, col2)
In the end, the reason that I have this in “The Bad” section is that I went to the documentation and read this part…
The SELECT statement in the view definition must not contain the following Transact-SQL elements: […] self-joins […]
Now it’s unknown whether this was there when this Connect item was created, but it’s there now (and I didn’t have enough time to re-plan this blog post)
…and The Bug-ly.
In my quest through the magical lands of Connect I stumbled across this little beauty of a bug by Anatoly V. Popov that I had to mention as “The Bug-ly”.
Yes, I know it’s the same as my one but dammit if I don’t think it’s a bug! That’s why it’s getting my “The Bug-ly” title.
This leaves me with a bit of a conundrum though…Do I close mine and comment on this one to get re-opened? Or do I try and continue to push on my one?
To be honest I don’t know which option to choose but, for anyone new to SQL Server, filing a Connect Item is an exciting but nerve-wracking time.
Exciting because you’ve worked with SQL Server enough to think you know how something should be done.
However, it’s nerve-wracking because you haven’t worked with SQL Server for long enough to know if you are just whining or have an actual item to be taken seriously.
Finding this Connect item was a small little shot of validation for me. Just a little something to say “Hey, I thought this too and I’m with you”.
It’s a great thing that Microsoft have started to respond to the Connect Items again and going through them you realise how different people are using SQL Server in ways that you didn’t even think to imagine.
So check out a few, leave your comment and up-vote/down-vote as you see fit.
Now I’ll admit that I didn’t have any plan to write a blog post before I got the challenge as I’ve been steadily getting busier with work/study/other commitments, so apologies if it’s a bit long, rambling, and not thought through fully.
Anyway, happy birthday Arun, here’s your blog post:
Understanding “Scan Counts 0, Logical Reads N”
Have you ever run SET STATISTICS IO ON; ?
I’ll confess that I do it a lot, especially when I am performance tuning. Yet, like most things in SQL Server, I don’t fully understand it…yet!
Now don’t get me wrong, the little that I understand is extremely helpful, but recently I had a case where I didn’t understand the output of STATISTICS IO , and asking my Senior DBA got me the look from him that I come to think of as ‘The You_Are_Either_Joking_Or_You_Are_Being_Stupid_Again Look’.
So to document my stupidity, here’s the question for that look.
How come the Logical Reads from STATISTICS IO are so high when it says Scan count is 0?
tl;dr – They are related but not exactly a 1:1 relationship. Plus a scan count of 0 does not mean that the object wasn’t used at all.
Let us just get this definition out of the way as it’s very short, sweet, and to the point. Logical Reads are the…
Number of pages read from the data cache.
Right, great, got’cha. Logical reads, 8kb pages, read from the data cache. If your STATISTICS IO reports logical reads 112 then you’ve read 112 pages from the cache. Simples!
This is the blighter that got me the look…well more like ‘my misunderstanding of what Scan Count means’ got me the look but it still holds my contempt at this moment in time.
My previous intuitions about this guy was…
“Scan count is the number of hits the table/index received”
(THIS IS NOT CORRECT! …and why it isn’t down as a full quote).
Let’s check out the definition again and see what it has to say for itself.
Scan count is the…
Number of seeks/scans started after reaching the leaf level in any direction to retrieve all the values to construct the final dataset for the output.
That’s a very specific definition isn’t it? It’s not all of the definition though, there’s more! And it is this “more” that I want to focus on.
Testing The Defintions
First, things first, let us set up our query environment.
SET STATISTICS IO ON;
Scan count is 0 if the index used is a unique index or clustered index on a primary key and you are seeking for only one value. For example WHERE Primary_Key_Column = <value>.
Well let’s see about that!
CREATE TABLE dbo.Unique_DefinedUnique
col1 INT NOT NULL PRIMARY KEY
INSERT INTO dbo.Unique_DefinedUnique (col1)
SELECT x.n FROM (VALUES (1), (2), (3), (4), (5)) AS x(n);
CREATE UNIQUE NONCLUSTERED INDEX uci_Unique_DefinedUnique_col1
ON dbo.Unique_DefinedUnique ( col1 )
SELECT col1 FROM dbo.Unique_DefinedUnique WHERE col1 = 1;
Never mind, the next paragraph please!
Scant count is 1 when you are searching for one value using a non-unique clustered index which is defined on a non-primary key column. This is done to check for duplicate values for the key value that you are searching for. For example WHERE Clustered_Index_Key_Column = <value>.
I don’t believe you!
CREATE TABLE dbo.Unique_NotDefinedUnique
col2 INT NOT NULL
INSERT INTO dbo.Unique_NotDefinedUnique (col2) VALUES (1), (2), (3), (4), (5);
CREATE NONCLUSTERED INDEX nci_Unique_NotDefinedUnique_col2
ON dbo.Unique_NotDefinedUnique ( col2 )
SELECT col2 FROM dbo.Unique_NotDefinedUnique WHERE col2 = 1;
Scan count is N when N is the number of different seek/scan started towards the left or right side at the leaf level after locating a key value using the index key.
Hmmm, so if we have duplicate values, then this will happen?…
CREATE TABLE dbo.NotUnique
col3 CHAR(1) NOT NULL
INSERT INTO dbo.NotUnique (col3)
('A'), ('A'), ('B'), ('B'), ('C'), ('C'), ('D'), ('D'), ('E'), ('E'),
('F'), ('F'), ('G'), ('G'), ('H'), ('H'), ('I'), ('I'), ('J'), ('J');
CREATE NONCLUSTERED INDEX nci_NotUnique_col3
ON dbo.NotUnique ( col3 )
--Let's try it with 2 and then 3!!!
SELECT col3 FROM dbo.NotUnique
WHERE col3 = 'A' OR col3 = 'B'
SELECT col3 FROM dbo.NotUnique
WHERE col3 = 'A' OR col3 = 'B' OR col3 = 'C'
Putting away my toys…
DROP TABLE dbo.Unique_DefinedUnique, dbo.Unique_NotDefinedUnique, dbo.NotUnique;
I was confused about the Scan count being 0 but logical reads not being 0. How can the scan count not actually scan/seek anything?
But it is scanning/seeking!
Read the Scan Count definition again…I’ll capitalize the words that I glossed over
Number of seeks/scans STARTED AFTER REACHING THE LEAF LEVEL in any direction to retrieve all the values to construct the final dataset for the output.
Scan count of 0 occurs when there is a unique index or clustered index on a primary key and you are seeking for only one value. The word of the day is…”Unique”.
So because the unique index looking for a unique value in a column guaranteed to be unique, it’s not so much that the query isn’t looking for a value, it’s more that once the query reaches the leaf level it already knows that it’s on the value it needs!
Since it doesn’t look any more “after reaching the leaf level”, the scan count is allowed to be 0.
This explains why, if the value is unique but not guaranteed to be so (either the index, value, or column is not guaranteed unique) the query has to do 1 scan/seek to check that the next value isn’t what it wants.
Therefore, Scan Count will be 1…
And I’ll leave it as an exercise to figure out why Scan Count N is Scan Count N.
(Hint: it’s because Scan Count N)
I sometimes find the documentation dry and not terribly interesting but I don’t like the “You_Are_Either_Joking_Or_You_Are_Being_Stupid_Again” look more.
So read the documentation, study what you are doing, know why things do what they do…
I’m becoming more and more of a fan of Powershell the more that I interact with it. And I’m a big fan of the work that those over at dbatools are doing (seriously, check it out and also check out their Slack channel).
So when reading an article by Steve Jones (b|t) that mentions using Powershell, especially dbatools, I took immediate attention!
However, while reading the blog, what jumped out at me was the fact that dbatools copies the logins and the passwords. I think that’s epic and saves so much hassle, especially with migrations and environment creation.
But when you script out a login in SSMS, you get the following “warning comment” (and this comes straight from Microsoft) :
/* For security reasons the login is created disabled and with a random password. */
I understand why you don’t want to script out a login’s password for security reasons but disabling it…does that even do anything?
Something that I only recently learned is that, for logins with high privileges, disabling them is not enough; they must be removed.
Overkill, I hear you say?
Example, I retort!
I will admit that for my example to work, there needs to be help from a member of the securityadmin server role login. So for this example we’re going to have…
A disabled sysadmin login,
A “compromised” securityadmin login,
An “attacking” low-privilege login.
Window 1 (High Permission Account):
-- Create a high privilege login (HPL)
CREATE LOGIN [AllThePower]
WITH PASSWORD = '1m5trong15wear!';
ALTER SERVER ROLE sysadmin
ADD MEMBER AllThePower;
-- Disable it.
ALTER LOGIN AllThePower DISABLE;
-- Create a "compromised" login
CREATE LOGIN Enabler
WITH PASSWORD = 'AlreadyHereButCompromised';
-- Make them part of security so can grant permissions
ALTER SERVER ROLE securityadmin
ADD MEMBER Enabler;
-- Create a low privilege login (LPL)
CREATE LOGIN Copycat
WITH PASSWORD = 'NotAsStrongButDoesntMatter';
So now we have all our actors created, we need to connect to the database with all 3 accounts.
Simple as “Connect” -> “Database Engine” -> Change to SQL Auth. and put in the details above for who you want.
Window 2 (CopyCat):
First things first, check who we are and who can we see?
-- Who are we?
SUSER_NAME() AS LoginName,
USER_NAME() AS UserName;
-- Who can we see?
Okay, so we can’t see it but we know that it’s there.
Let’s just cut to the chase and start “God-mode”
-- Can we get all the power
ALTER SERVER ROLE sysadmin
ADD MEMBER CopyCat;
Can we impersonate it?
-- Can we impersonate AllThePower
EXECUTE AS LOGIN = 'AllThePower'
SUSER_NAME() AS LoginName,
USER_NAME() AS UserName;
Time to go to our compromised account:
Window 3 (Enabler):
Now, who are we and what can we see?
Notice that “Enabler” as part of securityadmin can see the disabled “AllThePower” login?
Great, we can see it, so let’s promote our CopyCat login!
So even though we’re now a member of the securityadmin role, we still can’t promote our login!
I think you’d be safe in thinking that people would give up here, but we know from server_principals that “AllThePower” is there, even though it’s disabled!
So even though we don’t have the ability to promote our login, we do have something that we can do in our power.
-- Give CopyCat Grant permission
GRANT IMPERSONATE ON LOGIN::AllThePower TO CopyCat;
Window 2 (CopyCat):
Now can we impersonate our Disabled login?
And can we get all the power?
Finally, we’ll revert our impersonation and see if we actually are sysadmin?
-- Go back
SUSER_NAME() AS LoginName,
USER_NAME() AS UserName;
-- Are we superuser?
SELECT IS_SRVROLEMEMBER('sysadmin') AS AreWeSysAdmin;
And now I can do whatever I feel like, anything at all!
I’m a fan of removing high-permission accounts that are not needed but I’ve never put into words why. Now I know why disabling is just not enough and, for me, removing them is the option.