It would be more performant if I stripped that function and used the code directly but the code is too handy to use for the infrequent times I need it.
PowerShell
I’ve talked before about using ConvertTo-SqlSelect in a blog post before and I still use that function alot!
Another short piece of code that I use is more for formatting than anything else. You can populate a variable with an array of properties names. Select-Object can use this variable to return information.
When I sit down and write this post, I realise that I don’t have a lot of handy scripts. Either I re-write things constantly (that’s likely), or I don’t know enough yet (also likely). I should fix that.
A couple of days ago, I was running some unit tests across a piece of PowerShell code for work and a test was failing where I didn’t expect it to.
After realising that the issue was with the workings of TrimEnd and my thoughts on how TrimEnd works (versus how it actually works), I wondered if it was just me being a bit stupid.
So I put a poll up on Twitter, and I’m not alone! 60% of the people answering the poll had the wrong idea as well.
QUESTION TIME:
Without running it, what do you think this will return?
The vast majority of code that I have seen out in the wild has strings as the inner portion of TrimEnd
'Shanes_sqlserver'.TrimEnd('sqlserver')
The code works how I thought that it would, removing the “sqlserver” portion of the string at the end. Now, let’s try it again and remove the underscore as well.
'Shanes_sqlserver'.TrimEnd('_sqlserver')
See! Where has my “s” and “e” gone?!
Let’s look at the overload definitions for TrimEnd by running the code without the brackets after the method.
'Shanes_sqlserver'.TrimEnd
No overload definition takes a string; they either take a char or an array of chars. Is that what’s happening here?
# Takes an array of chars
'Shanes_sqlserver'.TrimEnd('_', 's', 'q', 'l', 'e', 'r', 'v')
# Turns a string into an array of chars
'Shanes_sqlserver'.TrimEnd('_sqlerv')
# Order doesn't matter either
'Shanes_sqlserver'.TrimEnd('vrelqs_')
A New Way of Thinking
So TrimEnd takes the characters that we provide inside the method and removes them from the end until it reaches the first non-matching character.
This example explains why our first example, with TrimEnd('sqlserver'), removes everything up to the underscore.
'Shanes_sqlserver'.TrimEnd('sqlserver')
# -----^ First non-matching character (_)
However, when we include the underscore, the first non-matching character shuffles back.
'Shanes_sqlserver'.TrimEnd('_sqlserver')
# --^ First non-matching character (n)
Initial Problem
Now that we have a new understanding of how TrimEnd works, how can we remove the “_sqlserver” part of the string?
Split it in two.
'Shanes_sqlserver'.TrimEnd('sqlserver').TrimEnd('_')
# -----^ First non-matching character (_)
# ----^ First non-matching character after first TrimEnd (s)
This rewrite works for us since we have a defined character that acts as a stop-gap. If that stop-gap isn’t possible, then -replace may be our best option.
'Shanes_sqlserver' -replace '_sqlserver'
Always good to get a better understanding of PowerShell. If my tests catch more of these misunderstandings that I can learn from, then I’m OK with that!
So this is a post that will not be educational, but it’s the latest encounter that I’ve had with containers, so it’s the most present in my mind. Hey, hopefully, it brings a laugh to some people.
I’ve been looking into Kubernetes. I’ve not gotten very far with it, but I managed to set up a replica in Ubuntu WSL2 on my laptop.
Everything was all well and good apart from being unable to connect to the database from Azure Data Studio but again, all good.
Fast forward a couple of days where I’m trying to share screen, and my laptop started getting very slow, the fans started getting very loud, and the performance just tanked.
Taking a look at the ol’ Task Manager, I saw a “vmmem” process taking a massive amount of memory. A quick google search led to the culprit being virtual machines.
Here started what I can only describe as a Benny Hill sketch where I tried to remove the pods only to have the Kubernetes create the pods again!
Remove the pods – check for pods – the same amount created a few seconds ago! Argh!!!
Containers
Eventually, I dropped the pods and managed to get my laptop under control. Still wanting to have a SQL instance to work with, I managed to spin up a Docker container and have a developer instance of SQL 2019 up and running on my laptop.
Thankfully I know enough about containers to stop the instance when I don’t need it and only start it up again when I do.
It’s strange to think that the day has arrived where I resort back to my knowledge of containers as the familiar option! There’s a good thing in there somewhere, maybe put a backstop into my learnings? Just enough to know how to stop if the situation goes wrong or go too far.
I still intend to continue researching Kubernetes, but maybe I’ll deepen my knowledge on Containers in the meantime.
It’s been a busy month for me so there’s not a lot of outside work research that has been going on.
That being said, there has been quite a gap since I wrote a blog post so I figured that I might as well write something
So what do I have to write about?
SELECT Statements
There are times when I want to mess about with data in SQL Server, data that I have obtained in PowerShell. This will require some way to get the information from PowerShell into SQL Server.
I know of a few ways to do this.
dbatools
There is the dbatools module and the Write-DbaDbTableData function.
Get-Help -Name Write-DbaDbTableData -Full
If I wanted to write the properties of 50 modules from PSGallery into SQL Server, I can use the function handy enough.
Both of these were a bit too much for me though. I only wanted a quick and easy way to have the data available in a SELECT statement.
I can use ImportExcel and ConvertFrom-ExcelToSQLInsert but that is dependent on the table already existing, never mind having to save the data in an Excel file first.
Don’t get me wrong – I’m aware that you don’t need Excel installed on the computer where you’re running these commands from. You still need to save the files somewhere though. The function doesn’t take data from variables.
I can use dbatools and Write-DbaDbTableData. This function is not dependent on the table having to already exist. It will create the table for you if you tell it to. Thank you -AutoCreateTable; even though I recommend pre-sizing your columns if you want to go with this method.
However, I don’t want to have to create the table beforehand.
ConvertTo-SQLSelect
So I wrote a primitive function to have the data available in a SELECT statement that I can run in an SSMS or Azure Data Studio window.
There are strings and they get inserted as strings but that’s okay for me for a quick playthrough. Any data conversions, I can do once I have the data in an SSMS window.
It doesn’t like single quotes
Yeah, I have no real excuse for this one. I should really fix that before I use this function again…
It can handle single quotes now
There is also no help comments for this.
There should be, even though there is only one parameter. There should also be tests! I am filled with good intentions that are yet to see fruition though…
That being said, I’ve had to use it a few times already that has meant that writing it has already paid off.
So feel free to use, abuse, and/or improve it as you see fit.
Figuring out how to group the output of your Pester tests
Words: 830
Time to read: ~ 4 minutes
I’ve been working with Pester v5 lately.
Pester v5 with PowerShell v5 at work & Pester v5 with PowerShell Core outside of work.
There are quite a few changes from Pester version 3, so it’s almost like learning a new language… except it’s based on slang. I think that I’m speaking eloquently, and then I’ve suddenly insulted someone and Pester no longer wants to play nice with me.
Initial Tests
I’ve got the following data that I’m using to test Pester v5.
BeforeDiscovery -ScriptBlock {
$Groups = @(
[PSCustomObject] @{
Server = 1
Group = 'A'
Value = '86b7b0f9-996f-4c19-ac9a-602b8fe4d6f2' -as [guid]
},
[PSCustomObject] @{
Server = 1
Group = 'B'
Value = 'e02913f7-7dae-4d33-98c9-d05db033bd08' -as [guid]
},
[PSCustomObject] @{
Server = 2
Group = 'A'
Value = '96ad0394-8e9e-4406-b17e-e7d47f29f927' -as [guid]
},
[PSCustomObject] @{
Server = 2
Group = 'B'
Value = 'f8efa8b6-e21b-4b9c-ae11-834e79768fee' -as [guid]
}
)
}
Test data
Usually, I would only use -TestCases to iterate through the data. I know that in Pester v3, I could wrap the It blocks inside a foreach () {}, and it would be okay. Hell, in most of my testings, it was faster. It doesn’t matter; I liked using -TestCases, and the performance difference is negligible to me.
That is still an option with Pester v5. I can run the below code to confirm.
Describe -Name 'Attempt: 01' -Tag '01' -Fixture {
Context -Name 'Server: <_.Server>' -Fixture {
It -Name 'should have a guid for its value: <_.Value>' -TestCases $Groups {
$_.Value | Should -BeOfType [guid]
}
}
}
ForEach on the It block
If I look at the data, I can see that I’ve got two different values for Server; 1 and 2. It would be great if I could group the tests by those server values.
For me, Pester has three main blocks; Describe, Context, and It. I know that Pester v5 has a -ForEach parameter for each of these blocks. I’ve already tried using the -ForEach parameter against the It block, and it didn’t do what I wanted.
Reminder of the ForEach on the It block
I’ll try it against the Context block instead and see if it works.
Describe -Name 'Attempt: 02' -Tag '02' -Fixture {
Context -Name 'Server: <_.Server>' -Foreach $Groups {
It -Name 'should have a guid for its value: <_.Value>' -Test {
$_.Value | Should -BeOfType [guid]
}
}
}
ForEach on the Context block
That kind of works but we’ve got the same server in two different groups. Let’s move the groups up to the Describe level.
Describe -Name 'Attempt: 03 - Server: <_.Server>' -Tag '03' -Foreach $Groups {
Context -Name 'Server: <_.Server>' -Fixture {
It -Name 'should have a guid for its value: <_.Value>' -Test {
$_.Value | Should -BeOfType [guid]
}
}
}
ForEach on the Describe block
We’ll that’s not what I wanted. Instead of 1 describe block, we have multiple blocks; 1 per group.
Grouped Data
Now, I’m going to start using Group-Object. My data by itself doesn’t seem to work.
$Groups = @(
[PSCustomObject] @{
Server = 1
Group = 'A'
Value = '86b7b0f9-996f-4c19-ac9a-602b8fe4d6f2' -as [guid]
},
[PSCustomObject] @{
Server = 1
Group = 'B'
Value = 'e02913f7-7dae-4d33-98c9-d05db033bd08' -as [guid]
},
[PSCustomObject] @{
Server = 2
Group = 'A'
Value = '96ad0394-8e9e-4406-b17e-e7d47f29f927' -as [guid]
},
[PSCustomObject] @{
Server = 2
Group = 'B'
Value = 'f8efa8b6-e21b-4b9c-ae11-834e79768fee' -as [guid]
}
)
Results of the Test Data
We can pass that data into Group-Object to group our data by a certain property. In my case, I want to group the data by the Server property.
$Groups | Group-Object -Property Server
Grouped Test Data
Taking a look at the first group, I only have the data for that single property value.
First, I’ll try putting the groups into the It blocks and see if that works.
Describe -Name 'Attempt: 05' -Tag '05' -Fixture {
BeforeDiscovery -ScriptBlock {
$GroupedGroups = $Groups | Group-Object -Property Server
}
Context -Name 'Server: <_.Name>' -Fixture {
It -Name 'should have a guid for its value: <_.Group.Value>' -ForEach $GroupedGroups {
$_.Group.Value | Should -BeOfType [guid]
}
}
}
Grouped on the It block
It doesn’t fully work. The data is grouped but the results seems to be concatenating the values. I’d like it better if they were split out to separate tests per value.
This time, I’ll group the data in the context blocks and then pass the groups into the It blocks. I’ll do this by passing the groups into the -ForEach parameter of the It block using $_.Group.
Describe -Name 'Attempt: 04' -Tag '04' -Fixture {
BeforeDiscovery -ScriptBlock {
$GroupedGroups = $Groups | Group-Object -Property Server
}
Context -Name 'Server: <_.Name>' -ForEach $GroupedGroups {
It -Name 'should have a guid for its value: <_.Value>' -TestCases $_.Group {
$_.Value | Should -BeOfType [guid]
}
}
}
Grouped on Context and passed to It block
In the previous code blocks, I used the BeforeDiscovery block in the Describe block. If you don’t want to use that, you can pass the Group-Object cmdlet to the ForEach parameter as a subexpression.
Describe -Name 'Attempt: 06 - Server: <_.Name>' -Tag '06' -ForEach ($Groups | Group-Object -Property Server) {
Context -Name 'Server: <_.Name>' -Fixture {
It -Name 'should have a guid for its value: <_.Value>' -TestCases $_.Group {
$_.Value | Should -BeOfType [guid]
}
}
}
Without using BeforeDiscovery on the Describe block
Pass or Fail
I’ve encountered this obstacle of grouping objects in tests a couple of times. I’m hoping that by writing this down, I’ll be able to commit the information to memory.
Hey, if it doesn’t, I can always grab the code and figure it out.
Welcome to T-SQL Tuesday, the brainchild of Adam Machanic ( twitter ) and ward of Steve Jones ( blog | twitter ). T-SQL Tuesday is a monthly blogging party where a topic gets assigned and all wishing to enter write about the subject. This month we have Mikey Bronowski ( blog | twitter ) asking us about the most helpful and useful tools we know of or use.
Tools of the trade are a topic that I enjoy. I have a (sadly unmaintained) list of scripts from various community members on my blog. This list is not what I’m going to talk about though. I’m going to talk about what to do with or any scripts.
I want to talk about you as a person and as a community member. Why? Because you are the master of your craft and a master of their craft takes care of their tools.
Store Them
If you are using scripts, community-made or self-made, then you should store them properly. By properly, I’m talking source control. Have your tools in a centralised place where those who need it can access it. Have your scripts in a centralised place where everyone gets the same changes applied to them, where you can roll back unwanted changes.
If you are using community scripts, then more likely than not, they are versioned. That way you’re able to see when you need to update to the newest version. No matter what language you’re using, you can add a version to them.
PowerShell has a ModuleVersion number, Python has __version__, and SQL has extended properties.
If you take care of these tools, if you store them, version them, and make them accessible to those who need them, then they will pay you back a hundredfold. You’ll no longer need to re-write the wheel or pay the time penalty for composing them. The tools will be easy to share and self-documented for any new hires. Like the adage says: Take care of your tools and your tools will take care of you.
If you look back over some of the posts that I wrote in October this year, you may have realised that there was a motif going on.
I used a homebrew pushup tracker as a data source for a couple of blog posts. A group of friends and I were attempting to “push out” (excuse the pun) 3,000 pushups over the month.
Spoilers: We didn’t reach the target.
Try Again
I’m okay with failure. If you learn from your failures, then I don’t even consider them as failures. This scenario didn’t fall into this case, though. The only reasons that I could think that I didn’t reach the target are:
I started after nearly a week into the month had passed, and
I tried to do too much, too fast, in as little rounds as possible per day.
So, with these lessons under my belt, I decided to try again.
Smarter
I figured that it was simple enough to fix my first mistake, I’d start on the first day of the month this time.
The second mistake was something that I figured would also be simple. Rather than attempting to do as many as I could in as little rounds as possible, I’d do ten sets a day and that was it. If I focus more on the process than the goal, I figured that it would get me over the line eventually.
Challenge 01
If I do a set every half hour, I’d have the ten completed in 5 hours. I mean, hey, we’re in lockdown. I have 5 hours to spare.
But I didn’t.
Work, meetings, calls, focus and flow all sapped the time away from me.
So I tried again.
I’ve started getting up early in the mornings do to research and blog posts (like this one for example), so I’d try and get them done then.
Ten sets every 5 minutes should have me completed in just under an hour; more than enough time to spare.
Challenge 02
Pushups are hard! Even when I’m not trying to rep out as many as I can, they still take a toll on the body. Soon a five-minute break is not enough, and I’m taking longer and longer rests.
Fine, if that’s the way we’re going to do this, then I’m going to go with the flow.
Scripting
Seeing as I needed a little extra rest each round, I decided to create a PowerShell script that would help calculate that rest for me.
Recently the DBA Team Lead and I were reviewing some SQL code, and we came across some SQL that neither of us had frequently encountered before. This led to a brief watercooler moment where we shared some uncommon SQL that we had seen. Perfect blog post material, I think.
/* The 10 employees who have been the longest at the company */
SET NOCOUNT ON;
SELECT TOP (10) WITH TIES
HE.JobTitle,
HE.HireDate,
{d '2006-06-30'} AS start_of_company,
DATEDIFF(DAY, {d '2006-06-30'}, HE.HireDate) AS days_since_company_start
FROM HumanResources.Employee AS HE
ORDER BY days_since_company_start;
INSERT Alias
An unexpected item that we found recently was that INSERT INTO statements care about correct column names. That’s all though, nothing else seems to faze them. This means that you can add the most ridiculous aliases or part names to the column and SQL Server won’t care. As far as I can tell, it will just ignore them.
/* Prefixes get prefixed */
SET NOCOUNT ON;
IF OBJECT_ID(N'dbo.Hires', N'U') IS NOT NULL
BEGIN
DROP TABLE dbo.Hires;
END;
CREATE TABLE dbo.Hires (
hire_id int IDENTITY(1, 1) NOT NULL
CONSTRAINT [PK dbo.Hires hire_id] PRIMARY KEY CLUSTERED,
job_title varchar(50) NOT NULL,
hire_date datetime2(7) NOT NULL,
is_on_salary bit NULL
CONSTRAINT [DF dbo.Hires is_on_salary] DEFAULT (0)
);
TRUNCATE TABLE dbo.Hires;
WITH OldestHires AS (
SELECT TOP (10) WITH TIES
HE.JobTitle AS job_title,
HE.HireDate AS hire_date,
ROW_NUMBER() OVER (ORDER BY HE.HireDate) AS rn
FROM HumanResources.Employee AS HE
ORDER BY HE.HireDate
)
INSERT INTO dbo.Hires (
[0].[1].[2].[3].[4].[5].[6].[7].[8].[9].job_title,
a.b.c.d.e.f.g.h.i.j.k.l.m.n.o.p.q.r.s.t.u.v.w.x.y.z.hire_date,
[1/0].[%].[OUT_OF_BOUNDS].[ ].is_on_salary
)
SELECT OH.job_title,
OH.hire_date,
CASE
WHEN OH.rn % 3 = 0 THEN NULL
ELSE 1
END AS is_on_salary
FROM OldestHires AS OH;
SELECT *
FROM dbo.Hires;
GO
Default Option
Let’s contrive an example. Let us say that we have a table called dbo.Hires and we’ve added a column called is_on_salary. Since most of the hires are salaried, we have added a new default constraint setting the value to 0. Unfortunately, it looks like the default constraint hasn’t been applied yet…
/* Our dbo.Hires table */
SET NOCOUNT ON;
SELECT *
FROM dbo.Hires;
= DEFAULT
Recently, my DBA Team Lead pointed me to a piece of code where the syntax was: UPDATE T SET COLUMN = DEFAULT
Now, I had never seen this before, and I wasn’t quite sure that this method would work. I wasn’t wholly surprised, though when a quick test proved that it does.
/* UPDATE DEFAULT */
SET NOCOUNT ON;
UPDATE dbo.Hires
SET is_on_salary = DEFAULT
WHERE is_on_salary IS NULL;
SELECT *
FROM dbo.Hires;
What about with no default?
Okay, that seems to add the default constraint to a column. What about when there is no defined constraint on the column. Will it error out then?
/* Removing our default constraint */
ALTER TABLE dbo.Hires
DROP CONSTRAINT [DF dbo.Hires is_on_salary]
SELECT 'Pre update' AS [status],
*
FROM dbo.Hires;
UPDATE dbo.Hires
SET is_on_salary = DEFAULT
WHERE is_on_salary = 0;
SELECT 'Post update' AS [status],
*
FROM dbo.Hires;
Nope! As mentioned in the docs – if there is no default, and the column can become NULL, then NULL will be inserted.
CURRENT
Finally, we have CURRENT. While the vast majority of scripts manually define the database context for commands, such as ALTER DATABASE AdventureWorks, etc., you can tell SQL Server: Hey! Use the current database context!
/* CURRENT Database Context */
SET NOCOUNT ON;
ALTER DATABASE AdventureWorks2014 SET PAGE_VERIFY NONE;
SELECT 'Pre change' AS [status], [name], page_verify_option_desc FROM [sys].[databases] WHERE [name] = N'AdventureWorks2014';
ALTER DATABASE CURRENT SET PAGE_VERIFY CHECKSUM;
SELECT 'Post change' AS [status], [name], page_verify_option_desc FROM [sys].[databases] WHERE [name] = N'AdventureWorks2014';
And so forth
Thre’s probably a lot more but these are the ones that we talked about. If you have any uncommon SQL, let me know!
It’d be a lot easier, though, with a properly normalized data model which includes date, attempt number, and push-ups in that attempt. Pivot those results at the end if you want this sort of report, but SQL is designed to work best with tables in first normal form or higher.
I can’t very well give out to people for not doing the right thing first time, even if it’s more difficult, if I don’t do the right thing myself!
As Kevin mentioned, once the data was in a proper format, a format designed for SQL, the calculations were trivial.
However, outputting the results in the same way in PowerShell required a way to pivot results in PowerShell. Thanks to some heavy lifting from Joel Sallow ( Blog | Twitter ), I now know how to pivot in PowerShell!
Here’s hoping that this post will help explain it for you also.
Exploring our Data
SQL
First off, let’s check the current state of our table in SQL.
SELECT POP.pushup_date,
POP.attempt_number,
POP.pushup_count,
SUM(POP.pushup_count) OVER (PARTITION BY POP.pushup_date ORDER BY POP.pushup_date) AS total_per_date,
SUM(POP.pushup_count) OVER () AS grand_total
FROM dbo.PushupsOctoberProper AS POP;
SQL style!
Pivoting
I want to get all possible 8 attempts horizontal like the last post. I find this fairly easy when I have the documentation for PIVOTs open in another tab.
/* Can we pivot these? */
SELECT PVT_01.pushup_date,
[1] AS attempt_1,
[2] AS attempt_2,
[3] AS attempt_3,
[4] AS attempt_4,
[5] AS attempt_5,
[6] AS attempt_6,
[7] AS attempt_7,
[8] AS attempt_8,
PVT_01.total,
PVT_01.total_so_far
FROM
(
SELECT POP.pushup_date,
POP.attempt_number,
POP.pushup_count,
SUM(POP.pushup_count) OVER (PARTITION BY POP.pushup_date ORDER BY POP.pushup_date) AS total,
SUM(POP.pushup_count) OVER () AS total_so_far
FROM dbo.PushupsOctoberProper AS POP
) AS SRC
PIVOT
(
MAX(pushup_count) FOR attempt_number IN ([1], [2], [3], [4], [5], [6], [7], [8])
) AS PVT_01
ORDER BY PVT_01.pushup_date;
Simples!
Simple, right? Once we have the data in the expected format then the above steps are the only steps necessary to calculate and show the data in the way that we want.
However, it becomes a bit more complicated in PowerShell.
PowerShell
Let’s grab the data from our SQL instance and take a look at it.
<# Populate our variable from the database #>
$invQueryParams = @{
SqlInstance = $sqlInstance
Database = 'LocalTesting'
Query = 'SELECT * FROM dbo.PushupsOctoberProper;'
}
$data = Invoke-DbaQuery @invQueryParams
<# Show our data #>
$data | Format-Table -Autosize
So far, so good…
Grouping our Data
We have our data fetched, now we need to group it by the different dates. If only PowerShell had a way to group objects…what? Group-Object? oh!
Data.DataRow? * sigh* one of these days I’ll remember to use -AS PSObject with my Invoke-DbaQuery
Now that we have our data grouped by the different dates, we can loop through each date and pivot the data out horizontally.
Manual Pivot
The first way that came to mind was to manually list out all columns. I know that the maximum attempt_count that I have is 8 so let’s manually create 8 attempt columns.
In case you’re wondering what @{ Expression = 'attempt*' ; Width = 10 } does, I use it to narrow the width of the columns named like attempt since they’re integers. Since they don’t need as much space, I can narrow them down and then Format-Table won’t cut-off my later columns!
Dynamic Pivot
I’m not against the manual way. I just find it too bulky and repetitve. It works! Please don’t get me wrong on that accout but as I recently heard someone say: “It works, now clean it up“
Our main problem is the attempt columns and our manually typing them out. They seem like a perfect candidate for a ForEach loop. But, when we try to slot that in….
Let’s investigate that $props variable. We’re creating a hashtable where the Key is our name and the Value is the expression we want. So let’s get the values.
$props | Format-List
Expression = $num
Do you see the way that each of the Expression keys have a value with the $num variable?
If you check $num now, you’ll see that it’s set to 8. It looks like we have found our problem, the $props variable isn’t keeping the value of $num when we define it!
Since only one date has a value for attempt 8, we should see some values there.
All filled but all with value for the 8th attempt!
Yeah…that’s not correct. I did 30 on the first attempt. Believe me, I remember the pain. Looks like it’s putting the value for attempt 8 into each of the attempts.
Not cool…
Closures
If only there was a way to keep the value of $num when we defined the $props variable. Well, thanks to Joel and his post ScriptBlocks and GetNewClosure(), I now know that there is!
There’s nothing wrong with making mistakes; as long as you learn from them.
Thanks to Kevin for reminding me how things should be stored, and thanks to Joel for this (vast) knowledge sharing, I’ve been able to learn how to dynamically pivot in PowerShell from my mistakes.
Review your mistakes, you never know what you may learn.
Like most things in life, this piece of work came about while attempting to complete something else. It’s not a bad thing, I expect it at this stage.
Easy Like Sunday Morning
I find it easy to get the total of a row in SQL. Hell, when it is not particularly important, I’ll even go the easy route and use a calculated column in the table.
Once you have the total per row, you throw in a SUM(that total) OVER () and you have a grand total. Thank you to Kevin Wilkie ( blog | twitter ) for re-igniting my curiosity about Window Functions again.
SELECT *,
SUM(p.total_pushups_per_day) OVER () AS total_so_far
FROM dbo.PushupsOctober AS p;
GO
Total total
Easy Like Monday Morning
PowerShell is a different beast. Please don’t get me wrong; I still love the language. I don’t find it easier to get a row total and then a grand total though.
It’s possible! I’m just hoping that there is a better way. Saying all that here is my attempt at a row total and grand total using PowerShell.
If you have a better way (you choose the conditions that satisfy “better”) please let me know.
Grabbing the Data
First, let’s grab the data from the table in our database.
Here’s where I remembered that I had a calculated column, realised that it would be cheating to use it and decided it needed to go. Thankfully, this also enabled me to get rid of those pesky columns that get returned from Invoke-DbaQuery when you forget the parameter -As PSObject!
We now have the row total in our total_per_day property. And, with our use of -outvariable data_3, we have the results saved into a variable called $data_3 .
Grand Total
Once we have a single column that we can sum up to give us our grand total, then PowerShell makes this operation trivial.
I do have to use Format-List here because Format-Table can’t fit all the properties in so our new property total_so_far won’t show up.