On PowerShell

I use PowerShell a lot and I write about using it to solve problems quite frequently. The fact that I can extending Powershell by interfacing with the .NET Framework or making a COM/COM+ object call means I can do just about anything I need to do in order to manage a Windows system. As a result, I consider PowerShell one of my most powerful tools.

However (you knew there was going to be a however), PowerShell is one tool among many. If you are a smart IT pro, you build your toolbox with the tools that are most appropriate for you. Yes, you take into account where the industry is as well as what your current job(s)/client(s) use. Sometimes that means you choose a tool other than PowerShell. To some, though, that sounds of blasphemy. It shouldn’t be. If you’re a senior IT professional, you should be amenable to finding the right tool for the job, even if it’s not the one you like the most. If you’re at an architect level, you had better be prepared to recommend a technology that is the best fit, not the best liked (by you).

When I think in these terms, it means I don’t build Windows system administration tools with Perl any longer. Unfortunately, even though ActiveState still has a very functional version, Perl has faded greatly from view on the Windows side. Granted, it was never very bright, but there were some big name proponents and it gave a whole lot of functionality not available in VBscript/Cscript/Jscript. That’s why some enterprise shops turned to it. With PowerShell, the functionality provided by Perl on Windows systems, the functionality missing from earlier Microsoft scripting languages, is present. So PowerShell will usually make more sense.

I said usually. I don’t automatically select PowerShell because it is the recommended standard by Microsoft. What clients am I running on? What other languages am I using? For instance, if I’m a heavy Python shop, that can be used to manage Windows systems. It may be more cost effective to write in Python than in PowerShell. If I have linux and Mac OS X platforms, I’m likely not using PowerShell. It’s all about the right tool for the job. And the right tool has more considerations than what a particular company recommends.

Advertisements

On Automation

I’m a big fan of automation. I’ve been in IT for 27 years now. One unchanging rule during that time is there is always more to do than there is time to do it. Automation helps close that gap. And when I can automate something, I can do more than peers who can’t. That gives me a competitive advantage. So, three cheers for automation. 

However, the reality is that a lot of administration is still manual. It may sound clever to say that if it’s not automat-able it’s not something you want a part of or that you’re not a player in some space because you don’t automate. But that’s not reality. 

For instance, people can choose to use the cloud and not automate. One reason that the cloud was advertised in the first place was to reduce on-premise costs. You could move to cloud servers and shutdown your costly datacenter and save. You didn’t have to change your day-to-day activities and you would still likely save. That’s not always true, as some startups have shown the math of switching to their own servers when reaching a certain capacity point. But that’s not the point. The point is you should be able to use the cloud even if you aren’t going to automate. 

It may not be as efficient or as cost-effective, but it still should be doable. There may be other business drivers that prevent IT from embracing automation. In the real world, that happens. It happens a lot. There are a finite number of resources. And if business determines that you as a resource would be better spent building out something new rather than automating something existing, then you are building something new. That’s reality. 

So when I hear about a new technology like Nano, I can like it without jumping on the automation bandwagon. Look, you just told me it’s compartmentalized and there’s a lot of surface area removed, even when compared to Windows Server Core. From a security perspective, I am doing a happy dance. I agree that automation makes it better. But just because your vision is automation, automation, automation, doesn’t mean it is everyone’s. And when there are other factors to consider, they may be right for what they are trying to do.

My Upcoming Speaking Engagements

March 4 – Charleston PASS, Charleston, SC

What Admins Absolutely Must Know About SQL Server Security

There are so many security tips out there for SQL Server. Almost all of them are rated as a best practice. What do you listen to? What do you focus on? In this session we’ll break down what you absolutely must know about securing SQL Server. We’ll look at the things to look for within SQL Server, including some of the nooks and crannies an attacker might use but what are rarely audited. You’ll leave with a checklist of what to investigate and a set of scripts to run on your own systems.

Register Here

March 12 – Webinar with MSSQLTips.com

SQL Server backup automation and best practices

Join us for this webcast to learn about best practices for backing up your SQL Server databases along with things you can automate to reduce your workload.

Having proper backups for your SQL Server databases is your last line of defense when things go wrong. Database backups are rarely used to restore a production database, but when they are needed, having a solid plan is paramount.

In this webcast we will cover:

  • The types of backups to setup for your databases
  • Proper database settings for backups
  • Protecting database backups
  • Backing up system databases
  • Automating backups with SQL Agent and other scheduling tools
  • Automating checks to ensure backups are successful
  • Setting up alerts and notifications for backup failures
  • and more

Register Here

March 12 – Midlands PASS, Columbia, SC

What Developers Absolutely Must Know About SQL Server Security

There are so many security tips out there for SQL Server. Almost all of them are rated as a best practice.What do you listen to? What do you focus on? In this session we’ll break down what you absolutely must know about building secure database using SQL Server. We’ll look at the SQL Server securables model, how you can simplify your security model using patterns and models you are already familiar with, how roles can be used to aggregate security cleanly, and how to put in triggers and other mechanisms to try and protect your databases from attack.

Register Here

Continuous Integration/Delivery without Testing!

Anything we can do to automate our builds and deployment should be considered. After all, the point isn’t just to write code, but to deploy working code. So what if we did the automated builds and deployed them to development or QA? No errors, so I’m good, right?

Not so fast. Go back to what Martin Fowler says about testing in continuous integration. Builds should be self-testing. For instance, simply deploying T-SQL code to a database without errors is not the end. That’s merely the beginning. At this point you know that there aren’t any obvious syntactical errors. That doesn’t mean the T-SQL code works according to specification. It doesn’t mean that a view that you didn’t drop and create is okay. After all, if you change the underlying objects, that view might not work any more. Testing the build is important. And having all the tests needed to sufficiently check out the functionality of the code is essential.

Actually, more testing should be done that just checking bits of functionality like we typically do with unit or module tests. At some point during the process you should also be testing in production-like conditions. Can you have Continuous Integration (CI) or Continuous Delivery (CD) without proper testing? No, you can’t. You can have something that looks like CI or CD, and you may even call what you have by one of those names, but you don’t have CI or CD.

The fact of that matter is that you want testing; actually, you want as much automated testing as is feasible. Speeding up the process doesn’t mean end users are suddenly okay with getting buggy code. And we as IT professionals shouldn’t be okay with that, either. We can still ship and test. We just have to commit to test. Yes, testing will add time to every build cycle. However, it’s a necessity for every build cycle if you’re doing builds right. Simply compiling the code isn’t adequate testing. It’s merely the first test of many more.

“A good DBA is a lazy DBA”

I’m borrowing from Andy Leonard (blog | twitter) who says all the time, “Good engineers are lazy.”

If you’re thinking, “Why would I want (to be) a lazy DBA?” let me explain. There’s a lot to be said for hard work. However, have you ever seen someone who is always busy but seems to get very little done? Hard work, in and of itself, isn’t the goal. The goal is to get things done. This is where laziness comes in.

If I have to repeat a task, I should look at automating it. I don’t want to have to repeat those steps each time. I want to be lazy. For instance, as an operational person, there are a lot of things I need to review within my production environment on a periodic basis to ensure my systems are running as they should. Case in point: ensuring the SQL Server drives don’t run out of free space. This is something that I should monitor regularly (daily).  There are different ways I could handle this:

  • I could log on to each server and check each drive.
  • I could use a tool like TreeSize to hit the drives one by one.
  • I could automate the checks which results in a report in my inbox.

I prefer the last option. If I’m smart, I’ll even schedule it to run so it’s ready for me when I get in each morning. But why stop there? I could not only automate gathering the data, but also automate some of the analysis. Let’s say I set thresholds and the automation bubbles up anything crossing a threshold to the top of my report, meaning the most important data gets seen first. I don’t want to just be lazy, I want to be very lazy. By being this lazy and automating, I free up the one resource I can never get more of: time.

What can you automate? Anything and everything you can successfully automate frees up time for you to spend tackling other things. The more your organization sees and understands that you do, the more valuable you are. If you are in IT but don’t happen to be a DBA, this is still a solid approach. Let me generalize and say being a lazy IT pro is being a good IT pro.

 

#TSQL2sDay – Data marts across a shaky WAN link

It sounded good in principle, especially given the requirements and the limitations:

  • We needed our various sites to be able to access the data on their customers.
  • Our line-of-business application that would be installed on the workstations will use this data.
  • Our sites resembled a snowflake schema with respect to WAN connectivity (this was back in the day when frame relay was king).

The solution? The monthly warehouse of data would be pushed out as data marts during off hours to key sites. We’d use DTS (this was back in the SQL Server 7/2000) days to accomplish the push each month and everyone would be happy. What could possibly go wrong?

A lot, apparently:

  • The network provider had a negotiated maintenance window on the circuits from 12 AM until 6 AM every day.
  • The network provider frequently, and without warning, used the maintenance window.
  • DTS didn’t have the greatest of restart capabilities nor was it designed to handle outages in connectivity.
  • Some of the links to the key sites didn’t have sufficient bandwidth for a data mart push.
  • The key line-of-business application front-loaded a bunch of data, MBs of data, and the auxiliary links were even slower than the links to the main sites.

Needless to say, the solution didn’t work. In the end all the SQL Servers in the field were recalled, the data mart push was cancelled, and a remoting solution which required far less bandwidth was deployed to provide our users with the new line-of-business application. Access to customer data outside of the line-of-business application was also deployed via the same remoting solution.

Want to read more T-SQL Tuesday stories? Jason Brimhall is this month’s host.

What If Someone Tampered with the Process?

I’m a big fan of automation. Automation means I can do more. Automation means I eliminate the mundane stuff to focus on critical things. I like automation as an IT professional.

However, as a security professional, a question that is ever present in my mind is,

“What if someone tampered with the process?”

Case in point: you have an automated process to build VMs. That includes configuring particular security groups for a particular type of build in the local Administrators group (you should already be doing some of this with group policy, but that is automation as well). What if an attacker was able to slip into the automation to include a particular account or a particular group? How long would it be before you caught it?

This is why I’m a big believer in a human putting eyes on automation results at some point and relatively frequently at that. In fact, I’m a big believer in multiple levels of verification. Maybe it’s my military background and things like the two person rule. If you’ve watched a movie like Crimson Tide you’ve seen it in action. Two people have keys that must be used together. This ensures that one person, acting alone, can’t do something devastating (in a relative sense).

I know there’s a balance to be met. Too much manual effort and you undo the benefits of automation. However, too much reliance on automation and you’re eventually going to miss something.