Day 45: Creating a Central Node Data Repository to Support Automated Composite DSC Configuration

Welcome to Day 45 of the “100 Days of DevOps with PowerShell”! For background on our goals in this series, see Announcing the “100 Days of Devops with PowerShell” Series here at SCC.

Today, I am going to talk a bit about the data and approach necessary to support a strategy for automated generation of composite DSC configuration mof files for managed nodes in our enterprise. This will enable us to maximize reuse of our previous DSC configuration mof development efforts in building more sophisticated configurations. Since I want to focus on what information we need and why we need it, I am going to use a SQL database as my repository (Azure SQL in this this case). As your organization evolves, I would recommend you move to house this information in your organizations CMDB (as described in Day 5 and Day 16), whichever platform that may be.

In this post, I’ll address the following topics and questions:

  • Why automate mof configuration in the first place?
  • Information needed to drive automated DSC configurations
  • Building a central reference DB to drive DSC
  • <various PowerShell examples for record retrieval, insert and update>
  • End-to-end process (high level)

With PowerShell DSC still in a state of rapid evolution, my intent is to drive the discussion around how we build a sustainable process around the technology to support DevOps with PowerShell DSC in the enterprise.

Why automate MOF configuration in the first place?

You may be asking yourself, why would I need to automate build of the configuration MOF files in the first place? I think there are a number of reasons, some obvious, some less so. I think the primary driver will be frequent periodic updates to shared configuration as part of a composite configuration strategy (described by Scott in Day 41 and Day 46). Nodes for which compliance is important (where ensuring certain updates are installed may be important) could drive monthly updates (or even more frequent in the new Windows 10 update model).  If you are using PowerShell DSC to deploy content to web servers, this may also fuel the need to drive updates, though I expect once the initial configuration is in place and initial content is deployed, we would likely use a different process to drive content updates to the servers from our code repository of our choice (TFS, GIT, etc.).

In any event, the DevOps process means an open, agile, highly automated approach to change, as is the technology driven by DevOps (like the Microsoft Cloud OS), so I expect the need for frequent DSC configuration mof updates will necessary and automation of that process a welcome addition.

Information needed to drive automated DSC configurations

With any automated process comes the need to identify and provide the necessary data inputs to drive the automation. There are a few values I would assume to be core to process of automating the build of the configuration mof file for each of our managed nodes.

  • Server Name – The FQDN of the managed node.
  • DSC Guid – The DSC configuration ID you have assigned to the node (and input into the nodes local configuration).
  • Roles – The Windows or functional roles that will help you identify which configuration elements need to be included in the nodes configuration mof file. You can use this to hold a combination of descriptions that is useful in your scenario. For example, you may include roles here like “IIS Web” or “Corporate E-commerce” site.
  • Applications – This could hold a list of of key Microsoft, 3rd party or custom apps you choose to deploy with DSC.
  • Compliance –  This field would include a list of compliance requirements for the managed node, such as PCIDSS, Hippa, SOX, etc.
  • Owner – Presumably an e-mail address of the person or group responsible for this server and/or the applications and workloads it hosts.
  • Date Added – The date the record for this server was added to the database.
  • Date Updated – The date the record for this server was last updated in this database.
  • Description – Free text to hold whatever description information is helpful to your process.

The only reason I don’t have a list of DSC modules required by each node is that I have them all already packaged on my DSC Pull Server. Scott Weisler has demonstrated this technique in “Day 21: Deploying Modules via a Pull Server”.

Building a Central Reference DB to Drive DSC

As mentioned above, today I am going to be a bit more universal in my approach. I have created a SQL database named DSCSource to store information regarding my managed nodes. This particular database is in Azure SQL, but you could host this on any version of Microsoft SQL or other ODBC compliant repository. While this may not ultimately be the best place, it is a couple of steps above using Excel spreadsheets and comma-separated value files as a secure, centrally accessible location to store this type of data.

Create a table to store node details

With the database created through SQL Management Studio (steps to create a database here if you need them), below is the transact-SQL (T-SQL) code to create a table named DSCTracking.

You will notice the primary key in the table is ServerName + DSCGuid. I took this approach to address the possibility you may have servers with identical names across multiple environments.

Insert data into the table

Now , to populate the table information about your managed servers (nodes).

Inserting data into the table could be accomplished through T-SQL, or even creation of a simple web-based UI, PowerShell or by simply importing data from a comma-separated file we create with Excel. The Import and Export Data wizard that comes with SQL Server 2012 and beyond is an easy way to import data from an Excel or comma-separated file.


Here is the format of the data in the DSCTracking table, in the event you would like to see the format of the contents in detail:

If I wanted to do this programmatically with PowerShell, here is an example of entering a new record into the database (an INSERT query), one example would be as shown below:

Updating Records

And if I wanted to update an existing record (an UPDATE query), my command would look like this example:

Retrieving Records

Retrieving data would also be a simple PowerShell call. I would modify the T-SQL query in the code snippet based on the scope of configuration MOF updates. For example, to retrieve configuration for a server named, I would use the following snippet:

If I wanted to update the configuration of all the HR web farm servers, I might base my query on a combination of factors in the Roles and Applications columns, like so:

The bottom line here is we are now building a base of information that will enable us to drive detailed configuration MOF file development in a highly automated manner.

End-to-End Process (High Level)

Finally, let us briefly look at the end-to-end process might look like that we could implement to drive a highly automated PowerShell DSC approach to support DevOps strategy. The process might look something like this:

  • Determine what updates we need to make, and to which server roles we need make them
  • Retrieve a list of servers meeting that description from the central reference database
  • Retrieve various configuration MOF snippets to build the file for a given role
  • Run script to build the new configuration MOF file
  • Copy the updated files to the DSC Pull Server, create checksum, etc.

Much of the code and many of the concepts necessary to implement this process have already been shared in previous installments of the 100 Days of DevOps series. Others, such as storage and source control of our DSC snippets in TFS, GIT or elsewhere have yet to be explored in their full detail…all part of the journey in our series.


I hope today filled in another piece of the puzzle in your PowerShell DSC journey.  Please post comments and errata in the Comments section below.

Previous Installments

To see the previous installments in this series, visit “100 Days of DevOps with PowerShell”.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.