ConvertTo-CustomObject function as a wrapper for Select-Object

In yesterday’s post I introduced a custom advanced function for changing property names on objects. I also showed you how to use Select-Object to do the same. Because the Select-Object cmdlet is probably more robust than my little function I decided to try and rewrite the function to use Select-Object under the hood.

Select-Object accepts an array of hashtable objects with two key/value pairs:

  • N is for the name of the property
  • E is the expression as a script block that need to be run to get the value

In order to use this functionality we need to figure out a way to generate these objects dynamically using our mapping table. The approach I took was to generate the code as strings and then use Invoke-Expression. Since we only need to do this once and not for every object I added a BEGIN block to my function:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
BEGIN {
$Expression = ""
$MappingTable.Keys | ForEach-Object {
$Expression += (
@"
@{
N="$($MappingTable[$_])"
E={`$_.$_}
},
"@
)
}
$Expression = $Expression.Substring(0,$Expression.Length-1)
$SelectObjectParameter = Invoke-Expression $Expression
}

So for each key in the mapping table we generate a name and an expression to run. Eg. using a mapping table like @{ "name" = "AccountName"; "telephone1" = "Phone" }generates the following string:

1
2
3
4
5
6
7
@{
N="Phone"
E={$_.telephone1}
}, @{
N="AccountName"
E={$_.name}
}

That string is exactly what we would feed Select-Object. To get it into actual PowerShell objects we need to feed it to Invoke-Expression. That gives us an array with two separate hashtables each with a N key with a string value and an E key with a script block value.

Once stored in a variable we can use this for each of the objects fed through the pipeline by using it in the PROCESS block of our advanced function:

1
2
3
PROCESS {
$SourceObject | Select-Object -Property $SelectObjectParameter
}

And that is all there is to it. The function works exactly the same as the ‘quick and dirty’ one in the last post. You can find the full function here. Import it to your session and try this example:
Get-Process | ConvertTo-CustomObject -MappingTable @{ "ProcessName" = "Name" }

Generate CSV from Dynamics 365 with Custom Headers

In my previous post we retrieved account records from Dynamics 365 (CRM) and exported those to a CSV-file. In this post we will expand the example with a transformation function to get nicer property names and CSV headers.

The Export-CSV cmdlet exports all the attributes on the objects piped to it. In our last example we used Select-Object to only include the properties we wanted. These properties are still named by the original object. In most cases this is not desirable. You could just replace the CSV file header row, but a cleaner way would be to transform the object somehow. One way to achieve this is to use Select-Object as mentioned here.

Using our example from before:

1
2
3
4
5
6
7
8
9
10
(Get-CrmRecords -EntityLogicalName account -Fields "name","telephone1").CrmRecords |
Select-Object @{
N="AccountName"
E={$_.name}
},
@{
N="Phonenumber"
E={$_.telephone1}
} |
Export-Csv -Path "accounts.csv"

This is a nice trick and certainly very flexible. You can select any property on the source object and assign it to a new property. But what if you want to parametrize and use a mapping table for the transformation? You could probably cook-up a function that provides a mapping array for Select-Object. But since I only just came up with that and it actually sounds a bit complicated, I propose a different approach.

Let’s create an advanced function to do the heavy lifting for us:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
function ConvertTo-CustomObject{
[CmdletBinding()]
param(
[Parameter(Mandatory=$True,ValueFromPipeline=$True)]
$SourceObject,
[Parameter(Mandatory=$True)]
[hashtable]$MappingTable
)
PROCESS {
$Property = @{}
$MappingTable.Keys | ForEach-Object {
$Property.Add(
$MappingTable[$_],
$SourceObject.$_
)
}
New-Object psobject -Property $Property
}
}

The function ConvertTo-CustomObject accepts objects from the pipeline and a mapping table. It then uses the mapping table to iterate across the the properties we want adding them to a property hashtable. Lastly it creates a new object using the properties and writes it to the output.

So again, using our example, we create a mapping table like so:

1
2
3
4
$FieldsMappings = @{
"name" = "AccountName"
"telephone1" = "Phone"
}

And then we can replace the Select-Object part with our function:

1
2
3
(Get-CrmRecords -EntityLogicalName account -Fields $FieldsMappings.Keys -AllRows).CrmRecords |
ConvertTo-CustomObject -MappingTable $FieldsMappings |
Export-Csv -Path "account.csv"

We retrieve the attributes we want from the system (the keys of the $Fieldmappings hashtable), pipe the objects to our function and write the resulting objects to a CSV like before, but with a nicer header row. Complete code here.

While this may seem like a complex way to do it, using objects gets us some nice benefits. Eg. try replacing the Export-CSV line with Out-Gridview and see that happens. If you would like to include more attributes, all you need to is add items to the mapping table.

Export Data from Dynamics 365 using PowerShell

Did you know that you can manage your Dynamics 365 instance using your favorite automation tool PowerShell? The Microsoft.Xrm.Data.PowerShell-module lets you manage your solutions and such but also has functionality to work on records. In this post I am going to show you how to use the module for data export.

The first thing you need to do is install the module on your system. Just run the command Install-Module Microsoft.Xrm.Data.PowerShell in PowerShell to get it. Next you need to connect to your instance. There are a few alternatives here so please refer to the project’s GitHub-page. With the Microsoft hosted Dynamics 365 you can use Connect-CrmOnlineDiscovery -InteractiveModeto connect using a GUI.

Working with data

Once connected you can try querying for some data. The command Get-CrmRecords -EntityLogicalName account gets you all (or the first 5000) accounts in your system. The output looks something like this:

1
2
3
4
5
6
7
Key Value
--- -----
CrmRecords {@{accountid=f39551c1-81ff-e611-80f3-5065f38a5a01; accountid_Property=[ac...
Count 349
PagingCookie <cookie page="1"><accountid last="{9AD008BF-8A34-E711-80FA-5065F38BC5C1}"...
NextPage False
FetchXml <fetch version="1.0" output-format="xml-platform" mapping="logical" d...

As you can see the result is an object with a few different properties. The CrmRecords property contains the actual result of the query. So if you want to output all the accounts you would run (Get-CrmRecords -EntityLogicalName account).CrmRecords. That gets you something like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
accountid : f39551c1-81ff-e611-80f3-5065f38a5a01
accountid_Property : [accountid, f39551c1-81ff-e611-80f3-5065f38a5a01]
ReturnProperty_EntityName : account
ReturnProperty_Id : f39551c1-81ff-e611-80f3-5065f38a5a01
original : {[accountid_Property, [accountid, f39551c1-81ff-e611-80f3-506
5f38a5a01]], [accountid, f39551c1-81ff-e611-80f3-5065f38a5a01
], [ReturnProperty_EntityName, account], [ReturnProperty_Id ,
f39551c1-81ff-e611-80f3-5065f38a5a01]}
logicalname : account
accountid_Property : [accountid, b902322e-6005-e711-80f4-5065f38a5a01]
accountid : b902322e-6005-e711-80f4-5065f38a5a01
ReturnProperty_EntityName : account
ReturnProperty_Id : b902322e-6005-e711-80f4-5065f38a5a01
original : {[accountid_Property, [accountid, b902322e-6005-e711-80f4-506
5f38a5a01]], [accountid, b902322e-6005-e711-80f4-5065f38a5a01
], [ReturnProperty_EntityName, account], [ReturnProperty_Id ,
b902322e-6005-e711-80f4-5065f38a5a01]}
logicalname : account

As you can see there doesn’t seem to be any useful data other than the Id to work with. That’s because we haven’t requested any attributes. To include the account name you need to use the -Fields parameter. Eg. Get-CrmRecords -EntityLogicalName account -Fields "name" includes the account name in the result like so:

1
2
3
4
5
6
7
8
9
10
11
name : Test account
name_Property : [name, Test account]
accountid_Property : [accountid, f39551c1-81ff-e611-80f3-5065f38a5a01]
accountid : f39551c1-81ff-e611-80f3-5065f38a5a01
ReturnProperty_EntityName : account
ReturnProperty_Id : f39551c1-81ff-e611-80f3-5065f38a5a01
original : {[name_Property, [name, Test account]], [name, Test
account], [accountid_Property, [accountid, f39551c1-81ff
-e611-80f3-5065f38a5a01]], [accountid, f39551c1-81ff-e611-80f
3-5065f38a5a01]...}
logicalname : account

So now that we know how to get data from the system, let’s get it into a CSV-file. You can’t quite do a Get-CrmRecords | Export-CSV, but close to it. Here’s a code snippet:

1
2
3
(Get-CrmRecords -EntityLogicalName account -Fields "name","telephone1").CrmRecords |
Select-Object -Property "name","telephone1" |
Export-Csv -Path "accounts.csv"

The code fetches all accounts from the system, selects just the properties we want and finally exports the result to a CSV-file. You might want to use the -NoTypeInformation, -Delimeter and -Encoding parameters. You can see the final code with some improvements here.

In my next post we will improve on this by supplying our own column names instead of the schema names provided by the system.

Blog moved to Github Pages

Just a quick post to see that the blog publishing still works. I moved the blog from Azure to Github pages. All in all really simple. You just create a repository and publish via git. In my case as I was already using git for publishing all I needed to do was change the git endpoint, add a custom domain and point my DNS to github.

My new shiny Markdown compatible blog

Markdown

I remember hearing about Markdown previously but hadn’t paid it much attention. Then I heard it mentioned on a podcast I listen to I also noticed that the Readme file on Visual Studio Team Services was done in markdown.

Naturally I was intrigued. As a wannabe programmer and configuration nazi the idea of using a simple markup syntax to write formatted documents sounded great. I had a brief stint using LaTeX while at school (no one else did though) and while I did do my final year project documentation with it, I quickly lost interest.

The blog

Anyway… I had thought about starting a blog for a while. I use Feedly every day and follow lots of blogs. Since I’ve benefited greatly from the community I thought it was time to give something back.

Getting back to the main topic of how this blog came about. I searched for blogging engines that supported Markdown and the first result on Google was Hexo. Hexo is a static site generator built using NodeJS. Seemed like a great fit for a backend guy like me.

I installed NodeJS using Chocolatey and followed the instructions to get up and running. Simple enough. The original plan was to host it on an Azure Web App, so I followed another set of instructions in order to do that.

Note that in the Azure portal you should use the deployment options instead of continuous integration. That’s where you find the Local Git repo option you need. I first tried the Free plan, but apparently custom domains are not supported on that, so I switched to the Shared plan. See more details here. I’ll probably move the blog to Github pages though since that’s free. Or I could host it on my router…

Once everything was setup all I need to do to update the blog, besides writing the posts, is hexo deploy --generate. I placed the blog source files in Dropbox so I can get to them wherever I am. This post was started on my phone and finished on an iPad. I can’t generate and deploy on the go though. I’m sure PowerShell can help, but that is another blog post I think. I’m thinking some kind of change monitoring and automated deployment.

Please note that I’m using this blog as another learning opportunity, so except changes, downtime etc. I’m also not sure how I active I’m actually going to be.

Dynamics 365 Document Template from Existing Word Document

For the impatient this is how you do it:

Extract the Dynamics 365 generated template from zip and add the CustomXML-part to your existing template.

I couldn’t find a post on how to create a new Dynamics 365 Document Template using an existing document or template. I’m sure many organisations have an existing template with all the right formatting, headers and footers they would like to use.

Knowing how .docx-files are structured and how Dynamics 365 stores the metadata it was easy enough to figure out. I thought I would share how with you.

Step by step

  1. Create a new blank template from Dynamics 365 the way you would normally do it and save the file to your computer
  2. Change the extension of the file to .zip and extract it
  3. Open the existing document you would like to use with Dynamics 365
  4. In your Word template click Developer > XML Mapping Pane
    XML mapping pane
  5. In the drop down menu select (Add new part…)
    Dropdown menu
  6. Navigate to your extracted template and select the xml-file (item.xml) under the CustomXML-directory
  7. Proceed as you normally would with a blank template

Note that this is might not be supported, but seems to work just fine.