Get in touch with us!

Save your $vars

I know that you have used export-csv one too many times and each time it’s a pain in the ass. For example, you make a huge dump of data to a csv and two weeks later you import the file to restore your dump. This simple task step may take unnecessarily long time.

Or have you ever saved your credentials in your script? Most people I’ve met do something like this:

(Get-Credential).Password | ConvertFrom-SecureString | out-file .cred.txt
$ss=(Get-Content .cred.txt | ConvertTo-SecureString)
$mycreds = New-Object System.Management.Automation.PSCredential ("username", $ss)

It’s “safe” because a secure string is encrypted by the user security context.

Computer1 – User A

Computer1 – User B

It’s an ok way to save a password. I don’t say it’s the right way to do it. But anyhow, for each thing you want to save the script gets more and more complex.

Let’s say you want to save a multidimensional table, which you don’t know the size of, it’s also unstructured, contains objects and perhaps has secure strings. Using out-file / export-csv is only going to give you less time to live and less time to do the things you love.

The answer to all your problems is, Export-Clixml.

(Get-Credential) | Export-Clixml -Path .c.xml
$cred=Import-Clixml -Path .c.xml

Here is an example how to save a MSOL result to a CLIXML and load it, as it never even left the console.

$allusers=Get-MsolUser –all
$allusers | Export-Clixml -Path .var.xml -ErrorAction SilentlyContinue
$allusersLoaded=Import-Clixml -Path .var.xml
$allusersLoaded.Where({$psitem.IsLicensed -eq $true}).count

Warning don’t run this on an on-premise server. The file size can be greater than 100mb depending on the size of your tenant, it may result in filling your hard drive. J

Submit a Comment

Your email address will not be published. Required fields are marked *