There are a number of PowerShell User Groups in the UK, but unfortunately none for me that are easy to get to with my home location and work commitments. So I am gathering interest in a UK South Coast PowerShell User Group for coders of all experience levels.
The purpose of this initial meetup is to test the viability of running a PowerShell meetup in Southampton on a regular basis. Hopefully we will get enough interest to take this forward and start running sessions with PowerShell content for everyone to learn from.
I’m not a massive fan of certifications, but I understand why people do them and the benefits which can arise from the whole process of achieving them. I did a lot of them in the past when my career was more geared around infrastructure work rather than coding. However, I wanted to learn about Microsoft Azure and since it is such a large topic to get to grips with, decided that pursuing the 70-533: Implementing Microsoft Azure Infrastructure Solutions exam would be a good way to focus on learning an initial subset of what is available to work with in Azure.
PowerShell v6 Alpha 17 has been released and contains an interesting change with the version parameter when applied to powershell.exe. Some discussion around it can be found here and here.
When using a Linux based shell, supplying the version parameter returns the version of the shell:
You can now do a similar thing in PowerShell Core:
Note that using $psversiontable still gives you fuller information:
This is slightly different from the pre-v6 PowerShell version on Windows where the version parameter requires an argument:
A colleague of mine asked whether it was possible to open an Explorer window in the same folder as the current location in the PowerShell console.
A couple of different ways to do it:
Here’s a clip of it in the PowerShell ISE and the standard PowerShell console:
My observations so far with the Azure PowerShell experience have been somewhat mixed and the example in this post will give you a flavour of that. I wanted to create a new Storage Blob Container via PowerShell, rather than through the below process in the web portal:
I looked for cmdlets which could potentially be used:
However, it returned nothing from the AzureRM module, only the Azure module. (There are currently two modules you need to use when working with Azure, some more info here and here) To say this can get confusing when you are new to the topic is an understatement, hopefully this situation is going to improve significantly ASAP.
New-AzureRmResourceGroupDeployment generates the following error:
New-AzureRmResourceGroupDeployment ` -Name $resourceDeploymentName ` -ResourceGroupName $resourceGroupName ` -TemplateFile $template ` @additionalParameters ` -Verbose -Force New-AzureRmResourceGroupDeployment : A parameter cannot be found that matches parameter name ‘xxxxxxxxxxx’. At line:5 char:5 + @additionalParameters ` + ~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : InvalidArgument: (:) [New-AzureRmResourceGroupDeployment], ParameterBindingException + FullyQualifiedErrorId : NamedParameterNotFound,Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDeploymentCmdlet
This kind of error seems fairly in tune with the experience I have had so far with the AzureRM PowerShell module, i.
This article on the vCOTeam site details how to mount a CIFS share on the vRO Appliance so that workflows can write files directly to a Windows File Share rather than using another process to copy the file over there.
This was straightforward to implement in a lab scenario, however within a corporate environment with more restrictions around security and networking it can potentially be more of a challenge. Specifically we encountered the following error response from a Windows Server seemingly configured correctly for Share and NTFS permssions on the folder to mount:
Back Story For a while Craig and I have had a number of requests regarding offering OS X and Linux support to PowervRA, particularly since in case you weren’t aware PowerShell is now available on those OSs and 3rd party modules such as PowerCLI are heading towards supporting that. We first looked at offering this support for PowervRA when the first Alpha release of PowerShell Core was shipped, however we were blocked by a couple of issues, particularly this one regarding certificate checking.
A couple of times I have got tripped up by the fact that the Depth parameter for ConvertTo-Json has a default value of 2. So for an object something like this with multiple sub-objects, you will have problems if you don’t specify a higher value for that parameter.
If we send the original object through to ConvertTo-Json with the default value for Depth, then we’ll get the following and you’ll observe that only the first two levels have been dealt with properly:
Update 18/01/2021: See this post for details on an updated version of this module, parts of the below may now be out of date.
In part 1 of this series, we looked at how to get started with the Brickset module. In part 2 we examined how to easily download sets of instructions. Now in part 3 I’ll show you how to use the inventory features of Brickset.
When you are logged into the Brickset website you can use the inventory features to help keep track of your collection.