Get-Choice

I came across this little function today whilst doing some house cleaning.   It’s a function that will show a message prompt and uses a switch statement to control what actions are taken.  It’s an unsupported Microsoft script but no idea where it came from.

Function GetChoice {
 #Prompt message
 $Caption = “Restart the computer.”
 $Message = “It will take effect after restart,  do you want to restart right now?”
 $Choices = [System.Management.Automation.Host.ChoiceDescription[]] @(“&Yes”,“&No”)
 [Int]$DefaultChoice = 0
 $ChoiceRTN = $Host.UI.PromptForChoice($Caption$Message, $Choices, $DefaultChoice)
 Switch ($ChoiceRTN){
  0  { shutdown -t 0 -r }
  1 {break}
}
}

Anyway, clearly this is ripe for customisation.  Step one lets make the function name conform to the PowerShell convention by adding a hyphen
Function Get-Choice

Now lets look at what else might be useful …..

Well the caption and message and even the choices could be made into parameters so you can change the ‘choices’

param (
 $Caption = “Restart the computer.”,
$Message = “It will take effect after restart, do you want to restart right now?”,
$Choices = @(“&Yes”,“&No”,“&Maybe”)
)
change the line that instantiated the $choices variable to

$Choices = [System.Management.Automation.Host.ChoiceDescription[]] $Choices

Hopefully you can see the flaw in the logic now though – the switch statement can’t be customised, well not easily I could pass in a script block but there’s little point in that.

But you could either directly return the choice value as an integer

$Host.UI.PromptForChoice($Caption, $Message, $Choices, $DefaultChoice)

Or return the value

$Choices[$Host.UI.PromptForChoice($Caption, $Message, $Choices, $DefaultChoice)].Label

Advertisements

How to add a reason to the ActiveRoles Change History

One thing I’d recommend you do is add a “reason” to the ARS change history when ever a policy applies some automated changes.  When you look at the Change History in ARS for an automated update that was applied by a script policy how will you know “why” the attribute was changed unless you do this.

It’s such a simple thing to do too.  Most of the ARS AD cmdlets, like set-qaduser have a control parameter that can be used to add a reason for the change.  To add a reason all you need to is add this command line switch.

-Control @{OperationReason=”SeparationOfDuties_v$Global:scriptVersion”}

Note how I also included a variable in there – this allows me to see not only the script that was run but the version number of the script.  Now the History will show you the reason why an update was made.  Nice don’t you think?

History

 

Using a managed Unit for highlighting incorrectly set up accounts

I’ve been using a script that I got from the Quest ( now Dell ) support site for dynamically building a query for a Managed Unit to show inactive accounts.  The thought occurred to me that I could do this to highlight accounts that have not been set up properly.

Yes I know ARS should do this when the service desk are creating the users – but what if someone manually create an account using ADU&C or a script then the integrity rules enforced by ARS are not always adhered to.

It’s also nice to have a quick way of seeing these anomalies.

Anyway, for what ever reason you use this idea the point is to explain how to update a managed unit dynamically.

A Managed Unit is an Active Roles Dynamic OU an object can only be in one OU in AD but it can be in multiple Managed Units in ARS.  It’s a brilliant Idea!  You can then apply permission templates or ARS policies to control how the objects are managed and setup.

Why would I want to dynamically manage the “filter” used to build the Managed Unit?  Well two examples already alluded to above are where one of the filter variables is based on a date, e.g. accounts not used for 30 days – accounts created in the last 30 days etc.

Here’s how to do it – using powershell 🙂

Create a Managed Unit and set the filter.  e.g.
(&(sAMAccountType=805306368)(!(|(employeeID=*)(employeeNumber=*)(sAMAccountName=svc*)(title=*_*)(secretary=*)(userAccountControl:1.2.840.113556.1.4.803:=2)))(whenCreated>=19990224103016.0Z))

or

(&(employeeID=*)(sAMAccountType=805306368)(!(userAccountControl:1.2.840.113556.1.4.803:=2))(lastLogonTimestamp<=130749192000000000))

The interesting bit in both these examples is the date.  These numbers represent a time.  Lastlogon using a lareg integer – you convert this using this formula:  $objLargeInteger90 = $(Get-Date).Date.AddDays(-90).ToFileTime().

The whenCreated attribute uses a simpler string which is human readable YYYYMMDDHHMMSS.OZ

So what we need is a script to dynamically build these queries once a day and then update the managed unit filter.  Hopefully the below needs little explanation and you can easily modify it for your environment.

$Rule.Base can be used to narrow down the search – I search all of my managed domains in this example.

How did I figure out the rule type?  Well I didn’t really what I did was examine the existing setting and then reused it by examining the value in $RuleCollection when debugging my script.

$ManagedUnitDN = “CN=New Accounts Incorrectly Tagged Created in the Last 30 Days,CN=Admin Action Required,CN=User Management,CN=Managed Units,CN=Configuration”
$ManagedUnit = [ADSI]”EDMS://$ManagedUnitDN”
$RuleCollection = $ManagedUnit.MembershipRuleCollection
$daysAgo = 30
$dateThreshold = Get-Date( $(Get-Date).adddays(-$daysAgo) ) -uformat %Y%m%d%H%M%S.0Z
do {
$RuleCollection.RemoveAt(0)
} while ($RuleCollection.Count -gt 0)
$Rule = New-Object -ComObject “EDSIManagedUnitCondition”
$Rule.Base = “EDMS://CN=Active Directory”
$Rule.Filter = “(&(sAMAccountType=805306368)(!(|(employeeID=*)(employeeNumber=*)(sAMAccountName=svc*)(sAMAccountName=saPLON*)(sAMAccountName=saBLON*)(title=*_*)(secretary=*)(userAccountControl:1.2.840.113556.1.4.803:=2)))(whenCreated>=$dateThreshold))”
$Rule.Type = 1
$RuleCollection.Add($Rule)
$ManagedUnit.SetInfo()

You can add multiple rules by repeat the lines of code that set the $rule attribute and then add the rule to the collection.
I did a little research to discover the possible rulecollection types and came up with this:

The BASE defines the scope of the search in all of the rulecollection types where one is needed.  The path uses EDMS://

Type 1 is include by query and requires an LDAP query to define the objects to include
Type 2 is exclude by query and requires an LDAP query to define the objects to exclude
Type 3 is include explicity and does not require a filter.  The Base is the DN of the object to include
Type 4 is exclude explicity and does not require a filter.  The Base is the DN of the object to include
Type 5 is include group members and does not require a filter.  The Base is the DN of the group to include
Type 6 is exclude group members and does not require a filter.  The Base is the DN of the group to exclude
Type 7 is user to keep Deprtovisioned users in the group.  The filter is set to (edsvaDeprovisionStatus=*)

Making a good connection

How often is it that a feature is also a right pain in the …..

I ran an audit script today and half way through it started throwing loads of red error messages.  Now red is a nice colour, Ferraris are red but when it comes to PowerShell the less red we see the better :-).

Any way I could have avoided the errors better in my code but this script has been in use for years with no issues so I wondered why it was suddenly complaining.

In short it was because the script was picking up the last connection I had established with the quest commandlets.

It’s a “feature” that once you establish a connection the subsequent commands use the same connection.  The problem with this is that not every command you use will return what you need.  For example suppose I connect to domain A and query a user then without establishing another connection I just query for a user in Domain B.  Guess what?  One of two things will happen.  If the same username exists in Domain A then you just bound to the wrong object and are possibly on your way to a career limiting mistake.  The second thing that could happen is if no matching username is in Domain A your script will not be able to bind and make the changes and you could be on your way to a career limiting mistake …. ARRRRRGHHHH.

OK it’s unlikely you will get into any serious trouble here but you might so I’ve realised now that because of this feature you need to make sure you always specify the connection you want to use explicitly.

That means always use the -connection parameter on the commands that will accept it and establish a connection to the correct domain / ARS service and store this in a variable.  This will make your script portable between domains and also allow you to use multiple domains in the same script.

I’ve now modified my get-AdminCredentials function and combined it with my post on using a hash table to store the domain connection objects.

Copying an ARS scheduled task

Ok so I have to admit right up front I failed – I can’t create an ARS scheduled task but I can copy the parameters and schedule from one task to another which was why I was trying to do this in the first place.  Back in June 2014 I’d tried to copy a task and posted on the Quest ( now Dell software ) support site a question http://en.community.dell.com/techcenter/iam/f/4817/t/19589175 but this got ZERO responses :-(.

Anyway creating a task is relatively simple, what I wanted really was a way to copy the parameter list and their values.  This is reasonably simple:

Grab the task you want to clone:

$TaskObj = Get-QADObject `
  -Identity $Global:taskDN `
-Connection $ARSConnection `
-IncludedProperties edsaParameters,edsaXMLSchedule,edsaModule

the edsaModule is the script that the scheduled task runs the other two attributes store the parameters and the schedule should you want to copy those too.

Once you have the existing task and you have created the new task just issue this command and the values are copied over for you:

Set-QADObject `
-Identity $NewTaskObj.DN `
-ObjectAttributes @{edsaParameters=$TaskObj.edsaParameters}

User report with hyperlinks

I had a request last week to report on potential name clashes when we migrate some users into our active directory.  I wrote a simple script to read in the logon name for ach user to import and check if it already existed in my AD forest.  I then started thinking how I Might present this information and it occurred to me that I could use a HTML report, mainly because when I searched my archive of code I’d previously used I found an example that did exactly this in one line virtually too which was quite cool, although I can’t remember where I got this tip from:

# create some formatting for the HTML file

$a = “<style>”
$a = $a + “BODY{font-family: Verdana, Arial, Helvetica, sans-serif;font-size:10;font-color: #000000}”
$a = $a + “TABLE{border-width: 1px;border-style: solid;border-color: black;border-collapse: collapse;}”
$a = $a + “TH{border-width: 1px;padding: 0px;border-style: solid;border-color: black;background-color: #E8E8E8}”
$a = $a + “TD{border-width: 1px;padding: 0px;border-style: solid;border-color: black}”
$a = $a + “</style>”

#Body text, pretty much just the report title:

$b = “<H2>All users found where the name includes: $strUserName</H2>”

$arrUsers | select-object EmployeeID,sAMAccountname,Domain,DisplayName,mail,AccountIsDisabled | sort EmployeeID |ConvertTo-HTML -Title “Lookup for $strUserName” -head $a -body $b | Set-Content $HTMLFile

Now you could open the HTML file and it will show you the details in a nice table.

After compiling this report I looked at the report and thought that I could go one better and set a hyperlink to the user object so it would display all of the attributes in the ARS web portal.  This meant I couldn’t use the trick above but it;’s not much more difficult to write your own HTML report from scratch.

the way I do this is usually to use one of the web based HTML editors and create a dummy table with all the date set as place holders, e.g. [USERNAME],[DISPLAYNAME].  then once I have this in note pad I set up a few variables to hold the parts of HTML that I will dynamically put together as my script runs.  I use HERE strings so that I don’t need to double escape any quotes in the HTML string

$HTMLHeader = @”
<!–DOCTYPE html PUBLIC “-//W3C//DTD XHTML 1.0 Strict//EN” “http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd”&gt;
xmlns=”http://www.w3.org/1999/xhtml”&gt;
<head>

</head><body>

All users found where the name is in the input file

<col/>

<tr><th>EmployeeID</th>
<th>sAMAccountname</th>
<th>Domain</th>
<th>DisplayName</th>
<th>mail</th>
<th>AccountIsDisabled</th></tr>
“@

 

Notice in the rowTemplate I have a URL as a hyper link I’ll explain how I get this later.

$rowTemplate =@”
[employeeID]
<td><a href=”[URL]”>[userName]</a></td>
<td>[domain]</td>
[displayName]
[mail]
<td>[enabled]</td>
</tr>
“@

$HTMLEnd = “</table></body></html>”

I then create an array of user objects I want in my report

$ADUsers=@()

In my table I’m going to include a hyperlink for the username that will link to the user object in the ARS web portal.  How I work this out is I searched the ARS web portal for a known user and grabbed the hyperlink.  then I split this out into it’s component parts in the same way as I did for the HTML table template.

$stubURL = https://ars.mydomain.com/MyAdminPortalURL/GenerateForm.aspx?TaskId=UserProperties&TargetClass=user&DN=&#8217;

then I need to create some variables that will be used to replace the elements of the users DN that I will get from the user object – these are the HTML translations of cn=, ou= and dc=
$cn = ‘CN%3d’
$ou = ‘%2cOU%3d’
$dc = ‘%2cDC%3d’

Now I start building my HTML output

$HTMLOutput = $HTMLHeader

All that remains is to loop for each user object I have and create a new rowTemplate  each user.  For each user I just replace the characters in the DN with the variables I defined above and the place holders in the rowTemplate and then add that line to the HTMLOutput variable.

ForEach ( $user in $users ) {
$ADUser = Get-QADUser -SamAccountName $user.UserName -IncludedProperties employeeID,PrimarySMTPAddress -Connection $ARSconnection

if ( ! $ADUser ) { continue }
$userObj = “” |
select employeeID,sAMAccountname,Domain,DisplayName,mail,AccountIsDisabled,URL
ForEach ( $returendUser in $ADUser ) {
$userObj.EmployeeID = $returendUser.EmployeeID
$userObj.sAMAccountname = $returendUser.sAMAccountname
$userObj.Domain = $returendUser.Domain.Name
$userObj.DisplayName = $returendUser.DisplayName
$userObj.mail = $returendUser.PrimarySMTPAddress
$userObj.AccountIsDisabled = $returendUser.AccountIsDisabled
$userDNConverted = $returendUser.DN.Replace(“CN=”,$cn)
$userDNConverted = $userDNConverted.replace(“,OU=”,$ou)
$userDNConverted = $userDNConverted.replace(“,DC=”,$dc)
$userDNConverted = $userDNConverted.replace(” “,“+”)
$userDNConverted = “$stubURL$userDNConverted”
$userObj.URL = $userDNConverted
$HTMLRow = $rowTemplate.replace(“[userName]”,$returendUser.sAMAccountname)
$HTMLRow = $HTMLRow.replace(“[employeeID]”,$returendUser.employeeID)
$HTMLRow = $HTMLRow.replace(“[URL]”,$userDNConverted)
$HTMLRow = $HTMLRow.replace(“[domain]”,$returendUser.Domain.Name)
$HTMLRow = $HTMLRow.replace(“[displayName]”,$returendUser.DisplayName)
$HTMLRow = $HTMLRow.replace(“[enabled]”,$returendUser.AccountIsDisabled)
$HTMLRow = $HTMLRow.replace(“[mail]”,$returendUser.PrimarySMTPAddress)
$HTMLOutput += $HTMLRow
$ADUsers += $userObj
}
}

all that’s left to do now is output the data and if you like open IE and display it

$HTMLOutput += $HTMLEnd

Out-File -FilePath $HTMLFile -InputObject $HTMLOutput

Invoke-Item $HTMLFile

I use this technique when creating emails too.  A formatted HTML email is so much better than a plain text email with no formatting.  With HTML you can control ho wit looks and it will look much more professional and a finished product.  Create a template email with place holders and replace the text with .replace, simples.