FindDuplicateEmployeeIDs.ps1


Description

@GitHub Copilot - Welcome @BanterBoy, I’m your Copilot and I’m here to help you get things done faster. I can identify issues, explain and even improve code.

You can ask generic questions, but what I’m really good at is helping you with your code. For example:

Generate unit tests for my code Explain the selected code Propose a fix for the bugs in my code If you want to learn more about my capabilities and limitations, check out the Copilot documentation.

I’m powered by AI, so surprises and mistakes are possible. Make sure to verify any generated code or suggestions, and share feedback so that we can learn and improve.

@BanterBoy: - Explain what the script does and why it is useful.

@GitHub Copilot -


Script

<#
  This script will find duplicate employeeID values and export them
  to a CSV file.

  Syntax examples:
    To process all users:
      UpdatingEmployeeID.ps1

    To process all users from a particular OU structure:
      UpdatingEmployeeID.ps1 -SearchBase "OU=Users,OU=Corp,DC=mydemosthatrock,DC=com" -ReferenceFile "c:\MyOutput.csv"

    You must use quotes around the SearchBase parameter otherwise the
    comma will be replaced with a space. This is because the comma is a
    special symbol in PowerShell.

  Release 1.0
  Written by [email protected] 3rd April 2014

#>

#-------------------------------------------------------------
param([String]$SearchBase, [String]$ReferenceFile)

# Get the script path
$ScriptPath = { Split-Path $MyInvocation.ScriptName }

if ([String]::IsNullOrEmpty($ReferenceFile)) {
    $ReferenceFile = $(&$ScriptPath) + "\DuplicateEmployeeIDs.csv";
}

$UsedefaultNamingContext = $False
if ([String]::IsNullOrEmpty($SearchBase)) {
    $UsedefaultNamingContext = $True
}

#-------------------------------------------------------------
# Import the Active Directory Module
Import-Module ActiveDirectory -WarningAction SilentlyContinue
if ($Error.Count -eq 0) {
    #Write-Host "Successfully loaded Active Directory Powershell's module" -ForeGroundColor Green
}
else {
    Write-Host "Error while loading Active Directory Powershell's module : $Error" -ForeGroundColor Red
    exit
}

#-------------------------------------------------------------
$defaultNamingContext = (get-adrootdse).defaultnamingcontext
$DistinguishedName = (Get-ADDomain).DistinguishedName
$DomainName = (Get-ADDomain).NetBIOSName
$DNSRoot = (Get-ADDomain).DNSRoot

if ($UsedefaultNamingContext -eq $True) {
    $SearchBase = $defaultNamingContext
}
else {
    $TestSearchBase = Get-ADobject "$SearchBase"
    if ($Null -eq $TestSearchBase) {
        $SearchBase = $defaultNamingContext
    }
}

#-------------------------------------------------------------

function Get-LastLoggedOnDate ([string] $Date) {
    if ($Date -eq $NULL -OR $Date -eq "") {
        $Date = "Never logged on before"
    }
    $Date
}

$LastLoggedOnDate = @{Name = 'LastLoggedOnDate'; Expression = { Get-LastLoggedOnDate $_.LastLogonDate } }

$EmployeeID = @{Name = 'EmployeeID'; Expression = { $_.EmployeeID.Trim() } }

$TotalProcessed = 0
$filter = "(employeeID=*)"
Write-Host -ForegroundColor Green "Finding all users with an employeeID attribute and exporting the duplicates to '$ReferenceFile'`n"
Get-ADUser -LDAPFilter $filter -SearchBase $SearchBase -Properties * | Select-Object $EmployeeID, SamAccountName, Name, Enabled, $LastLoggedOnDate, whenCreated | Group-Object EmployeeID | Where-Object { $_.Count -gt 1 } | Select-Object -Expand Group | Export-Csv -notype "$ReferenceFile"

# Remove the quotes
(Get-Content "$ReferenceFile") | ForEach-Object { $_ -replace '"', "" } | Out-File "$ReferenceFile" -Force -Encoding ascii

Back to Top


Download

Please feel free to copy parts of the script or if you would like to download the entire script, simple click the download button. You can download the complete repository in a zip file by clicking the Download link in the menu bar on the left hand side of the page.


Report Issues

You can report an issue or contribute to this site on GitHub. Simply click the button below and add any relevant notes. I will attempt to respond to all issues as soon as possible.

Issue


Back to Top