Office365 user error: An unknown error has occured

A customer migrated a mailbox from exchange online back to Exchange on-premise but the email was not received anymore after migrating.
I was checking what was causing this issue i quickly found the following error in the user settings of office365 admin center:
Exchange: An unknown error has occured. Refer to correlation id: 0000000000000000

Error-office365

so i opened up powershell and ran the following command:

Get-MsolUser -UserPrincipalName UPN | fl -Property *

At the bottom the field ValidationStatus said “Error”
to find out what the error is did the following:

$errors = (Get-MsolUser -UserPrincipalName UPN).Errors
$errors | foreach-object {“`nService: ” + $_.ErrorDetail.Name.split("/")[0]; “Error Message: ” + $_.ErrorDetail.ObjectErrors.ErrorRecord.ErrorDescription}
$errors

In my case the error said:
Error Message: The value “6dd9696b-0300-41b2-b280-3ee189ed1d9f” of property “ExchangeGuid” is used by another recipient object. Please specify a unique value.

It is clear enought that we have a duplicate exchangeguid, so i started searching which user had the same guid but coudln’t find any at first.
I then checked the deleted users in office365 and suprise there i found the same user with the same exchangeguid
to find deleted users:

 Get-MsolUser -ReturnDeletedUsers | fl UserPrincipalName,ObjectID

I deleted the user with the following command: (the object id is the same as in the the result of the above command)

Remove-MsolUser -ObjectId 7b876459-9faf-4c77-926f-54a5ebcce94a -RemoveFromRecycleBin -Force

This deletes the user but not the old malbox yet, check if there is still a mailbox:

Get-Mailbox -SoftDeletedMailbox

If there is still a mailbox then also delete this with the following command:

Get-Mailbox -SoftDeletedMailbox UPN | Remove-Mailbox -PermanentlyDelete

After waiting 30minutes the ValidationStatus of the user was now Healthy again and email started going to the new on-premise mailbox.

From ADFS 3.0 on premise to ADFS 4.0 in Azure

Some time ago we decided to move our adfs farm from on-premise to Azure.
We originally had a server 2012 r2 adfs 3.0 farm with a server 2012 r2 adfs proxy

Instead of just simply moving the farm from on-premise to Azure, i found out that we can add a new server 2016 with adfs 4.0 to the old farm and work together. so this way we could upgrade to the newest version of adfs without any downtime.

How did i do this?

-Create two new server 2016 machines in azure with size Standard_A1_v2 (1 core, 2 GB memory)
– Domain join the adfs server (ADFS1)
– The adfs proxy shouldn’t be joined to the domain(DMZ zone). (ADFS2)

install the ADFS role through the add Roles and Features Wizard

No additional roles or features are needed just click install.

before you can continue with the adfs configuration itself you need the following:
– ADFS service communication certificate
– ADFS Signing certificate
– ADFS Decrypting certificate
– ADFS Service account (the service account that is already being used for the adfs 3.0 farm)
– User with domain admin rights
– server 2016 image mounted on domain controller ( this to upgrade the domain Schema, if not already done so)

Now you can import all certificates and don’t forget to give the adfs service account rights to manage the private key for the service communication certificate

How to give the service account rights to the provate key? see below:
Click Start, Run, type MMC.exe, and press Enter
b. Click File, Add/Remove Snap-in
c. Double-click Certificates
d. Select Computer account and click Next
e. Select Local computer and click Finish
f. Expand Certificates (Local Computer), expand Personal, and select Certificates
g. Right-click your new SSL and Service Communications certificate, select All Tasks, and select Manage Private Keys
h. Add Read access for your AD FS 2.0 service account and click OK

You can now start the adfs configuration wizard
As we are adding the server to a existing farm we’ll choose “Add a federation server to a federation server farm”

Next you need to connect to ADFS with a domain admin account, either the current user or an other user that you can select:

Now you will need to specify either the primary adfs server (with full FQDN) if you use windows internal database (WID) or specify the sql server and database instance.

You now need to specify the service communication certificate:

Next you select the ADFS service account:

Now just hit next a few times and it should configure the adfs server successfully, you will get a warning in the results but this is because the new server is in read-onlny mode.

So now you can start testing the new ADFS server, you can either put in production by adding it to the loadbalancer or test it by adding you adfs url to your host file.
Once you are confident that the new server is working correctly, you can set the new adfs server as the primary computer with the following powershell command:

Set-AdfsSyncProperties -Role 'PrimaryComputer'

Now on all other ADFS servers you will need to change the Role to secondary computer, with the following powershell commands:

On the old primary server:

Set-AdfsSyncProperties -PrimaryComputerName 'ADFS1' -Role 'SecondaryComputer'

All other servers (except the new primary server):

 Set-AdfsSyncProperties -Role SecondaryComputer -PrimaryComputerName ADFS1

ADFS proxy (WAP) installation
As we need a ADFS proxy as well in azure we can start configuring the ADFS2 now.

First add the internal ADFS url to local host file and let it point to the ADFS1

Now run the following powershell command:

 Install-WindowsFeature Web-Application-Proxy -IncludeManagementTools

You’ll then need to install the service communication certificate on the ADFS2, please write down the thumbprint because you will need it in the next powershell command

Now run the following powershell command (replace certificatethumbprint with the thumbprint you just noted:

Install-WebApplicationProxy -CertificateThumbprint 2D125F4D735DEF823178E8F17E6579D04FB97B7A -FederationServiceName sts.margiestravel.com -FederationServiceTrustCredential $(Get-Credential

So now you can start testing the new ADFS proxy server, you can either put in production by adding it to the loadbalancer or test it by adding you adfs url to your host file.
Once you are confident that the new server is working correctly, you can change the dns A record to the new server or change your firewall rules.

Removing old ADFS servers

Removing the old server is as simple of just installing the adfs role.

Upgrading ADFS farm functional level

Before you can upgrade the ADFS farm functional level, you need to make sure that active directory schema is upgraded to server 2016.

Run the following powershell command on your domain controller:

Get-ADObject (Get-ADRootDSE).schemaNamingContext -Property objectVersion | fl objectversion

The object version should be one of the following:

87 = Windows server 2016
69 = Windows Server 2012 R2
56 = Windows Server 2012
47 = Windows Server 2008 R2
44 = Windows Server 2008
31 = Windows Server 2003 R2
30 = Windows Server 2003
13 = Windows 2000

If it is not version 87 you should upgrade it with the server 2016 image mounted to the domain controller.

Open an administrator command prompt and go to the following folder of the image: drive:\support\adprep
Now run adprep /forestprep
After that run adprep /domainprep

It is now time to upgrade the ADFS farm functional level

Go to your ADFS server and run the following command in powershell:

Invoke-AdfsFarmBehaviorLevelRaise

When prompted type “Y” and your all done.

Azure – Recovery Services Vaults powershell

I set up Azure recovery service vaults notification mails a few months ago, but up until now i received only one confirmation email that it was configured and after that it was pretty quiet.
You can set up mail notifications for informationals, Warnings and errors. However this notification functionality is still in Preview(beta).

notification-image

so as i was not even receiving any informational emails i decided to build my own script that goes through all recovery services vaults and put all results nicely in a table. The results will be sent every day. (task needs to be scheduled somewhere)
Below you can find a  example of my script:

$FilePath = "D:\somelocation\admin.pwd" # fill in the location of the stored password
$username = "username" # fill in username
$securePassword = ConvertTo-SecureString (Get-Content -Path $FilePath)
$cred = New-Object -TypeName System.Management.Automation.PSCredential ($username, $securePassword)
$result = Add-AzureRMAccount -Credential $cred
Select-AzureRmSubscription -SubscriptionId 00000000-0000-0000-0000-000000000000 # fill in the azure subscription id
$vaults = Get-AzureRmRecoveryServicesVault
 
 
# Create a DataTable
$table = New-Object system.Data.DataTable "Backup"
$col1 = New-Object system.Data.DataColumn Workloadname,([string])
$col2 = New-Object system.Data.DataColumn Operation,([string])
$col3 = New-Object system.Data.DataColumn Status,([string])
$col4 = New-Object system.Data.DataColumn StartTime,([string])
$col5 = New-Object system.Data.DataColumn EndTime,([string])
$col6 = New-Object system.Data.DataColumn JobId,([string])
$table.columns.add($col1)
$table.columns.add($col2)
$table.columns.add($col3)
$table.columns.add($col4)
$table.columns.add($col5)
$table.columns.add($col6)
 
# Create an HTML version of the DataTable
$html = "
 
"
 
foreach ($subvault in $vaults)
{
    Set-AzureRmRecoveryServicesVaultContext -Vault $subvault
    $results = Get-AzureRmRecoveryServicesBackupJob -From (Get-Date).AddDays(-1).ToUniversalTime() # show all jobs from the last 24 hours
 
        $html += "
 
 
"
        $html += "
 
 
"
 
 
    foreach ($line in $results)
    {
 
        $row = $table.NewRow()
        $row.Workloadname = $line.workloadname
        $row.Operation = $line.Operation
        $row.Status = $line.Status
        $row.StartTime = $line.StartTime
        $row.EndTime = $line.EndTime
        $row.JobId = $line.JobId
 
        $html += "
 
 
"
    }
}
 
$html += "
 
<table border="1" cellpadding="10">
 
<tbody>
<tr>
 
<th colspan="5">Azure Recovery service vaults</th>
 
 
<th><img src="add some cool logo" alt="Logo"></th>
 
</tr>
 
 
<tr>
 
<th colspan="6"> Recovery services vault: " + $subvault.Name + "</th>
 
</tr>
 
 
<tr>
 
<th>Workloadname</th>
 
 
<th>Operation</th>
 
 
<th>Status</th>
 
 
<th>StartTime</th>
 
 
<th>EndTime</th>
 
 
<th>JobId</th>
 
</tr>
 
 
<tr>
 
<td>" + $row[0] + "</td>
 
 
<td>" + $row[1] + "</td>
 
 
<th>" + $row[2] + "</th>
 
 
<td>" + $row[3] + "</td>
 
 
<td>" + $row[4] + "</td>
 
 
<td>" + $row[5] + "</td>
 
</tr>
 
</tbody>
</table>
 
 
"
 
$email = @{
From = ""
To = ""
Subject = "Azure Backup results"
SMTPServer = "smtp server"
Body = $html
}
 
 
send-mailmessage @email -UseSsl -bodyashtml

you will need to change the following parameters yourself:

Select-AzureRmSubscription -SubscriptionId 00000000-0000-0000-0000-000000000000 # fill in the azure subscription id 
 
$FilePath = "D:\somelocation\admin.pwd" # fill in the location of the stored password
$username = "username" # fill in username
 
From = ""
To = ""
Subject = "Azure Backup results"
SMTPServer = "smtp server"

The end result will look something like this: