top of page

82 results found with an empty search

  • Managing Local Admin Passwords with LAPS

    How are you managing your local administrator passwords? Are they stored in a spreadsheet on a network share, or worse, is the same password used everywhere? Microsoft LAPS (Local Administrator Password Solution) could be the answer. LAPS is a lightweight tool that, with a few simple GPO settings, automatically randomizes local administrator passwords across your domain. It ensures each client and server has a unique, securely managed password, removing the need for spreadsheets or manual updates. Download LAPS from the Microsoft site Copy the file to the Domain Controller and ensure that the account you are logged on has 'Schema Admin'. Install only the Management Tools. As its a DC its optional whether to install the 'Fat Client UI', Schema updates should always be performed on a DC directly. Open Powershell and run the following command after seeking approval. Update-AdmPwdSchema SELF will need updating on the OU's for your workstations and servers. Add SELF as the Security Principal. Select 'Write ms-Mcs-AdmPwd Now change the GPO settings on the OU's. The default is 14 characters but I would go higher and set above 20. Install LAPS on a client and select only the AdmPwd GPO Extension On the Domain Controller open the LAPS UI and search and Set a client. Once the password has reset open the properties of the client and check the ms-Mcs-AdmPwd for the new password. Now every 30 days the local Admin password will be automatically updated and unique. Deploy the client with ConfigMgr to remaining estate. By default Domain Admin have access to read the password attribute and this can be delegated to a Security Group. AND.....this is the warning.....Any delegated privileges that allow delegated Computer management and the 'Extended Attributes' can also read the 'ms-MCS-AdmPwd'.

  • Using SCOM to Monitor AD and Local Accounts and Groups

    For those that have deployed SCOM without ACS or another monitoring service, but don't have a full-blown IDS\IPS. With a little effort, it's possible to at least monitor and alert when critical groups and accounts. As a free alternative, ELK (Elastic Search) or Security Onion. The following example is SCOM being configured to alert when Domain Admins is updated. On the Authoring Tab, Management Pack Objects, Rules, select 'NT Event Log (Alert)' Create a new Management Pack if required, don't ever use the default MP The 'Rule Name' should have an aspect that is unique and all subsequent rules to assist searching later on. Rules that monitor Groups or Accounts will be pre-fixed with 'GpMon'. The 'Rule Target' in this case is 'Windows Domain Controllers', it's a domain group. Change the 'Log Name' to 'Security'. Add Event ID 4728 (A member was added to a security-enabled global group) Update the Event Source to 'Contains' with a value of 'Domain Admins'. Update the priorities to High and Critical. Sit back grab a coffee (or 2) and wait whilst the rule is distributed to the Domain Controllers, this can take a while. Test the rule by adding a group or account to Domain Admins, in the SCOM Monitoring tab, an alert will almost immediately appear with full details. Now for the laborious bit, create further monitors for the following: Server Operators Account Operators Print Operators Schema and Enterprise Admins Any delegation or role-up groups SCCM Administrative groups CA Administrative groups That's the obvious groups covered, now to target all Windows Servers and Clients (if SCOM has been deployed to the clients) Local accounts for creation, addition to local groups and password resets. Applocker to alert on any unauthorised software being installed or accessed. Finally here's what Microsoft recommens. With a few hours of effort and you'll have better visibility of the system and any changes to those critical groups.

  • Always Patch Before Applocker or Device Guard are Deployed.

    Labs don't tend to follow the best practices or any security standards, they're quick dirty installations for developing and messing around. Here's some food for thought the next time you're wanting to test Applocker or Windows Defender Application Control (WADC) aka Device Guard, you may wish to at least patch. For the most part, deploying Domain Infrastructure, scripts and services works great, until Device Guard is deployed to an unpatched Windows 11 client. Firstly the steps on how to configure Device Guard, then the fun... DeviceGuardBasic.ps1 script can be downloaded from ( here ). Run the script as Admin and point the Local GPO to Initial.bin following the help. Device Guard is set to enforced, no audit mode for me, that's for wimps, been here hundreds of times......what's the worse that can happen..... arrghhhhh. The first indication Windows 11 had issues was 'Settings' crashed upon opening. This isn't my first rodeo, straight to the eventlogs. Ah, a bloodbath of red Code Integrity errors complaining that a file hasn't been signed correctly. How could this be.... the files are Microsoft files. This doesn't look good, the digital signature can't be verified meaning the signing certificate isn't in the Root Certificate Store for the Computer. This is not the first time I've seen the 'Microsoft Development PCA 2014' certificate. A few years back a sub-optimal Office 2016 update prevented Word, PowerPoint and Excel from launching. It was Applocker protecting me from the Microsoft Development certificate at that time. Well done Microsoft, I see your test and release cycle hasn’t improved. A Windows update and all is fine….right.....as if. I'm unable to click on the Install updates button, it's part of Settings and no longer accessible. Bring back Control Panel. No way I’m waiting for Windows to get around to installing the updates by itself. The choices: Disable Device Guard by removing the GPO and deleting the SIPolicyp7b file. Create an additional policy based on hashes. Start again, 2 hours effort, most of that waiting for updates to install. Creating an additional policy based on hashes and then merging them into the ‘initial’ policy allows for testing Device Guard's behaviour. Does Device Guard prevent untrusted and poorly signed files from running when hashes are present? Observed behaviour is for Device Guard policy to create hashes for unsigned files as a fallback. The new and improved Device Guard script, aptly named 'DeviceGuard-withMerge.ps1' can be downloaded from ( here ). The only additional lines of note are the New-CIPolicy to create only hashes for the “C\Windows\SystemApps” directory and to merge the 2 XML policy files. New-CIPolicy -Level Hash -FilePath $HashCIPolicy -UserPEs 3> $HashCIPolicyTxt -ScanPath "C:\Windows\SystemApps\" Merge-CIPolicy -PolicyPaths $IntialCIPolicy,$HashCIPolicy -OutputFilePath $MergedCIPolicy The result, 'Settings' now works despite Microsoft's best effort to ruin my day. Creating Device Guard policies based on hashes for files incorrectly signed by Microsoft's internal development CA is resolved. Below is the proof, 'Settings' is functional even with those dodgy files. Conclusion: This may come as a shock to some….. Microsoft does make mistakes and release files incorrectly sighed… shocking. Device Guard will allow files to run providing the hashes are present even when incorrectly signed. Did I learn something, hell yeah! always patch before deploying Device Guard or Applocker. The time spent faffing resolving the issue far exceeded the time it would have taken to patch it in the first place.

  • LAPS Leaks Local Admin Passwords

    On a previous blog ( here ), LAPS (Local Administrator Password Solution) was installed. LAPS manages and updates the local Administrator passwords on clients and member servers, controlled via GPO. Only Domain Admins have default permission to view the local administrator password for clients and member servers. Access to view the passwords by non-Domain Admins is via delegation, here lies the problem. Access to the local administrator passwords may be delegated unintentionally. This could lead to a serious security breach, leaking all local admin accounts passwords for all computer objects to those that shouldn't have access. This article will demonstrate a typical delegation for adding a computer object to an OU and how to tweak the delegation to prevent access to the ms-Mcs-AdmPwd attribute. Prep Work There is some prep-work, LAPS is required to be installed and configured, follow the above link. At least 1 non-domain joined client, preferably 2 eg Windows 10 or 11 Enterprise. A test account, mine's named TestAdmin, with no privileges or delegations and an OU named 'Workstation Test'. Ideally, I'll be using AD Groups and not adding TestAdmin directly to the OU, it's easy for demonstration purposes. Delegation of Account Open Active Directory Users and Computers or type dsa.msc in the run command. With a Domain Admin account right-click on 'Workstation Test' OU, Properties, Security Tab and then Advanced. Click Add and select the TestAdmin as the principal. Select, Applies to: This Object and all Descendant Objects In the Permission window below select: Create Computer Objects Delete Computer Objects Apply the change. This is a 2 step process, repeat this time selecting. Applies to: Descendant Computer Objects Select Full Control in the Permissions window. Test Delegation Log on to a domain workstation with RSAT installed and open Active Directory Users and Computers. Test by pre-creating a Computer object, right-click on the OU and select New > Computer, and type the hostname of the client to be added. Log on to a non-domain joined Windows client as the administrator and add to the domain using the TestAdmin credentials, reboot. Then wait for the LAPS policy to apply.......I've set a policy to update daily. View the LAPS Password As the TestAdmin, from within AD Users and Computers go to View and select Advanced. Right-click properties on the client, select the Attribute tab Scroll down and locate 'ms-Mcs-AdmPwd', that the Administrator password for that client. The Fix.... To prevent TestAdmin from reading the ms-Mcs-AdmPwd attribute value, a slight amendment to the delegation is required. As the Domain Admin right-click on 'Workstation Test' OU, Properties, Security Tab and then Advanced. Select the TestAdmin entry, it should say 'Full Control'. Remove 'All Extended Rights', 'Change Password' and 'Reset Password' and apply the change. As TestAdmin open AD Users and open the Computer attributes. ms-Mcs-AdmPwd is no longer visible. Did I Just Break Something...... Test the change by adding a computer object to the OU and adding a client to the domain. Introducing computers to the domain is functional... No harm no foul. Final Thoughts Removing the Extended Rights and Password permissions prevents the delegated account from reading the local administrator password from ms-Mcs-AdmPwd AD attribute without causing any noticeable problems. Watch for any future delegations ensuring the permissions aren't restored by accident. Enjoy and hope this was insightful.

  • Code Signing PowerShell Scripts

    In this article, I'll describe the process of Code Signing PowerShell scripts from a Microsoft CA. I'll not cover how Code Signing adds security, simply put Code Signing doesn't provide or was intended to provide a robust security layer. However, Code Signing does provide both Authenticity and Integrity: The Authenticity that the script was written or reviewed by a trusted entity and then signed. Integrity ensures that once signed the script hasn't been modified, useful when deploying scripts or executing scripts by a scheduled task with a service account. Bypassing Code Signing requirements is simple, open ISE, paste in the code and F8, instant bypass. However, my development 'Enterprise' system is not standard, ISE won't work as Constrained Language Mode prevents all but core functionality from loading, meaning no API's, .Net, Com and most modules. As a note, even with the script, code signed, ISE is next to useless with Constrained Language Mode enforced. Scripts require both signing and authorising in Applocker\WDAC and will only execute from native PowerShell. Back to it..... This is a typical message when executing a PowerShell script with the system requiring Code Signing. To successfully execute the script, the script must be signed with a digital signature from either a CA or Self Signed certificate. I'm not going to Self Sign, it's filth and I've access to a Microsoft Certificate Authority (CA) as part of the Enterprise. Login to the CA, launch 'Manage' and locate the 'Code Signing' template, then 'Duplicate Template'. Complete the new template with the following settings: General: Name the new certificate template with something meaningful and up the validity to 3 years or to the maximum the corporate policy allows. Compatibility: Update the Compatibility Settings and Certificate Recipient to 'Windows Server 2016' and 'Windows 10/Windows Server 2016' respectively. Request Handling: Check the 'Allow private key to be exported'. Cryptographic: Set 'Minimum key size' to either 1024, 2048, 4096, 8192 or 16,384 Select 'Requests must use one of the following providers:' and check 'Microsoft Enhanced RSA and AES Cryptographic Provider' ( description ) Security: Ideally, enrolment is controlled via an AD Group with both READ and Enrol permissions. Do not under any circumstances allow WRITE or FULL. Save the new template and then issue by right-clicking on 'Certificate Template' > New and 'Certificate Template to Issue'. From a client and logged on with the account that is a member of the 'CRT_PowerShellCodeSigning' group, launch MMC and add the Certificate snap-in for the Current User. Browse to Personal > Certificates and right-click in the empty space to the right, then click on 'All Tasks' > 'Request New Certificate. Select the 'Toyo Code Signing' template and then click on 'Properties' to add in some additional information. Add a Friendly Name and Description. Enrol the template. Now, right-click on the new 'Code Signing' certificate > All Tasks > Export. Select 'Yes, export the private key'. Ensure the 2 PKCS options are selected. Check the 'Group or username (recommended)' and on the Encryption drop-down select 'AES256-SHA256'. Complete the wizard by exporting the .pfx file The final step is to sign a script with the .pfx file using PowerShell. Set-AuthenticodeSignature -FilePath "C:\Downloads\SecureReport9.4.ps1" -cert "C:\Downloads\CodeSigning.pfx" Open the newly signed script and at the bottom of the script is the digital signature. Launch PowerShell.exe and run the script. For those with Applocker\WDAC then the script requires adding to the allow list by file hash. Now I'll be able to execute my own Pentest script on my allegedly secure system and locate any missing settings..... As always thanks for your support.

  • How to Delegate Active Directory OU's with PowerShell

    Today is a quick explanation regarding OU delegation using PowerShell with usable examples and how-to located the GUID that identifies the object type being delegated. All the required scripts can be found on my Github ( here ). Delegated Test Account: For demonstration purposes, the following is executed directly on the Domain Controller and as a Domain Admin. Create a test user named 'SrvOps' and add it to the 'Server Operators', group. This effectively provides Administrator privileges on the DC's without access to AD. Create the following Global Groups, CompDele, UserDele and GroupDele and to the SrvOps user. Greate the following OU's, Computer, User and Group. Shift and Right-click 'Active Directory Users and Computer' and 'Run as a Different User', and enter the SrvOps credentials. Right-click on the Computer OU and you will notice that there's no options to New and select an object type. ADSI Edit and Object GUID: Close the AD snap-in. Back to Domain Admin and launch 'adsiedit.msc'. Select 'Schema' from the 'Select a well known Naming Context:' and OK. Scroll down and select 'CN=Computer' properties. On the 'Attribute Editor' tab scroll down and locate 'schemaIDGUID'. This is the Guid object identity used for delegating Computer objects. It's not possible to copy the value directly and double clicking provides Hex or Binary values which can be copied. The following converts the Hex to the required Guid value. $trim = ("86 7A 96 BF E6 0D D0 11 A2 85 00 AA 00 30 49 E2").replace(" ","") $oct = "$trim" $oct1 = $oct.substring(0,2) $oct2 = $oct.substring(2,2) $oct3 = $oct.substring(4,2) $oct4 = $oct.substring(6,2) $oct5 = $oct.substring(8,2) $oct6 = $oct.substring(10,2) $oct7 = $oct.substring(12,2) $oct8 = $oct.substring(14,2) $oct9 = $oct.substring(16,4) $oct10 = $oct.substring(20,12) $strOut = "$oct4" + "$oct3" + "$oct2" + "$oct1" + "-" + "$oct6" + "$oct5" + "-" + "$oct8" + "$oct7" + "-" + "$oct9" + "-" + "$oct10" write-host $strOut #result = BF967A86-0DE6-11D0-A285-00AA003049E2 The Script: Download the scripts from Github ( here ) and open with Powershell_ise. Update the DN, the OU path to the Computer OU created earlier. Execute the script and repeat for Users and Groups scripts. Relaunch 'Active Directory Users and Computers' as a different user and enter the SrvOps account credentials. Right-click on each of the OU's and 'New'. You will notice SrvOps can now create objects relative to the name of the OU. Final Considerations: Retrieving the 'schemaIDGUID' from ADSI Edit allows the delegation of pretty much any object type within AD and for the most part a couple of minor tweaks to the scripts provided and your set. Enjoy and if you find this useful please provide some feedback via the homepage's comment box.

  • Failure Deploying Applications with SCCM\MECM with Error 0x87d01106 and 0x80070005

    I encountered an issue with SCCM\MECM failing to deploy the LAPS application to clients and servers. This was previously working fine but now was failing with a Past Due error in Software Center. The AppEnforce.log produced the only meaningful SCCM error events of 0x87d01106 and 0x80070005. 0x80070005 CMsiHandler::EnforceApp failed (0x80070005). AppProvider::EnforceApp - Failed to invoke EnforceApp on Application handler(0x80070005). CommenceEnforcement failed with error 0x80070005. Method CommenceEnforcement failed with error code 80070005 ++++++ Failed to enforce app. Error 0x80070005. ++++++ CMTrace Error Lookup reported ‘Access denied’ 0x87d01106 Invalid executable file C:\Windows\msiexec.exe CMsiHandler::EnforceApp failed (0x87d01106). AppProvider::EnforceApp - Failed to invoke EnforceApp on Application handler(0x87d01106). CommenceEnforcement failed with error 0x87d01106. Method CommenceEnforcement failed with error code 87D01106 ++++++ Failed to enforce app. Error 0x87d01106. ++++++ CMTrace Error Lookup reported Failed to verify the executable file is valid or to construct the associated command line. Source: Microsoft Endpoint Configuration Manager Interestingly testing revealed that .msi applications, configuration items aka compliance and WDAC policy were affected with .exe deployments remaining unaffected. Executing the install string from the administrator account also worked. Somewhat concerning as SCCM deployments execute as System, the highest privilege possible, yet all application installs failed across the entire domain. At this point, Google is normally your friend..... but the results suggested PowerShell, and the wrong user context, as it's a msi issue, these suggestions were not helpful. Clearly, I'm asking the wrong question...... When in doubt or.... stuck, trawl the eventlogs, the SCCM logs weren't going to give up anything further. Fortunately, in fairly short order the following errors were located in the Windows Defender log. Microsoft Defender Exploit Guard has blocked an operation that is not allowed by your IT administrator. For more information please contact your IT administrator. ID: D1E49AAC-8F56-4280-B9BA-993A6D77406C Detection time: 2023-02-23T21:03:46.265Z User: NT AUTHORITY\SYSTEM Path: C:\Windows\System32\msiexec.exe Process Name: C:\Windows\System32\wbem\WmiPrvSE.exe Target Commandline: "C:\Windows\system32\msiexec.exe" /i "LAPS.x64.msi" /q /qn Parent Commandline: C:\Windows\system32\wbem\wmiprvse.exe -Embedding Involved File: Inheritance Flags: 0x00000000 Security intelligence Version: 1.383.518.0 Engine Version: 1.1.20000.2 Product Version: 4.18.2301.6 Now I know the correct question to ask Google 'D1E49AAC-8F56-4280-B9BA-993A6D77406C', with Attack Surface Reduction (ASR) being the culprit. The following is an extract from the Microsoft page: 'Block process creations originating from PSExec and WMI commands D1E49AAC-8F56-4280-B9BA-993A6D77406C Block process creations originating from PSExec and WMI commands This rule blocks processes created through PsExec and WMI from running. Both PsExec and WMI can remotely execute code. There's a risk of malware abusing the functionality of PsExec and WMI for command and control purposes, or to spread infection throughout an organization's network. Warning Only use this rule if you're managing your devices with Intune or another MDM solution. This rule is incompatible with management through Microsoft Endpoint Configuration Manager because this rule blocks WMI commands the Configuration Manager client uses to function correctly. There is no fix, only a workaround, involving updating the ASR setting Block Mode to Audit Mode in Group Policy. Open GPO Management and locate the ASR rules under Windows Components/Microsoft Defender Antivirus/Microsoft Defender Exploit Guard/Attack Surface Reduction. Open the 'Configure Attack Surface Reduction Rules'. Update value name 'D1E49AAC-8F56-4280-B9BA-993A6D77406C' from 1 to 2. Gpupdate /force to refresh the GPO's on the client, then check the eventlog for 5007 recording the change from Block to Audit Mode. Test an SCCM Application deployment to confirm the fix. One final check of the event log confirming event id 1122 for the deployed application.

  • Change MDT Mapped Z: Drive

    When deploying a Windows operating system or installing MDT applications, a mapped network drive is usually mounted temporarily as Z:\. The letter "Z" is chosen because it is typically not used for local drives in most deployments, it's less likely to conflict with existing drive letters on the target computer. What occurs when an application necessitates the use of the Z:\ drive during the process of deploying an image through MDT? It's often better to overlook your initial reaction.....Z: Being engaged during the operating system installation. Applications can persist with preconfigured mapped network drives. The illustration provided represents a common example of a regular operating system deployment, and it's evident that the drive letter Z: is assigned to the MDT Deployment share. There appear to be two approaches to altering the fixed Z:\ drive mapping to a different designated letter, although there might be additional methods available as well . During my search for a solution, Google yielded no results, which could potentially be attributed to me asking the wrong questions. Late to the party and whilst writing this blog, ChatGPT provided a suggestion to address this issue, update the 'CustomSettings.ini' file by incorporating 'DriveLetter=Y'. Had it succeeded on the initial attempt, it would have presented a more graceful resolution, unfortunately, that wasn't the case, I haven't delved into the reasons behind the failure. Let's proceed with a working solution by modifying the hardcoded drive letter in ZTIUtility.vbs. I'm using PowerShell_ISE as it conveniently displays the line number. Browse to C:\MDTDeploymentShare\Scripts\ZTIUtility.vbs Search for "z" and on line 3003 or thereabouts, depending on the version of MDT installed, update the hardcoded drive 'Z' to something else, not C: or X: as these are also used by the OS and MDT. In this case, I've designated the letter 'T' as the new MDT mapped network drive. Regenerate the Boot images by Updating the Deployment Share. Choose 'Completely regenerate the boot images', then grab a coffee. Launch WDS and Replace the Image. Browse to the MDT Share and select the LiteTouchPE_x64.wim. Deploy a new Windows OS from MDT Pxe and the MDTDeploymentShare is now mapped as "T:\". If you found the content valuable, I encourage you to explore the MDT deployment guides and instructional resources available under the main website sections. Finally, I'm headed off to have strong words with the individual responsible for implementing an application that requires hardcoded drives for configuration components.

  • Sorting Files into Years and Month

    Thousands of files, no structure, let's get them organised into months and years with PowerShell. Duplicates are moved to another directory for review. This script was written in response to trying to manage the 10’s of thousands of photos and videos being uploaded to a file share each year. Management is near impossible with Synology’s DS Photo Android App automatically uploading new photo’s to the root of the share. Plus any taken with cameras or other mobiles were also dumped into the same share. A bit of a mess. For the purposes of testing and this blog, a Data directory was created off the root of C:\. A few hundred photos and videos have been dumped… oops… copied into the folder. The files were copied to create duplicates. Download the 'hash and then sort by month' script from @ Tenaka/FileSystem (github.com) Open PowerShell_ise and browse to the downloaded script. Update the $path variable, Ctrl + A and then F8, sit back and wait for the files to be organised. On a serious note, please don't run this without testing. So what does it do: All files are compared based on their file hash to find all duplicates. Duplicate file names are amended to include an incremental number preventing potential loss of data with files overwriting each other. Files that aren't duplicates are moved based on their creation date to Year\Month directory.

  • Ivanti Endpoint Manager Initial Setup for Endpoint Protection

    Ivanti's Endpoint Protection's Application Control: Ivanti Endpoint Protection is a comprehensive security solution that provides organizations with a comprehensive set of security tools designed to protect their endpoints, networks, and data. It is designed to protect users from the latest threats, such as malware, ransomware, and phishing attacks. It also provides advanced capabilities, such as patch management, application control, and user privilege management. With Ivanti Endpoint Protection, organizations can ensure their endpoints are secure and protected from the latest threats. This article focuses on the initial setup of Ivanti Endpoint Manager and Endpoint Security Application Control, agent deployment and policy. This will provide the bases for the next round of 'verses' articles having thoroughly abused Windows Applocker, WDAC and GPO. The following has been extracted from the Ivanti Endpoint Protection user guide downloadable from ( here ). Ivanti® Endpoint Manager and Endpoint Security for Endpoint Manager consists of a wide variety of powerful and easy-to-use tools you can use to help manage and protect your Windows, Macintosh, mobile, Linux, and UNIX devices. Endpoint Manager and Security tools are proven to increase end user and IT administrator productivity and efficiencyLANDesk Application control offers the following system-level security: Kernel-level, rule-based file-system protection Registry Protection Startup Control Detection of stealth rootkits Network filtering Process and file/application certification File protection rules that restrict actions that executable programs can perform on specified files The initial Ivanti setup focus's on Ivanti Endpoint Protection's (EP) Application Control to compare and pit against Microsoft's Applocker and WDAC. Ivanti's EP Firewall, Device Control and AV policies won't be configured, although it is capable of providing a full management suite of protections from within a single console. The focus is Ivant EP vs Microsoft's application control, the paid 3rd part tools versus the free inbuilt tools. Ivanti Download: The good news, Ivanti provides 45 day, fully featured trial software, allowing plenty of time for EP to be put through its paces. The bad news, the trial software is not current, the download is for the 2020.1 version and not the latest 2022.2 or higher. A little sub-optimal considering it's for endpoint protection and security. Links to access Ivanti Endpoint Manager 2020.1: 45 day trial sign-up ( here ). Installation guide ( here ), Domain with a SQL server is required. Exclaimers: After following the installation guide, Ivanti will require a fair amount of fettling to deploy Application Control in enforcement mode. Remember, it's only for application execution to provide a direct comparison to Applocker and WDAC and a baseline reference for EP configuration. I'm not an Ivanti expert, I've spent a day installing and learning Ivanti. It's expected that the lack of experience with this product results in some ambiguity, I'm not interested in the journey but the net result of trying to exploit Windows with Ivanti Endpoint Protection enabled. Initial Login: Let's get to it...... From the Start Menu launch 'Ivanti Management Console', and enter the account details used during setup. Add LDAP Configuration: To integrate AD, providing search and deployment of policy, agent and software: Click on 'Configuration' in the lower left pane. Right-click on 'Directory' and 'Manage Directory...' 'Add', follow the wizard to include the domain structure using the Domain Admin account. Initial Agent Audit Policy: Initially, the endpoint and its software is unknown and an agent is required to be deployed. Click 'Configuration' in the bottom left windows and then select 'Agent Configuration', then the top left. In the 'Agent Configuration' window, bottom right, right-click and select 'New Windows agent configuration'. Update the 'Agent Configuration': Update 'Configuration Name' with something meaningful. Check the 'Endpoint Security option. Browse and then select 'Endpoint Protection' under 'Distribute and Patch' and then 'Security and Compliance'. Click 'Configure'. Within 'Endpoint Security' check 'Application Control:' and then click on '....' to configure the Application Control policy. Select 'Advanced' under 'Application Protection' and click on 'Learning'. With the initial policy when Ivanti is 'Learning' there is no reason to tempt fate by locking ourselves out of the client. Select 'Learning' for 'Whitelisting'. Save the changes and close both the 'Application Control' and 'Agent Configuration wizards. Agent Deployment: The agent and EP policy has been created and requires deploying to a client. Ivanti Management is fully featured and comes with LANDesk. For those that aren't familiar it's on par with SCCM\MECM. Here's a guide to assist in deploying the Ivanti agent ( here ). For expedience, I've opted for manual agent deployment. Right-click on the new agent and select 'Advance Agent'. Copy the URL and log on to the Windows 10 or 11 client. Download the .exe and install. Both Windows Defender and SmartScreen GPO's required updating to allow the Ivanti agent to install. Once the agent's installed, launch 'Ivanti Endpoint Security' from the Start Menu for a quick review. Excellent, Application Control and Whitelist learning policies are in effect. In preparation for blocking mode, launch installed applications on the client and run through some user activity. This activity is audited and logged to the Ivanti server for approval. It's time for a long coffee break, the file activity can take a little while to report back to the Ivanti server console. The initial audit results will take a few hours, a full audit will take overnight. Audited Files: With the agent installed the 'Win10-01' client becomes available to manage by right-clicking. Top tip, from Diagnostics its possible to see Ivant client and core logs. To view the audited files select 'Security and Patch' then 'Application Information'. As this is a new installation of Ivanti Endpoint Protection the audited files are classed as 'undecided'. It's not as simple as clicking and then approving the files, this can only be accomplished by updating the 'Agent Configuration' settings. Endpoint Security Policy - Blocking Mode: The agent has been deployed in learning mode, enabling file data collection to be available in the console. At this point, those files require authorising and blocking mode enabling. The easiest method of updating the client from learning to blocking was to update the agent and not just the Endpoint Security policy, having failed repeated attempts. Right-click the 'Agent Deployment - Initial Config', Copy and then Paste, maintaining the original agent settings. Rename the agent configuration to reflect its purpose, 'Agent Deployment - Windows Client Blocking'. Right-click the new agent config, 'Properties'. Navigate to 'Endpoint Security' via 'Distribution and Patch' and then 'Security and Compliance'. Click 'Configure...' and in the 'Configure endpoint security setting' click 'New'. Add a meaningful name to the 'Endpoint Security' wizard. Click on 'Default Policy' and select ... next to the 'Application control' dropdown. Click on 'New...' On the 'General Settings' update the name. Click on 'Application Protection' and check the following: Enable application behaviour protections Prevent master boot record (MBR) encryption Auto detect and blacklist crypto-ransomware Under 'File protection rules' select all the options, not all these options may be suitable for an enterprise, and some trial and error may be required. Under 'Application Protection' click on 'Advanced' and 'Blocking', and remove any checks for 'Learning mode ...' Under 'Whitelisting' check all options and 'Configure' and select all the script options. Scripts will require authorising to work. Again on the 'Advanced' page select 'Blocking' and uncheck 'Learning mode ...' save the changes. Highlight the new policy and then 'Use Selected'. Enable Microsoft * as a trusted signer, under 'Digital Signatures'. As Ivanti is authorising files by hash it seems prudent to trust and thus allow all Microsoft files. Ivanti operates at the kernel level, any file not authorised will be denied including system files, it's reasonable to expect blue bends (BSoD) in this case. Click 'Add...' on the 'Application File List'. Click 'New'. To authorise collected from the client click on the yellow circle with a downward arrow. Click 'Import from other application file lists... ' Check the 'Computer' and select the client. Ctrl + A to highlight all files and right-click, the 'Override reputation...' Enable 'Good'. To ensure that blocking mode is enabled, set CMD.exe's reputation to 'Bad'. Click 'Next', returning to the Application File List. Highlight CMD.exe and then click on the pencil, 'Edit Application Files'. Set the execution from Allow to Block. OK the changes, close the Application File List, returning to the 'Configure Application File Lists'. Highlight the new blocking policy then click 'Use selected'. Update the 'Learning list:' drop down to that of the Win10 approval file list and save the changes. Ensure the 'Machine Configuration' is configured with the new Windows 10 Client Policy and save the changes. Point of note: No Dll's were listed in the authorised file list, from previous testing bypassing application protections can be achieved when dll file types arent protected. Read this ( here ) where Applocker was successfully bypassed by malware with a DLL file extention. Deploy Agent in Blocking Mode: Click on 'Configuration' in the bottom left pane and then 'Agent Configuration'. In the bottom right pane select 'My Configurations'. Right-click and properties on the 'Agent Deployment - Windows Client Blocking' As the target client already has the agent installed a 'scheduled agent deployment' or 'scheduled update to agent settings' should work. I've opted for the agent deployment, removing the old agent and settings alnd installing the new agent with the new blocking configuration. Click on 'Targets', then 'Targeted Devices', and click on 'Add'. Select the Windows client with the agent installed and ensure the client box is checked. In 'Schedule task', select 'Start Now' and then 'Save'. The Client: Log in to the client and after about 15 minutes the Ivanti agent with the blocking configuring will have been deployed. The client is likely to show that the 'Status' is disabled for all components with 'Application Control' also displaying 'Off'. Reboot the client. After the reboot the agent should show the following: Launching cmd.exe displays the following Ivanti message, cmd is indeed blocked and policy and settings are successfully applied. The process of creating and deploying Ivanti EP is understood and is repeatable. The next step is to test how effective Ivanti EP is at protecting Windows from various Remote Code Exploits, Local Code Exploits and Reverse Shells following the same patterns used testing Applocker and Device Guard (WDAC). To follow shortly......

  • The Onion Router (TOR) in a Box

    Invizbox If you’re looking to take your online privacy up a notch, combining Tor with an InvizBox router is a smart move. The InvizBox makes it easy to route your network traffic through the Tor network, giving you anonymity without having to tinker with complex configurations on each device. In this blog I'll walk through how to get Tor running on your InvizBox so you can browse the web more securely and privately. TOR TOR protects the user's privacy and your IP address from your ISP and anyone interested in the traffic leaving the property by applying multiple layers of encryption to your browser traffic and passing the traffic through a series of random Tor relays. As the traffic progresses through the relays a layer of encryption is decrypted revealing the next hope unit the exit node where the final layer is decrypted and the original web request is sent on to its final destination. Simplified diagram of Tor. The green lines are encrypted. That's the basics of how Tor works and I tend to run it from a Linux variant such as Kali or Backbox. A while back I purchased an Invizbox One, tested it and then chucked it in the back of the drawer. But with some extra time on my hands due to CV-19 I thought I would revisit the Invizbox. To start with the Invizbox didn't power on, a great start, it didn't like being plugged into the USB port of the router and so I moved it to a PC. Once connected to the Admin page the firmware had to be updated before Tor would start. On the Zyxel I assigned the DMZ to port 5, configured the Firewall, DHCP, DNS and then plugged in the yellow cable. On the Invizbox Admin page, I set the Privacy Mode to 'Tor' Set the country options to Europe and UK, wasn't sure if the UK was considered part of the EU or not...... That was pretty much it, nice and easy. Any client, Windows, Linux or even...Mac (yuck) can connect to the Invizbox wifi and browse from any country in Europe or UK. Yesterday apparently I was visiting Romania and today it's Germany. To sum up, it's a nifty little device that makes it easy and more accessible to more devices including those you can't install software on. The Invizbox was purchased a few years back at a cost of £50, it's now £80 on Amazon, direct from the Invizbox there's now a subscription for the VPN. There are alternatives like Anonabox. Would I purchase one today at £80, unlikely, if I had to use a device I would rather build an Onion Pi or Odroid. But likely I would carry on using Kali with Tor, it's free. Now the words of warning: There have been security flaws with Tor devices and with Tor as a browser, regularly check for updates. To maintain anonymity don't use the computer where your also logging on to Facebook, Amazon etc.... I would stay away from using Windows as it's a little heavy on the MS spyware and there's the potential for AV and Windows updates to be tampered with on the exit nodes. Only use secure websites to prevent the exit nodes from performing Man In The Middle attacks. The relay nodes are run and maintained by volunteers, which means that the nodes can't be trusted and some will be run by the NSA, FBI or criminals. https://tails.boum.org/ is recommended for maintaining privacy Invizbox and Alternatives https://www.anonabox.com/buy-anonabox-original.html https://www.invizbox.com/products/invizbox/#pricing https://www.raspberrypi.org/blog/onion-pi-tor-proxy/

  • Basics of Creating Webpages with PowerShell

    Creating a simple web report with PowerShell doesn't need to be a chore, there are limitations and it's definitely not a proper HTML editor. It doesn't mean the output should look shoddy. Like many, I'm using PowerShell to analyse Windows and display the results. The screen grab below is a section of a report I'm currently working on and soon to be published. The script is a comprehensive vulnerability assessment written entirely in PowerShell and made to look pretty without trawling through copious amounts of log outputs. This blog will cover the basics of taking PowerShell objects from various sources and creating HTLM output. It's not difficult, just fiddley, a couple of different techniques to successfully convert PowerShell to HTML may be required. Before everyone gets critical regarding the script formatting, some are due to how ConvertTo-HTML expects the data, most are to help those that aren’t familiar with scripting. There is a conscious decision not to use aliases or abbreviations and where possible to create variables. #Set Output Location Variables Nothing challenging here, creates a working directory, and sets the variable for the report output. Tests the existence of the path and if doesn’t exist creates the directory structure. $RootPath = "C:\Report" $OutFunc = "SystemReport" $tpSec10 = Test-Path "$RootPath \$OutFunc\" if ($tpSec10 -eq $false) { New-Item -Path "$RootPath \$OutFunc\" -ItemType Directory -Force } $working = "$RootPath \$OutFunc\" $Report = "$RootPath \$OutFunc\"+ "$OutFunc.html" #HTML to Text Keep it simple, create a variable and add some text. This is the one that ought to be straightforward and ended up being a bit of a pain. The conversion to HTML ended up producing garbage. Google gave some interesting solutions…. The fix I discovered turned out to be super simple. The fragment needs to be set as a ‘Table’ and not a ‘List’. Doh….. $Intro = "The results in this report are a guide and not a guarantee that the tested system is not without further defects or vulnerabilities." #Simple WMI This is a report about Windows, had better collect some wmi attributes. There are 2 methods, dump the attributes into a variable and process them later. Or create a variable for each required attribute and hashtable the data, the latter is a lot of effort. $hn = Get-CimInstance -ClassName win32_computersystem $os = Get-CimInstance -ClassName win32_operatingsystem $bios = Get-CimInstance -ClassName win32_bios $cpu = Get-CimInstance -ClassName win32_processor #Foreach and New-Object. Now life starts to get interesting. The date format needs updating from “23/11/2021 00:00:00” to “23/11/2021” to maintain the formatting a ‘foreach’ is required to strip out the additional characters per line, then added to an array. Under normal circumstances, the red code snippet would suffice. Foreach ($hfitem in $getHF) { $hfid = $hfitem.hotfixid $hfdate = ($hfitem.installedon).ToShortDateString() $hfurl = $hfitem.caption $newObjHF = $hfid, $hfdate,$hfurl $HotFix += $newObjHF } When dealing with HTML the correct method requires the use of ‘New-Object’ command. $HotFix=@() $getHF = Get-HotFix | Select-Object HotFixID,InstalledOn,Caption Foreach ($hfitem in $getHF) { $hfid = $hfitem.hotfixid $hfdate = $hfitem.installedon $hfurl = $hfitem.caption $newObjHF = New-Object psObject Add-Member -InputObject $newObjHF -Type NoteProperty -Name HotFixID -Value $hfid Add-Member -InputObject $newObjHF -Type NoteProperty -Name InstalledOn -Value ($hfdate).Date.ToString("dd-MM-yyyy") Add-Member -InputObject $newObjHF -Type NoteProperty -Name Caption -Value $hfurl $HotFix += $newObjHF } #Pulling Data from the Registry Registry keys require the ‘Get-ChildItem’ followed by ‘Get-ItemProperty’ to extract the individual settings from the Registry Hive. Each setting is then assigned to a variable. $getUnin = Get-ChildItem "HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall\" $UninChild = $getUnin.Name.Replace("HKEY_LOCAL_MACHINE","HKLM:") $InstallApps =@() Foreach ( $uninItem in $UninChild) { $getUninItem = Get-ItemProperty $uninItem $UninDisN = $getUninItem.DisplayName -replace "$null","" $UninDisVer = $getUninItem.DisplayVersion -replace "$null","" $UninPub = $getUninItem.Publisher -replace "$null","" $UninDate = $getUninItem.InstallDate -replace "$null","" $newObjInstApps = New-Object -TypeName PSObject Add-Member -InputObject $newObjInstApps -Type NoteProperty -Name Publisher -Value $UninPub Add-Member -InputObject $newObjInstApps -Type NoteProperty -Name DisplayName -Value $UninDisN Add-Member -InputObject $newObjInstApps -Type NoteProperty -Name DisplayVersion -Value $UninDisVer Add-Member -InputObject $newObjInstApps -Type NoteProperty -Name InstallDate -Value $UninDate $InstallApps += $newObjInstApps } #Cascading Style Sheets (CSS) To apply a consistent style to each element we use a CSS containing text size, colour and font as well as spacing and background colours. Each style, for example 'h1' has a set of properties that applies to any number of elements tagged "variable or text". reducing repeat lines of code required, updating the CSS and all elements receive the change. CSS Tutorial (w3schools.com) is a good resource to learn and try out CSS. In the example below h1, h2 and h3 set different sized fonts and colours. $style = @"

bottom of page