Settings Results in 4 milliseconds

ASP.NET Core authentication using Microsoft Entra External ID for customers (CIAM)
ASP.NET Core authentication using Microsoft Entra ...

This article looks at implementing an ASP.NET Core application which authenticates using Microsoft Entra External ID for customers (CIAM). The ASP.NET Core authentication is implemented using the Microsoft.Identity.Web Nuget package. The client implements the OpenID Connect code flow with PKCE and a confidential client. Code https//github.com/damienbod/EntraExternalIdCiam Microsoft Entra External ID for customers (CIAM) is a new Microsoft product for customer (B2C) identity solutions. This has many changes to the existing Azure AD B2C solution and adopts many of the features from Azure AD. At present, the product is in public preview. App registration setup As with any Azure AD, Azure AD B2C, Azure AD CIAM application, an Azure App registration is created and used to define the authentication client. The ASP.NET core application is a confidential client and must use a secret or a certificate to authenticate the application as well as the user. The client authenticates using an OpenID Connect (OIDC) confidential code flow with PKCE. The implicit flow does not need to be activated. User flow setup In Microsoft Entra External ID for customers (CIAM), the application must be connected to the user flow. In external identities, a new user flow can be created and the application (The Azure app registration) can be added to the user flow. The user flow can be used to define the specific customer authentication requirements. ASP.NET Core application The ASP.NET Core application is implemented using the Microsoft.Identity.Web Nuget package. The recommended flow for trusted applications is the OpenID Connect confidential code flow with PKCE. This is setup using the AddMicrosoftIdentityWebApp method and also the EnableTokenAcquisitionToCallDownstreamApi method. The CIAM client configuration is read using the json EntraExternalID section. services.AddDistributedMemoryCache(); services.AddAuthentication(OpenIdConnectDefaults.AuthenticationScheme) .AddMicrosoftIdentityWebApp( builder.Configuration .GetSection("EntraExternalID")) .EnableTokenAcquisitionToCallDownstreamApi() .AddDistributedTokenCaches(); In the appsettings.json, user secrets or the production setup, the client specific configurations are defined. The settings must match the Azure App registration. The SignUpSignInPolicyId is no longer used compared to Azure AD B2C. // -- using ciamlogin.com -- "EntraExternalID" { "Authority" "https//damienbodciam.ciamlogin.com", "ClientId" "0990af2f-c338-484d-b23d-dfef6c65f522", "CallbackPath" "/signin-oidc", "SignedOutCallbackPath " "/signout-callback-oidc" // "ClientSecret" "--in-user-secrets--" }, Notes I always try to implement user flows for B2C solutions and avoid custom setups as these setups are hard to maintain, expensive to keep updated and hard to migrate when the product is end of life. Setting up a CIAM client in ASP.NET Core works without problems. CIAM offers many more features but is still missing some essential ones. This product is starting to look really good and will be a great improvement on Azure AD B2C when it is feature complete. Strong authentication is missing from Microsoft Entra External ID for customers (CIAM) and this makes it hard to test using my Azure AD users. Hopefully FIDO2 and passkeys will get supported soon. See the following link for the supported authentication methods https//learn.microsoft.com/en-us/azure/active-directory/external-identities/customers/concept-supported-features-customers I also require a standard OpenID Connect identity provider (Code flow confidential client with PKCE support) in most of my customer solution rollouts. This is not is supported at present. With CIAM, new possibilities are also possible for creating single solutions to support both B2B and B2C use cases. Support for Azure security groups and Azure roles in Microsoft Entra External ID for customers (CIAM) is one of the features which makes this possible. Links https//learn.microsoft.com/en-us/azure/active-directory/external-identities/ https//www.microsoft.com/en-us/security/business/identity-access/microsoft-entra-external-id https//www.cloudpartner.fi/?p=14685 https//developer.microsoft.com/en-us/identity/customers https//techcommunity.microsoft.com/t5/microsoft-entra-azure-ad-blog/microsoft-entra-external-id-public-preview-developer-centric/ba-p/3823766 https//github.com/AzureAD/microsoft-identity-web


Full time Javascript Developer wanted!
Category: Jobs

An award winning account firm local to Cleveland/Akron, OH is looking for a mid level (3-6 years ...


Views: 38 Likes: 89
What is Computer Programming
Category: Computer Programming

<div class="group w-full text-gray-800 darktext-gray-100 border-b border-black/10 darkborder-gray- ...


Views: 0 Likes: 17
Reset user account passwords using Microsoft Graph and application permissions in ASP.NET Core
Reset user account passwords using Microsoft Graph ...

This article shows how to reset a password for tenant members using a Microsoft Graph application client in ASP.NET Core. An Azure App registration is used to define the application permission for the Microsoft Graph client and the User Administrator role is assigned to the Azure Enterprise application created from the Azure App registration. Code https//github.com/damienbod/azuerad-reset Create an Azure App registration with the Graph permission An Azure App registration was created which requires a secret or a certificate. The Azure App registration has the application User.ReadWrite.All permission and is used to assign the Azure role. This client is only for application clients and not delegated clients. Assign the User Administrator role to the App Registration The User Administrator role is assigned to the Azure App registration (Azure Enterprise application pro tenant). You can do this by using the User Administrator Assignments and and new one can be added. Choose the Azure App registration corresponding Enterprise application and assign the role to be always active. Create the Microsoft Graph application client In the ASP.NET Core application, a new Graph application can be created using the Microsoft Graph SDK and Azure Identity. The GetChainedTokenCredentials is used to authenticate using a managed identity for the production deployment or a user secret in development. You could also use a certificate. This is the managed identity from the Azure App service where the application is deployed in production. using Azure.Identity; using Microsoft.Graph; namespace SelfServiceAzureAdPasswordReset; public class GraphApplicationClientService { private readonly IConfiguration _configuration; private readonly IHostEnvironment _environment; private GraphServiceClient? _graphServiceClient; public GraphApplicationClientService(IConfiguration configuration, IHostEnvironment environment) { _configuration = configuration; _environment = environment; } /// <summary> /// gets a singleton instance of the GraphServiceClient /// </summary> public GraphServiceClient GetGraphClientWithManagedIdentityOrDevClient() { if (_graphServiceClient != null) return _graphServiceClient; string[] scopes = new[] { "https//graph.microsoft.com/.default" }; var chainedTokenCredential = GetChainedTokenCredentials(); _graphServiceClient = new GraphServiceClient(chainedTokenCredential, scopes); return _graphServiceClient; } private ChainedTokenCredential GetChainedTokenCredentials() { if (!_environment.IsDevelopment()) { // You could also use a certificate here return new ChainedTokenCredential(new ManagedIdentityCredential()); } else // dev env { var tenantId = _configuration["AzureAdGraphTenantId"]; var clientId = _configuration.GetValue<string>("AzureAdGraphClientId"); var clientSecret = _configuration.GetValue<string>("AzureAdGraphClientSecret"); var options = new TokenCredentialOptions { AuthorityHost = AzureAuthorityHosts.AzurePublicCloud }; // https//docs.microsoft.com/dotnet/api/azure.identity.clientsecretcredential var devClientSecretCredential = new ClientSecretCredential( tenantId, clientId, clientSecret, options); var chainedTokenCredential = new ChainedTokenCredential(devClientSecretCredential); return chainedTokenCredential; } } } Reset the password Microsoft Graph SDK 4 Once the client is authenticated, Microsoft Graph SDK can be used to implement the logic. You need to decide if SDK 4 or SDK 5 is used to implement the Graph client. Most applications must still use Graph SDK 4 but no docs exist for this anymore. Refer to Stackoverflow or try and error. The application has one method to get the user and a second one to reset the password and force a change on the next authentication. This is ok for low level security, but TAP with a strong authentication should always be used if possible. using Microsoft.Graph; using System.Security.Cryptography; namespace SelfServiceAzureAdPasswordReset; public class UserResetPasswordApplicationGraphSDK4 { private readonly GraphApplicationClientService _graphApplicationClientService; public UserResetPasswordApplicationGraphSDK4(GraphApplicationClientService graphApplicationClientService) { _graphApplicationClientService = graphApplicationClientService; } private async Task<string> GetUserIdAsync(string email) { var filter = $"startswith(userPrincipalName,'{email}')"; var graphServiceClient = _graphApplicationClientService .GetGraphClientWithManagedIdentityOrDevClient(); var users = await graphServiceClient.Users .Request() .Filter(filter) .GetAsync(); return users.CurrentPage[0].Id; } public async Task<string?> ResetPassword(string email) { var graphServiceClient = _graphApplicationClientService .GetGraphClientWithManagedIdentityOrDevClient(); var userId = await GetUserIdAsync(email); if (userId == null) { throw new ArgumentNullException(nameof(email)); } var password = GetRandomString(); await graphServiceClient.Users[userId].Request() .UpdateAsync(new User { PasswordProfile = new PasswordProfile { Password = password, ForceChangePasswordNextSignIn = true } }); return password; } private static string GetRandomString() { var random = $"{GenerateRandom()}{GenerateRandom()}{GenerateRandom()}{GenerateRandom()}-AC"; return random; } private static int GenerateRandom() { return RandomNumberGenerator.GetInt32(100000000, int.MaxValue); } } Reset the password Microsoft Graph SDK 5 Microsoft Graph SDK 5 can also be used to implement the logic to reset the password and force a change on the next signin. using Microsoft.Graph; using Microsoft.Graph.Models; using System.Security.Cryptography; namespace SelfServiceAzureAdPasswordReset; public class UserResetPasswordApplicationGraphSDK5 { private readonly GraphApplicationClientService _graphApplicationClientService; public UserResetPasswordApplicationGraphSDK5(GraphApplicationClientService graphApplicationClientService) { _graphApplicationClientService = graphApplicationClientService; } private async Task<string?> GetUserIdAsync(string email) { var filter = $"startswith(userPrincipalName,'{email}')"; var graphServiceClient = _graphApplicationClientService .GetGraphClientWithManagedIdentityOrDevClient(); var result = await graphServiceClient.Users.GetAsync((requestConfiguration) => { requestConfiguration.QueryParameters.Top = 10; if (!string.IsNullOrEmpty(email)) { requestConfiguration.QueryParameters.Search = $"\"userPrincipalName{email}\""; } requestConfiguration.QueryParameters.Orderby = new string[] { "displayName" }; requestConfiguration.QueryParameters.Count = true; requestConfiguration.QueryParameters.Select = new string[] { "id", "displayName", "userPrincipalName", "userType" }; requestConfiguration.QueryParameters.Filter = "userType eq 'Member'"; // onPremisesSyncEnabled eq false requestConfiguration.Headers.Add("ConsistencyLevel", "eventual"); }); return result!.Value!.FirstOrDefault()!.Id; } public async Task<string?> ResetPassword(string email) { var graphServiceClient = _graphApplicationClientService .GetGraphClientWithManagedIdentityOrDevClient(); var userId = await GetUserIdAsync(email); if (userId == null) { throw new ArgumentNullException(nameof(email)); } var password = GetRandomString(); await graphServiceClient.Users[userId].PatchAsync( new User { PasswordProfile = new PasswordProfile { Password = password, ForceChangePasswordNextSignIn = true } }); return password; } private static string GetRandomString() { var random = $"{GenerateRandom()}{GenerateRandom()}{GenerateRandom()}{GenerateRandom()}-AC"; return random; } private static int GenerateRandom() { return RandomNumberGenerator.GetInt32(100000000, int.MaxValue); } } Any Razor page can use the service and update the password. The Razor Page requires protection to prevent any user or bot updating any other user account. Some type of secret is required to use the service or an extra id which can be created from an internal IT admin. DDOS protection and BOT protection is also required if the Razor page is deployed to a public endpoint and a delay after each request must also be implemented. Extreme caution needs to be taken when exposing this business functionality. private readonly UserResetPasswordApplicationGraphSDK5 _userResetPasswordApp; [BindProperty] public string Upn { get; set; } = string.Empty; [BindProperty] public string? Password { get; set; } = string.Empty; public IndexModel(UserResetPasswordApplicationGraphSDK5 userResetPasswordApplicationGraphSDK4) { _userResetPasswordApp = userResetPasswordApplicationGraphSDK4; } public void OnGet(){} public async Task<IActionResult> OnPostAsync() { if (!ModelState.IsValid) { return Page(); } Password = await _userResetPasswordApp .ResetPassword(Upn); return Page(); } The demo application can be started and a password from a local member can be reset. The https//mysignins.microsoft.com/security-info url can be used to test the new password and add MFA or whatever. Notes You can use this solution for applications with no user. If using an administrator or a user to reset the passwords, then a delegated permission should be used with different Graph SDK methods and different Graph permissions. Links https//aka.ms/mysecurityinfo https//learn.microsoft.com/en-us/graph/api/overview?view=graph-rest-1.0 https//learn.microsoft.com/en-us/graph/sdks/paging?tabs=csharp https//learn.microsoft.com/en-us/graph/api/authenticationmethod-resetpassword?view=graph-rest-1.0&tabs=csharp


Solved!! Asset was not declared to be precompiled ...
Category: Technology

When developing using Ruby on Rails, and you try to add Bootstrap to an already existing application ...


Views: 1298 Likes: 81
Software Development Architecture and Good Practic ...
Category: System Design

These notes are used to drill down into the most op ...


Views: 0 Likes: 33
SqlException: A connection was successfully establ ...
Category: .Net 7

Question How do you solve the error that says "SqlException A connection was ...


Views: 0 Likes: 31
Why is Python so popular among Machine-Learning De ...
Category: Research

Python is a popular programming language among machine learning developers for several reasons< ...


Views: 0 Likes: 30
MicroServerlessLamba.NET Series – Intro: A Scalable Server-less Architecture in AWS Lambda with ASP.NET Core Microservices
MicroServerlessLamba.NET Series – Intro A Scalabl ...

Over the last 18 months I have been working with our team to develop a scalable microservice architecture in AWS Lambda.  Starting with very little experience in AWS prior to this undertaking and now feeling MUCH more comfortable with it, I wanted to summarize our journey and share it with others who may find it useful.  I would also love to hear and learn from those of you who may have feedback on our approach. First, why server-less? While containers seem to be all the rage in the software industry (and I do appreciate their utility), I do not believe they are the silver bullet many present them to be.  In our situation, we were building a greenfield solution with no dependencies to shackle us requiring a hosting mechanism that allowed for OS-level customizations or installs.  Server-less offerings were also a new cloud trend in the industry which almost everyone was using in one way or another so we definitely knew we would utilize it as well.  The more we researched it, the more we thought we could use it for everything with little to no compromise. Every adopted technology has a cost in learning and developing an expertise.  This is especially apparent with a small team.  We wanted to move fast and were looking for every opportunity to save time.  The time we saved NOT learning Docker/Kubernetes/containers/orchestration definitely helped us deliver a working solution faster. I have no doubt we will eventually leverage containers for certain use cases but we will continue to defer until we find a case where their value outweighs the costs. .NET Core The vast majority of our 10 years of code was developed in .NET 4.5 with a brief and unsuccessful detour in NodeJS.  Leadership wanted a new scalable API to extend our success with smaller customer to larger customers through API integrations with their software systems. At the time, .NET Core was just starting to gain some traction offering cross-platform execution, better performance and improved APIs, all developed in open source.  These features coupled with the easier transition from full .NET framework made it the clear choice.  We settled on using .NET Core 2.1 in Linux as our base for the architecture. Cloud Selection – Azure vs AWS We started our new development effort with a 3-4 week evaluation in both Azure and AWS.  We leveraged architect teams from each vendor to expedite our POCs and leverage their expertise to further develop our designs. We successfully developed a simplified version of our planned architecture in each cloud and evaluated the experience. This was a really valuable step in our journey.  Thankfully it was fairly easy to setup time architects for both vendors by working with our cloud representatives.  Both also provided access to their product teams for deeper questions and assistance. You should absolutely leverage these services if you are in a similar situation. In the end we decided to use AWS for a few reasons.  First our team had more AWS experience overall and were more comfortable with it.  Next Azure didn’t yet support .NET Core 2.1 and wouldn’t for another 4-6 months.  And lastly, our legacy products were already in AWS. Overall, both offered very similar capabilities/experiences and either could be an excellent choice for building a new server-less architecture. Summary In closing, this was the process we used in choosing the core platforms to build our new architecture upon.  I plan on writing a series of posts to continue sharing this journey and how we brought these platforms together. If you have feedback on this post or questions that you would like covered in future posts please leave a comment below. Cheers


GitKraken Error Applying Stash could not open dire ...
Category: Git

Git Kraken Error <span style="color ...


Views: 1703 Likes: 144
Exited with error code -532462766
Category: .Net 7

Question Every time I start an ...


Views: 0 Likes: 36
Drupal 8 Bootstrap 4 Not Loading CSS Files
Category: Bootstrap

<span style="font-weight bold; font-size medium; textline underl ...


Views: 298 Likes: 90
[EF Core] How to Enable Sensitive Data Logging and ...
Category: Entity Framework

Question How do you enable sensitive data and detailed error <a class="text-decoration-none" hre ...


Views: 0 Likes: 33
Kint Debugging not Showing (Drupal 8 Web Developme ...
Category: Technology

Tips and Tricks ...


Views: 263 Likes: 87
Reset passwords in ASP.NET Core using delegated permissions and Microsoft Graph
Reset passwords in ASP.NET Core using delegated pe ...

This article shows how an administrator can reset passwords for local members of an Azure AD tenant using Microsoft Graph and delegated permissions. An ASP.NET Core application is used to implement the Azure AD client and the Graph client services. Code https//github.com/damienbod/azuerad-reset Setup Azure App registration The Azure App registration is setup to authenticate with a user and an application (delegated flow). An App registration “Web” setup is used. Only delegated permissions are used in this setup. This implements an OpenID Connect code flow with PKCE and a confidential client. A secret or a certificate is required for this flow. The following delegated Graph permissions are used Directory.AccessAsUser.All User.ReadWrite.All UserAuthenticationMethod.ReadWrite.All ASP.NET Core setup The ASP.NET Core application implements the Azure AD client using the Microsoft.Identity.Web Nuget package and libraries. The following packages are used Microsoft.Identity.Web Microsoft.Identity.Web.UI Microsoft.Identity.Web.GraphServiceClient (SDK5) or Microsoft.Identity.Web.MicrosoftGraph (SDK4) Microsoft Graph is not added directly because the Microsoft.Identity.Web.MicrosoftGraph or Microsoft.Identity.Web.GraphServiceClient adds this with a tested and working version. Microsoft.Identity.Web uses the Microsoft.Identity.Web.GraphServiceClient package for Graph SDK 5. Microsoft.Identity.Web.MicrosoftGraph uses Microsoft.Graph 4.x versions. The official Microsoft Graph documentation is already updated to SDK 5. For application permissions, Microsoft Graph SDK 5 can be used with Azure.Identity. Search for users with Graph SDK 5 and resetting the password The Graph SDK 5 can be used to search for users and reset the user using a delegated scope and then to reset the password using the Patch HTTP request. The Graph QueryParameters are used to find the user and the HTTP Patch is used to update the password using the PasswordProfile. using System.Security.Cryptography; using Microsoft.Graph; using Microsoft.Graph.Models; namespace AzureAdPasswordReset; public class UserResetPasswordDelegatedGraphSDK5 { private readonly GraphServiceClient _graphServiceClient; public UserResetPasswordDelegatedGraphSDK5(GraphServiceClient graphServiceClient) { _graphServiceClient = graphServiceClient; } /// <summary> /// Directory.AccessAsUser.All User.ReadWrite.All UserAuthenticationMethod.ReadWrite.All /// </summary> public async Task<(string? Upn, string? Password)> ResetPassword(string oid) { var password = GetRandomString(); var user = await _graphServiceClient .Users[oid] .GetAsync(); if (user == null) { throw new ArgumentNullException(nameof(oid)); } await _graphServiceClient.Users[oid].PatchAsync( new User { PasswordProfile = new PasswordProfile { Password = password, ForceChangePasswordNextSignIn = true } }); return (user.UserPrincipalName, password); } public async Task<UserCollectionResponse?> FindUsers(string search) { var result = await _graphServiceClient.Users.GetAsync((requestConfiguration) => { requestConfiguration.QueryParameters.Top = 10; if (!string.IsNullOrEmpty(search)) { requestConfiguration.QueryParameters.Search = $"\"displayName{search}\""; } requestConfiguration.QueryParameters.Orderby = new string[] { "displayName" }; requestConfiguration.QueryParameters.Count = true; requestConfiguration.QueryParameters.Select = new string[] { "id", "displayName", "userPrincipalName", "userType" }; requestConfiguration.QueryParameters.Filter = "userType eq 'Member'"; // onPremisesSyncEnabled eq false requestConfiguration.Headers.Add("ConsistencyLevel", "eventual"); }); return result; } private static string GetRandomString() { var random = $"{GenerateRandom()}{GenerateRandom()}{GenerateRandom()}{GenerateRandom()}-AC"; return random; } private static int GenerateRandom() { return RandomNumberGenerator.GetInt32(100000000, int.MaxValue); } } Search for users SDK 4 The application allows the user administration to search for members of the Azure AD tenant and finds users using a select and a filter definition. The search query parameter would probably return a better user experience. public async Task<IGraphServiceUsersCollectionPage?> FindUsers(string search) { var users = await _graphServiceClient.Users.Request() .Filter($"startswith(displayName,'{search}') AND userType eq 'Member'") .Select(u => new { u.Id, u.GivenName, u.Surname, u.DisplayName, u.Mail, u.EmployeeId, u.EmployeeType, u.BusinessPhones, u.MobilePhone, u.AccountEnabled, u.UserPrincipalName }) .GetAsync(); return users; } The ASP.NET Core Razor page supports an auto complete using the OnGetAutoCompleteSuggest method. This returns the found results using the Graph request. private readonly UserResetPasswordDelegatedGraphSDK4 _graphUsers; public string? SearchText { get; set; } public IndexModel(UserResetPasswordDelegatedGraphSDK4 graphUsers) { _graphUsers = graphUsers; } public async Task<ActionResult> OnGetAutoCompleteSuggest(string term) { if (term == "*") term = string.Empty; var usersCollectionResponse = await _graphUsers.FindUsers(term); var users = usersCollectionResponse!.ToList(); var usersDisplay = users.Select(user => new { user.Id, user.UserPrincipalName, user.DisplayName }); SearchText = term; return new JsonResult(usersDisplay); } The Razor Page can be implemented using Bootstrap or whatever CSS framework you prefer. Reset the password for user X using Graph SDK 4 The Graph service supports reset a password using a delegated permission. The user is requested using the OID and a new PasswordProfile is created updating the password and forcing a one time usage. /// <summary> /// Directory.AccessAsUser.All /// User.ReadWrite.All /// UserAuthenticationMethod.ReadWrite.All /// </summary> public async Task<(string? Upn, string? Password)> ResetPassword(string oid) { var password = GetRandomString(); var user = await _graphServiceClient.Users[oid] .Request().GetAsync(); if (user == null) { throw new ArgumentNullException(nameof(oid)); } await _graphServiceClient.Users[oid].Request() .UpdateAsync(new User { PasswordProfile = new PasswordProfile { Password = password, ForceChangePasswordNextSignIn = true } }); return (user.UserPrincipalName, password); } The Razor Page sends a post request and resets the password using the user principal name. public async Task<IActionResult> OnPostAsync() { var id = Request.Form .FirstOrDefault(u => u.Key == "userId") .Value.FirstOrDefault(); var upn = Request.Form .FirstOrDefault(u => u.Key == "userPrincipalName") .Value.FirstOrDefault(); if(!string.IsNullOrEmpty(id)) { var result = await _graphUsers.ResetPassword(id); Upn = result.Upn; Password = result.Password; return Page(); } return Page(); } Running the application When the application is started, a user password can be reset and updated. It is important to block this function for non-authorized users as it is possible to reset any account without further protection. You could PIM this application using an azure AD security group or something like this. Notes Using Graph SDK 4 is hard as almost no docs now exist, Graph has moved to version 5. Microsoft Graph SDK 5 has many breaking changes and is supported by Microsoft.Identity.Web using the Microsoft.Identity.Web.GraphServiceClient package. High user permissions are used in this and it is important to protection this API or the users that can use the application. Links https//aka.ms/mysecurityinfo https//learn.microsoft.com/en-us/graph/api/overview?view=graph-rest-1.0 https//learn.microsoft.com/en-us/graph/sdks/paging?tabs=csharp https//github.com/AzureAD/microsoft-identity-web/blob/jmprieur/Graph5/src/Microsoft.Identity.Web.GraphServiceClient/Readme.md https//learn.microsoft.com/en-us/graph/api/authenticationmethod-resetpassword?view=graph-rest-1.0&tabs=csharp


Add a #FingerPrint reader to a C# WinForms app
Add a #FingerPrint reader to a C# WinForms app

Probably a good way to add extra security to a Windows Form app, just to make sure that there is a real human user infront of the screen, and it’s not some bot trying to interact with your software, is to add a Fingerprint / “Windows Hello” login. Of course, in the real world, generally the attacker would probably try to de-compile your software and try to attack whatever underlying API you are using. However, this is a very visible security feature, and if you’re looking for a super-quick security addition, then, this may be an interesting addition. Windows Hello is a biometric authentication feature that allows users to log into their Windows device using a fingerprint scanner, facial recognition, or other biometric methods, rather than a password. Some potential use cases for including Windows Hello in a WinForms app include Secure login By using a fingerprint scanner or other biometric method, you can add an additional layer of security to your app and make it more difficult for unauthorized users to access the app. Convenience Allowing users to log in with a fingerprint or other biometric method can make the login process more convenient for them, as they don’t have to remember a password or enter it manually. Compliance Depending on the nature of the app and the industries it serves, biometric authentication may be required by compliance regulations or industry standards. User experience For some users, biometric authentication is a preferred way to interact with their devices, and they feel more secure with that kind of security. Protecting sensitive data If your app handles sensitive information, such as financial data or personal information, biometric authentication can help ensure that only authorized users have access to this information. Here is a link to a public GitHub Repo that shows a simple example of this in action https//github.com/infiniteloopltd/FingerPrintReader The Key code being; var supported = await KeyCredentialManager.IsSupportedAsync(); if (!supported) return; var result = await KeyCredentialManager.RequestCreateAsync("login", KeyCredentialCreationOption.ReplaceExisting); if (result.Status == KeyCredentialStatus.Success) { MessageBox.Show("Logged in."); } else { MessageBox.Show("Login failed."); }


.NET DateTime Basics [are trickier than you might think]
.NET DateTime Basics [are trickier than you might ...

#TLDR Handling dates/times in software is deceptively hard. If handling dates, times and zones extensively/accurately is a core concern (you will know if it is), use NodaTime. For the rest of us, the built in .NET classes will usually suffice if you know how they work. If you fall into the latter camp, read this brief post to gain a working knowledge of DateTime and DateTimeOffset in .NET. Background .NET provides classes for handling dates/times which generally work as well as most other programming languages but they can leave a lot to be desired if you need to take full control.  That’s why Jon Skeet created NodaTime. While NodaTime is a technically superior framework for handling dates/times and more accurately exposes the complexities involved, this open source project still isn’t free.  All that power makes it more difficult to understand and implement dates/times for anyone who hasn’t used it before which adds a hidden cost over time. In my particular use case, I decided it didn’t warrant the full power/complexity of NodaTime and the built-in .NET classes would work for us if we understood them. If you are interested in a more comprehensive explanation of the shortcomings of date/time handling in .NET read this excellent post by Jon Skeet. NOTE – This blog post and my findings assume .NET Core 2.1. DateTime The DateTime class represents a date and time value but does not include any time offset.  It does have a Kind property which indicates whether the DateTime is a Local, UTC or Unspecified instance. You can use the Now and UtcNow static properties to get a local or UTC DateTime respectively Or you can use the DateTime constructor overloads to create a DateTime.  If no Kind is specified it defaults to Unspecified DateTimeOffset The DateTimeOffset class represents a date and time with an offset.  Similar to DateTime, it has Now and UtcNow static properties as well as constructor overloads for setting the offset explicitly Notice there are also methods to create a new DateTimeOffset converted to local, universal or a custom offset time from an existing DateTimeOffset. The offset value is set from a TimeSpan value which assumes you know the appropriate offset for a specific time zone. Daylight Savings Time When time zone information is available (more details below), DateTimeOffset will adjust to the appropriate DST offset. Converting from DateTime to DateTimeOffset (don’t miss this!) There is an implicit conversion from DateTime to DateTimeOffset which can be convenient but can also cause confusion if you don’t know how it works.  The implicit conversion will assign the offset based on the DateTimeKind on the DateTime. Be careful that you don’t pass a DateTime assuming it will default to a UTC offset.  Instead explicitly create DateTime with the correct kind OR the DateTimeOffset with the appropriate offset value using the constructor shown in the earlier section. TimeZoneInfo .NET does also have a TimeZoneInfo class which provides time zone details.  On Windows the time zone info is based on the time zone info in the Windows registry.  On Linux, the time zone info uses a different standard and is only optionally installed (its not installed in many scenarios).  Read this StackOverflow post for more details. You can also lookup time zones by their ID Lessons Learned In the end we were able to use the built-in classes for handling dates/times in our use case successfully after digging into the framework and understanding what is available and how it works. Here are a few lessons we learned and practices we will employ going forward to avoid issues Unit Test Unit test your date/time code to verify it will work from and across any time zones.  Test with multiple offsets to simulate different servers running your code.  Remember your local machine will usually have a different default time zone than the server where the code will ultimately run. Store DateTimes in UTC You never know where your code will run.  Especially, as more code moves to the cloud you never know where the server will be geographically or the default time zone it will be configured with.  Whenever possible use UTC everywhere you can, then you always know the source and convert back to the needed timezone/offset. Be Explicit Don’t assume the framework will work the way you want.  Explicitly convert DateTimes to DateTimeOffsets with the appropriate offset value to avoid odd behavior. Feedback I hope this post helps others who are struggling with date/time issues.  If you find it useful or have further suggestions, please leave a comment below.  Thanks!


.NET Developer Needed in OH
Category: Jobs

Role You Wi ...


Views: 213 Likes: 85
Unable to start Kestrel. System.InvalidOpera ...
Category: .Net 7

Question Unable to start Kestrel.&nbsp; &nbsp; &nbsp; System.InvalidOpera ...


Views: 250 Likes: 31
Data structure in C-Sharp Software Development Not ...
Category: Algorithms

In this article, I will keep notes about different #data #structures and why I should use ...


Views: 0 Likes: 39
Asp.Net 5 Development Notes (DotNet Core 3.1 Study ...
Category: Software Development

Study Notes to use when progra ...


Views: 423 Likes: 61
Multiple Java Web Developer Positions Available in ...
Category: Jobs

Java Web Developers &nbsp; Cynergies Solutions Group is looking ...


Views: 0 Likes: 79
How to Automate Income for a small Business in 202 ...
Category: Research

Diversifying income streams is a smart strategy for small businesses to reduce risk and explore ...


Views: 0 Likes: 6
Software Best Practices Learned by Experience
Category: System Design

[Updated] It is considered good practice to cache your data in memory, either o ...


Views: 0 Likes: 38
Junior/Mid-Level Java Developer
Category: Jobs

Must-Haves Bachelor&rsquo;s Degree in IT or a related field ...


Views: 149 Likes: 107
Technical Project Manager
Category: Jobs

"IMMEDIATE REQUIREMENT" Please share the suitableprofile to&nbsp;<a href="mailtoelly.jack ...


Views: 0 Likes: 29
Server-Side Development
Category: Computer Programming

The Se ...


Views: 0 Likes: 20
Provision Azure IoT Hub devices using DPS and X.509 certificates in ASP.NET Core
Provision Azure IoT Hub devices using DPS and X.50 ...

This article shows how to provision Azure IoT hub devices using Azure IoT hub device provisioning services (DPS) and ASP.NET Core. The devices are setup using chained certificates created using .NET Core and managed in the web application. The data is persisted in a database using EF Core and the certificates are generated using the CertificateManager Nuget package. Code https//github.com/damienbod/AzureIoTHubDps Setup To setup a new Azure IoT Hub DPS, enrollment group and devices, the web application creates a new certificate using an ECDsa private key and the .NET Core APIs. The data is stored in two pem files, one for the public certificate and one for the private key. The pem public certificate file is downloaded from the web application and uploaded to the certificates blade in Azure IoT Hub DPS. The web application persists the data to a database using EF Core and SQL. A new certificate is created from the DPS root certificate and used to create a DPS enrollment group. The certificates are chained from the original DPS certificate. New devices are registered and created using the enrollment group. Another new device certificate chained from the enrollment group certificate is created per device and used in the DPS. The Azure IoT Hub DPS creates a new IoT Hub device using the linked IoT Hubs. Once the IoT hub is running, the private key from the device certificate is used to authenticate the device and send data to the server. When the ASP.NET Core web application is started, users can create new certificates, enrollment groups and add devices to the groups. I plan to extend the web application to add devices, delete devices, and delete groups. I plan to add authorization for the different user types and better paging for the different UIs. At present all certificates use ECDsa private keys but this can easily be changed to other types. This depends on the type of root certificate used. The application is secured using Microsoft.Identity.Web and requires an authenticated user. This can be setup in the program file or in the startup extensions. I use EnableTokenAcquisitionToCallDownstreamApi to force the OpenID Connect code flow. The configuration is read from the default AzureAd app.settings and the whole application is required to be authenticated. When the enable and disable flows are added, I will add different users with different authorization levels. builder.Services.AddDistributedMemoryCache(); builder.Services.AddAuthentication( OpenIdConnectDefaults.AuthenticationScheme) .AddMicrosoftIdentityWebApp( builder.Configuration.GetSection("AzureAd")) .EnableTokenAcquisitionToCallDownstreamApi() .AddDistributedTokenCaches(); Create an Azure IoT Hub DPS certificate The web application is used to create devices using certificates and DPS enrollment groups. The DpsCertificateProvider class is used to create the root self signed certificate for the DPS enrollment groups. The NewRootCertificate from the CertificateManager Nuget package is used to create the certificate using an ECDsa private key. This package wraps the default .NET APIs for creating certificates and adds a layer of abstraction. You could just use the lower level APIs directly. The certificate is exported to two separate pem files and persisted to the database. public class DpsCertificateProvider { private readonly CreateCertificatesClientServerAuth _createCertsService; private readonly ImportExportCertificate _iec; private readonly DpsDbContext _dpsDbContext; public DpsCertificateProvider(CreateCertificatesClientServerAuth ccs, ImportExportCertificate importExportCertificate, DpsDbContext dpsDbContext) { _createCertsService = ccs; _iec = importExportCertificate; _dpsDbContext = dpsDbContext; } public async Task<(string PublicPem, int Id)> CreateCertificateForDpsAsync(string certName) { var certificateDps = _createCertsService.NewRootCertificate( new DistinguishedName { CommonName = certName, Country = "CH" }, new ValidityPeriod { ValidFrom = DateTime.UtcNow, ValidTo = DateTime.UtcNow.AddYears(50) }, 3, certName); var publicKeyPem = _iec.PemExportPublicKeyCertificate(certificateDps); string pemPrivateKey = string.Empty; using (ECDsa? ecdsa = certificateDps.GetECDsaPrivateKey()) { pemPrivateKey = ecdsa!.ExportECPrivateKeyPem(); FileProvider.WriteToDisk($"{certName}-private.pem", pemPrivateKey); } var item = new DpsCertificate { Name = certName, PemPrivateKey = pemPrivateKey, PemPublicKey = publicKeyPem }; _dpsDbContext.DpsCertificates.Add(item); await _dpsDbContext.SaveChangesAsync(); return (publicKeyPem, item.Id); } public async Task<List<DpsCertificate>> GetDpsCertificatesAsync() { return await _dpsDbContext.DpsCertificates.ToListAsync(); } public async Task<DpsCertificate?> GetDpsCertificateAsync(int id) { return await _dpsDbContext.DpsCertificates.FirstOrDefaultAsync(item => item.Id == id); } } Once the root certificate is created, you can download the public pem file from the web application and upload it to the Azure IoT Hub DPS portal. This needs to be verified. You could also use a CA created certificate for this, if it is possible to create child chained certificates. The enrollment groups are created from this root certificate. Create an Azure IoT Hub DPS enrollment group Devices can be created in different ways in the Azure IoT Hub. We use a DPS enrollment group with certificates to create the Azure IoT devices. The DpsEnrollmentGroupProvider is used to create the enrollment group certificate. This uses the root certificate created in the previous step and chains the new group certificate from this. The enrollment group is used to add devices. Default values are defined for the enrollment group and the pem files are saved to the database. The root certificate is read from the database and the chained enrollment group certificate uses an ECDsa private key like the root self signed certificate. The CreateEnrollmentGroup method is used to set the initial values of the IoT Hub Device. The ProvisioningStatus is set to enabled. This means when the device is registered, it will be enabled to send messages. You could also set this to disabled and enable it after when the device gets used by an end client for the first time. A MAC or a serial code from the device hardware could be used to enable the IoT Hub device. By waiting till the device is started by the end client, you could choose a IoT Hub optimized for this client. public class DpsEnrollmentGroupProvider { private IConfiguration Configuration { get;set;} private readonly ILogger<DpsEnrollmentGroupProvider> _logger; private readonly DpsDbContext _dpsDbContext; private readonly ImportExportCertificate _iec; private readonly CreateCertificatesClientServerAuth _createCertsService; private readonly ProvisioningServiceClient _provisioningServiceClient; public DpsEnrollmentGroupProvider(IConfiguration config, ILoggerFactory loggerFactory, ImportExportCertificate importExportCertificate, CreateCertificatesClientServerAuth ccs, DpsDbContext dpsDbContext) { Configuration = config; _logger = loggerFactory.CreateLogger<DpsEnrollmentGroupProvider>(); _dpsDbContext = dpsDbContext; _iec = importExportCertificate; _createCertsService = ccs; _provisioningServiceClient = ProvisioningServiceClient.CreateFromConnectionString( Configuration.GetConnectionString("DpsConnection")); } public async Task<(string Name, int Id)> CreateDpsEnrollmentGroupAsync( string enrollmentGroupName, string certificatePublicPemId) { _logger.LogInformation("Starting CreateDpsEnrollmentGroupAsync..."); _logger.LogInformation("Creating a new enrollmentGroup..."); var dpsCertificate = _dpsDbContext.DpsCertificates .FirstOrDefault(t => t.Id == int.Parse(certificatePublicPemId)); var rootCertificate = X509Certificate2.CreateFromPem( dpsCertificate!.PemPublicKey, dpsCertificate.PemPrivateKey); // create an intermediate for each group var certName = $"{enrollmentGroupName}"; var certDpsGroup = _createCertsService.NewIntermediateChainedCertificate( new DistinguishedName { CommonName = certName, Country = "CH" }, new ValidityPeriod { ValidFrom = DateTime.UtcNow, ValidTo = DateTime.UtcNow.AddYears(50) }, 2, certName, rootCertificate); // get the public key certificate for the enrollment var pemDpsGroupPublic = _iec.PemExportPublicKeyCertificate(certDpsGroup); string pemDpsGroupPrivate = string.Empty; using (ECDsa? ecdsa = certDpsGroup.GetECDsaPrivateKey()) { pemDpsGroupPrivate = ecdsa!.ExportECPrivateKeyPem(); FileProvider.WriteToDisk($"{enrollmentGroupName}-private.pem", pemDpsGroupPrivate); } Attestation attestation = X509Attestation.CreateFromRootCertificates(pemDpsGroupPublic); EnrollmentGroup enrollmentGroup = CreateEnrollmentGroup(enrollmentGroupName, attestation); _logger.LogInformation("{enrollmentGroup}", enrollmentGroup); _logger.LogInformation("Adding new enrollmentGroup..."); EnrollmentGroup enrollmentGroupResult = await _provisioningServiceClient .CreateOrUpdateEnrollmentGroupAsync(enrollmentGroup); _logger.LogInformation("EnrollmentGroup created with success."); _logger.LogInformation("{enrollmentGroupResult}", enrollmentGroupResult); DpsEnrollmentGroup newItem = await PersistData(enrollmentGroupName, dpsCertificate, pemDpsGroupPublic, pemDpsGroupPrivate); return (newItem.Name, newItem.Id); } private async Task<DpsEnrollmentGroup> PersistData(string enrollmentGroupName, DpsCertificate dpsCertificate, string pemDpsGroupPublic, string pemDpsGroupPrivate) { var newItem = new DpsEnrollmentGroup { DpsCertificateId = dpsCertificate.Id, Name = enrollmentGroupName, DpsCertificate = dpsCertificate, PemPublicKey = pemDpsGroupPublic, PemPrivateKey = pemDpsGroupPrivate }; _dpsDbContext.DpsEnrollmentGroups.Add(newItem); dpsCertificate.DpsEnrollmentGroups.Add(newItem); await _dpsDbContext.SaveChangesAsync(); return newItem; } private static EnrollmentGroup CreateEnrollmentGroup(string enrollmentGroupName, Attestation attestation) { return new EnrollmentGroup(enrollmentGroupName, attestation) { ProvisioningStatus = ProvisioningStatus.Enabled, ReprovisionPolicy = new ReprovisionPolicy { MigrateDeviceData = false, UpdateHubAssignment = true }, Capabilities = new DeviceCapabilities { IotEdge = false }, InitialTwinState = new TwinState( new TwinCollection("{ \"updatedby\"\"" + "damien" + "\", \"timeZone\"\"" + TimeZoneInfo.Local.DisplayName + "\" }"), new TwinCollection("{ }") ) }; } public async Task<List<DpsEnrollmentGroup>> GetDpsGroupsAsync(int? certificateId = null) { if (certificateId == null) { return await _dpsDbContext.DpsEnrollmentGroups.ToListAsync(); } return await _dpsDbContext.DpsEnrollmentGroups .Where(s => s.DpsCertificateId == certificateId).ToListAsync(); } public async Task<DpsEnrollmentGroup?> GetDpsGroupAsync(int id) { return await _dpsDbContext.DpsEnrollmentGroups .FirstOrDefaultAsync(d => d.Id == id); } } Register a device in the enrollment group The DpsRegisterDeviceProvider class creates a new device chained certificate using the enrollment group certificate and creates this using the ProvisioningDeviceClient. The transport ProvisioningTransportHandlerAmqp is set in this example. There are different transport types possible and you need to chose the one which best meets your needs. The device certificate uses an ECDsa private key and stores everything to the database. The PFX for windows is stored directly to the file system. I use pem files and create the certificate from these in the device client sending data to the hub and this is platform independent. The create PFX file requires a password to use it. public class DpsRegisterDeviceProvider { private IConfiguration Configuration { get; set; } private readonly ILogger<DpsRegisterDeviceProvider> _logger; private readonly DpsDbContext _dpsDbContext; private readonly ImportExportCertificate _iec; private readonly CreateCertificatesClientServerAuth _createCertsService; public DpsRegisterDeviceProvider(IConfiguration config, ILoggerFactory loggerFactory, ImportExportCertificate importExportCertificate, CreateCertificatesClientServerAuth ccs, DpsDbContext dpsDbContext) { Configuration = config; _logger = loggerFactory.CreateLogger<DpsRegisterDeviceProvider>(); _dpsDbContext = dpsDbContext; _iec = importExportCertificate; _createCertsService = ccs; } public async Task<(int? DeviceId, string? ErrorMessage)> RegisterDeviceAsync( string deviceCommonNameDevice, string dpsEnrollmentGroupId) { int? deviceId = null; var scopeId = Configuration["ScopeId"]; var dpsEnrollmentGroup = _dpsDbContext.DpsEnrollmentGroups .FirstOrDefault(t => t.Id == int.Parse(dpsEnrollmentGroupId)); var certDpsEnrollmentGroup = X509Certificate2.CreateFromPem( dpsEnrollmentGroup!.PemPublicKey, dpsEnrollmentGroup.PemPrivateKey); var newDevice = new DpsEnrollmentDevice { Password = GetEncodedRandomString(30), Name = deviceCommonNameDevice.ToLower(), DpsEnrollmentGroupId = dpsEnrollmentGroup.Id, DpsEnrollmentGroup = dpsEnrollmentGroup }; var certDevice = _createCertsService.NewDeviceChainedCertificate( new DistinguishedName { CommonName = $"{newDevice.Name}" }, new ValidityPeriod { ValidFrom = DateTime.UtcNow, ValidTo = DateTime.UtcNow.AddYears(50) }, $"{newDevice.Name}", certDpsEnrollmentGroup); var deviceInPfxBytes = _iec.ExportChainedCertificatePfx(newDevice.Password, certDevice, certDpsEnrollmentGroup); // This is required if you want PFX exports to work. newDevice.PathToPfx = FileProvider.WritePfxToDisk($"{newDevice.Name}.pfx", deviceInPfxBytes); // get the public key certificate for the device newDevice.PemPublicKey = _iec.PemExportPublicKeyCertificate(certDevice); FileProvider.WriteToDisk($"{newDevice.Name}-public.pem", newDevice.PemPublicKey); using (ECDsa? ecdsa = certDevice.GetECDsaPrivateKey()) { newDevice.PemPrivateKey = ecdsa!.ExportECPrivateKeyPem(); FileProvider.WriteToDisk($"{newDevice.Name}-private.pem", newDevice.PemPrivateKey); } // setup Windows store deviceCert var pemExportDevice = _iec.PemExportPfxFullCertificate(certDevice, newDevice.Password); var certDeviceForCreation = _iec.PemImportCertificate(pemExportDevice, newDevice.Password); using (var security = new SecurityProviderX509Certificate(certDeviceForCreation, new X509Certificate2Collection(certDpsEnrollmentGroup))) // To optimize for size, reference only the protocols used by your application. using (var transport = new ProvisioningTransportHandlerAmqp(TransportFallbackType.TcpOnly)) //using (var transport = new ProvisioningTransportHandlerHttp()) //using (var transport = new ProvisioningTransportHandlerMqtt(TransportFallbackType.TcpOnly)) //using (var transport = new ProvisioningTransportHandlerMqtt(TransportFallbackType.WebSocketOnly)) { var client = ProvisioningDeviceClient.Create("global.azure-devices-provisioning.net", scopeId, security, transport); try { var result = await client.RegisterAsync(); _logger.LogInformation("DPS client created {result}", result); } catch (Exception ex) { _logger.LogError("DPS client created {result}", ex.Message); return (null, ex.Message); } } _dpsDbContext.DpsEnrollmentDevices.Add(newDevice); dpsEnrollmentGroup.DpsEnrollmentDevices.Add(newDevice); await _dpsDbContext.SaveChangesAsync(); deviceId = newDevice.Id; return (deviceId, null); } private static string GetEncodedRandomString(int length) { var base64 = Convert.ToBase64String(GenerateRandomBytes(length)); return base64; } private static byte[] GenerateRandomBytes(int length) { var byteArray = new byte[length]; RandomNumberGenerator.Fill(byteArray); return byteArray; } public async Task<List<DpsEnrollmentDevice>> GetDpsDevicesAsync(int? dpsEnrollmentGroupId) { if(dpsEnrollmentGroupId == null) { return await _dpsDbContext.DpsEnrollmentDevices.ToListAsync(); } return await _dpsDbContext.DpsEnrollmentDevices.Where(s => s.DpsEnrollmentGroupId == dpsEnrollmentGroupId).ToListAsync(); } public async Task<DpsEnrollmentDevice?> GetDpsDeviceAsync(int id) { return await _dpsDbContext.DpsEnrollmentDevices .Include(device => device.DpsEnrollmentGroup) .FirstOrDefaultAsync(d => d.Id == id); } } Download certificates and use The private and the public pem files are used to setup the Azure IoT Hub device and send data from the device to the server. A HTML form is used to download the files. The form sends a post request to the file download API. <form action="/api/FileDownload/DpsDevicePublicKeyPem" method="post"> <input type="hidden" value="@Model.DpsDevice.Id" id="Id" name="Id" /> <button type="submit" style="padding-left0" class="btn btn-link">Download Public PEM</button> </form> The DpsDevicePublicKeyPemAsync method implements the file download. The method gets the data from the database and returns this as pem file. [HttpPost("DpsDevicePublicKeyPem")] public async Task<IActionResult> DpsDevicePublicKeyPemAsync([FromForm] int id) { var cert = await _dpsRegisterDeviceProvider .GetDpsDeviceAsync(id); if (cert == null) throw new ArgumentNullException(nameof(cert)); if (cert.PemPublicKey == null) throw new ArgumentNullException(nameof(cert.PemPublicKey)); return File(Encoding.UTF8.GetBytes(cert.PemPublicKey), "application/octet-stream", $"{cert.Name}-public.pem"); } The device UI displays the data and allows the authenticated user to download the files. The CertificateManager and the Microsoft.Azure.Devices.Client Nuget packages are used to implement the IoT Hub device client. The pem files with the public certificate and the private key can be loaded into a X509Certificate instance. This is then used to send the data using the DeviceAuthenticationWithX509Certificate class. The SendEvent method sends the data using the IoT Hub device Message class. var serviceProvider = new ServiceCollection() .AddCertificateManager() .BuildServiceProvider(); var iec = serviceProvider.GetService<ImportExportCertificate>(); #region pem var deviceNamePem = "robot1-feed"; var certPem = File.ReadAllText($"{_pathToCerts}{deviceNamePem}-public.pem"); var eccPem = File.ReadAllText($"{_pathToCerts}{deviceNamePem}-private.pem"); var cert = X509Certificate2.CreateFromPem(certPem, eccPem); // setup deviceCert windows store export var pemDeviceCertPrivate = iec!.PemExportPfxFullCertificate(cert); var certDevice = iec.PemImportCertificate(pemDeviceCertPrivate); #endregion pem var auth = new DeviceAuthenticationWithX509Certificate(deviceNamePem, certDevice); var deviceClient = DeviceClient.Create(iotHubUrl, auth, transportType); if (deviceClient == null) { Console.WriteLine("Failed to create DeviceClient!"); } else { Console.WriteLine("Successfully created DeviceClient!"); SendEvent(deviceClient).Wait(); } Notes Using certificates in .NET and windows is complicated due to how the private keys are handled and loaded. The private keys need to be exported or imported into the stores etc. This is not an easy API to get working and the docs for this are confusing. This type of device transport and the default setup for the device would need to be adapted for your system. In this example, I used ECDsa certificates but you could also use RSA based keys. The root certificate could be replaced with a CA issued one. I created long living certificates because I do not want the devices to stop working in the field. This should be moved to a configuration. A certificate rotation flow would make sense as well. In the follow up articles, I plan to save the events in hot and cold path events and implement device enable, disable flows. I also plan to write about the device twins. The device twins is a excellent way of sharing data in both directions. Links https//github.com/Azure/azure-iot-sdk-csharp https//github.com/damienbod/AspNetCoreCertificates Creating Certificates for X.509 security in Azure IoT Hub using .NET Core https//learn.microsoft.com/en-us/azure/iot-hub/troubleshoot-error-codes https//stackoverflow.com/questions/52750160/what-is-the-rationale-for-all-the-different-x509keystorageflags/52840537#52840537 https//github.com/dotnet/runtime/issues/19581 https//www.nuget.org/packages/CertificateManager Azure IoT Hub Documentation | Microsoft Learn


CSS not rendering in Ruby on Rails [Development No ...
Category: Front-End

When Ruby on Rails does not render CSS and Images look disproportionate on Linux Ubuntu Destro (W ...


Views: 365 Likes: 118
Onboarding users in ASP.NET Core using Azure AD Temporary Access Pass and Microsoft Graph
Onboarding users in ASP.NET Core using Azure AD Te ...

The article looks at onboarding different Azure AD users with a temporary access pass (TAP) and some type of passwordless authentication. An ASP.NET Core application is used to create the Azure AD member users which can then use a TAP to setup the account. This is a great way to onboard users in your tenant. Code https//github.com/damienbod/AzureAdTapOnboarding The ASP.NET Core application needs to onboard different type of Azure AD users. Some users cannot use a passwordless authentication (yet) and so a password setup is also required for these users. TAP only works with members and we also need to support guest users with some alternative onboarding flow. Different type of user flows are supported or possible AAD member user flow with TAP and FIDO2 authentication AAD member user flow with password using email/password authentication AAD member user flow with password setup and a phone authentication AAD guest user flow with federated login AAD guest user flow with Microsoft account AAD guest user flow with email code FIDO2 should be used for all enterprise employees with an office account in the enterprise. If this is not possible, then at least the IT administrators should be forced to use FIDO2 authentication and the companies should be planning on a strategy on how to move to a phishing resistant authentication. This could be forced with a PIM and a continuous access policy for administration jobs. Using FIDO2, the identities are protected with a phishing resistant authentication. This should be a requirement for any professional solution. Azure AD users with no computer can use an email code or a SMS authentication. This is a low security authentication and applications should not expose sensitive information to these user types. Setup The ASP.NET Core application uses Microsoft.Identity.Web and the Microsoft.Identity.Web.MicrosoftGraphBeta Nuget packages to implement the Azure AD clients. The ASP.NET Core client is a server rendered application and uses an Azure App registration which requires a secret or a certificate to acquire access tokens. The onboarding application uses Microsoft Graph applications permissions to create the users and initialize the temporary access pass (TAP) flow. The following application permissions are used User.EnableDisableAccount.All User.ReadWrite.All UserAuthenticationMethod.ReadWrite.All The permissions are added to a separate Azure App registration and require a secret to use. In a second phase, I will look at implementing the Graph API access using Microsoft Graph delegated permissions. It is also possible to use a service managed identity to acquire a Graph access token with the required permissions. Onboarding members using passwordless When onboarding a new Azure AD user with passwordless and TAP, this needs to be implemented in two steps. Firstly, a new Microsoft Graph user is created with the type member. This takes an unknown length of time to complete on Azure AD. When this is finished, a new TAP authentication method is created. I used the Polly Nuget package to retry this until the TAP request succeeds. Once successful, the temporary access pass is displayed in the UI. If this was a new employee or something like this, you could print this out and let the user complete the process. private async Task CreateMember(UserModel userData) { var createdUser = await _aadGraphSdkManagedIdentityAppClient .CreateGraphMemberUserAsync(userData); if (createdUser!.Id != null) { if (userData.UsePasswordless) { var maxRetryAttempts = 7; var pauseBetweenFailures = TimeSpan.FromSeconds(3); var retryPolicy = Policy .Handle<HttpRequestException>() .WaitAndRetryAsync(maxRetryAttempts, i => pauseBetweenFailures); await retryPolicy.ExecuteAsync(async () => { var tap = await _aadGraphSdkManagedIdentityAppClient .AddTapForUserAsync(createdUser.Id); AccessInfo = new CreatedAccessModel { Email = createdUser.Email, TemporaryAccessPass = tap!.TemporaryAccessPass }; }); } else { AccessInfo = new CreatedAccessModel { Email = createdUser.Email, Password = createdUser.Password }; } } } The CreateGraphMemberUserAsync method creates a new Microsoft Graph user. To use a temporary access pass, a member user must be used. Guest users cannot be onboarded like this. Even though we do not use a password in this process, the Microsoft Graph user validation forces us to create one. We just create a random password and will not return this, This password will not be updated. public async Task<CreatedUserModel> CreateGraphMemberUserAsync (UserModel userModel) { if (!userModel.Email.ToLower().EndsWith(_aadIssuerDomain.ToLower())) { throw new ArgumentException("A guest user must be invited!"); } var graphServiceClient = _graphService .GetGraphClientWithManagedIdentityOrDevClient(); var password = GetRandomString(); var user = new User { DisplayName = userModel.UserName, Surname = userModel.LastName, GivenName = userModel.FirstName, OtherMails = new List<string> { userModel.Email }, UserType = "member", AccountEnabled = true, UserPrincipalName = userModel.Email, MailNickname = userModel.UserName, PasswordProfile = new PasswordProfile { Password = password, // We use TAP if a paswordless onboarding is used ForceChangePasswordNextSignIn = !userModel.UsePasswordless }, PasswordPolicies = "DisablePasswordExpiration" }; var createdUser = await graphServiceClient.Users .Request() .AddAsync(user); return new CreatedUserModel { Email = createdUser.UserPrincipalName, Id = createdUser.Id, Password = password }; } The TemporaryAccessPassAuthenticationMethod object is created using Microsoft Graph. We create a use once TAP. The access code is returned and displayed in the UI. public async Task<TemporaryAccessPassAuthenticationMethod?> AddTapForUserAsync(string userId) { var graphServiceClient = _graphService .GetGraphClientWithManagedIdentityOrDevClient(); var tempAccessPassAuthMethod = new TemporaryAccessPassAuthenticationMethod { //StartDateTime = DateTimeOffset.Now, LifetimeInMinutes = 60, IsUsableOnce = true, }; var result = await graphServiceClient.Users[userId] .Authentication .TemporaryAccessPassMethods .Request() .AddAsync(tempAccessPassAuthMethod); return result; } The https//aka.ms/mysecurityinfo link can be used to complete the flow. The new user can click this link and enter the email and the access code. Now that the user is authenticated, he or she can add a passwordless authentication method. I use an external FIDO2 key. Once setup, the user can register and authenticate. You should use at least two security keys. This is an awesome way of onboarding users which allows users to authenticate in a phishing resistant way without requiring or using a password. FIDO2 is the recommended and best way of authenticating users and with the rollout of passkeys, this will become more user friendly as well. Onboarding members using password Due to the fact that some companies still use legacy authentication or we would like to support users with no computer, we also need to onboard users with passwords. When using passwords, the user needs to update the password on first use. The user should add an MFA, if not forced by the tenant. Some employees might not have a computer and would like user a phone to authenticate. An SMS code would be a good way of achieving this. This is of course not very secure, so you should expect these accounts to get lost or breached and so sensitive data should be avoided for applications used by these accounts. The device code flow could be used together on a shared PC with the user mobile phone. Starting an authentication flow from a QR Code is unsecure as this is not safe against phishing but as SMS is used for these type of users, it’s already not very secure. Again sensitive data must be avoided for applications accepting these low security accounts. It’s all about balance, maybe someday soon, all users will have FIDO2 security keys or passkeys to use and we can avoid these sort of solutions. Onboarding guest users (invitations) Guest users cannot be onboarded by creating a Microsoft Graph user. You need to send an invitation to the guest user for your tenant. Microsoft Graph provides an API for this. There a different type of guest users, depending on the account type and the authentication method type. The invitation returns an invite redeem URL which can be used to setup the account. This URL is mailed to the email used in the invite and does not need to be displayed in the UI. private async Task InviteGuest(UserModel userData) { var invitedGuestUser = await _aadGraphSdkManagedIdentityAppClient .InviteGuestUser(userData, _inviteUrl); if (invitedGuestUser!.Id != null) { AccessInfo = new CreatedAccessModel { Email = invitedGuestUser.InvitedUserEmailAddress, InviteRedeemUrl = invitedGuestUser.InviteRedeemUrl }; } } The InviteGuestUser method is used to create the invite object and this is sent as a HTTP post request to the Microsoft Graph API. public async Task<Invitation?> InviteGuestUser (UserModel userModel, string redirectUrl) { if (userModel.Email.ToLower().EndsWith(_aadIssuerDomain.ToLower())) { throw new ArgumentException("user must be from a different domain!"); } var graphServiceClient = _graphService .GetGraphClientWithManagedIdentityOrDevClient(); var invitation = new Invitation { InvitedUserEmailAddress = userModel.Email, SendInvitationMessage = true, InvitedUserDisplayName = $"{userModel.FirstName} {userModel.LastName}", InviteRedirectUrl = redirectUrl, InvitedUserType = "guest" }; var invite = await graphServiceClient.Invitations .Request() .AddAsync(invitation); return invite; } Notes Onboarding users with Microsoft Graph can be complicated because you need to know which parameters and how the users need to be created. Azure AD members can be created using the Microsoft Graph user APIs, guest users are created using the Microsoft Graph invitation APIs. Onboarding users with TAP and FIDO2 is a great way of doing implementing this workflow. As of today, this is still part of the beta release. Links https//entra.microsoft.com/ https//learn.microsoft.com/en-us/azure/active-directory/authentication/howto-authentication-temporary-access-pass https//learn.microsoft.com/en-us/graph/api/authentication-post-temporaryaccesspassmethods?view=graph-rest-1.0&tabs=csharp https//learn.microsoft.com/en-us/graph/authenticationmethods-get-started https//learn.microsoft.com/en-us/azure/active-directory/authentication/howto-authentication-passwordless-security-key-on-premises Create Azure B2C users with Microsoft Graph and ASP.NET Core Onboarding new users in an ASP.NET Core application using Azure B2C Disable Azure AD user account using Microsoft Graph and an application client Invite external users to Azure AD using Microsoft Graph and ASP.NET Core https//learn.microsoft.com/en-us/azure/active-directory/external-identities/external-identities-overview https//learn.microsoft.com/en-us/azure/active-directory/external-identities/b2b-quickstart-add-guest-users-portal


Why Open Source Libraries are the Future of Softwa ...
Category: Computer Programming

We have seen famous Social Networks like Facebook being made using ...


Views: 0 Likes: 30
Asp.Net MVC Development Notes
Category: .Net 7

<a href="https//www.freecodecamp.org/news/an-awesome-guide-on-how-to-build-restful-apis-w ...


Views: 752 Likes: 79
DotNet Software Development and Performance Tools
Category: .Net 7

[11/11/2022] Bombardia Web Stress Testing Tools<a h ...


Views: 0 Likes: 75
Good Problem Solving Tip
Category: Software Development

<span style="font-size 12pt; font-family Arial; background-color ...


Views: 317 Likes: 108
Software Development Refactoring Wisdom I gained t ...
Category: Software Development

Software Development Refactoring Wisdom I gained through R ...


Views: 175 Likes: 84
Is AI going to take Software Development Jobs?
Category: Research

Artificial Intelligence (AI) is becoming increasingly prevalent in the software development indu ...


Views: 0 Likes: 32
Visual Studio 2019 Development
Category: Tools

Visual Studio Development Tips ...


Views: 373 Likes: 129
Issue and verify BBS+ verifiable credentials using ASP.NET Core and trinsic.id
Issue and verify BBS+ verifiable credentials using ...

This article shows how to implement identity verification in a solution using ASP.NET Core and trinsic.id, built using an id-tech solution based on self sovereign identity principals. The credential issuer uses OpenID Connect to authenticate, implemented using Microsoft Entra ID. The edge or web wallet authenticates using trinsic.id based on a single factor email code. The verifier needs no authentication, it only verifies that the verifiable credential is authentic and valid. The verifiable credentials uses JSON-LD ZKP with BBS+ Signatures and selective disclosure to verify. Code https//github.com/swiss-ssi-group/TrinsicV2AspNetCore Use Case As a university administrator, I want to create BBS+ verifiable credentials templates for university diplomas. As a student, I want to authenticate using my university account (OpenID Connect Code flow with PKCE), create my verifiable credential and download this credential to my SSI Wallet. As an HR employee, I want to verify the job candidate has a degree from this university. The HR employee needs verification but does not require to see the data. The sample creates university diplomas as verifiable credentials with BBS+ signatures. The credentials are issued using OIDC with strong authentication (If possible) to a candidate mobile wallet. The issuer uses a trust registry so that the verifier can validate that the VC is authentic. The verifier uses BBS+ selective disclosure to validate a diploma. (ZKP is not possible at present) The university application requires an admin zone and a student zone. Setup The university application plays the role of credential issuer and has it’s own users. Trinsic.id wallet is used to store the verifiable credentials for students. Company X HR plays the role of verifier and has no connection to the university. The company has a list of trusted universities. (DIDs) Issuing a credential The ASP.NET Core issuer application can create a new issuer web (edge) wallet, create templates and issue credentials using the template. The trinsic.id .NET SDK is used to implement the SSI or id-tech integration. How and where trinsic.id store the data is unknown, and you must trust them to do this correctly if you use their solution. Implementing your own ledger or identity layer is at present too much effort and so the best option is to use one of the integration solutions. The UI is implemented using ASP.NET Core Razor pages and the Microsoft.Identity.Web Nuget package with Microsoft Entra ID for the authentication. The required issuer template is used to create the credential offer. var diploma = await _universityServices.GetUniversityDiploma(DiplomaTemplateId); var tid = Convert.ToInt32(DiplomaTemplateId, CultureInfo.InvariantCulture); var response = await _universityServices .IssuerStudentDiplomaCredentialOffer(diploma, tid); CredentialOfferUrl = response!.ShareUrl; The IssuerStudentDiplomaCredentialOffer method uses the University eco system and issuer a new offer using it’s template. public async Task<CreateCredentialOfferResponse?> IssuerStudentDiplomaCredentialOffer (Diploma diploma, int universityDiplomaTemplateId) { var templateId = await GetUniversityDiplomaTemplateId(universityDiplomaTemplateId); // get the template from the id-tech solution var templateResponse = await GetUniversityDiplomaTemplate(templateId!); // Auth token from University issuer wallet _trinsicService.Options.AuthToken = _configuration["TrinsicOptionsIssuerAuthToken"]; var response = await _trinsicService .Credential.CreateCredentialOfferAsync( new CreateCredentialOfferRequest { TemplateId = templateResponse.Template.Id, ValuesJson = JsonSerializer.Serialize(diploma), GenerateShareUrl = true }); return response; } The UI returns the offer and displays this as a QR code which can be opened and starts to process to add the credential to the web wallet. This is a magic link and not an SSI OpenID for Verifiable Credential Issuance flow or the starting point for a Didcomm V2 flow. Starting any flow using a QR Code is unsafe and further protection is required. Using an authenticated application with a phishing resistant authentication reduces this risk or using OpenID for Verifiable Credential Issuance VC issuing. Creating a proof in the wallet Once the credential is in the holders wallet, a proof can be created from it. The proof can be used in a verifier application. The wallet authentication of the user is implemented using a single factor email code and creates a selective proof which can be copied to the verifier web application. OpenID for Verifiable Presentations for verifiers cannot be used because there is no connection between the issuer and the verifier. This could be possible if the process was delegated to the wallet using some type of magic URL, string etc. The wallet can authenticate against any known eco systems. public class GenerateProofService { private readonly TrinsicService _trinsicService; private readonly IConfiguration _configuration; public List<SelectListItem> Universities = new(); public GenerateProofService(TrinsicService trinsicService, IConfiguration configuration) { _trinsicService = trinsicService; _configuration = configuration; Universities = _configuration.GetSection("Universities")!.Get<List<SelectListItem>>()!; } public async Task<List<SelectListItem>> GetItemsInWallet(string userAuthToken) { var results = new List<SelectListItem>(); // Auth token from user _trinsicService.Options.AuthToken = userAuthToken; // get all items var items = await _trinsicService.Wallet.SearchWalletAsync(new SearchRequest()); foreach (var item in items.Items) { var jsonObject = JsonNode.Parse(item)!; var id = jsonObject["id"]; var vcArray = jsonObject["data"]!["type"]; var vc = string.Empty; foreach (var i in vcArray!.AsArray()) { var val = i!.ToString(); if (val != "VerifiableCredential") { vc = val!.ToString(); break; } } results.Add(new SelectListItem(vc, id!.ToString())); } return results; } public async Task<CreateProofResponse> CreateProof(string userAuthToken, string credentialItemId) { // Auth token from user _trinsicService.Options.AuthToken = userAuthToken; var selectiveProof = await _trinsicService.Credential.CreateProofAsync(new() { ItemId = credentialItemId, RevealTemplate = new() { TemplateAttributes = { "firstName", "lastName", "dateOfBirth", "diplomaTitle" } } }); return selectiveProof; } public AuthenticateInitResponse AuthenticateInit(string userId, string universityEcosystemId) { var requestInit = new AuthenticateInitRequest { Identity = userId, Provider = IdentityProvider.Email, EcosystemId = universityEcosystemId }; var authenticateInitResponse = _trinsicService.Wallet.AuthenticateInit(requestInit); return authenticateInitResponse; } public AuthenticateConfirmResponse AuthenticateConfirm(string code, string challenge) { var requestConfirm = new AuthenticateConfirmRequest { Challenge = challenge, Response = code }; var authenticateConfirmResponse = _trinsicService.Wallet.AuthenticateConfirm(requestConfirm); return authenticateConfirmResponse; } } The wallet connect screen can look some like this The proof can be created using one of the credentials and the proof can be copied. Verifying the proof The verifier can use the proof to validate the required information. The verifier has no connection to the issuer, it is in a different eco system. Because of this, the verifier must validate the issuer DID. This must be a trusted issuer (university). public class DiplomaVerifyService { private readonly TrinsicService _trinsicService; private readonly IConfiguration _configuration; public List<SelectListItem> TrustedUniversities = new(); public List<SelectListItem> TrustedCredentials = new(); public DiplomaVerifyService(TrinsicService trinsicService, IConfiguration configuration) { _trinsicService = trinsicService; _configuration = configuration; TrustedUniversities = _configuration.GetSection("TrustedUniversities")!.Get<List<SelectListItem>>()!; TrustedCredentials = _configuration.GetSection("TrustedCredentials")!.Get<List<SelectListItem>>()!; } public async Task<(VerifyProofResponse? Proof, bool IsValid)> Verify(string studentProof, string universityIssuer) { // Verifiers auth token // Auth token from trinsic.id root API KEY provider _trinsicService.Options.AuthToken = _configuration["TrinsicCompanyXHumanResourcesOptionsApiKey"]; var verifyProofResponse = await _trinsicService.Credential.VerifyProofAsync(new VerifyProofRequest { ProofDocumentJson = studentProof, }); var jsonObject = JsonNode.Parse(studentProof)!; var issuer = jsonObject["issuer"]; // check issuer if (universityIssuer != issuer!.ToString()) { return (null, false); } return (verifyProofResponse, true); } } The ASP.NET Core UI could look something like this Notes Using a SSI based solution to share data securely across domains or eco systems can be very useful and opens up many business possibilities. SSI or id-tech is a good solution for identity checks and credential checks, it is not a good solution for authentication. Phishing is hard to solve in cross device environments. Passkeys or FIDO2 is the way forward for user authentication. The biggest problem for SSI and id-tech is the interoperability between solutions. For example, the following credential types exist and are only useable on specific solutions JSON JWT SD-JWT (JSON-LD with LD Signatures) AnonCreds with CL Signatures JSON-LD ZKP with BBS+ Signatures mDL ISO ISO/IEC 18013-5 There are multiple standards, multiple solutions, multiple networks and multiple ledgers. No two systems seem to work with each other. In a closed eco system, it will work, but SSI has few advantages over existing solutions in a closed eco system. Interoperability needs to be solved. Links https//dashboard.trinsic.id/ecosystem https//github.com/trinsic-id/sdk https//docs.trinsic.id/dotnet/ Integrating Verifiable Credentials in 2023 – Trinsic Platform Walkthrough https//openid.net/specs/openid-4-verifiable-credential-issuance-1_0.html https//openid.net/specs/openid-4-verifiable-presentations-1_0.html https//openid.net/specs/openid-connect-self-issued-v2-1_0.html https//datatracker.ietf.org/doc/draft-ietf-oauth-selective-disclosure-jwt/


Auto sign-out using ASP.NET Core Razor Pages with Azure AD B2C
Auto sign-out using ASP.NET Core Razor Pages with ...

This article shows how an ASP.NET Core Razor Page application could implement an automatic sign-out when a user does not use the application for n-minutes. The application is secured using Azure AD B2C. To remove the session, the client must sign-out both on the ASP.NET Core application and the Azure AD B2C identity provider or whatever identity provider you are using. Code https//github.com/damienbod/AspNetCoreB2cLogout Sometimes clients require that an application supports automatic sign-out in a SSO environment. An example of this is when a user uses a shared computer and does not click the sign-out button. The session would remain active for the next user. This method is not fool proof as the end user could save the credentials in the browser. If you need a better solution, then SSO and rolling sessions should be avoided but this leads to a worse user experience. The ASP.NET Core application is protected using Microsoft.Identity.Web. This takes care of the client authentication flows using Azure AD B2C as the identity provider. Once authenticated, the session is stored in a cookie. A distributed cache is added to record the last activity of of each user. An IAsyncPageFilter implementation is used and added as a global filter to all requests for Razor Pages. The SessionTimeoutAsyncPageFilter class implements the IAsyncPageFilter interface. builder.Services.AddDistributedMemoryCache(); builder.Services.AddAuthentication(OpenIdConnectDefaults.AuthenticationScheme) .AddMicrosoftIdentityWebApp(builder.Configuration, "AzureAdB2c" ) .EnableTokenAcquisitionToCallDownstreamApi(Array.Empty<string>()) .AddDistributedTokenCaches(); builder.Services.AddAuthorization(options => { options.FallbackPolicy = options.DefaultPolicy; }); builder.Services.AddSingleton<SessionTimeoutAsyncPageFilter>(); builder.Services.AddRazorPages() .AddMvcOptions(options => { options.Filters.Add(typeof(SessionTimeoutAsyncPageFilter)); }) .AddMicrosoftIdentityUI(); The IAsyncPageFilter interface is used to catch the request for the Razor Pages. The OnPageHandlerExecutionAsync method is used to implement the automatic end session logic. We use the default name identifier claim type to get an ID for the user. If using the standard claims instead of the Microsoft namespace mapping, this would be different. Match the claim returned in the id_token from the OpenID Connect authentication. I check for idle time. If no requests was sent in the last n-minutes, the application will sign-out, in both the local cookie and also on Azure AD B2C. It is important to sign-out on the identity provider as well. If the idle time is less than the allowed time span, the DateTime timestamp is persisted to cache. public async Task OnPageHandlerExecutionAsync(PageHandlerExecutingContext context, PageHandlerExecutionDelegate next) { var claimTypes = "http//schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier"; var name = context.HttpContext .User .Claims .FirstOrDefault(c => c.Type == claimTypes)! .Value; if (name == null) throw new ArgumentNullException(nameof(name)); var lastActivity = GetFromCache(name); if (lastActivity != null && lastActivity.GetValueOrDefault() .AddMinutes(timeoutInMinutes) < DateTime.UtcNow) { await context.HttpContext .SignOutAsync(CookieAuthenticationDefaults.AuthenticationScheme); await context.HttpContext .SignOutAsync(OpenIdConnectDefaults.AuthenticationScheme); } AddUpdateCache(name); await next.Invoke(); } Distributed cache is used to persist the user idle time from each session. This might be expensive for applications with many users. In this demo, the UTC now value is used for the check. This might need to be improved and the cache length as well. This needs to be validated. if this is enough for all different combinations of timeout. private void AddUpdateCache(string name) { var options = new DistributedCacheEntryOptions() .SetSlidingExpiration(TimeSpan .FromDays(cacheExpirationInDays)); _cache.SetString(name, DateTime .UtcNow.ToString("s"), options); } private DateTime? GetFromCache(string key) { var item = _cache.GetString(key); if (item != null) { return DateTime.Parse(item); } return null; } When the session timeouts, the code executes the OnPageHandlerExecutionAsync method and signouts. This works for Razor Pages. This is not the only way of supporting this and it is not an easy requirement to fully implement. Next step would be to support this from SPA UIs which send Javascript or ajax requests. Links https//learn.microsoft.com/en-us/azure/active-directory-b2c/openid-connect#send-a-sign-out-request https//learn.microsoft.com/en-us/aspnet/core/razor-pages/filter?view=aspnetcore-7.0 https//github.com/AzureAD/microsoft-identity-web


How to solve problems
Category: Software Development

Instead of asking what problems should I solve. Ask, what problems do I wish someone else would s ...


Views: 303 Likes: 121
How to Setup the Development Environment using Vis ...
Category: Computer Programming

<img style="float right; border-style ...


Views: 0 Likes: 23
Software Development Good Practices
Category: .Net 7

Knowledge Collected Over the Years of Developing Design your soft ...


Views: 231 Likes: 70
D1 Mini SP32 Development Board and BTF LED Lights ...
Category: Home

Question Why is D1 Mini SP32 Development Board and BTF LED Lights on one LED works?Answer ...


Views: 0 Likes: 10
dotnet 8 not running in development mode
Category: .Net 7

Question How do you make dotnet 8 run in development mode when you run it from the command prom ...


Views: 0 Likes: 5
Software Development
Category: Technology

Software Development<div sty ...


Views: 304 Likes: 99
Use Azure AD Access Packages to onboard users in an Azure DevOps project
Use Azure AD Access Packages to onboard users in a ...

This post looks at onboarding users into an Azure DevOps team or project using Azure AD access packages. The Azure AD access packages are part of the Microsoft Entra Identity Governance and provide a good solution for onboarding internal or external users into your tenant with access to the defined resources. Flow for onboarding Azure DevOps members Sometimes we develop large projects with internal and external users which need access to an Azure DevOps project for a fixed length of time which can be extended if required. These users only need access to the the Azure DevOps project and should be automatically removed when the contract or project is completed. Azure AD access packages are a good way to implement this. Use an Azure AD group The access to the Azure DevOps can be implemented by using an Azure security group in Azure AD. This security will be used to add team members for the Azure DevOps project. Azure AD access packages are used to onboard users into the Azure AD group and the Azure DevOps project uses the security group to define the members. The “azure-devops-project-access-packages” security group was created for this. Setup the Azure DevOps A new Azure DevOps project was created for this demo. The project has an URL on the dev.azure.com domain. The Azure DevOps needs to be attached to the Azure AD tenant. Only an Azure AD member with the required permissions can add a security group to the Azure DevOps project. My test Azure DevOps project was created with the following URL. You can only access this if you are a member. https//dev.azure.com/azureadgroup-access-packages/use-access-packages The project team can now be onboarded. Create the Azure AD P2 Access packages To create an Azure AD P2 Access package, you can use the Microsoft Entra admin center. The access package can be created in the Entitlement management blade. Add the security group from the Azure AD which you use for adding or removing users to the Azure DevOps project. Add the users as members. The users onboarded using the access package are given a lifespan in the tenant for the access and can be extended or not as needed. The users can be added using an access package link, or you can get an admin to assign users to the package. I created a second access package to assign any users to the package which can then be approved or rejected by the Azure DevOps project manager. The Azure DevOps administrator can approve the access package and the Azure DevOps team member can access the Azure DevOps project using the public URL. The new member is added to the Azure security group using the access package. An access package link would look something like this https//myaccess.microsoft.com/@damienbodsharepoint.onmicrosoft.com#/access-packages/b5ad7ec0-8728-4a18-be5b-9fa24dcfefe3 Links https//learn.microsoft.com/en-us/azure/active-directory/governance/entitlement-management-access-package-create https//learn.microsoft.com/en-us/azure/devops/organizations/accounts/faq-user-and-permissions-management?view=azure-devops#q-why-cant-i-find-members-from-my-connected-azure-ad-even-though-im-the-azure-ad-global-admin https//entra.microsoft.com/


How to Research different technologies for Develop ...
Category: Computer Programming

This is often overlooked as many Computer Pr ...


Views: 0 Likes: 38
iis just in time debugging error
Category: Servers

Question Why is IIS Just-In-Time debugging error popping up when testing <a cl ...


Views: 0 Likes: 34
Drupal 8 Xampp Local Development (Error)
Category: Technology

<span style="textline underline; font-weight bold; font-size med ...


Views: 334 Likes: 80
How to Debug Asp.Net Core 3.1 in IIS Development E ...
Category: .Net 7

Question How do you troubleshoot IIS Asp.Net Core 3.1 that is running in Produc ...


Views: 461 Likes: 132
How to choose a Framework when making Web app
Category: Computer Programming

Choosing the right framework for develop ...


Views: 0 Likes: 29
Front-End Bootstrap Development Notes Tips and Tri ...
Category: .Net 7

How to remove borders from a Boostrap 5 Table Do you find yourself adding "border-0" to e ...


Views: 0 Likes: 27
Front-End Development Tips and Tricks
Category: Front-End

1. Did you know you could make a form (Html Tag) a parent of different other <st ...


Views: 45 Likes: 61
IT Java Application Supervisor
Category: Technology

Title IT Java Application Supervisor Location Clevela ...


Views: 0 Likes: 40
Java Web Developer
Category: Jobs

Java Web Developer &nbsp; Cynergies Solutions Group is looking f ...


Views: 0 Likes: 51
[Software Development] Discover ErnesTech Step-by- ...
Category: Computer Programming

At ErnesTech, we take a collaborative approach to ensure your satisfaction and success. Our seaml ...


Views: 0 Likes: 33
Azure AD cross-tenant synchronization
Azure AD cross-tenant synchronization

The article looks at and explores the new Azure AD cross-tenant synchronization. The feature makes it really easy to implement the technical part of synchronization between different Azure AD tenants. Code https//github.com/damienbod/Aad-cross-tenant-synchronization Requirements To use this feature, both the source tenant and the target tenant require at least a P1 Azure AD license. The administrator and the application used to implement the feature needs to have the required delegated authorization. I used Microsoft Graph Explorer for this. In a second step, I will implement an ASP.NET Core application for this. Setup In this use case, I have two separate tenants with users, applications, groups and everything that comes with an Azure AD tenant. I would like to join these two tenants without migrating all the users, groups, and applications to one of the tenants. A typical use case for this would when a company acquires a second company and both use Azure AD. There are many different strategies for technically merging two companies and every implementation is always different. The merge might be a m-n merge which is a more complicated to implement. It is always best to aim for simplicity. I always aim to reduce to a single source of identity for applications and use this to authenticate everything against. The single source of identity can federate to different IAM systems, but the applications should not use the external IAM directly. I also try to have only one source of identity for users, but this is not always possible. With these complex systems, users must have only one primary identity, otherwise it gets really hard to maintain onboard, offboard and you require more tools to manage this. One problem with this solution is using on-premise applications or clients with cloud IAM solutions. For some on-premise applications, there is no network connection to the cloud systems and so exceptions needs to be implemented here if using a cloud solution as the single source of identity for applications. Azure AD and AD is one such example of this problem and Microsoft provide a good solution for this. (Azure Hybrid) The big decision is deciding where the identities have the primary definition. In this demo, we keep the identities primary definition as it was and add the members of the source tenant to the target tenant as external member users. This synchronized then. The Microsoft documentation for this is excellent and I really recommend using this when implementing the synchronization. Setup target tenant To implement the synchronization, the user and application need the assigned privileges for the following Microsoft Graph permissions Policy.Read.All Policy.ReadWrite.CrossTenantAccess You can view this in the Enterprise applications once setup. This Enterprise application is only required to create the synchronization. This can also be deleted after the synchronization has been created. The Enterprise application is used to create a second Enterprise application used for the synchronization. You could also just implement this directly in the Azure portal. The different steps to create the synchronization are described in great detail in the Microsoft Azure docs. Setup source tenant To implement the synchronization, the user and application need the assigned privileges for the following Microsoft Graph permissions Policy.Read.All Policy.ReadWrite.CrossTenantAccess Application.ReadWrite.All Directory.ReadWrite.All The different steps to create the synchronization are described in great detail in the Microsoft Azure docs. Synchronization configurations The Azure AD provisioning can be setup to add all users, or scoped to a set of users, or a group of users as required. The provisioning can be setup to map the user attributes as required. I used the default settings and provision all member users. Synchronization in operation One the synchronization is setup, the identities are automatically updated to the target system using the defined scope and configuration. You can make changes on one tenants and the changes can be viewed on the target tenant. The trust settings can be updated to trust the original MFA form the source tenant, but care needs to be taken here. You do not want to allow weaker MFA than the required company policy. A phishing resistant MFA should be standard for any professional company that cares about its employees and security. Now that the identities are synchronized , the next step would be to move the applications and the groups for the shared services. Links https//learn.microsoft.com/en-us/azure/active-directory/multi-tenant-organizations/cross-tenant-synchronization-overview https//learn.microsoft.com/en-us/azure/active-directory/multi-tenant-organizations/cross-tenant-synchronization-configure https//learn.microsoft.com/en-us/azure/active-directory/multi-tenant-organizations/cross-tenant-synchronization-configure-graph https//www.microsoft.com/en/security/business/identity-access/azure-active-directory-pricing


Lead Drupal Developer with Third and Grove
Category: Jobs

This position is fully remote. Office space is available for co-working at our Boston HQ.&nbsp;<s ...


Views: 0 Likes: 40
Use OCR Plugin to read text from Camera App
Category: Technology

This plugin can be used in app ...


Views: 298 Likes: 83
Why is AspNet 6 Application always running in Prod ...
Category: .Net 7

Question I just upgraded Asp.Net 5 Application to Asp.Net 6, when I run the command&nbsp;< ...


Views: 0 Likes: 38
Back-End Development
Category: Computer Programming

The&nbsp;<span class="c26 c ...


Views: 0 Likes: 32
Technical Project Manager
Category: Jobs

"IMMEDIATE REQUIREMENT" Please share the suitableprofile to&nbsp;<a href="mailtoelly.jack ...


Views: 0 Likes: 32

Login to Continue, We will bring you back to this content 0



For peering opportunity Autonomouse System Number: AS401345 Custom Software Development at ErnesTech Email Address[email protected]