Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 108

Azure Topics

1. Logging......................................................................................................................................1
1.1 App Insights...........................................................................................................................1
1.2 Azure Monitoring................................................................................................................11
1.3 Azure Alerts.........................................................................................................................12
Create a single database.............................................................................................................30
Query the database......................................................................................................................34
USING REDIS CACHE WITH ASP.NET CORE 3.1 USING
STACKEXCHANGE.REDIS.EXTENSIONS.CORE EXTENSIONS...........................................40
Create App registrations in Azure portal....................................................................................74
Create ASP.NET Core Web API in Visual Studio 2019..............................................................79
Create Angular 8 application using Angular CLI.......................................................................82
Conclusions...................................................................................................................................91
Scaling Options: Scale-Up vs. Scale-Out....................................................................................91
F1 (free one), D1, B1, B2, and B3.................................................................................................95
Basic Service Plan (B1, B3 and B3).............................................................................................96

1. Logging
1.1 App Insights

 
 
In the appservice of the SPA app. I have the configuration:

rerequisites

1. Visual Studio 2019 (or Visual Studio code)


2. Create an Azure App Service

In this article we will learn the steps to integrate a .Net Core Web
API with Application Insights which helps you to log all the application
telemetry and gain insights about Failures and Performance with very low-
code.
Let’s start.

Advertisements
REPORT THIS AD

Step1 –  Create a new ASP.Net Core Web API Project

Create a new ASP.Net Core Web API using Visual Studio as shown


below.

Azure Application Insights – Create a new ASP.NET Core Web API


Application

Once the application is created, it looks like this


Run the application to see if it’s running in the local machine by pressing
F5. It should run in some port as shown below.

Advertisements

Step2 Download Microsoft.ApplicationInsights.AspNetCore Nuget


package and configure Log Levels

Once the application is working n the local environment, let’s add the
libraries that can be used to log the errors / telemetry to the Application
Insights
Install the Microsofot.ApplicationInsights.AspNetCore nuget package
shown above. Once you install, you can see the package added to the
Project file as shown below.

Azure Application Insights – Project File with


Microsoft.ApplicationInsights.AspNetCore Nuget Package

In the appSettings.json add the Application Insights Log Category as


shown below.

Azure Application Insights – App Settings File with ApplicationInsights Log


Level

Build the application to see if everything is configured properly.

Advertisements
Step3 Enable Application Insights and configure Dependency
Injection of ILogger interface

In this section, we will enable Application Insights in


the ConfigureServices method of the Startup class as shown below.

public void ConfigureServices(IServiceCollection services)


{
services.AddControllers();
services.AddApplicationInsightsTelemetry();
}

In order to enable the Controllers to leverage Log Providers (in our case,
Application Insights), we need to add ILogger interface to the Constructor
as shown below.
private readonly ILogger _logger;

public WeatherForecastController(ILogger logger)

_logger = logger;

Advertisements

Step4 – Log the Errors, Information and Traces in your application


code

Now, we can start logging the errors, information and any traces in your
application code using _logger object reference as shown below.
Advertisements

Step5 Deployment of the Web API application to Azure App


Service

As one of the Prerequisites was to create an Azure App Service – Web App,
this section assumes that you have created an Azure App Service.
Let’s deploy the application to Azure App Service by right clicking on the
Project and click on Publish and choose the appropriate options in the
Publish window. If you have configured everything properly, you would see
something as shown below.

Step6 – Create Application Insights and integrate with Azure App


Service
The final step is to create an Application Insights instance and integrate
it with the Azure App Service by following the below steps.

Create an Application Insights and copy the Instrumentation Key as shown


below.

Azure Application Insights – Copy Instrumentation Key


Navigate to the Configuration blade of the Azure App Service and add
a new App Setting as shown below and click on Save to save the
changes.

Tip: Ensure you create the Azure App Service and the Application


Insights in the same Location. If not, if would lead to performance issues.

Finally, access the action method and after 2 minutes (sometimes up to 5


minutes), you should be able to look at the logs as shown below.
That’s it. We have learnt how to integrate Azure Application
Insights with a .Net Core 3.1 application which is hosted in Azure App
Service.

 
 

1.2 Azure Monitoring

Azure Monitor helps you maximize the availability and performance of your
applications and services. It delivers a comprehensive solution for collecting,
analyzing, and acting on telemetry from your cloud and on-premises environments.
This information helps you understand how your applications are performing and
proactively identify issues affecting them and the resources they depend on.

Just a few examples of what you can do with Azure Monitor include:

 Detect and diagnose issues across applications and dependencies


with Application Insights.
 Correlate infrastructure issues with VM insights and Container insights.
 Drill into your monitoring data with Log Analytics for troubleshooting and deep
diagnostics.
 Support operations at scale with smart alerts and automated actions.
 Create visualizations with Azure dashboards and workbooks.
 Collect data from monitored resources using Azure Monitor Metrics.
Analyze Use Log Analytics in the Azure portal to write log queries and interactively analyze log data using a

Alert Configure a log alert rule that sends a notification or takes automated action when the results of th
Visualize Pin query results rendered as tables or charts to an Azure dashboard.
Create a workbook to combine with multiple sets of data in an interactive report.
Export the results of a query to Power BI to use different visualizations and share with users outside
Export the results of a query to Grafana to leverage its dashboarding and combine with other data

Insights Support insights that provide a customized monitoring experience for particular applications and s

1.3 Azure Alerts

Here the steps to get started writing queries for alerts:

1. Go to the resource you would like to alert on. Consider setting up alert
rules on multiple resources by selecting a subscription or resource group
scope whenever possible. Alerting on multiple resources reduces costs
and the need to manage multiple alert rules.

2. Under Monitor, select Logs.

3. Query the log data that can indicate the issue. You can use the alert
query examples topic to understand what you can discover or get started
on writing your own query. Also, learn how to create optimized alert
queries.

4. Press on '+ New Alert Rule' button to start the alert creation flow.

 Note

It is recommended that you create alerts at scale, when using resource access mode
for logs, which runs on multiple resources using a resource group or subscription
scope. Alerting at scale reduces rule management overhead. To be able to target the
resources, please include the resource ID column in the results. Learn more about
splitting alerts by dimensions.
Log alert for Log Analytics and Application Insights

1. If the query syntax is correct, then historical data for the query appears
as a graph with the option to tweak the chart period from the last six
hours to last week.

If your query results contain summarized data or project specific


columns without time column, the chart shows a single value.
2. Choose the time range over which to assess the specified condition,
using Period option.

3. Log Alerts can be based on two types of Measures:

1. Number of results - Count of records returned by the query.


2. Metric measurement - Aggregate value calculated using summarize
grouped by expressions chosen and bin() selection. For example:
KustoCopy
// Reported errors
union Event, Syslog // Event table stores Windows event records,
Syslog stores Linux records
| where EventLevelName == "Error" // EventLevelName is used in the
Event (Windows) records
or SeverityLevel== "err" // SeverityLevel is used in Syslog (Linux)
records
| summarize AggregatedValue = count() by Computer, bin(TimeGenerated,
15m)

4. For metric measurements alert logic, you can optionally specify how
to split the alerts by dimensions using the Aggregate on option. Row
grouping expression must be unique and sorted.

 Note

As bin() can result in uneven time intervals, the alert service will


automatically convert bin() function to bin_at() function with
appropriate time at runtime, to ensure results with a fixed point.

 Note

Split by alert dimensions is only available for the current


scheduledQueryRules API. If you use the legacy Log Analytics Alert API,
you will need to switch. Learn more about switching. Resource centric
alerting at scale is only supported in the API version 2020-05-01-
preview and above.
5. Next, based on the preview data set the Operator, Threshold Value,
and Frequency.
6. You can also optionally set the number of violations to trigger an
alert by using Total or Consecutive Breaches.

7. Select Done.

8. Define the Alert rule name, Description, and select the alert Severity.


These details are used in all alert actions. Additionally, you can choose to
not activate the alert rule on creation by selecting Enable rule upon
creation.

9. Choose if you want to suppress rule actions for a time after an alert is
fired, use the Suppress Alerts option. The rule will still run and create
alerts but actions won't be triggered to prevent noise. Mute actions
value must be greater than the frequency of alert to be effective.

10. Specify if the alert rule should trigger one or more Action Groups when
alert condition is met.

 Note

Refer to the Azure subscription service limits for limits on the actions


that can be performed.

11. You can optionally customize actions in log alert rules:

o Custom Email Subject: Overrides the e-mail subject of email


actions. You can't modify the body of the mail and this field isn't
for email addresses.
o Include custom Json payload: Overrides the webhook JSON
used by Action Groups assuming the action group contains a
webhook action. Learn more about webhook action for Log Alerts.

12. If all fields are correctly set, the Create alert rule button can be clicked
and an alert is created.

Within a few minutes, the alert is active and triggers as previously


described.
Creating log alert for Log Analytics and Application Insights from the alerts management
 Note

Creation from alerts management is currently not supported for resource centric logs

1. In the portal, select Monitor then choose Alerts.


2. Select New Alert Rule.

3. The Create Alert pane appears. It has four parts:

o The resource to which the alert applies.


o The condition to check.
o The actions to take if the condition is true.
o The details to name and describe the alert.
4. Press on Select Resource button. Filter by choosing
the Subscription, Resource Type, and select a resource. Ensure the
resource has logs available.
5. Next, use the add Condition button to view list of signal options
available for the resource. Select Custom log search option.
 Note

The alerts portal lists saved queries from Log Analytics and Application
Insights and they can be used as template alert queries.

6. Once selected, write, paste, or edit the alerting query in the Search


Query field.

7. Continue to the next steps described in the last section.

Log alert for all other resource types


 Note

There are currently no additional charges for the API version 2020-05-01-preview and


resource centric log alerts. Pricing for features that are in preview will be announced
in the future and a notice provided prior to start of billing. Should you choose to
continue using new API version and resource centric log alerts after the notice
period, you will be billed at the applicable rate.

1. Start from the Condition tab:

1. Check that the Measure, Aggregation type, and Aggregation


granularity are correct.

1. By default, the rule counts the number of results in the last


5 minutes.
2. If we detect summarized query results, the rule will be
updated automatically within a few seconds to capture
that.

2. Choose alert splitting by dimensions, if needed:

 Resource ID column is selected automatically, if


detected, and changes the context of the fired alert to
the record's resource.
 Resource ID column can be de-selected to fire alerts
on subscription or resource groups. De-selecting is
useful when query results are based on cross-resources.
For example, a query that check if 80% of the resource
group's virtual machines are experiencing high CPU
usage.
 Up to six more splittings can be also selected for any
number or text columns types using the dimensions
table.
 Alerts are fired separately according to splitting based
on unique combinations and alert payload includes this
information.

3. The Preview chart shows query evaluations results over time.


You can change the chart period or select different time series
that resulted from unique alert splitting by dimensions.
4. Next, based on the preview data, set the Alert
logic; Operator, Threshold Value, and Frequency.
5. You can optionally set Number of violations to trigger the
alert in the Advanced options section.

2. In the Actions tab, select or create the required action groups.


3. In the Details tab, define the Alert rule details, and Project details. You
can optionally set whether to not Start running now, or Mute
Actions for a period after the alert rule fires.

 Note

Log alert rules are currently stateless and fires an action every time an
alert is created unless muting is defined.
4. In the Tags tab, set any required tags on the alert rule resource.

5. In the Review + create tab, a validation will run and inform of any


issues. Review and approve the rule definition.

6. If all fields are correct, select the Create button and complete the alert
rule creation. All alerts can be viewed from the alerts management.
2. Azure DB

2.1 Azure Sql


Create a single database
This quickstart creates a single database in the serverless compute tier.
 Portal
 Azure CLI
 PowerShell
To create a single database in the Azure portal this quickstart starts at the Azure
SQL page.

1. Browse to the Select SQL Deployment option page.


2. Under SQL databases, leave Resource type set to Single database, and
select Create.

3. On the Basics tab of the Create SQL Database form, under Project
details, select the desired Azure Subscription.
4. For Resource group, select Create new, enter myResourceGroup, and
select OK.
5. For Database name enter mySampleDatabase.
6. For Server, select Create new, and fill out the New server form with the
following values:
o Server name: Enter mysqlserver, and add some characters
for uniqueness. We can't provide an exact server name to
use because server names must be globally unique for all
servers in Azure, not just unique within a subscription. So
enter something like mysqlserver12345, and the portal lets
you know if it is available or not.
o Server admin login: Enter azureuser.
o Password: Enter a password that meets requirements, and
enter it again in the Confirm password field.
o Location: Select a location from the dropdown list.
7. Select OK.
8. Leave Want to use SQL elastic pool set to No.
9. Under Compute + storage, select Configure database.
10. This quickstart uses a serverless database, so select Serverless, and
then select Apply.
11. Select Next: Networking at the bottom of the page.

12. On the Networking tab, for Connectivity method, select Public endpoint.
13. For Firewall rules, set Add current client IP address to Yes. Leave Allow
Azure services and resources to access this server set to No.
14. Select Next: Additional settings at the bottom of the page.

15. On the Additional settings tab, in the Data source section, for Use
existing data, select Sample. This creates an AdventureWorksLT sample
database so there's some tables and data to query and experiment with,
as opposed to an empty blank database.
16. Select Review + create at the bottom of the page:

17. On the Review + create page, after reviewing, select Create.

Query the database


Once your database is created, you can use the Query editor (preview) in the Azure
portal to connect to the database and query data.

1. In the portal, search for and select SQL databases, and then select your
database from the list.
2. On the page for your database, select Query editor (preview) in the left
menu.
3. Enter your server admin login information, and select OK.

4. Enter the following query in the Query editor pane.


5. SQL
6. Copy

SELECT TOP 20 pc.Name as CategoryName, p.name as ProductName


FROM SalesLT.ProductCategory pc
JOIN SalesLT.Product p
ON pc.productcategoryid = p.productcategoryid;
7. Select Run, and then review the query results in the Results pane.

8. Close the Query editor page, and select OK when prompted to discard your
unsaved edits.

2.2 Redis

RedisStackExchange:
{

"ConnectionStrings": {

"RedisConnection": ""

},

"Logging": {

"IncludeScopes": false,

"LogLevel": {

"Default": "Warning"

Controller:

using Microsoft.AspNetCore.Mvc;

using Microsoft.Extensions.Caching.Distributed;

using Newtonsoft.Json;

namespace AzureRedisCache.Controllers

public class HomeController : Controller

private IDistributedCache _cache;

public HomeController(IDistributedCache cache)

this._cache = cache;

public IActionResult Index()

string test = _cache.GetString("Test") ?? "";

if (string.IsNullOrEmpty( test))

{
_cache.SetString("Test", "Tested");

test = _cache.GetString("Test") ?? "";

ViewData["Test"] = test;

return View();

Startup.cs

{using System;

using System.Collections.Generic;

using System.Linq;

using System.Threading.Tasks;

using Microsoft.AspNetCore.Builder;

using Microsoft.AspNetCore.Hosting;

using Microsoft.AspNetCore.HttpsPolicy;

using Microsoft.Extensions.Configuration;

using Microsoft.Extensions.DependencyInjection;

using Microsoft.Extensions.Hosting;

using StackExchange.Redis.Extensions.Core.Configuration;

using StackExchange.Redis.Extensions.Newtonsoft;

namespace WebCacheStackExchangeDemo

public class Startup

public Startup(IConfiguration configuration)

Configuration = configuration;

}
public IConfiguration Configuration { get; }

// This method gets called by the runtime. Use this method to add services to the container.

public void ConfigureServices(IServiceCollection services)

var redisConfiguration = Configuration.GetSection("Redis").Get<RedisConfiguration>();

services.AddControllersWithViews();

services.AddStackExchangeRedisExtensions<NewtonsoftSerializer>(redisConfiguration);

// This method gets called by the runtime. Use this method to configure the HTTP request
pipeline.

public void Configure(IApplicationBuilder app, IWebHostEnvironment env)

if (env.IsDevelopment())

app.UseDeveloperExceptionPage();

else

app.UseExceptionHandler("/Home/Error");

// The default HSTS value is 30 days. You may want to change this for production
scenarios, see https://aka.ms/aspnetcore-hsts.

app.UseHsts();

app.UseHttpsRedirection();

app.UseStaticFiles();

app.UseRouting();

app.UseAuthorization();
app.UseEndpoints(endpoints =>

endpoints.MapControllerRoute(

name: "default",

pattern: "{controller=Demo}/{action=Index}/{id?}");

});

}}

RedisExtension :

USING REDIS CACHE WITH ASP.NET CORE 3.1 USING


STACKEXCHANGE.REDIS.EXTENSIONS.CORE
EXTENSIONS
Source CodeDownload

In this article, we are going to learn how to Access Azure Redis cache using
StackExchange.Redis.Extensions.Core Extensions.
“StackExchange.Redis.Extensions.Core” Extensions are written by Ugo Lattanzi.
Azure Cache for Redis provides you access to a secure, dedicated Redis cache. Azure Cache
for Redis is managed by Microsoft,hosted within Azure, and accessible to any application
within or outside of Azure.

 Creating application ASP.NET Core Application

 Installing StackExchange.Redis.Extensions.Core,
StackExchange.Redis.Extensions.AspNetCore, StackExchange.Redis.Extensions.Newtonsoft 
package from NuGet package.
 Creating a Resource “Azure Cache for Redis” on Azure.
 Getting access keys

 Adding Access Keys to appsettings.json file

 Adding “AddStackExchangeRedisExtensions” method in the ConfigureServices


method

 Adding DemoController and injecting IRedisCacheClient for accessing method.


 Implementing AddAsync, AddAllAsync methods

 Implementing GetAsync, GetAllAsync methods

 Implementing RemoveAsync, RemoveAllAsync methods

 Implementing ExistsAsync methods

 Implementing SearchKeysAsync methods

Let’s start with creating an application.

Creating ASP.NET Core Application

Next, we are going to set Project Name “WebCacheStackExchangeDemo” and location. In last
part, we are going to choose .Net Core framework and ASP.NET Core Version 3.1 as the
framework for application and few advance settings for such as configuring https and
enabling docker we are not going to enable docker settings for this project.
Now finally click on create button to create a project.

Project structure
The project structure generated according to the configuration.

After creating project next, we are going to install below package from NuGet Packages
 StackExchange.Redis.Extensions.Core
 StackExchange.Redis.Extensions.AspNetCore
 StackExchange.Redis.Extensions.Newtonsoft

Creating a Resource “Azure Cache for Redis” on Azure


We are going to Create a Resource “Azure Cache for Redis” on Azure.
After login into azure portal and click on Create Resource below view is displayed.

Now from azure Marketplace, we are going to choose Database.


After choosing Database, we are going to select “Azure Cache for Redis“.
After selecting Azure Cache for Redis, we are going to see Create screen of Redis cache.
In this Screen, we are going to set DNS name as “CoreCache” and location whichever is right
for you for this demo I am going to choose “Central India” location also we are going to take
basic tier “Basic C0“.
After configuring settings next, we are going to click on the Create button to create a Cache.
This process will take a bit of time, but you can monitor the status of it. If your status as
running means you Redis cache is ready to use.
After deployment of Redis is successful next, we are going to get Access keys to access Redis
Resource form C#.
Getting Access keys
Here we are going to click on Resource which we have created “CoreCache” in Settings you
will find Access keys.

Next, we are going set connection string in appsettings.json file. For that, we are going to use
Primary connection string from Access Keys.
"Redis": {
"Password": "uefEg7DelBFFbAIw=",
"AllowAdmin": true,
"Ssl": true,
"ConnectTimeout": 6000,
"ConnectRetry": 2,
"Database": 0,
"Hosts": [
{
"Host": "CoreCacheDemo.redis.cache.windows.net",
"Port": "6380"
}
]
}
appsettings.json file
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft": "Warning",
"Microsoft.Hosting.Lifetime": "Information"
}
},
"AllowedHosts": "*",
"Redis": {
"Password": "uefEg7DezCTgTJDnECMexdKqxTlg+NiWkRilBFFbAIw=",
"AllowAdmin": true,
"Ssl": true,
"ConnectTimeout": 6000,
"ConnectRetry": 2,
"Database": 0,
"Hosts": [
{
"Host": "CoreCache.redis.cache.windows.net",
"Port": "6380"
}
]
}
}
After setting connection string in appsettings.json file next, we are going register
“AddStackExchangeRedisExtensions” service in ConfigureServices Method.
Adding “AddStackExchangeRedisExtensions” method in the ConfigureServices method
In ConfigureServices Method first we are going to read Redis connection settings from
appsettings.json file.
var redisConfiguration =
Configuration.GetSection("Redis").Get<RedisConfiguration>();
After reading it we are going to pass values to AddStackExchangeRedisExtensions method as
show below.
public void ConfigureServices(IServiceCollection services)
{
var redisConfiguration =
Configuration.GetSection("Redis").Get<RedisConfiguration>();
services.AddControllersWithViews();

services.AddStackExchangeRedisExtensions<NewtonsoftSerializer>(redisConfigu
ration);
}
After registering service next we are going add controller.
Adding DemoController and injecting IRedisCacheClient for accessing method
We are going to add a controller with name DemoController after adding we are going to add
a constructor to it. For injecting IRedisCacheClient dependency.
using Microsoft.AspNetCore.Mvc;
using StackExchange.Redis.Extensions.Core.Abstractions;

namespace WebCacheStackExchangeDemo.Controllers
{
public class DemoController : Controller
{
private IRedisCacheClient _redisCacheClient;
public DemoController(IRedisCacheClient redisCacheClient)
{
_redisCacheClient = redisCacheClient;
}
public IActionResult Index()
{
return View();
}
}
}
After creating a controller and adding constructor of class for injecting dependency next, we
are going to and simple model product and then we are going to implement methods of it.

Adding Product Model for demo


namespace WebCacheStackExchangeDemo.Models
{
public class Product
{
public int Id { get; set; }
public string Name { get; set; }
public double Price { get; set; }
}
}
Implementing AddAsync, AddAllAsync methods
AddAsync
Storing an object into Redis using AddAsync.
While storing, we are using “Db0” which is database 0, you can configure 16 different
databases. To add, we are using AddAsync method.
 Key

 Object to store

 DateTimeOffset
using System;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using StackExchange.Redis.Extensions.Core.Abstractions;
using WebCacheStackExchangeDemo.Models;

namespace WebCacheStackExchangeDemo.Controllers
{
public class DemoController : Controller
{
private readonly IRedisCacheClient _redisCacheClient;
public DemoController(IRedisCacheClient redisCacheClient)
{
_redisCacheClient = redisCacheClient;
}

public async Task<IActionResult> Index()


{
var product = new Product()
{
Id = 1,
Name = "hand sanitizer",
Price = 100
};

bool isAdded = await _redisCacheClient.Db0.AddAsync("Product",


product, DateTimeOffset.Now.AddMinutes(10));

return View();
}
}
}
Output
I previously was using Redis Desktop Manager to see key and values which are stored in
Redis. But now the Redis Desktop Manager tool is not free. You can use another free
tool AnotherRedisDesktopManager that has cool features.
Download Another-Redis-Desktop-Manager

AddAllAsync methods
Storing multiple Object with single round trip.
We are storing a list of products with different keys in single go you can use this method to
add keys to Redis in a single request.

using System;
using System.Collections.Generic;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using StackExchange.Redis.Extensions.Core.Abstractions;
using WebCacheStackExchangeDemo.Models;
namespace WebCacheStackExchangeDemo.Controllers
{
public class DemoController : Controller
{
private readonly IRedisCacheClient _redisCacheClient;
public DemoController(IRedisCacheClient redisCacheClient)
{
_redisCacheClient = redisCacheClient;
}

public async Task<IActionResult> Index()


{

var values = new List<Tuple<string, Product>>


{
new Tuple<string, Product>("Product1", new Product()
{
Id = 1,
Name = "hand sanitizer 1",
Price = 100
}),
new Tuple<string, Product>("Product2",new Product()
{
Id = 2,
Name = "hand sanitizer 2",
Price = 200
}),
new Tuple<string, Product>("Product3", new Product()
{
Id = 3,
Name = "hand sanitizer 3",
Price = 300
})
};

await _redisCacheClient.Db0.AddAllAsync(values,
DateTimeOffset.Now.AddMinutes(30));

return View();
}
}
}
Output
Implementing GetAsync, GetAllAsync methods
GetAsync
GetAsync Retrieves an object which is stored in Redis.
Here we are going to Get stored product object from in Azure Redis cache. For getting it, we
are going to use the GetAsync method and then we are going to pass key (‘Product’) to get
Object.
Using the GetAsync method, we can get single objects or list of objects both we can get using
the GetAsync method.
using Microsoft.AspNetCore.Mvc;
using StackExchange.Redis.Extensions.Core.Abstractions;
using System.Threading.Tasks;
using WebCacheStackExchangeDemo.Models;

namespace WebCacheStackExchangeDemo.Controllers
{
public class DemoController : Controller
{
private readonly IRedisCacheClient _redisCacheClient;
public DemoController(IRedisCacheClient redisCacheClient)
{
_redisCacheClient = redisCacheClient;
}

public async Task<IActionResult> Index()


{
var productdata = await
_redisCacheClient.Db0.GetAsync<Product>("Product");
return View();
}
}
}
Object to retrieve

Output
GetAllAsync
Retrieves multiple Object with single round trip stored in Redis.
Here we are passing multiple keys and get all objects related to keys.

using System;
using System.Collections.Generic;
using Microsoft.AspNetCore.Mvc;
using StackExchange.Redis.Extensions.Core.Abstractions;
using System.Threading.Tasks;
using WebCacheStackExchangeDemo.Models;

namespace WebCacheStackExchangeDemo.Controllers
{
public class DemoController : Controller
{
private readonly IRedisCacheClient _redisCacheClient;
public DemoController(IRedisCacheClient redisCacheClient)
{
_redisCacheClient = redisCacheClient;
}

public async Task<IActionResult> Index()


{
List<string> allKeys = new List<string>()
{
"Product1","Product2","Product3"
};

var listofProducts = await


_redisCacheClient.Db0.GetAllAsync<Product>(allKeys);
return View();
}
}
}
Objects to retrieve
Output

Implementing RemoveAsync, RemoveAllAsync methods


RemoveAsync
Removing a single Object which is stored in Azure Redis cache.
Here just need to pass key “Product” to RemoveAsync method for removing Object.
using Microsoft.AspNetCore.Mvc;
using StackExchange.Redis.Extensions.Core.Abstractions;
using System.Threading.Tasks;

namespace WebCacheStackExchangeDemo.Controllers
{
public class DemoController : Controller
{
private readonly IRedisCacheClient _redisCacheClient;
public DemoController(IRedisCacheClient redisCacheClient)
{
_redisCacheClient = redisCacheClient;
}

public async Task<IActionResult> Index()


{
bool isRemoved = await
_redisCacheClient.Db0.RemoveAsync("Product");
return View();
}
}
}
Object to remove
Output

RemoveAllAsync
Removing multiple Object which is stored Azure Redis cache with single round trip.
Here we are passing multiple keys to RemoveAllAsync method.
using System.Collections.Generic;
using Microsoft.AspNetCore.Mvc;
using StackExchange.Redis.Extensions.Core.Abstractions;
using System.Threading.Tasks;

namespace WebCacheStackExchangeDemo.Controllers
{
public class DemoController : Controller
{
private readonly IRedisCacheClient _redisCacheClient;
public DemoController(IRedisCacheClient redisCacheClient)
{
_redisCacheClient = redisCacheClient;
}

public async Task<IActionResult> Index()


{
List<string> allKeys = new List<string>()
{
"Product1",
"Product2",
"Product3"
};

await _redisCacheClient.Db0.RemoveAllAsync(allKeys);
return View();
}
}
}
Objects to remove
Output

Implementing ExistsAsync methods


ExistsAsync method is used to check whether Object exists or not in Azure Redis Cache.
if object exists then ExistsAsync method will return true else false value.
using Microsoft.AspNetCore.Mvc;
using StackExchange.Redis.Extensions.Core.Abstractions;
using System.Threading.Tasks;

namespace WebCacheStackExchangeDemo.Controllers
{
public class DemoController : Controller
{
private readonly IRedisCacheClient _redisCacheClient;
public DemoController(IRedisCacheClient redisCacheClient)
{
_redisCacheClient = redisCacheClient;
}

public async Task<IActionResult> Index()


{
bool isExists = await
_redisCacheClient.Db0.ExistsAsync("Product");
return View();
}
}
}
Objects
Output

Implementing SearchKeysAsync methods


SearchKeysAsync method is used to Search keys into Redis Cache.
 If you want to search all keys that start with “Product*“.

 If you want to search all keys that contain with “*Product*“.


 If you want to search all keys that end with “*Product”.
using System;
using System.Collections.Generic;
using Microsoft.AspNetCore.Mvc;
using StackExchange.Redis.Extensions.Core.Abstractions;
using System.Threading.Tasks;
using WebCacheStackExchangeDemo.Models;

namespace WebCacheStackExchangeDemo.Controllers
{
public class DemoController : Controller
{
private readonly IRedisCacheClient _redisCacheClient;
public DemoController(IRedisCacheClient redisCacheClient)
{
_redisCacheClient = redisCacheClient;
}

public async Task<IActionResult> Index()


{
// If you want to search all keys that start with 'Product*'
var listofkeys1 = await
_redisCacheClient.Db0.SearchKeysAsync("Product*");

// If you want to search all keys that contain with '*Product*'


var listofkeys2 = await
_redisCacheClient.Db0.SearchKeysAsync("*Product*");

// If you want to search all keys that end with '*Product'


var listofkeys3 = await
_redisCacheClient.Db0.SearchKeysAsync("*Product");

return View();
}
}
}
Objects to search

Output

2.3 Cosmos

2.4 Blob

3. Azure PAAS & SAAS

3.1 App Service

zure Account

First, we need to create an account on the Azure portal. Only then can we host the application in the
cloud environment. So, please check the following steps to create an Azure account.

Azure Account Registration 

 
Create an account through this link to Azure Portal.

Domain Registration

We need to host our application in a particular domain. Check the following steps -

1. Click on "All resources" on the left side menu and it will open a dashboard with an empty or
already existing list of resources that we have created earlier.

2. Click on the "Add" button and it will open another window with multiple options. We can
choose an appropriate option to host our application.

3. As per our requirement, we choose "Web + Mobile" and clicked on the "Web App" on the
right side.

4. It will open another form to fill up our app details to host. We need to give a unique name in
the "Appname" section and It will create a subdomain for our ASP.NET Core application.

5. We choose subscription as "Free Trial" because we created a free account on the Azure
portal.

6. We need to host our app resources in Resource group, So first we need to create a resource
group name in our Azure account. But we choose existing resource group name
"AzureDemo" that we are already created in our Azure account.

7. "OS ( Operating System )" we selected as "Windows"( As per our requirement ).

8. We can create our own App Service Plan name.

9. Application Insight will give you the more clarity about your hosted app. Eg. analytics, etc.

10. Click on the “Create” button and wait for the build success.
11. Another way to create the Resource Group Name - click on "Resource groups -> Add".

 
12. Once the build is succeeded, then we can see this output.

Simple steps to create an Asp.Net Core Application

1. Open our VisualStudio then click on File > New > Project.
2. Select Visual C# > Web > ASP.NET Core Web Application.
3. We have given our application name as "MyFirstAzureWebApp".
4. Then, click OK.
5. Click on the "Ctrl+F5".

App Publishing into Azure


 

We created a default ASP.NET Core application ( We have done some changes in UI


Section) for the publishing process.

1. Right click on the application and click on the Publish menu.


2. Click on the "Microsoft Azure App Service".

3. We choose our existing resource group name ( That we created ! "AzureDemo" )


in our Azure Portal and It will display the App Name inside the "AzureDemo"
folder. This will display only when we are login through the Visual Studio using
Azure credentials ( Email & Password).
 

3.2 Azure VDI

3.3 Azure Functions

4. Azure Hosting & Networking

4.1 Docker

4.2 Azure Load Balancing

oad Balancers distribute the incoming traffic over multiple servers. Here, it will
distribute the load over two Virtual Machines that I have already created.

1. First, click on "Create a Resource", and find Load Balancer under the Networking
category.
2. Set the name of Load Balancer, keep it Public. Set the Public Address. Create
new, keep it dynamic, and give it some name. 

3. I am using an existing resource group where I have already created two Virtual
Machines. Give the location and click on "Create". 
1. First, we are going to configure Health Probes of our Load Balancer. So,
we need to define how many servers we need to make the Load Balancer.

4. So, inside Load Balancer, go to Health Probes and click on Add. 

 
5. Give name to the Health Probe. The interval shows at it is going to make a probe
attempt at five second intervals. And the unhealthy threshold shows that after
two failed probe attempts, it is going to declare it as an unhealthy server.

1. Now comes the role of Backend Pools.


2. In the Backend Pools, we define the servers which we are going to use in
the Load Balancer.
6. Now click on Backend Pools and click on Add. 

7. Here I have given some name to my Backend Pool. I have associated it with an
availability set that I had created. This AvailSet1 contains two of my VMs. Next,
we need to add target network IP configuration. 

8. First I have selected the VM1 and its network IP configuration. 


9. Then I have selected the second VM and its IP configuration. Finally click on OK. 
10. So the backend pool has been configured now. The next step is configuring Load
Balancing Rules.

11. Click on Load Balancing Rules and click on Add. 


12. Here I have given some name to the rule. The IP version is IPv4, protocol is TCP,
port number is 80, backend port is 80, backend pool is the one that we just
created and the health probe is the one that we just created. 

13. Session persistence is none by default and idle timeout is 4 minutes by default.
Floating IP disabled. Click on OK. 
14. So our Load Balancing Rule has been created. 

15. Both of my VMs which are under the Load Balancer are in running mode. 
16. In the Load Balancer, I have my Public IP address. I am going to copy it and paste
it in the address bar. 

17. On the first hit, it has taken me to VM1’s server. 


18. But if I hit the same address in another browser window, it is taking me to VM2’s
server. 
19. Side by side they would look something like this.

 Azure

4.3 Azure App Management Middleware

4.4 Azure ADS

Azure Active Directory is the Identity and Access Management (IAM) solution
offered by Microsoft.
Azure AD can authenticate accounts from different sources, which are as
follows:

 Azure Subscription
 Office 365 Subscription
 Microsoft Intune and Dynamics Subscription
 Third Party cloud-based SaaS applications (which supports Azure AD
authentication)
 On Premise Active Directory accounts

You can refer to this Microsoft document to get more details about the Azure AD
authentication.
We need two app registrations in Azure portal for AD authentication. One for
Web API application and one for Angular application.
Create App registrations in Azure portal
Login to Azure portal -> click Azure Active Directory blade
 

Choose App registration blade


 
Click + New registration
 
Give a valid name and redirect URI here. We can give the redirect URI in angular
code as well.
 
 
 
Click “Register” button to create the app.
 

 
We can see the app registration details like client id, tenant id etc. We must use
these details later in our Angular application.
 
Click Authentication tab in the left side and select Access Token and Id tokens
and click Save button.
 
 
We will use these tokens for our Authentication and Authorization purpose later.
We can click App registration blade again and create a new app registration for
Web API.
 
We have given app name only. No need to give redirect URI here because this is
for API.
 
Click “Register” button to create app.
 
After the app registration will be populated, please click “Expose an API” blade.
 
Click Add Scope button
 

 
Click Save button.
 
One new window will be appeared and we can enter scope name, consent name
and consent description in this window.
 

 
We will use this consent scope later in our Angular application.
 
We can link our previously created client application to this API app registration.
 
Click + Add a client application button and give correct client id from previously
created app registration. Don’t forget to select consent scope along with client
id.
 
 
We have successfully created app registration for both UI and API. Now we can
create ASP.NET Core web API and Angular 8 application and enable Azure AD
authentication.

Create ASP.NET Core Web API in Visual Studio 2019


We can create ASP.NET Core Web API application using default API template in
Visual Studio.
 
We must install “Microsoft.AspNetCore.Authentication.AzureAD.UI” library using
NuGet. This is used for AD authentication.
 
We have already created two app registrations in Azure active directory. We can
use the client id and tenant id for API here in appsettings as given below.
 
appsettings.json

1. {  
2.   "Logging": {  
3.     "LogLevel": {  
4.       "Default": "Information",  
5.       "Microsoft": "Warning",  
6.       "Microsoft.Hosting.Lifetime": "Information"  
7.     }  
8.   },  
9.   "AllowedHosts": "*",  
10.   "AzureActiveDirectory": {  
11.     "Instance": "https://login.microsoftonline.com/",  
12.     "Domain": "<your domain>.onmicrosoft.com",  
13.     "TenantId": "adbbbd82-76e5-4952-8531-3cc59f3c1fdd",  
14.     "ClientId": "api://e283d8fb-22ad-4e2c-9541-
14f6f118a08f"  
15.   }  
16. }  

We can register authentication service inside the ConfigureServices method in


Startup class. Also add CORS service as well.
 
Staturp.cs

1. using Microsoft.AspNetCore.Authentication;  
2. using Microsoft.AspNetCore.Authentication.AzureAD.UI;  
3. using Microsoft.AspNetCore.Builder;  
4. using Microsoft.AspNetCore.Hosting;  
5. using Microsoft.Extensions.Configuration;  
6. using Microsoft.Extensions.DependencyInjection;  
7. using Microsoft.Extensions.Hosting;  
8. using System;  
9.   
10. namespace AzureADAPI  
11. {  
12.     public class Startup  
13.     {  
14.         public Startup(IConfiguration configuration)  
15.         {  
16.             Configuration = configuration;  
17.         }  
18.   
19.         public IConfiguration Configuration { get; }  
20.   
21.         // This method gets called by the runtime. Use th
is method to add services to the container.  
22.         public void ConfigureServices(IServiceCollection 
services)  
23.         {  
24.             services.AddControllers();  
25.   
26.             services.AddAuthentication(AzureADDefaults.Be
arerAuthenticationScheme).AddAzureADBearer(options => Configur
ation.Bind("AzureActiveDirectory", options));  
27.   
28.             string corsDomains = "http://localhost:4200"; 
 
29.             string[] domains = corsDomains.Split(",".ToCh
arArray(), StringSplitOptions.RemoveEmptyEntries);  
30.   
31.             services.AddCors(o => o.AddPolicy("AppCORSPol
icy", builder =>  
32.             {  
33.                 builder.AllowAnyOrigin()  
34.                        .AllowAnyMethod()  
35.                        .AllowAnyHeader()  
36.                        .AllowCredentials()  
37.                        .WithOrigins(domains);  
38.             }));  
39.   
40.         }  
41.   
42.         // This method gets called by the runtime. Use th
is method to configure the HTTP request pipeline.  
43.         public void Configure(IApplicationBuilder app, IW
ebHostEnvironment env)  
44.         {  
45.             if (env.IsDevelopment())  
46.             {  
47.                 app.UseDeveloperExceptionPage();  
48.             }  
49.   
50.             app.UseCors("AppCORSPolicy");  
51.   
52.             app.UseRouting();  
53.   
54.             app.UseAuthentication();  
55.             app.UseAuthorization();  
56.   
57.             app.UseEndpoints(endpoints =>  
58.             {  
59.                 endpoints.MapControllers();  
60.             });  
61.         }  
62.     }  
63. }  

Create an Employee class. This will be used in our Employees controller class to
return some dummy data to Angular application later.
 
Employee.cs

1. namespace AzureADAPI  
2. {  
3.     public class Employee  
4.     {  
5.         public int Id { get; set; }  
6.         public string Name { get; set; }  
7.         public string Company { get; set; }  
8.         public string City { get; set; }  
9.     }  
10. }  
Create Employee controller with single Get method. This method will be called
from Angular application to test AD authentication.
 
EmployeeController.cs

1. using Microsoft.AspNetCore.Authorization;  
2. using Microsoft.AspNetCore.Mvc;  
3. using System.Collections.Generic;  
4.   
5. // For more information on enabling Web API for empty projects
, visit https://go.microsoft.com/fwlink/?LinkID=397860  
6.   
7. namespace AzureADAPI.Controllers  
8. {  
9.     [Authorize]  
10.     [Route("api/[controller]")]  
11.     public class EmployeesController : Controller  
12.     {  
13.         [HttpGet]  
14.         public IEnumerable<Employee> Get()  
15.         {  
16.             List<Employee> employees = new List<Employee> 
 
17.             {  
18.                 new Employee { Id = 1, Name = "Sarathlal 
Saseendran", Company = "Orion Business Innovations", City = "K
ochi" },  
19.                 new Employee { Id = 2, Name = "Anil Soman
", Company = "Cognizant", City = "Bangalare" }  
20.             };  
21.             return employees;  
22.         }  
23.     }  
24. }  

Please note, we have decorated above controller with [Authorize] decorator.


 
We have completed the API application enabled with AD authentication. We can
create Angular application from scratch and add all the components and
services.
 

Create Angular 8 application using Angular CLI


I am still using the stable version of Angular 8 LTS (8.3.26)
 
As we discussed earlier, we are using “@azure/msal-angular” library for AD
authentication. Unfortunately, the latest version is still in preview mode. Hence, I
am using the stable version 0.1.4 in our application.
 
Please use below command to install this stable version.
 
npm i @azure/msal-angular@0.1.4
 
We need RxJs compatible version also in our application.
 
Please install that also.
 
npm i rxjs-compat
 
We can add AD related client id, tenant id, consent scope details inside the
environment variable.
 
environment.ts

1. export const environment = {  
2.   production: false,  
3.   baseUrl:'http://localhost:58980/',  
4.   scopeUri: ['api://e283d8fb-22ad-4e2c-9541-14f6f118a08f/
sarath'],  
5.   tenantId: 'adbbbd82-76e5-4952-8531-3cc59f3c1fdd',  
6.   uiClienId: '28a65047-6d13-4566-aba6-bd6d6dcd170b',  
7.   redirectUrl: 'http://localhost:4200'  
8. };  

We can create a services folder and create a MsalUserService class.


 
msaluser.service.ts

1. import { Injectable } from '@angular/core';  
2. import * as Msal from 'msal';  
3. import { environment } from 'src/environments/environment';  
4. import { Observable } from 'rxjs';  
5.   
6. @Injectable()  
7. export class MsalUserService {  
8.     private accessToken: any;  
9.     public clientApplication: Msal.UserAgentApplication = null
;  
10.     constructor() {  
11.         this.clientApplication = new Msal.UserAgentApplic
ation(  
12.             environment.uiClienId,   
13.             'https://login.microsoftonline.com/' + enviro
nment.tenantId,  
14.             this.authCallback,  
15.             {  
16.                 storeAuthStateInCookie: true,  
17.                 //cacheLocation: 'localStorage' ,  
18.             });  
19.     }  
20.   
21.     public GetAccessToken(): Observable<any> {  
22.         if (sessionStorage.getItem('msal.idtoken') !== un
defined && sessionStorage.getItem('msal.idtoken') != null) {  
23.             this.accessToken = sessionStorage.getItem('ms
al.idtoken');  
24.         }  
25.         return this.accessToken;  
26.     }  
27.   
28.     public authCallback(errorDesc, token, error, tokenTyp
e) {  
29.         if (token) {  
30.   
31.         } else {  
32.             console.log(error + ':' + errorDesc);  
33.         }  
34.     }  
35.   
36.     public getCurrentUserInfo() {  
37.         const user = this.clientApplication.getUser();  
38.         alert(user.name);  
39.     }  
40.   
41.     public logout() {  
42.         this.clientApplication.logout();  
43.       }  
44. }  

We have used this service for getting access token (for authentication), logout
function and also for getting logged in user name.
 
We can create an employee class
 
Employee.ts

1. export class Employee {  
2.     id: number;  
3.     name: string;  
4.     company: string;  
5.     city: string;  
6. }  

We can create a DataService for getting employee data from Web API
application.
 
data.service.ts

1. import { Injectable } from '@angular/core';  
2. import { HttpClient, HttpHeaders } from '@angular/common/
http';  
3. import { Observable } from 'rxjs';  
4. import { environment } from 'src/environments/environment';  
5. import { MsalUserService } from './msaluser.service';  
6. import { Employee } from './employee';  
7.   
8. @Injectable({  
9.     providedIn: 'root'  
10. })  
11. export class DataService {  
12.     private url = environment.baseUrl + 'api/employees';  
13.   
14.     httpOptions = {  
15.         headers: new HttpHeaders({  
16.             'Content-Type': 'application/json'  
17.         })  
18.     };  
19.   
20.     constructor(private http: HttpClient, private msalSer
vice: MsalUserService  
21.     ) { }  
22.   
23.     getEmployees(): Observable<Employee[]> {  
24.          
25.         this.httpOptions = {  
26.             headers: new HttpHeaders({  
27.                 'Content-Type': 'application/json',  
28.                 'Authorization': 'Bearer ' + this.msalSer
vice.GetAccessToken()  
29.             })  
30.   
31.         };  
32.   
33.         return this.http.get(this.url, this.httpOptions)  
34.             .pipe((response: any) => {  
35.                 return response;  
36.             });  
37.     }  
38.   
39.     getCurrentUserInfo(){  
40.         this.msalService.getCurrentUserInfo();  
41.     }  
42.   
43.     logout(){  
44.         this.msalService.logout();  
45.     }  
46. }    

For every request, we are calling access token from MsalUserService and added
to request header.
 
We can add default MsalGuard inside the AppRoutingModule class.
 
app-routing.module.ts

1. import { NgModule } from '@angular/core';  
2. import { Routes, RouterModule } from '@angular/router';  
3. import { AppComponent } from './app.component';  
4. import { MsalGuard } from '@azure/msal-angular';  
5.   
6.   
7. const routes: Routes = [  
8.   {  
9.     path: '',  
10.     component: AppComponent,  
11.     canActivate: [MsalGuard]  
12.   }  
13. ];  
14.   
15. @NgModule({  
16.   imports: [RouterModule.forRoot(routes)],  
17.   exports: [RouterModule]  
18. })  
19. export class AppRoutingModule { }  

Whenever we route to App component, MsalGuard will protect the component


with AD authentication.
 
We can modify the AppModule class with below code.
 
app.module.ts

1. import { BrowserModule } from '@angular/platform-browser';  
2. import { NgModule } from '@angular/core';  
3.   
4. import { AppRoutingModule } from './app-routing.module';  
5. import { AppComponent } from './app.component';  
6. import { environment } from 'src/environments/environment';  
7. import { MsalModule, MsalInterceptor } from '@azure/msal-
angular';  
8. import { HttpClientModule, HttpClient, HTTP_INTERCEPTORS } fro
m '@angular/common/http';  
9. import { MsalUserService } from './services/msaluser.service'; 
 
10.   
11. export const protectedResourceMap: any =  
12.   [  
13.     [environment.baseUrl, environment.scopeUri  
14.     ]  
15.   ];  
16.   
17. @NgModule({  
18.   declarations: [  
19.     AppComponent  
20.   ],  
21.   imports: [  
22.     MsalModule.forRoot({  
23.       clientID: environment.uiClienId,  
24.       authority: 'https://login.microsoftonline.com/' + e
nvironment.tenantId,  
25.       //cacheLocation: 'localStorage',  
26.       protectedResourceMap: protectedResourceMap,  
27.       redirectUri: environment.redirectUrl  
28.     }),  
29.     BrowserModule,  
30.     AppRoutingModule,  
31.     HttpClientModule  
32.   ],  
33.   providers: [  
34.     HttpClient,  
35.     MsalUserService,  
36.     {  
37.       provide: HTTP_INTERCEPTORS, useClass: MsalIntercept
or, multi: true  
38.     }  
39.   ],  
40.   bootstrap: [AppComponent]  
41. })  
42. export class AppModule { }  

We have registered MsalModule and MsalInterceptor inside the AppModule.


 
We can modify the default App component with below code.
 
app.component.ts

1. import { Component } from '@angular/core';  
2. import { Employee } from './services/employee';  
3. import { DataService } from './services/data.service';  
4.   
5. @Component({  
6.   selector: 'app-root',  
7.   templateUrl: './app.component.html',  
8.   styleUrls: ['./app.component.css']  
9. })  
10. export class AppComponent {  
11.   title = 'AzureMSALAngular';  
12.   
13.   employees: Employee[];  
14.   errorMessage: any;  
15.   
16.   constructor(private dataService: DataService) { }  
17.   
18.   ngOnInit(): void {  
19.     this.dataService.getEmployees().subscribe(  
20.       values => {  
21.         this.employees = values;  
22.       },  
23.       error => this.errorMessage = <any>error  
24.     );  
25.   }  
26.   
27.   getUser(){  
28.     this.dataService.getCurrentUserInfo();  
29.   }  
30.   
31.   logout(){  
32.     this.dataService.logout();  
33.   }  
34. }  

Also modify the template file.


 
app.component.html

1. <h3>Azure AD Authentication with Azure Angular MSAL library</
h3>  
2.   
3. <hr>  
4.   
5. <table>    
6.   <thead>    
7.     <tr>    
8.       <th>Id</th>    
9.       <th>Name</th>    
10.       <th>Company</th>    
11.       <th>City</th>    
12.     </tr>    
13.   </thead>    
14.   <tbody>    
15.     <tr *ngFor="let employee of employees">    
16.       <td>{{ employee.id }}</td>    
17.       <td>{{ employee.name }}</td>    
18.       <td>{{ employee.company }}</td>    
19.       <td>{{ employee.city }}</td>    
20.     </tr>    
21.   </tbody>    
22. </table>    
23.   
24. <hr>  
25.   
26. <button (click)="getUser()">User Name</button>  
27. <button (click)="logout()">Logout</button>  

We have completed the coding part of Angular application as well.


 
We can run both Web API and Angular application.
 
Application will immediately ask your AD credentials.
 
 
After the successful login, it will ask your consent to access the AD app
registration. Once you approved the consent, it will not ask again.
 
We can see the dummy Employee data has been taken from Web API
successfully.

 
If you click the User Name button, you can see the logged in user name.
 
 
 
If you click the Logout button, application will be successfully signed out.

Conclusion
In this post, we have seen how to create an Azure AD enabled ASP.NET Core
Web API application and Angular 8 application and communicate with each
other. We have used "@azure/msal-angular" library to enable Azure AD in
Angular application. This library is a wrapper for base library “msal”. Latest
version of this library is still in preview. I have used the stable version. We can
create AD enabled application using “msal” library as well. We can see those
details in another article.

4.5 Azure App Service Scaling


Scaling Options: Scale-Up vs. Scale-Out
 

Scale-up and scale-out are two primary workflows for scaling.

Scale-Out (Horizontal scaling)

It is basically adding multiple instances of the application that runs in your app. In other
words, it increases the number of VM instances up to 30 depending upon your pricing
tier. However, in an Isolated tier, we can further scale up to 100 instances based on our
requirements. Additionally, we can do a scale-out count manually or set it to auto-
scaling based on some rules.

We can do scale-out from the Azure portal, as shown below:


 

Autoscale is a built-in feature that helps applications perform their best when demand
changes. You can choose to scale your resource manually to a specific instance count, or via
a custom Autoscale policy that scales based on metric(s) thresholds, or schedule instance
count which scales during designated time windows. Autoscale enables your resource to be
performant and cost-effective by adding and removing instances based on demand.

Select Scale-out from the left navigation, then we can choose either manual and set
instance number as per our requirements or auto-scale out based on some rule as
shown below:

 
 

We have to auto-scale the setting name with the resource group name. We will have
some more options like scale mode, rules, instance limit, and schedule. This is how we
can do horizontal scaling (i.e., scale-out in Azure App Services.)

Scale-up (Vertical scaling)

In this scaling option, we can modify the instance power in terms of CPU, memory, disk
space which directly impact the cost. This scale-up is done by changing the pricing tier
of the app service plan. Additionally, based on the pricing tier, different useful features
are available with the azure app service like custom domains and certificates, staging
slots, autoscaling, instances, daily backups, and many more.

Scaling is quite easy to do in azure, I prefer from the portal, portal.azure.com.

Simply log in to the portal, open your app service then you will see the following options
for scaling:

 
 

In the scale-up, we can see the distinct categories: Dev/Test, Production, and Isolated.

Dev/Test

As the name suggests, this category is used for development and testing purposes of
the applications with Azure App Service.

There are five separate pricing tiers in the Dev/Test category, as shown below:

 
 

F1 (free one), D1, B1, B2, and B3


 

We can see from the above illustration that the resources as well as tentative cost per
month based on these pricing tiers.

Another important point to notice is that we can see the included features and
hardware after selecting each tier, as depicted below:

Note

I am not including cost, as cost and plan can modify.

 
F1 Pricing tier (Free Tier)

 No feature
 Share Infrastructure 1GB Memory/1GB Storage
 60 minutes/day compute

D1 Pricing Tier

 Custom domain feature


 Share Infrastructure 1GB Memory
 240 minutes/day compute

These service plans are using the same Azure VMs as other apps, so some apps may
belong to other customers.
 

There is no SLA provided for this service plan and is metered on a per-app basis.

Basic Service Plan (B1, B3 and B3)


 

This plan is for low-traffic applications where we don’t need auto-scaling and traffic
management features. Built-in network load balancing support automatically distributes
traffic across instances.

B1 Pricing Tier

Included Features

 Custom domains/SSL
 Manual Scaling up to 3 instances

Hardware

 100 total Azure computing units


 1.75 GB Memory and 10 GB disk storage
 A-series computing equivalent

B2 and B3 have the same features as B1, however, hardware contents are different.
 

B2 Hardware
 200 total Azure computing units
 3.5 GB Memory and 10 GB disk storage
 A-series computing equivalent

B3 Hardware

 400 total Azure computing units


 7 GB Memory and 10 GB disk storage
 A-series computing equivalent

Production
 

By name, it is clear that we use this category for the production workloads of the
applications. Again, there are two types under this category: standard and premium
pricing tiers.

This category has a standard pricing tier, S1, S2, and S3, and premium tier P1V2, P2V2,
P3V2, P1V3, P2V3, and P3V3.

All these tiers have custom domains/SSL, Auto Scale, Staging Slots, Daily backups, and
Traffic manager features.

Scale Out

 part one, we saw how to manually scale our app, and here, we are going to see
how we can enable auto scale in our app. Auto scale happens based on some
metrics and parameters like usage of disk, and memory or CPU beyond a limit. It
will automatically add more instances when the usage is exceeded. Also, we can
schedule it for some specific time or day, even though there are some
limitations per plan, and you can’t scale beyond a limit.

We are back to our scale out section of our app created in part one in which we set
three instances manually and you can see that the auto scale functionality is disabled by
default.

When I enable the auto scale, you can see that it will create a default Auto scale
condition. It is possible to have more than one conditon and to have the name “Auto
created scale condition” of your choice. You have to give a name to this Auto scale
setting also, from the top. So, this setting may have multiple conditions.
 

When we click on "Add a rule", it will pop up the following window to define the rule.

Let us define a new rule to increase the instance count to three in case when CPU usage
exceeds 80% for an average of 10 minutes.

For that, I am selecting the metric name to CPU Percentage with the operator
greater than and the threshold to 80 and the duration for 10 minutes.

Here, the duration of 10 minutes means that every time the auto scale runs, it
will query the metrics for that past 10 minutes which will allow the metrics to
stabilize and avoid reaching transient spikes. Now, we have to increase the count
by 3 and instance value to 3 and the cool down time to 5 minutes. Cool down
time is the waiting time after a scale operation, before the scaling starts again. 

This default scale will always happen irrespective of time, date, and other parameters,
but if you want to add other scale conditions, you have to add them yourself outside of
the default setting. For example, if you want to increase the instance count without any
condition for all weekends (Saturday and Sunday), you can click on ‘Add a Scale
Condition’ and select ‘Scale to a specific instance count’ with the start and end time,
instead of ‘Scale based on a metric’.
 

It is also possible to have a scale scheduled for a specific date, such as - any festival
dates like New Year's Eve etc. You can also define rules to scale with maximum and
minimum instances to run.

So, it is possible to have a default scale, recurrent scale, and fixed date scales.

 
4.6 Azure Domain DNS
Introduction

This article tells you how to set up a custom domain name for the Azure VM
using DNS Zone. DNS zone in Azure is basically used to host the DNS record. To
host our domain name in Azure DNS, we need to create a DNS Zone.

Content

 Create a DNS Zone


 Customizing the Nameserver in GoDaddy
 Add a DNS record in Zone

Create a DNS Zone

Step 1

Login into Azure portal (https://portal.azure.com/# )

Step 2

Search for DNS Zone in marketplace, select and click on create.

Step 3

Give a domain name as a DNS Zone name, select a subscription, resource group
and location
 

In the DNS Zone Overview, you can find out the Nameservers, as shown in the below
figure 

Customizing the Nameserver in GoDaddy

Step 1

Login to GoDaddy
Step 2

Go the Domains and click on DNS as shown in the below figure.

Step 3

In Nameservers section click on change, choose customize from the dropdown


and add a nameservers which is listed in Azure DNS Zone
Add a DNS record in Zone

Step 1

Go back to Azure portal DNS Zone 

Step 2

Click on Record Set, as show in the below figure

 
 

Step 3

Give the name as www, choose type as A, and add an IP (In my case I added an IP of my
VM which is created in my last article), as shown in the below figure.

Usually DNS propagation will take some time, we can confirm the mapping using the
powershell nslookup command as shown in the below figure.
I hope you have enjoyed this article. Your valuable feedback, questions, or comments
about this article are always welcome. 

5. Azure Devops & Pipe line


5.1 Azure Repository GIT

5.2 Azure Pipeline API / .Net

5.3 Azure Pipeline Angular

You might also like