Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 213

Unit 1

Short answers:
Q1: What do you mean by web analytics?

Ans: Web analytics is the measurement and analysis of data to inform an understanding of user
behaviour across web pages.

Analytics platforms measure activity and behaviour on a website, for example: how many users visit,
how long they stay, how many pages they visit, which pages they visit, and whether they arrive by
following a link or not.

Businesses use web analytics platforms to measure and benchmark site performance and to look at
key performance indicators that drive their business, such as purchase conversion rate.

Q2: Why web analytics are so important?

Ans: There’s an old business adage that whatever is worth doing is worth measuring.

Website analytics provide insights and data that can be used to create a better user experience for
website visitors.

Understanding customer behaviour is also key to optimizing a website for key conversion metrics.

For example, web analytics will show you the most popular pages on your website, and the most
popular paths to purchase.

With website analytics, you can also accurately track the effectiveness of your online marketing
campaigns to help inform future efforts.

Q3: How web analytics work?

Ans: Most analytics tools ‘tag’ their web pages by inserting a snippet of JavaScript in the web page’s
code.

Using this tag, the analytics tool counts each time the page gets a visitor or a click on a link. The tag
can also gather other information like device, browser and geographic location (via IP address).

Web analytics services may also use cookies to track individual sessions and to determine repeat
visits from the same browser.

Since some users delete cookies, and browsers have various restrictions around code snippets, no
analytics platform can claim full accuracy of their data and different tools sometimes produce slightly
different results.
Q4: What are three ways of collecting web analytics?

Ans: There are three types of Web analytics metrics:

1. Count. These are the raw figures captured that will be used for analysis.
2. Ratio. This is an interpretation of the data that are counted.
3. KPI (key performance indicator). Either a count or a ratio, these are the figures that help
you to determine your success in reaching your goals.

Q5: What Digital Channels Can You Measure?


Ans: While your ability to measure may be restricted by application architecture, security
requirements, infrastructure limitations, and privacy policies, most online marketing endeavours can
be tracked and measured in detail. The following are the digital channels that we have developed
and measured over the years: websites, microsites, search engines, email, e-Commerce, contests,
wireless devices, and banner advertising.

Q6: What Is the Best Way to Attribute an Offline Sale to An Online Assist?
Ans: Your problem is the primary key. So, use unique phone numbers (specific to campaigns if you
want granular details) … leverage unique coupon / campaign / offer codes… get good at geographic
targeting… become a God of controlled experiments.

Q7: Define Session.


Ans:  A session is when the user has visited your website, and it can last up to a maximum of 30
minutes. A new session is created when the same user tries to access the website from a different
machine. It is important to remember a single session can hold multiple web pages.

Q8: Define users


Ans: Users are the visitors who have completed a single session within the given timeframe.
Generally, users are referred to unique users of the site. 

Q9: What are the capabilities of Web Analytics?


Ans: Web Analytics have many capabilities to analyse and present the user date. We can analyse
target audience, audience locations, audience behaviour, traffic sources, conversions, revenue,
devices, browsers info, navigation paths etc. 

Q10: How does Web Analytics help in market research?


Ans: A successful business requires online promotion these days to thrive in the market. Web
analysis helps in gauging the number of visitors on the website and the traffic on the site. We can
understand the audience behaviour and make decisions to improve our business. 

Long Answers
Q1: What is Web analytics? Define the purpose of web analytics.
Ans: As a general term, web analytics means the analysis of the relationship between
a website and the users of that website. However, in the field of web consultancy and
e-commerce, web analytics has a field/industry specific meaning, which we will be
covering in this section.

When e-commerce began to emerge as a significant industry, it came with its own sets
of needs, requiring professionals who could measure the effectiveness of strategies,
the current state of affairs that a website was experiencing, what the demographics
and demands of a website’s users were.

Most of all, as the internet grew, so did what we now call Big Data. To meet these new
challenges and needs, a new field of expertise and knowledge evolved called Web
Analytics. A major part of this new domain of information technology was the software
used to measure and conduct web analytics.

These genres of computer programmes and tools were instrumental in monitoring the
relationship between websites and their users, the two poles in a dynamic relationship,
which is the basis of web analytics.

The purpose of web analytics

Web analytics is the activity that will provide information about different aspects of a
websites, as well as help you answer some of the questions related to the traffic and
overall performance.

To give you a very basic summation of what kind of information a web analyst deals
with, here are a few indicators or signals of a website that are meaningful and
significant for web analytics:

 What is the gross quantity of visitors on a website or traffic quantum


 What is the number of unique visitors or visitors who are new
 The route involved in bringing different categories of traffic to the website. E.g.,
how many arrive through search engine results, how many get to the website
through online marketing?
 What terms are trending on the search facility on the website, what are users
searching for and in what quantity?
 What category of users searches for particular keywords or search terms?
 What is the average, minimum and maximum time being spent on the website by
users?
 How many users are going beyond the main page, to deeper links?
 What are the links attracting the most second clicks beyond the homepage?
 What is the bounce rate? How many users arrive at the homepage and leave
website?

All of this information has to do with user-experience and understanding it, optimizing it
and troubleshooting areas where a website is not delivering what it is intended to and is
not resulting in the kind of traffic revenue or user activity that a site owner wishes to see.
This is the essence of web analytics; helping a website to meet the demands of users
and the objectives set forth by the site owner.

As you can see, all the above questions require monitoring, collection, classification,
interpretation and finally, analysis of vast sums and types of data that is usually raw
statistics, which need to be brought into a meaningful context.

That is a job for the web analyst. At this stage, we can define that as a highly internet-
literate individual who is required to contribute his expertise in four major domains; the
creation of data entry procedures; ensuring the security systems of a website are
sufficient and effective; thirdly, maintaining and managing the integrity of data; and lastly
(and perhaps the most important) analyzing the data generated by the operations of the
website and its interaction with users.

Q2: Elaborate the history of web analytics.


Ans: When the internet began in 1991, web analytics were very basic and not possible in real time.
You might remember seeing one of those counters that read ‘You are the 15,614th visitor to our
website’ or something like that. The list of things we could measure, with some degree of accuracy,
was rather small. Web strategists in those days basically shot arrows in the dark and that is why what
we now call the ‘Internet Boom of the 90’s’, was really like a blind gold rush. If you somehow found
traffic, you struck gold and on the contrary, you could have valuable content and due to lack of the
‘right’ promotion and proper optimized visibility, your website could lead to financial ruin.

Then came 1993, bringing along with it the first step down the path of what IT professionals,
marketing people and web strategists together know as web analytics. The year 1993 brought Web
Trends to the world, the first web analysis tool that shaped the concept of what we call the ‘user’ now,
and it was designed to keep track of records and analyze the behaviour of users on the internet.

The first form of web analytics was perhaps basic log files that, at the time, had no marketing or
commercial value or contexts attached to them at all, and were just a resource in possession of and
maintained by the IT departments of businesses or organization. It is difficult to say when, where or
how, but somewhere down the line, marketing departments began to see the potential possibilities
that such log files held for understanding consumer behaviour. However, it goes without saying that
there was not much informational or in-depth research to what these log files contained at the time.

Nevertheless, the point is that the link had been made between commerce and the archiving of user
activity on the web. This was a naturally prize-winning find and a godsend for marketing and
research, instead of traditional market research, where the company literally has to hound the
consumer for information and be careful not to annoy potential customers. Here, with analysing user
behaviour on the internet, you basically had your consumers under a microscope, a setting which
was, for the most part, controlled and comprised of precisely adjustable variables. All consumer
activity was archived by default and there was no need for strategies to bait the consumer into giving
you some time and information, since the very presence of a user on the web generated data.

The next big development in the story of web analytics happened with the help of JavaScript and that
was a little something called page tagging. It is pretty simple and ingenious. Page tagging allowed an
analyst to find out if a user was a new guest or frequent visitor, where they came from and where
they headed to when they exited the site, what areas of the page they hovered over the most and
other similar examples. In essence, page tagging was sketching out the interaction between websites
and their visitors.

On a tangential note, insights of this nature had repercussions for website design and content
development in a major way as well. Marketing was not the only contender for the progression in web
analytics. Web analytics were starting to be understood as the internet looking back at its self – a
virtual mirror that could be customized to the needs of what the analyst was looking into. These early
advances in web analytics were indirectly responsible or at least contributed to giving rise to the user-
friendly and user-centric philosophy of development, which is making a comeback since Google’s
Panda and Penguin updates.
Around the mid-90s web analysis (or analytics) had created a niche for itself and placed itself on the
global R&D agenda, as far as big IT giants were concerned at least. Web analytics became the new
frontier for the big data revolution and a new toy for marketers – a marketing guru famous on
Madison Avenue is said to have remarked: “The net’s changed it all, we used to run after them
asking them their favourite colour, now they come to us and tell us where they were last night!”

Exactly around this time, in the summer of 1995, what was later to be known as Google Analytics,
came into the cyber world, in the form of a software by the name of URCHIN. 1995 is highly
underrated for its importance in the history of marketing, e-commerce, web analytics and the internet
as a whole. Although URCHIN would not be bought, revamped and rebranded as Google Analytics
until 2005, a decade later, history had already been made that summer.

URCHIN was made available free to anyone who owned a website and once injected into the cyber
bloodstream, it changed the World Wide Web for good, resulting in over 90% of page views being
tracked, archived and analysed, as opposed to only 30% of partial tracking and readings pre-
URCHIN.

The end of the 90s and the beginning of the 2000s saw the official rise of big data and the integration
of web analytics into the mainstream in four major industries: marketing, information technology,
online publishing and lastly and most importantly, e-commerce. What started out as a simple tally
counter to measure how many people set their cursors on a website became one of the most refined
and sophisticated practices, sought after by every business small or large.

Here is how first design of Google Analytics, which replaced the URCHIN in 2005.
Q3: What are keywords? How To Find The Best Keywords For Your Business?

Ans: Keywords are the words and phrases that people type into search engines to find what they’re

looking for. 
For example, if you were looking to buy a new jacket, you might type something like “mens leather
jacket” into Google. Even though that phrase consists of more than one word, it’s still a keyword.

Of course, nobody outside of the SEO industry tends to use this terminology. Most people would call
them Google searches or queries. Just know that keywords are synonymous with both of these
things.

So, you found a lot of great keywords, the question now is which keywords should you go after?
Of course, some keywords will be harder to rank for and others will be easier. Some keywords could
generate you more revenue even with less traffic.
So how can you determine the best keywords to go after?
There are 2 factors you should take into consideration when you’re deciding which keywords will be
the best to use for your business.
• Commercial Intent
• Competition

Commercial Intent

Keywords are generally classified into 2 types; informational and commercial keywords.
Informational keywords are keywords people search for to learn more about a given topic, or to
search for a quick solution to their problem.
They’re just searching about something to gather information, like “what is web hosting?”.
That doesn’t mean you’re ready to choose a hosting provider today and make a purchase.
On the other hand, commercial keywords are keywords prospects use, when they have an intent to
buy a product. For example, “best web hosting company”, “content marketing services” or “best
premium digital marketing courses” etc.
This isn’t a science that I can’t teach you, but when you’re in the Google Keyword Planner, you’ll
notice a great difference between the bid value of commercial keywords and informational keywords.
You could also apply the old copywriting AIDA formula here.
Here’s what I mean:
• Attention: These are informational keywords most of the time. Good for generating traffic and
letting people know about you and your company.
• Interest: When people start to have interest in a topic, they’ll start looking for comparison, and see
which service to go after and so on.
• Desire: After they decide on the service to use, they’ll have the desire to go with that service. They
may start looking for coupons, final reviews to make sure that they chose the right product.
• Action: After they have the desire, they’ll start taking action. They’ll start searching for how to buy
the product and so on.
You don’t need to always go with action keywords, but whenever you can, it’s better.
Keyword Competition:

After you’ve found a great commercially intent keyword, you now should analyse the competition on
the first page of Google’s SERPs.
The less competitive a keyword is, the easier it will be to rank for.
Here’s a step-by-step guide on what to do:
1. Go to MozBar tool page and click on the browser you use.
2. After that enter your search in Google.
3. Check the PA (Page Authority) of the page, the higher the number, the harder it’ll be to beat that
page.
4. After that check DA (Domain Authority), same as above.
5. Click on link analysis, and observe the links that point to this site.
6. After 20-30 seconds you could easily determine whether the site uses blackhat strategies or not.
This means the page won’t last there forever.
7. Check the titles of the results on the first page, and ask yourself, are they optimised for that
keyword? Do they have keywords at the beginning or at the end of the title?
8. After that open this page result, click on the magnifying glass icon in the mozbar and choose page
elements. You’ll find information about the page’s URL H tags and image alt text.
9. If the keyword is included in the URL, H1/H2 tag, and image alt text, then consider this page a well-
optimised page. If not, this page could be easy to beat.
10. After that, check the quality of the content. If it’s a long informative post that cites research and
educational institutions, it’ll be hard to beat this page.
11. Most easy targets are pages with a PA less than 10, such as; Ehow articles, Ezine articles, Ebay,
wordpress.com or blogspot pages. This is a sign that you should target this keyword.

Q4: What Is Keyword Analysis? Explain the Importance of Keyword Analysis.

Ans: Keyword analysis is the process of analyzing the keywords or search phrases that bring visitors
to your website through organic and paid search. As such, keyword analysis is the starting point and
cornerstone of search marketing campaigns.

By understanding what queries qualified visitors to your website type into search engines, search
marketers can better customize their content and landing pages to drive more traffic and increase
conversion rates. For this reason, keyword analysis is an important skill for both SEO and PPC
experts
Keyword analysis helps to increase conversions, find new markets, and optimize spend, but it
requires time-consuming examination and decision making to beat your keyword competition. The
WordStream Keyword Analysis Tool takes the analysis of your website keywords a step further by not
only analyzing your keywords, but also suggesting actions and automating your activity for the best
efficiency and results.

Do you really want to look at spreadsheets and graphs for hours a day every day? And after that,
what comes next? WordStream eliminates this time waste, streamlining the process of analyzing
keywords, highlighting the vital marketing performance metrics, and prioritizing actions to greatly
improve your efficiency while simultaneously improving your PPC performance.

The Importance of Keyword Analysis


Marketing is inherently analytic. Field-testing marketing outreach and marketing performance is key to
optimizing budget allocation and market reach. Search marketing is no different, and since keywords
dictate your entire search campaign, keyword analysis should be your primary focus. Analyzing
keywords allows you to:
 Optimize Spend: Distribute more budget to successful keywords and eliminate wasteful
spending on those that aren’t producing results
 Increase Conversions: Identifying and focusing on well-converting keywords is good
for conversion rate optimization and return on investment (ROI)
 Eye Trends: Knowledge of keyword search frequency provides insight into market behaviour
which you can apply to multiple aspects of your business
 Prioritize Your Time: Keyword performance guides campaign importance–spend your time
optimizing areas that have the biggest impact on your bottom line
 Find New Markets: Use keyword analysis to expand your long tail efforts and discover more
specific keyword queries and corresponding warm leads
Despite all the benefits, most search marketers don’t spend nearly enough time on keyword analysis
because it’s time-consuming and repetitive. With keyword software from WordStream, analyzing
keywords and capitalizing on data is automated and simplified, making you more productive.
Q5: What are the two main categories of web analytics?
The two main categories of web analytics are off-site web analytics and on-site web analytics.

Off-site web analytics

The term off-site web analytics refers to the practice of monitoring visitor activity outside of an
organization's website to measure potential audience. Off-site web analytics provides an industrywide
analysis that gives insight into how a business is performing in comparison to competitors. It refers to
the type of analytics that focuses on data collected from across the web, such as social
media, search engines and forums.

On-site web analytics

On-site web analytics refers to a narrower focus that uses analytics to track the activity of visitors to a
specific site to see how the site is performing. The data gathered is usually more relevant to a site's
owner and can include details on site engagement, such as what content is most popular. Two
technological approaches to on-site web analytics include log file analysis and page tagging.

Log file analysis, also known as log management, is the process of analyzing data gathered from log
files to monitor, troubleshoot and report on the performance of a website. Log files hold records of
virtually every action taken on a network server, such as a web server, email server, database server
or file server.

Page tagging is the process of adding snippets of code into a website's HyperText Markup Language
code using a tag management system to track website visitors and their interactions across the
website. These snippets of code are called tags. When businesses add these tags to a website, they
can be used to track any number of metrics, such as the number of pages viewed, the number of
unique visitors and the number of specific products viewed.

Q6: Explain the process of Web analytics.

Ans: The web analytics process involves the following steps:

1. Setting goals. The first step in the web analytics process is for businesses to determine
goals and the end results they are trying to achieve. These goals can include increased
sales, customer satisfaction and brand awareness. Business goals can be both quantitative
and qualitative.

2. Collecting data. The second step in web analytics is the collection and storage of data.
Businesses can collect data directly from a website or web analytics tool, such as Google
Analytics. The data mainly comes from Hypertext Transfer Protocol requests -- including
data at the network and application levels -- and can be combined with external data to
interpret web usage. For example, a user's Internet Protocol address is typically associated
with many factors, including geographic location and clickthrough rates.

3. Processing data. The next stage of the web analytics funnel involves businesses
processing the collected data into actionable information.

4. Identifying key performance indicators (KPIs). In web analytics, a KPI is a quantifiable


measure to monitor and analyze user behaviour on a website. Examples include bounce
rates, unique users, user sessions and on-site search queries.
5. Developing a strategy. This stage involves implementing insights to formulate strategies
that align with an organization's goals. For example, search queries conducted on-site can
help an organization develop a content strategy based on what users are searching for on
its website.

6. Experimenting and testing. Businesses need to experiment with different strategies in


order to find the one that yields the best results. For example, A/B testing is a simple
strategy to help learn how an audience responds to different content. The process involves
creating two or more versions of content and then displaying it to different audience
segments to reveal which version of the content performs better.

Q7: Explain conversion Metrics. How to Measure it?


Ans: conversion metric examines how effective you are at converting your online audience into
paying customers. Google defines a conversion as what happens when someone clicks your ad and
then takes an action that you’ve defined as valuable. The conversion measure itself will vary
depending on the type of business you operate. For instance, an ecommerce company will want to
measure online purchase whereas a SaaS business will want to measure number of trial sign ups.
The example image for the online conversion metric shows how a software company may want to
track this metric.
Page Visits and Completion Rate

Digital marketers are typically hyper-aware of online conversion paths on the website they manage.
For many businesses, these conversion paths map to product, feature, and shopping cart pages.
Therefore, measuring traffic to these pages is a strong indicator of the number of people within the
purchase funnel that are showing consideration or intent to purchase. Likewise, tracking on-page
completion rates corroborates this indicator and demonstrates the effectiveness of the website at
converting visitors into leads.
Customer statistics

To complete the picture of the purchase funnel, digital marketers need to track new and returning
customers over time. Again, the example given above is representative of a software company, but
the same logic applies to other business types. Here are a few key stats to consider tracking:

 New Customers: First time buyers.


 Returning Customers: Buyers that have made at least 1 previous purchase.
 Average Revenue per Buyer: Total revenue for a period divided by total number of buyers.
 Total Transactions: The total number of purchases made during a period regardless if the
purchase was made by a new or a returning customer.
 Total Revenue: The amount of revenue generated during a period from all customer types.

Conversions vary from business to business, depending on the enterprise’s goals. They can be
anything you want your visitors to do. While some conversions are purchase-related, others are
geared towards pushing the prospect further into the funnel.
Examples of conversions include;
 Filling out a form
 Making a purchase
 Signing up for a newsletter
 Reaching a certain point in the content 
 Downloading a resource; eBook, whitepaper
 Attending a webinar
 Visiting a specific webpage
Conversion Metrics Brands Should be Measuring
Brands and businesses can track almost anything as a conversion, however, a few carry more weight
than others.
1. Lead Inquiries
A lead inquiry can be anything that demonstrates a visitors interest in your products or services. This
can come as a form submission, phone call, pricing request, or contact form submission.
2. Click-Through Rate (CTR)
Whether you’re running paid ads or thriving solely on organic traffic, the CTR of your content plays a
key role in driving revenue. CTR is calculated by diving the number of actual clicks on your ad or
SERP listing by the number of impressions.  CTR is a great barometer for gauging how well your ad
copy, headlines, title tags, and meta descriptions are performing.  
3. Sales & Revenue
If users can purchase products on your site or perform any type of transaction then measuring sales
is a must. Tracking sales by source, channel, and campaign allows you to adjust marketing efforts to
the most profitable channels quickly.
4 Cost Per Acquisition (CPA)
CPA is the expense you incur to acquire a new customer. You calculate the cost per acquisition by
dividing the total marketing costs by the number of customers acquired.
CPA allows you to gain insights into the profitability of your marketing and business as a whole. If
your CPA is higher than the revenue the new customers bring in, then you may be in a path to loss.
However, if new customers are bringing in more than the cost of acquiring them, then you are on a
profitability path.
5. Return on Ad Spend (ROAS)
Return on Ad Spend (ROAS) allows marketers to determine if they will get the money they spend on
advertising back. It shows the most and least financially effective advertising campaigns so that you
can adjust your marketing budget accordingly.
A positive ROAS means that you’re making more in revenue or conversion value that you’re spending
(e.g. for every $1 spent on ad spend you make >$2). A negative ROAS indicates a marketing channel
that is need of attention or having its budget shifted elsewhere. You calculate ROAS using the
formula; (Revenue-Cost)/Cost.
6. Time on Site
Time on site can be a misleading metric sometimes. If your content is designed to quickly funnel
users through a conversion funnel or directly solve their problem on a single page a user’s time on
site could be quite low. Time on site is often referred to as average session duration and measures
how long visitors spending on your site per visit.
7. Interactions Per Visit
Interactions per visit measures engagement actions users are taking within your site. Even if visitors
are not converting, they’re taking steps towards it with product views, add to carts, link clicks, and
more. Measuring these interactions allows you to better understand user behaviour and how to
improve site conversion.
8. Value Per Visit
Value per visit is a calculation of the monetary value you’re getting from your overall site traffic. It’s
tied directly to the previous metric, interactions per visit. It’s calculated by taking the number of total
visits and dividing it by the total value created. If your site is eCommerce focused this is a little easier
to track, however, if your site is primarily focused on lead gen you’ll need to have a monetary value
assigned to each goal within Google Analytics.
9. Conversion Rate
Conversion rate is the percentage of visitors that make a defined conversion action on your site. This
is a top-level metric that tells you how well your site is transitioning visitors to leads or customers. The
overall average conversion rate is around 2%, however, depending on your industry conversion rate
averages could be as low as <1% or as high as 10%.

Q8: Explain various advantages and disadvantages of web analytics.


Ans: There are several advantages and limitations of Web analytics, here we look at the top 5
benefits & limitations of web analytics. By being aware of them, organizations can take actions to
leverage the advantages and modify their way of working to overcome the limitations.

Advantages

 Web analytics helps an organization make better decisions


Lot of times decisions within organizations are made more on gut feel rather than facts and
data. One of the reasons for this could be lack of access to quality data that can help with
better decision making. Analytics can help with transforming the data that is available into
valuable information for executives so that better decisions can be made. This can be a source
of competitive advantage if fewer poor decisions are made since poor decisions can have a
negative impact on a number of areas including company growth and profitability.
 Increase the efficiency of the work
Analytics can help analyse large amounts of data quickly and display it in a formulated manner
to help achieve specific organizational goals. It encourages a culture of efficiency and
teamwork by allowing the managers to share the insights from the analytics results to the
employees. The gaps and improvement areas within a company become evident and actions
can be taken to increase the overall efficiency of the workplace thereby increasing productivity.
 The analytics keeps you updated of your customer behavioural changes
In today’s world, customers have a lot of choices. If organizations are not tuned to customer
desires and expectations, they can soon find themselves in a downward spiral. Customers
tend to change their minds as they are continuously exposed to new information in this era of
digitization. With vast amount of customer data, it is practically impossible for organizations to
make senses of all the changes in customer perception data without using the power of
analytics. Analytics gives you insights into how your target market thinks and if there is any
change. Hence, being aware of shift in customer behaviour can provide a decisive advantage
to companies so that they can react faster to the market changes.
 Personalization of products and services
Gone are the days where a company could sell a standard set of products and services to
customers. Customers crave products and services that can meet their individual needs.
Analytics can help companies keep track of what kind of service, product, or content is
preferred by the customer and then show the recommendations based on their preferences.
For example, in social media, we usually see what we like to see, all of this is made possible
due to the data collection and analytics that companies do. Data analytics can help provide
targeted services to customers based on their individual requirements.
 Improving quality of products and services
Data analytics can help with enhancing the user experience by detecting and correcting errors
or avoiding non-value-added tasks. For example, self-learning systems can use data to
understand the way customers are interacting with the tools and make appropriate changes to
improve user experience. In addition, data analytics can help with automated data cleansing
and improving the quality of data and consecutively benefiting both customers and
organizations.

Limitations

 Lack of alignment within teams


There is a lack of alignment between different teams or departments within an organization.
Data analytics may be done by a select set of team members and the analysis done may be
shared with a limited set of executives. However, the insights generated by these teams are
either of not much value or are having limited impact on organizational metrics. This could be
due to a “silos” way of working with each team only using their existing processes
disconnected from other departments. The analytics team should be focussed on answering
the right questions for the business and the results generated by data analytics teams needs to
be properly communicated to the right employees to drive the right set of actions and
behaviours so that it can have an positive impact on the organization.
 Lack of commitment and patience
Analytics solutions are not difficult to implement, however, they are costly, and the ROI is not
immediate. Especially, if existing data is not available, it may take time to put processes and
procedures in place to start collecting the data. By nature, the analytics models improve
accuracy over time and require dedication to implement the solution. Since the business users
do not see results immediately, they sometimes lose interest which results in loss of trust and
the models fail. When an organization decides to implement data analytics methods, there
needs to be a feedback loop and mechanism in place to understand what is working and what
is not, and corrective actions are required to fix things that are broken. Without this closed loop
system, senior management may decide that analytics is not working or much valuable and
may abandon the entire exercise.
 Low quality of data
One of the biggest limitations of data analytics is lack of access to quality data. It is possible
that companies already have access to a lot of data, but the question is do they have the right
data that they need? A top down approach is required where the business questions that need
to be answered need to be known first and what data is required to answer these questions
can then be determined. In some cases, data may have been collected for historical reasons
may not be suitable to answer the questions that we ask today. At other times, even though we
have the right metrics that we are collecting data on, the quality of the data collection may be
poor. There can be instances where adequate data is not available or is missing for proper
analytics to be done. As they say, garbage-in garbage-out. If the data quality is poor, the
decision made by using this data is also going to be poor. Hence, actions must be taken to fix
the quality of the data before it can be effectively used within organizations.
 Privacy concerns
Sometimes, data collection might breach the privacy of the customers as their information such
as purchases, online transactions, and subscriptions are available to companies whose
services they are using. Some companies might exchange those datasets with other
companies for mutual benefit. Certain data collected can also be used against a person,
country, or community. Organizations need to be cautious of what sort of data they are
collecting from customers and ensure the security and confidentiality of the data. Only the data
required for the analysis needs to be captured and if there is sensitive data, it needs to be
anonymized so that sensitive data is protected. Data breaches can cause customers to lose
trust in the organizations which may result in a negative impact on the organization.
 Complexity & Bias
Some of the analytics tools developed by companies are more like a black box model. What is
inside the black box is not clear or the logic the system uses to learn from data and create a
model is not readily evident. For example, a neural network model that learns from various
scenarios to decide who should be given a loan and who should be rejected. The usage of
these tools may be easy but the logic of how decisions are made is not clear to anyone within
the company. If companies are not careful and a poor quality data set is used to train the
model, there may be hidden biases in the decisions made by these systems which may not be
readily evident and organizations may be breaking the law by discriminating against race,
gender, sex, age etc.

MCQs

1. The development of the led to the development of web analytics.


a) Social media
b) Internet
c) Google Analytics
d) Insights
2. The first forms of analytics tools were built to monitor .
a) Website links
b) User behaviour
c) Social media
d) Interaction and conversions

3. Which of these is not an indicator or signal of a website meaningful and


significant for web analytics?
a) The gross quantity of visitors on a website or traffic quantum
b) The number of unique visitors or visitors who are new
c) The number of social media followers
d) The category of users searches for particular keywords or search terms

4. Which of these is an indicator or signal of a website meaningful and


significant for web analytics?
a) The number of users who signed up for a newsletter
b) The number of links you have on the home page
c) The number of links you post in blog articles
d) The number of users who go beyond the main page, to deeper links

5. The first web analytics tool was called .


a) Google Ads
b) Google Analytics
c) WebTrends
d) Cookies

6. First analytics tools shifted focus on in order to monitor the


performance of the websites.
a) User behaviour
b) Search engine ads
c) Search engine results
d) External links

7. The first form of web analytics were .


a) Spreadsheets
b) Log files
c) Infographics
d) Tables

8. JavaScript enables .
a) Log files
b) Analytics
c) Page tagging
d) HTML

9. Page tagging enabled web analysts to find more information about the users, such as
.
a) If a user was a new guest or frequent visitor
b) User behaviour
c) Analytics
d) Social media tags

10. The first free analytics software was introduced in 1995 under the name of
.
a) Google Analytics
b) URCHIN
c) HORSE
d) Bing Analytics

11. In order to avoid the analysis being a bunch of data without any
meaning, you have to set up .
a) Google AdWords
b) Goals and objectives
c) Social media accounts
d) Traffic and conversions

12. Which of these is not data segment?


a) Acquisition
b) Behaviour
c) Outcomes
d) Banners

13. Web analytics helps you with different tasks, such as understanding
the audience, which will help you find out .
a) How much time users spend on the website
b) Which pages are the most and the least visited
c) Which websites refer the most traffic to your pages
d) What can be done to increase conversions

14. Web analytics helps you with different tasks, such as determining the
strengths and weaknesses of your website, which will help you find out .
a) How much time users spend on the website
b) Which pages are the most and the least visited
c) Which websites refer the most traffic to your pages
d) What can be done to increase conversions

15. Two major components of any web analytics process are:


a) Website and search engines
b) Analytics software and the analyst
c) Google Analytics and Bing Analytics
d) Users and conversions

16. The ‘Who’ aspect of web analytics helps you find out:
a) What kind of traffic your website attracts
b) How long the users stayed on your website
c) The data about referrals
d) The data about which keyword or phrase brought users to your website

17. The ‘Where’ aspect of web analytics help you find out:
a) What kind of traffic your website attracts
b) How long the users stayed on your website
c) The data about referrals
d) The data about which keyword or phrase brought users to your website

18. Metrics are .


a) Also known as impressions
b) Indices used to gauge the performance of the website
c) The time the user took to browse a website in one go
d) The number of views of one particular page
Answers
1. b
2. b

3. c

4. d

5. c

6. a

7. b

8. c

9. a

10. b

11. b

12. d

13. a

14. b

15. b

16. a

17. c
18. b
Unit-2
Short Answers
Q1: How are social networking websites useful?

Ans: Social networking websites help people meet and discuss topics or issues online. They
provide a place for people to interact and share their views, photographs and videos.

Q2 : In what way does Web 1.0 differ from Web 2.0?

Ans: Web 1.0 belonged to the 1990s, when the internet or web was a place to read articles,
listen to music and look for information.
Web 2.0 of the present day is a term for social networking websites where people meet and
share their views, videos and photographs. It is a lot more interactive that Web 1.0.

Q3: Mention two special features of Facebook.

Ans: Facebook is the biggest social networking service on the internet, with over two billion
active users. Besides offering everything that offer websites provide, it has a mini chat-box to
privately chat with one’s friends.

Q4: Name two popular social networking websites that were launched after MySpace. How did
they score over it?

Ans: Social networking websites that came after MySpace helped people create and share what
they loved, including music and much more. Twitter and Facebook are two of them.

Q5: Find the word from the lines above that means the same as ‘a thing or person that gives you
new ideas and the enthusiasm to create something with them’.

Ans: Inspiration

So, these were On Social Networking Questions & Answers.

Q6: Write True or False:

(a) SixDegrees was a social networking website for students – False


(b) LinkedIn is a website for professional networking – True
(c) Orkut was the most popular social networking site in the US – False
(d) Facebook has over a billion users – True
(e) There is a constant debate about the benefits and dangers of using social networking
websites – True
(f) MySpace took the world by storm with its focus on sports – False

Q7 What are cookies on the internet?


Ans: Cookies are files that hold information about you, your web browser and your behaviour on
the internet. They are tiny files stored on your PC or device, which can be used by websites or
web apps to tailor your online experience.
Q8: What do cookies do?
Ans: Cookies are sent between a sender (usually a website or a web app) and a receiver (your
device). A cookie is created and interpreted by the sender, while the receiver only holds it and
sends it back if the sender asks for it.
When browsing the web, the sender is the server on which a website runs, and the receiver is the
web browser of the user who visits that website. Their purpose is to identify the user, check for
his or her past activity on the website and provide appropriate content based on this data.

Long Answers:
Q1: What is social network perspective?
Ans: The social network perspective emphasizes multiple levels of analysis. Differences among
actors are traced to the constraints and opportunities that arise from how they are embedded in
networks; the structure and behaviour of networks grounded in, and enacted by local interactions
among actors. 

Differences among individuals in how connected they are can be extremely consequential for
understanding their attributes and behaviour . More connections often mean that individuals are
exposed to more, and more diverse, information. Highly connected individuals may be more
influential, and may be more influenced by others. Differences among whole populations in how
connected they are can be quite consequential as well. Disease and rumors spread more quickly
where there are high rates of connection. But, so to does useful information. More connected
populations may be better able to mobilize their resources, and may be better able to bring
multiple and diverse perspectives to bear to solve problems. In between the individual and the
whole population, there is another level of analysis -- that of "composition." Some populations
may be composed of individuals who are all pretty much alike in the extent to which they are
connected. Other populations may display sharp differences, with a small elite of central and
highly connected persons, and larger masses of persons with fewer connections. Differences in
connections can tell us a good bit about the stratification order of social groups.  A great deal of
recent work by Duncan Watts, Doug White and many others outside of the social sciences is
focusing on the consequences of variation in the degree of connection of actors.

Because most individuals are not usually connected directly to most other individuals in a
population, it can be quite important to go beyond simply examining the immediate connections
of actors, and the overall density of direct connections in populations. The second major (but
closely related) set of approaches that we will examine in this chapter have to do with the idea of
the distance between actors (or, conversely how close they are to one another). Some actors
may be able to reach most other members of the population with little effort: they tell their friends,
who tell their friends, and "everyone" knows. Other actors may have difficulty being heard. They
may tell people, but the people they tell are not well connected, and the message doesn't go far.
Thinking about it the other way around, if all of my friends have one another as friends, my
network is fairly limited -- even though I may have quite a few friends. But, if my friends have
many non-overlapping connections, the range of my connection is expanded. If individuals differ
in their closeness to other actors, then the possibility of stratification along this dimension arises.
Indeed, one major difference among "social classes" is not so much in the number of connections
that actors have, but in whether these connections overlap and "constrain" or extent outward and
provide "opportunity." Populations as a whole, then, can also differ in how close actors are to
other actors, on the average. Such differences may help us to understand diffusion,
homogeneity, solidarity, and other differences in macro properties of social groups.

Q2: what do you mean by Social Networks and Social Ties?

Ans: Social Networks


You already know what a social network is. Are you familiar with Facebook? What about
Instagram, Pinterest, LinkedIn or Twitter? Those are all Internet-based social networks. A social
network is a structure of relationships that links people, or groups of people, together. The
network is simply the structure, or the vehicle. Think of the social network as the overall
organization of the relationships. The network is the basic tool people use to connect to society.
Information and ideas flow between people via the network.
Note that it doesn't matter how the people are linked. People can connect and socialize using a
variety of different social networks, such as those based on live interaction, letter writing or
Internet correspondence.
The network, or structure, connects various actors. Actors are individuals or organizations linked
together through a social network. Actors are also sometimes called nodes. Your Facebook
friends and the people you follow on Twitter are actors, or nodes, in your social network.
The information passed between the actors, or nodes, is known as a narrative. Typically, there
are many narratives being passed within a network at any given time. For example, let's say I
post something on Facebook. I post a picture of me at the beach, on vacation. I write, 'Having a
great time at the beach!' That's my narrative to all of my Facebook friends. I'm narrating my
perspective, or the information I want you to have. At the same time I'm narrating, many other
people in the network are also narrating their information.
What's important to remember is that narratives aren't always objectively true. Narratives
represent just one author's story. Maybe I'm having a terrible time, or maybe I'm not at the beach
at all!

Social Ties
The narratives travel along ties. These are social connections, or links, between the actors. You'll
typically find many ties in a network. Ties represent the relationships between the actors and
range in quality from weak to strong. For example, I barely know my neighbor John. I hardly ever
see him. He's an acquaintance more than a friend. That's a weak tie. However, I still talk to my
mom on the phone at least weekly. That's a close family bond and a strong tie.
If you're familiar with LinkedIn, it provides a great example. You can see the strength of your ties
right there on the screen. The network identifies people who are connected directly to you as 'first
connections.' Friends of friends are identified as 'secondary connections,' and so on. The ties get
weaker the further they progress.
Here's another example: I live in Dallas, Texas, very near where several people were diagnosed
with the Ebola virus. Once 'Patient Zero' tested positive for Ebola, professionals immediately
began researching his social network in order to determine who else might be carrying the virus.
First connections were placed into quarantine. Secondary connections were monitored but not
quarantined. People who came into contact with secondary connections were notified and asked
to self-monitor for symptoms. You can see how the restrictions weakened as the ties to Patient
Zero weakened.

Q3: what are the nodes and edges in Social Networks? Explain.
Ans: Nodes and Edges

Up until now, I have referred to both actors and relationships. In network science, actors are

referred to as nodes (the dots on the graph) and relationships as edges (the lines on the graph).

You will see me use this terminology throughout the rest of this article.

Nodes can represent a variety of ‘actors’. In internet networks, nodes can represent web pages. In

social networks, nodes can represent people. In supply chain networks, nodes can represent

organizations. In foreign relations networks, nodes can represent countries. While nodes can
represent a variety of things, they are all the thing that has a relationship with another thing.

Edges can represent a variety of ‘relationships’. In internet networks, edges can represent

hyperlinks. In social networks, edges can represent connections. In supply chain networks, edges

can represent the transfer of goods. In foreign relations networks, edges can represent policies.

Like nodes, edges can represent a variety of things.

Nodes and edges are a key concept in networks, so make sure you have a good understanding of

them before tackling the other concepts.

Q4: Explain the terms Node Direction, Edge Weight, Centrality and degree of nodes.

Ans:Edge Direction
There are two types of edges: directed and undirected. It will be necessary to decipher what type
of edge your data contains when building a network graph.
Directed edges are applied from one node to another with a starting node and an ending node.
For example, when a twitter user tags another twitter user in a tweet, that relationship is directed.
The user who wrote the tweet (starting node) applied that relationship to the user who they tagged
(ending node). The tagged user has not necessarily reciprocated that relationship. Another
example of a directed edge are payments. If a customer (starting node) pays a coffee shop
(ending node) for a coffee, that relationship is not necessarily reciprocated because the coffee
shop has not also paid the customer.

Undirected edges are the opposite of directed edges. These relationships are reciprocated by
both parties without a clear starting node and ending node. For example, if two people are friends
on Facebook, that relationship is undirected. This is because it can be said that Person A is
friends with Person B, but it can also be said that Person B is friends with Person A. Another
example of an undirected edge is Meetup groups. This is because it can be said that Person A is
in a group with Person B, but it can also be said that Person B is in a group with Person A.
Edge Weight
An Edge’s weight is the number of times that edge appears between two specific nodes. For
example, if Person A buys a coffee from a coffee shop 3 times, the edge connecting Person A and
the coffee shop will have a weight of 3. However, if Person B only buys coffee from the coffee
shop once, the edge connecting Person B and the coffee shop will have a weight of 1.
Centrality Measures
Centrality is a collection of metrics used to quantify how important and influential a specific node is
to the network as a whole. It is important to remember that centrality measures are used on
specific nodes within the network, and do not provide information on a network level. There are
several centrality measures, but this guide will cover degree, closeness, and betweenness.
Degree
A node’s degree is the number of edges the node has. In an undirected network(see edge
direction section), there is only one measure for degree. For example, if node A has edges
connecting it to Node B and Node D, then node A’s degree is 2.
However, in a directed network, there are actually three different degree measures. Because
these edges have a starting and end node, the in-degree (number of edges the node is an end
node of), out-degree(number of edges a node is a starting node of), and degree (number of edges
a node is either a starting node or end node of) can be calculated.

Q5: What are Web Beacons? Explain their uses.

Ans: Web Beacons are minute objects packaged with HTML formatted web pages or a tiny
graphics clear file that is specifically designed to track your online activities. It keeps a record of
all your navigation activities through various websites. 
Also known as Web bugs these are mainly deployed by third-party websites for monitoring their
website traffic and other tracking services. Tech experts also say that there is an underlying
connection between Web Beacons and cookies. Both are used to track users’ online activities to
understand their navigation patterns and process the website content accordingly. 
A Web Beacon is made up of a clear file, usually a 1×1 pixel, that can track users similar to a
cookie.  A web beacon is a technique used on web pages and emails to unobtrusively check that
a user has accessed some content.
Web beacons are used to help the website owner track the journey of the user navigating through
the website or a series of websites. They can be delivered through a web browser or in an email.
They can be used in conjunction with cookies to understand the user, their behaviour , and how
they interact with the content on the website.
Web beacons are also known as web bugs, and they can help the experience for users when
companies analyze the information they gather. They’re used when monitoring online ad
impressions, understanding user behaviour , and tracking the success of ad campaigns.  It
is important for website owners who have web beacons to be transparent about how they use the
beacon and what information is collected from the user.

Let’s take a look at an example to understand this better. 


Say for instance a company owns several websites and wants to understand the user’s
behaviour . Here the company can use a Web beacon to track the navigation pattern of its users
among its various websites. The collected data can be further used by the company to analyze to
improve its network browsing making it user friendly and efficient.
Different uses of Web Bugs or Web Beacons
Web Beacons can be used in multiple ways by companies to analyze the information they collect.
Here are some of the common areas where web bugs are used.

 They are used worldwide to understand users’ behaviour .


 It also helps in tracking the effectiveness of an ad campaign and gives an approximate
idea about its success rate.
 Many companies also use it to monitor and track the impressions of their online ads. 
 Web Beacons are also widely used by social media platforms to analyze what content is
being shared on their platforms by third-party websites. 

To summarize the main use of Web bugs is to monitor customer’s behaviour and preferences.

Are Web Beacons Used To Monitor Web Traffic?

A web beacon is a transparent image file used to monitor your journey around a single website or
collection of sites. They are also referred to as web bugs and are commonly used by sites that
hire third-party services to monitor traffic. They may be used in association with cookies to
understand how visitors interact with the pages and content on the pages of a web site.

For example, a company owning a network of sites may use web beacons to count and
recognise users travelling around its network. Being able to recognise you enables the site owner
to personalise your visit and make it more user friendly.

Q6: What is a Lead? Explain different types of Leads available for a Business. How can we
generate them?

Ans: A lead is any person who indicates interest in a company's product or service in some way,
shape, or form.

Leads typically hear from a business or organization after opening communication (by submitting


personal information for an offer, trial, or subscription) … instead of getting a random cold call
from someone who purchased their contact information.
Let's say you take an online survey to learn more about how to take care of your car. A day or so
later, you receive an email from the auto company that created the survey about how they could
help you take care of your car. This process would be far less intrusive than if they'd just called
you out of the blue with no knowledge of whether you even care about car maintenance, right?
This is what it's like to be a lead.
And from a business perspective, the information the auto company collects about you from your
survey responses helps them personalize that opening communication to address your existing
problems — and not waste time calling leads who aren't at all interested in auto services.
There are different types of leads based on how they are qualified and what lifecycle stage
they're in.
Marketing Qualified Lead (MQL)

Marketing qualified leads are contacts who've engaged with your marketing team's efforts but
aren't ready to receive a sales call. An example of an MQL is a contact who fills out a landing
page form for an offer (like in our lead generation process scenario below).
Sales Qualified Lead (SQL)

Sales qualified leads are contacts who've taken actions that expressly indicate their interest in
becoming a paying customer. An example of an SQL is a contact who fills out a form to ask a
question about your product or service.
Product Qualified Lead (PQL)

Product qualified leads are contacts who've used your product and taken actions that indicate
interest in becoming a paying customer. PQLs typically exist for companies who offer a product
trial or a free or limited version of their product (like HubSpot!) with options to upgrade, which
is where your sales team comes in. An example of a PQL is a customer who uses your free
version but engages or asks about features that are only available upon payment.
Service Qualified Lead

Service qualified leads are contacts or customers who've indicated to your service team that
they're interested in becoming a paying customer. An example of an service qualified lead is a
customer who tells their customer service representative that they'd like to upgrade their product
subscription; at this time, the customer service representative would up-level this customer to the
appropriate sales team or representative.

Lead generation 

Lead generation is the process of attracting prospects to your business and increasing their
interest through nurturing, all with the end goal of converting them into a customer. Some ways to
generate leads are through job applications, blog posts, coupons, live events, and online content.

 These lead generators are just a few examples of lead generation strategies you can use to
attract potential customers and guide them towards your offers.

Whenever someone outside the marketing world asks me what I do, I can't simply say, "I create
content for lead generation." It'd be totally lost on them, and I'd get some really confused looks.
So instead, I say, "I work on finding unique ways to attract people to my business. I want to
provide them with enough goodies to get them naturally interested in my company so they
eventually warm up to the brand enough to want to hear from us!"

That usually resonates better, and that's exactly what lead generation is: It's a way of warming
up potential customers to your business and getting them on the path to eventually making a
purchase.
Why do you need lead generation?

When a stranger initiates a relationship with you by showing an organic interest in your business,
the transition from stranger to customer is much more natural.
Lead generation falls within the second stage of the inbound marketing methodology. It
occurs after you've attracted an audience and are ready to convert those visitors into leads for
your sales team (namely sales-qualified leads).

As you can see in the diagram below, generating leads is a fundamental point in an individual's
journey to becoming a delighted customer.

Lead Generation Process


Now that we understand how lead generation fits into the inbound marketing methodology,
let's walk through the steps of the lead generation process.

1. First, a visitor discovers your business through one of your marketing channels, such as
your website, blog, or social media page.

2. That visitor then clicks on your call-to-action (CTA) — an image, button, or message that


encourages website visitors to take some sort of action.
3. That CTA takes your visitor to a landing page, which is a web page that is designed to
capture lead information in exchange for an offer.
Note: An offer is the content or something of value that's being "offered" on the landing
page, like an ebook, a course, or a template. The offer must have enough perceived value
to a visitor for them to provide their personal information in exchange for access to it.)
4. Once on the landing page, your visitor fills out a form in exchange for the offer. (Forms are
typically hosted on landing pages, although they can technically be embedded anywhere
on your site.) Voila! You have a new lead. That is, as long as you’re following lead-
capture form best practices.

Q7: What is Competitive Intelligence Analysis? Why and How to Choose?


Ans: We already have a lot of data at our disposal in the world of Web Insights. Clickstream,
Outcomes, Surveys, Usability etc. That in itself is hard to wrap our brains around.

But there is one more really great source of data out there that I am sure some of you are already
tapping into, perhaps not as many of as that could benefit from it. I am of course talking about the
data that falls broadly into the category of “competitive intelligence”.

The most compelling reason to tap into competitive intelligence is that in many ways it allows you
to step away from your silo, the existence that is pretty much defined by your web analytics tool
or data warehouse or all things connected to your website. In optimizing our website and
experience and expenses based on just our website data we might not be optimizing for the
overall business landscape.

Why should you care?

 Revenue for your e-commerce business, selling DVD rentals, is up 30% year over
year (YOY). Cause for celebration? Maybe or maybe not, it would depend on how the
overall DVD rental business is doing on the web. If it is up YOY by 100% then that is
not so great.
 You have a cute pet mascot and the TV ads for dog food you run have increased
traffic by 100% in two months. What is the impact of that, your ads, on your chief
competitor?
 After spending half a million dollars on a SEO (search engine optimization) project
over six months you have increased your traffic from top five key phrases by 20%
(that is huge). If the number of people searching are the same is that increase the
expense of your affiliates or competitors or the fact that in those six months traffic on
the web increased in your category by yyy%?
 An industry rag reports that your competitor has been eating your breakfast, lunch and
dinner. You look bad. To save your job you need to find out if your competitor has
started new kinds of campaigns (say affiliate) or they have started advertising on
websites you don’t know of or have massively ramped up PPC spending or targeted
certain new demographic.
 You are looking to provide true business strategic insights in a culture where web
analytics suffers its life as a “reporting” function. Competitive intelligence provides an
option to get ahead of the business with some game changing actionable insights.
These are very simple scenarios but in each of these competitive intelligence is key to providing
the answers that you need.  It helps you understand your performance in the context of the
greater web eco-system and allows you to better understand causality due to “eco-system
trends” vs. your actions (or lack there of).

It is pretty easy to celebrate success (or sometimes failure) of our websites based on just our
numbers (Omniture, ClickTracks, WebTrends, HBX etc), true delight comes knowing how you are
doing vis a vis your competitors or industry as a whole.

What options are out there?

A search of the phrase web competitive intelligence yields 45 million results. I am sure there


are that many ways to get competitive intelligence. : ) The focus of this post will be on two of the
“big boys” in this space: ComScore and HitWise.

[A poor man’s option for basic competitive data is Alexa. I use it on my blog goal tracking page.
Alexa collects its data via folks who install its tool bar and the data is extremely basic for this
reason I am not including it.]

HitWise and ComScore are radically different services, as we’ll outline below, and you should be
very careful in your choice and ensure that you are choosing the right one. Success of your
valuable dollars invested depends on choosing the right service for your company, and of course
then diving in and playing James Bond. : )

How do they capture data?

At a summary level HitWise is “ISP based” and ComScore is “Panel based” in terms of data
capture (vastly different ways of collecting data).

HitWise has agreements with ISP’s worldwide whereby the ISP’s share the anonymous weblog
data collected on the ISP network with HitWise. This data is analyzed by HitWise. They also
combine this data with a worldwide opt-in panel to get demographic and lifestyle information.

ComScore on the other hand has a panel of people who opt in to be 100% monitored as they surf
the web (by ComScore installing monitoring software on their Panel Member’s computers and
then funneling 100% of the surfing via their proxy servers). In exchange for being monitored the
Panel Member gets one (or combo) of these benefits: 

 Server-based virus protection


 Attractive sweepstakes prizes
 Opportunity to impact and improve the Internet

Q8: How structured data helps in SEO?


Ans: Structured data is important for SEO because it helps search engines find and understand
your content and website. It’s also an important way to prepare for the future of search, as
Google and other engines continue to personalize the user experience and answer questions
directly on their SERPs.
Google’s SERPs weren’t always as easy on the eye as they are today. Don’t remember? Check
out this Google result for “pool tables” from 2008.

Let’s compare. Here’s the same result from today.


Wow. That’s a world of difference. Not only are these results easier to read, but the extra features
make for a much more informative, intelligent searching — and shopping — experience. Between
the sponsored content and live map (plus the product carousel, question snippets, and related
searches not shown in the screenshot), Google provides pretty much everything I need to know
about pool tables.

Heck, sometimes I search for something and find the answer right on the SERP — I don’t even
have to click on a result. Does that ever happen to you? If it has, you can thank structured data.

How does structured data work?

At this point, you might be asking: How can there exist a language (markup) that is consistently
recognized by search engines and people alike?

In order for this markup to be accurately and universally understood, there are standardized
formats and vocabularies that should be used.

Let’s go back to basics for a minute. When conveying information, whether you’re communicating
with a human or a computer, you need two main things: vocabulary (a set of words with known
meanings) and syntax (a set of rules on how to use those words to convey meaning).
Most terminology surrounding structured data markup can be organized into these two concepts
— vocabularies and syntaxes — and webmasters can combine whichever two they need to
structure their data (with the exception of Microformats).
VOCABULARY SYNTAX
Schema.org Microdata
DCMI JSON-LD
FOAF RDFa

Okay … that’s enough of the fancy developer speak. What should you be using for your
structured data?
Schema.org is the accepted universal vocabulary standard for structured data. It was founded
and is currently sponsored by Google, Bing, Yahoo, and Yandex. It’s flexible, open-sourced, and
constantly updated and improved.
Note: Schema is called such because it features markup for a wide variety of schemas — or
data models — for different types of content.
Here’s an example of Schema Markup language (which is good for SEO) pulled from my article
on branding.
"@context" : "http://schema.org",
"@type" : "Article",
"name" : "The Ultimate Guide to Branding in 2019"
"author" : {
"@type" : "Person",
"name" : "Allie Decker"
},
"datePublished" : "2019-04-02",
"image" : "https://blog.hubspot.com/hubfs/branding-2.jpg",
"url" : "https://blog.hubspot.com/marketing/branding",
"publisher" : {
"@type" : "Organization",
"name" : "HubSpot"
As for syntax, there’s no correct answer. Google recommends JSON-LD (and defaults to that
syntax when using its Structured Data Markup Helper — as you see below). JSON-LD uses
Javascript code and embedded widgets to dynamically display your content, which is typically a
simpler development process.

Google also recognizes Microdata and RDFa. Both of these syntaxes use HTML to identify
properties within structured data. Microdata is typically only used in the page body, whereas
RDFa is commonly used in both the page head and body.

On the other hand, JSON-LD is only placed in the page head, meaning, for certain types of
markup, JSON-LD makes it so you don't have to navigate subheaders, supporting copy, and
related styling that's included in the page's HTML. This is why JSON-LD is considered simpler
than the other two.

Ultimately, it all depends on the data you're trying to implement, what the benefit is to your
website, and what would be easier to share with your team.

Examples of Structured Data

To the average internet user, structured data can’t be seen. It’s hidden among the code that
makes up our favorite websites and online platforms. So, how does structured data affect what
we (and our customers) see? What does it look like to the “naked” eye?

When webmasters adhere to structured data standards, search engines


like Google and Bing reward their websites and organizations by featuring their content in a
variety of SERP features (another reason to use structured data).

Source
Let’s talk about those features — specifically on Google. Google SERPs display a wide variety
of information, but the ones we talk about below are specifically influenced by structured data.

There are also a couple of ways that structured data can benefit your non-SERP marketing
efforts on social media and email marketing.

Q9: how to identify unique visitors for a website in google analytics

Ans: Identifying unique visitors for a website is a good web analytics practice. It helps you in
many ways. Website owners or marketers could use the data to design or adjust marketing
strategies based on provided results. By analyzing unique visitors, website owners could
understand the user flow, behaviour , web content response/engagement, sales-attrition and
product performances. It is also the best way of measuring the popularity of website. But the real
questions is what is the best way to identify unique visitors for a website? I am going to share
some steps by which you can find out the analytical meaning of unique visitors, its importance in
web analytics and ways how to identify unique visitors. Before that let’s understand what is unique
visitors for a website?

Unique visitors

According to its name, unique visitors are people who visit your website or blog for the first time.
Analytical tools such as Google Analytics, Bing Analytics, Yandesk and other tracking tool uses
visitor’s IP address, Browser Cookies, Registration ID and Use Agent to identify a unique visitor.
These are called the identifiers. As soon as a visitor lands on your website or blog, these
analytical tools track the data and record it within the panel. Every user generates sessions which
helps website owners to understand user behaviour .

Please note that a single user could create multiple sessions. While navigating the website or
blog, users explore multiple pages which creates multiple sessions. Here is how you can
analyze Page Sessions in Google analytics.

Identify unique visits in Google Analytics

In order to identify unique visits in Google Analytics tool you can follow given below steps.

1. Open Google Analytics


2. Click on Audience from the left hand side menu
3. Choose Overview
4. Set the date range from when till when you want the data.

That’s how you could get an overview of new visitors on your web. However, in order to analyze
the behaviour and deep understanding, you can explore the audience section in google analytics
tool. All you need to do is click on “Behaviour ” and choose “New vs. Returning” under audience
report menu. You will get the data. You can adjust the dates to shrink or broaden the data.
If you would explore the audience report, you can actually do the better analysis on user
behaviour .

You can also identify the unique visitors by doing regular traffic check under acquisition. All you
need to do is, Click on “Acquisition” in top left menu on choose “All Traffic” & select
“Source/Medium”. The moment you get report, you will get a column name “New Users”.

Benefits of analyzing unique visitors

As stated above, analyzing unique visitors helps us to understand the user behaviour & user flow.
By collecting the data, you can design your marketing campaigns. Also get maximum exposure by
adjusting or re-designing marketing strategies according to data. Few major benefits of analyzing
unique visits are:

1. It helps us in designing re-marketing campaigns.


2. Identify the attrition rate.
3. Measure website popularity.
4. It can help us in understanding user behaviour /engagement.
5. You can enhance the performance of product/services page by identifying new visits.

Comment below if you find this information useful. However, you can also share your feedback or
suggestions for this information.

Q10: What Are Cookies? Explain its types. What Are Cookies Used For?

Ans: Cookies are text files with small pieces of data — like a username and password — that
are used to identify your computer as you use a computer network. Specific cookies known as
HTTP cookies are used to identify specific users and improve your web browsing experience.

Data stored in a cookie is created by the server upon your connection. This data is labeled with
an ID unique to you and your computer.

When the cookie is exchanged between your computer and the network server, the server reads
the ID and knows what information to specifically serve to you.

Different types of cookies - Magic Cookies and HTTP Cookies

 Magic Cookies
 HTTP Cookies

Cookies generally function the same but have been applied to different use cases:

"Magic cookies" are an old computing term that refers to packets of information that are sent
and received without changes. Commonly, this would be used for a login to computer database
systems, such as a business internal network. This concept predates the modern “cookie” we use
today.
HTTP cookies are a repurposed version of the “magic cookie” built for internet browsing. Web
browser programmer Lou Montulli used the “magic cookie” as inspiration in 1994. He recreated
this concept for browsers when he helped an online shopping store fix their overloaded servers.
The HTTP cookie is what we currently use to manage our online experiences. It is also what
some malicious people can use to spy on your online activity and steal your personal info.
HTTP cookies, or internet cookies, are built specifically for Internet web browsers to track,
personalize, and save information about each user’s session. A “session” just refers to the time
you spend on a site.

Cookies are created to identify you when you visit a new website. The web server — which
stores the website’s data — sends a short stream of identifying info to your web browser.

Browser cookies are identified and read by “name-value” pairs. These tell cookies where to be
sent and what data to recall.

The server only sends the cookie when it wants the web browser to save it. If you’re wondering
“where are cookies stored,” it’s simple: your web browser will store it locally to remember the
“name-value pair” that identifies you.

If a user returns to that site in the future, the web browser returns that data to the web server in
the form of a cookie. This is when your browser will send it back to the server to recall data from
your previous sessions.

To put it simply, cookies are a bit like getting a ticket for a coat check:

 You hand over your “coat” to the cloak desk. In this case, a pocket of data is linked to
you on the website server when you connect. This data can be your personal account,
your shopping cart, or even just what pages you’ve visited.
 You get a “ticket” to identify you as the “coat” owner. The cookie for the website is
given to you and stored in your web browser. It has a unique ID especially for you.
 If you leave and return, you can get the “coat” with your “ticket”. Your browser gives
the website your cookie. It reads the unique ID in the cookie to assemble your activity data
and recall your visit just as you left it.
What Are Cookies Used For?

Websites use HTTP cookies to streamline your web experiences. Without cookies, you’d have to
login again after you leave a site or rebuild your shopping cart if you accidentally close the page.
Making cookies an important a part of the internet experience.

Based on this, you’ll want to understand why they’re worth keeping — and when they’re not.

Here’s how cookie are intended to be used:

1. Session management. For example, cookies let websites recognize users and recall their
individual login information and preferences, such as sports news versus politics.
2. Personalization. Customized advertising is the main way cookies are used to personalize
your sessions. You may view certain items or parts of a site, and cookies use this data to
help build targeted ads that you might enjoy.
3. Tracking. Shopping sites use cookies to track items users previously viewed, allowing the
sites to suggest other goods they might like and keep items in shopping carts while they
continue shopping.
While this is mostly for your benefit, web developers get a lot out of this set-up as well.

Cookies are stored on your device locally to free up storage space on a website’s servers. In
turn, websites can personalize while saving money on server maintenance and storage costs.

MCQ’s

1. We get list of sites after typing a word in search bar called _______.
a) Key Phrase
b) Single Word
c) word
d) buffering
Ans. a
 
2. The search results are generally presented in a line of results often referred to as _______
a) Category List
b) Tag list
c) search Engine Results Pages
d) Search Engine Pages
Ans. c
 
3. Search Engines maintain heavy database of keywords and urls !
a) True
b) False
Ans. a
 
4. Web search engines stores information about many web pages by a ______.
a) Web indexer
b) Web Crawler
c) Web Organizer
d) Web Router
Ans. b
 
5. Web Crawler is also called as ______.
a) Link Directory
b) Web Spider
c) web Manager
d) Search Optimizer
Ans. b
 
6. Search engine optimization is the process of _______ of a website or a web page in a
search engine’s search results.
a) Affecting the visibility
b) None of these
c) Generating Cached files
d) Getting Meta Tags
Ans. a
 
7. SEO is all about optimizing a web site for search engines.
a) True
b) False
Ans. a
 
8. SEO is to improve the volume and ______ to a web site from search engines.
a) Look and feel       
b) Advertisement
c) Quality of traffic    
d) None of these
Ans. c
 
9. SEO can be called as art of ranking in the search engines.
a) True
b) False
Ans. a
 
10. A keyword search engines:
a) allows all users to change its content
b) returns a list of sites that have been reviewed by humans
c) returns a list of sites based on the search terms you enter
d) searches a variety of other search engines
Ans. c
 
11. A subject-oriented search engine:
a) allows all users to change its content
b) returns a list of sites based on the list of search terms you enter
c) returns a list of sites that have been reviewed by humans
d) searches a variety of other search engines
Ans. c
 
12.  A meta search engine:
a) returns a list of sites that have been reviewed by humans
b) allows all users to change its content
c) returns a list of sites based on the list of search terms you enter
d) searches a variety of other search engines
Ans. d
 
13. Compared to subject directories, search engines:
a) return fewer hits
b) return better-sorted hits
c) return hits that are reviewed by humans
d) return more hits
Ans. d
 
14. If you have a rough idea of your search in terms of what type of category it might fall into,
it is best to use a:
a) meta search engine
b) search engines such a Google
c) Formula for the keywords you enter
d) subject directory
Ans. d
 
15. SERPs stands for _______
a) Search engine results pages
b) Search engine heetson pages
c) Search engine real pages
d) Search engine results point
Ans. a
 
16. search engines also maintain _____information by running an algorithm on a web
crawler.
a) real time    
b) Google
c) www           
d) document
Ans. a
 
17. Yandex search engine belongs to which contry?
a) China
b) Russia
c) USA
d) Canada
Ans. b
 
18. In which year Yahoo search engine launched?
a) 1995
b) 1996
c) 1997
d) 1998
Ans. a
 
19. The most used search engine on the internet is _____
a) Archie
b) Google
c) Yandex
d) Bing
Ans. b
 
20. What is Alta Vista?
a) Hardware
b) Email
c) Search engine
d) Website
Ans. c

21. Search engine are used to ______


a) Software system that is designed to search for information on the world wide web
b) Search videos
c) search documents
d) All of these
Ans. d

22. Which of the following search engine is most popular in China?


a) Google       
b) Yahoo
c) Bing           
d) Baidu
Ans. d

23. Google was originally developed in 1997 by ________, Sergey Brin, and Scott Hassan
a) Larry Page            
b) Mark jukerbarg
c) Mike Tyson           
d) Allen turing
Ans. a

24. Search Engines are able to search ______ type of information.


a) Documents           
b) Videos
c) Images                   
d) All of these
Ans. d

25. SEO stands for _______.


a) Search Engine Optimization     
b) Search Entry Optimization
c) Search education Optimization   
d) None of these
Ans. a

26. To search for information in Google, you would:


a) enter the search terms in the search box
b) brows the topic categories
c) pay for a search subscription
d) send a request to the search librarian
Ans. a

27. To search more effectively, you should:


a) use a meta search engine
b) enclose your search terms in parentheses
c) use multiple words in your search to help narrow the results
d) use only one word to find sufficient results
Ans. c

28. If you enter the phrase “Cookie Recipes” with quotation marks in a search engine, you
will get results of all pages with:
a) the word cookie and the word Recipes in them
b) the word Cookie without Recipes
c) the word Cookie or the word Recipes in them
d) the words Cookie Recipes on the page together in order
Ans. d

29. Clicking on links from untrusted search engines might take an innocent user to ______.
(a) video sharing websites
(b) audio streaming web sites
(c) social media websites
(d) hackers web site which extracts login ID and password
Ans. d

30. Google is an example of _____


a) Search Engine    
b) Entertainment
c) Social Network     
d) None of these
Ans. a

31. Internet content that is not capable of being searched by a web search engine is
generally described as the______
a) Deep web
b) Dark web
c) Hidden web
d) secret web
Ans. a

32. The first tool used for searching content on the Internet was _____
a) Archie
b) Google
c) Yahoo
d) Bing
Ans. a

33. _______ is a Search engine of Microsoft.


a) Bing
b) Alta Vista
c) Google
d) portal
Ans. a

34. We can search information in Google by the use of _______.


a) Image
b) Voice
c) Text
d) All of these
Ans. d
Unit-3
Short Answers
Q1: What are basic web metrics?
Ans: Website metrics are a variety of measurements made on a given website in order to better
track its performance and statistics.

Q2: How to Measure Website Traffic?


1. Head to Google Analytics.
2. Click Audience in the left navigation.
3. Select Overview.
4. Check the Sessions metric. Here's how Google describes this number: “The period of time
a user is active on your site or app."

Q3: What are hits and page views?


Ans: Hits measure activity from the website server's perspective. Page Views are a much
better measurement of a website's activity. Page Views are a measure of how humans, not
computers, are interacting with your website. The reason there are two measures is due to the
way web pages are written.
Q4: What are hits social media?
Ans: The term “hit” is perhaps the most misused term in online marketing, mistakenly used to
mean unique visitors, visits, page views, or all of the above. A hit is merely a request for a file
from a Web server. A request for a Web page counts as a hit, but so does a request for a
graphic on a Web page.
Q5: What is a good number of page views?
Ans: Depending on the type of website you have, the type of advertising you do, and the type of
user on your site, the average number of pageviews per session can range from 1.2 to 10. For
ecommerce sites, 5-10 pageviews per session is a reasonable figure.

Q6: What Is Google Analytics And Main Purpose Of Google Analytics?

Ans: Google Analytics is a web analytics software used to track the traffic on the website. The
major purpose of analytics is to analyse the information about the site and make decisions to
improve the site traffic and revenue.
Q7. What Is Meant By Conversions And How Will You Track Conversions Through Ga?
Ans: Conversions happens when any predefined goals are accomplished thereby generating
ROI to the business. In other words if the user take any desired action on the site, its
considered as conversion. For example filling the form, purchasing a product etc. We use Goals
in Analytics to set the conversion tracking.

Q9. What Are Events In Google Analytics?


Ans: Events are user interactions with content that can be tracked independently from a web
page or a screen load. we can create custom events to track downloads, play buttons and ajax
load etc.

Q10. What Are Goals And How Many Goals Can We Create In Analytics?
Ans: A goal defines a completed user activity, called a conversion, that contributes to the
success of your business. We can have only 20 goals per one web property.

Q11. What Is Benchmarking?


Ans: This metric helps us to compare our data with the market aggregated data from relevant
industry who share the data anonymously.

Q12: What is a random graph?


Ans: Formally, when we are given a graph G and we say this is a random graph, we are wrong.
A given graph is fixed, there is nothing random to it. What we mean though through this term
abuse is that this graph was sampled out of a set of graphs according to a probability
distribution. For instance, three possible graphs on vertex set [3] = {1, 2, 3} with 2 edges. The
probability distribution is the uniform, namely, each graph has the same probability 1 3 to be
sampled.
Long Answers
Q1: Explain Page Views and Unique Page Views. Differentiate between them. How does a
pageview differ from a unique pageview?

Ans: Pageview

A pageview is “an instance of a page being loaded in a browser.” To put it simply, when a user
views one page on a site, it counts as a pageview. If the user reloads the page or goes to another
page and then returns, it is another pageview. For example, if the same user loads the same
page 10 times, it generates ten pageviews. 

Pageviews can offer an indication of how popular a web page or blog post is. However, you need
to consider other factors to determine whether a certain page is actually popular. Even though a
page might have lots of pageviews, it doesn’t necessarily mean the content resonates with your
readers. It might simply be a matter of good SEO that generates lots of hits for a specific page. In
order to dig deeper, you’ll also need to analyse the bounce rate to see whether the content is
being read, and is therefore useful for readers.

Page Views VS Unique Page Views

Pageviews are defined as the total number of times the piece of content was viewed during a
given period of time.  

Unique Pageviews represent an aggregate of pageviews generated by the same user during the
same session (i.e. the number of sessions during which that page was viewed one or more
times).  The time limit for a given session is 24hrs.

In other words, a pageview represents each time a user visits a page.  In this way, a single user
loading the same page 5 times in a single session will generate 5 pageviews. Whereas unique
pageviews are calculated on a session basis, meaning if the same user loads a page 5 times in a
given session, it's only calculated as 1 unique pageview.  This is why pageviews will always
outnumber unique pageviews.  

Unique pageviews refer to the aggregate of pageviews a single user generates during the same
session (more on sessions below). Because unique pageviews are calculated per session, if a
user loads the same page 10 times, this behaviour only generates one unique pageview. As a
result, the number of unique pageviews will always be lower than that of pageviews. 

To put it simply, the unique pageviews in Google Analytics shows how many users visited a
specific page, whereas the pageviews displays the total number of times any pages were visited,
including multiple views from the same user.

Q2: What is a Website’s Bounce Rate? How does SEO affect bounce rate?

Ans: A website’s bounce rate is perhaps one of the most undervalued metrics of a successful
SEO campaign. In general, a bounce rate is the number of visitors to any given website who
navigate off of the site after viewing only one page, typically expressed as a percentage. The
Google Analytics (GA) tracking software keeps track of this bounce rate for you. Time and time
again, Matt Cutts, Google’s head of Webspam, has adamantly denied that Google uses bounce
rates, or any other GA derived metric, in their ranking algorithms. Though this is most likely true
(using GA data would exclude viable results from websites who don’t use their Analytics
software), Cutts tends to avoid directly answering this question. Though Google may not be using
bounce rates from Google Analytics, that doesn’t mean they are not using a similar metric from
their own user data from the SERPs (Search Engine Results Page).

When someone clicks a result on the SERP, Google pays attention to how long they visit the
page. If a user clicks a result in the SERP, determines the page is not satisfying their query, and
quickly hits the back button in their browser to return to the SERP, this is what is referred to in the
SEO industry as a “return-to-SERP”. It’s not known exactly how long a user must “dwell” on a
page to not count as a return-to-SERP (Google is known for their super top-secret ultra classified
Area-51-esque algorithms). What is known, however, is that quickly returning back to the SERPs
most likely does play a role in the rankings, and rightfully so. If a lot of users are bouncing back to
the SERPs, the assumption is there must not be much valuable content on the page to begin
with. Or, the content could be quite rich, but simply not relevant to the searcher’s query. Of
course, there are exceptions to this. For example, if the user searches a quickly answerable
question, clicks the first result, finds their answer in a few seconds, and returns to the SERP to
continue on with something unrelated, this would lead to a high bounce rate. Considering this
example, it’s easy to see why “good” bounce rates depend completely on your website’s goals,
as well as why there are so many different ranking factors in the algorithms.

So if Google is not using your Google Analytics bounce rate, what use can it be to you?
Regardless if you believe Google does or doesn’t use them, bounce rates are still a great
indicator of how engaging your website is. It’s a great metric to use to make sure you’re content
remains relevant to your targeted keyword plan. If you are suffering from a high bounce rate, you
may be missing out on lots of potential conversions (whether its converting a sale, generating a
lead, or just interacting with a user who may go on to share your content with others). After all, a
bounce back to the SERP on an eCommerce site is comparable to someone opening the door to
a traditional brick and mortar storefront, only to take a quick look and turn around without actually
walking inside to see what the store has to offer.
How exactly Google implements return-to-SERP rates, or any other metric for that matter, to
affect their rankings is still up for debate. Google isn’t exactly entirely forthcoming with this
information, and for good reason. The less public they are with the intricacies of their algorithm,
the less prone to spammy SEO practices they are, and thus the better Google can continue to
sensibly order their rankings.

Q3: What is Google Analytics, and why is it important to my business?


Ans: Google Analytics is a web analytics service that provides statistics and basic analytical tools
for search engine optimization (SEO) and marketing purposes. The service is part of the Google
Marketing Platform and is available for free to anyone with a Google account.

Google Analytics is used to track website performance and collect visitor insights. It can help
organizations determine top sources of user traffic, gauge the success of their marketing
activities and campaigns, track goal completions (such as purchases, adding products to carts),
discover patterns and trends in user engagement and obtain other visitor information such as
demographics. Small and medium-sized retail websites often use Google Analytics to obtain and
analyze various customer behaviour analytics, which can be used to improve marketing
campaigns, drive website traffic and better retain visitors.

How does Google Analytics work?


Google Analytics acquires user data from each website visitor through the use of page tags.
A JavaScript page tag is inserted into the code of each page. This tag runs in the web browser of
each visitor, collecting data and sending it to one of Google's data collection servers. Google
Analytics can then generate customizable reports to track and visualize data such as the number
of users, bounce rates, average session durations, sessions by channel, page views, goal
completions and more.

The page tag functions as a web bug or web beacon, to gather visitor information. However,
because it relies on cookies, the system can't collect data for users who have disabled them.

Google Analytics includes features that can help users identify trends and patterns in how visitors
engage with their websites. Features enable data collection, analysis, monitoring, visualization,
reporting and integration with other applications. These features include:

 data visualization and monitoring tools, including dashboards, scorecards and motion


charts that display changes in data over time;

 data filtering, manipulation and funnel analysis;

 data collection application program interfaces (APIs);

 predictive analytics, intelligence and anomaly detection;

 segmentation for analysis of subsets, such as conversions;

 custom reports for advertising, acquisition, audience behaviour and conversion;

 email-based sharing and communication; and

 integration with other products, including Google Ads, Google Data Studio, Salesforce
Marketing Cloud, Google AdSense, Google Optimize 360, Google Search Ads 360,
Google Display & Video 360, Google Ad Manager and Google Search Console.

Within the Google Analytics dashboard, users can save profiles for multiple websites and either
see details for default categories or select custom metrics to display for each site. Available
categories for tracking include content overview, keywords, referring sites, visitors overview, map
overlay and traffic sources overview.

Benefits and limitations


Google Analytics has distinct benefits and limitations. Pros generally relate to the platform being
powerful, free and user-friendly. Google Analytics also provides the following benefits:

 The service is free, easy to use and beginner friendly.

 Google Analytics offers a variety of metrics and customizable dimensions. Many


different types of useful insights can be captured using this platform.

 Google Analytics also contains many other tools, such as data visualization,
monitoring, reporting, predictive analysis, etc.

Q4: What kind of data are available on Google Analytics, and what can you do with them?
Ans: There are two types of data that you can collect in Google Analytics:

1. User Acquisition Data: data about your users before they visit your website

2. User Behaviour Data: data about your users when they visit your website

(1) User Acquisition Data

Before users visit your website: you can access data about your user demographics before they

visit your website (e.g. their age, gender, and interests). You can also get data about where they

are coming from, whether that’s Facebook, other websites, or Google search. I call these data

“user acquisition data” because they can help you figure out which user group and channels to

target.
These characteristics of your web visitors, such as what media channel they frequent and their

demographic information, are intrinsic to the users themselves. You really cannot do much to

change these attributes.

Luckily, the internet is huge, so even though you cannot change these intrinsic characteristics of

your visitors, you can choose specific user groups on the internet who have the characteristics

you want to target. You can attract more of them to come to your site by running targeted ads

through Facebook, Google, and other advertising platforms. Your user acquisition data can serve

as the guiding compass to direct your digital marketing strategy and activities.

(2) User Behaviour Data

The second group of data are “user behaviour” data, which are collected during a user’s session

on your website. “User behaviour” data include:

 how long a user stayed on your website

 what is their first and last page on your website?

 the most common “pathway” through which they go through your website

Now unlike “user acquisition” data, “user behaviour” data can be easily changed by your changes

you make to your website. The key here is to use various analyses to identify the pages where

your users get “stuck.” You can then smooth out their user experience on these problem pages so

users can move seamlessly toward converting to paying customers with minimal friction.

“User behaviour” data can serve as a guide for you to improve your website so more of your users

end up converting, whether that means making a purchase on your website, or signing up for your

newsletter.

Q5: What is a social network graph?


Ans: A social network graph is a graph where the nodes represent people and the lines
between nodes, called edges, represent social connections between them, such as friendship or
working together on a project. These graphs can be either undirected or directed. For instance,
Facebook can be described with an undirected graph since the friendship is bidirectional, Alice
and Bob being friends is the same as Bob and Alice being friends. On the other hand, Twitter can
be described with a directed graph: Alice can follow Bob without Bob following Alice.

Social networks tend to have characteristic network properties. For instance, there tends to be a
short distance between any two nodes (as in the famous six degrees of separation study where
everyone in the world is at most six degrees away from any other), and a tendency to form
"triangles" (if Alice is friends with Bob and Carol, Bob and Carol are more likely to be friends with
each other.)

Social networks are important to social scientists interested in how people interact as well as
companies trying to target consumers for advertising. For instance, if advertisers connect up
three people as friends, co-workers, or family members, and two of them buy the advertiser's
product, then they may choose to spend more in advertising to the third hold-out, on the belief
that this target has a high propensity to buy their product.

Social scientists can also use social networks to model the way things made by people connect.
Pages on the internet and the links between them form a social network in much the same way
as people form networks with other people. Also, counter-intelligence agencies have used cell-
phone data and calls to map out terrorist cells.

Q6: What is Social Search?


Ans: "Social search" is an evolving term for the way in which search engines factor a user's
social network -- also referred to as social graph -- into how results are displayed after a search
query. In social search, content that has a social connection to you in some way is prioritized. A
social connection could mean someone you are linked to via Facebook, Twitter, or any other
major social network. Alternately, some forms of social search prioritize content that has been
shared by social media influencers, even if those experts aren't directly tied to you.

Examples of Social Search


Google Plus Your World
In early 2012, Google unveiled Google Plus Your World, a unique integration between Google
search results and the Google Plus social network that, when activated, prioritizes content that
has been shared or received a +1 by your Google network. In addition to Google Plus Your
World, Google social search results from multiple networks are now mixed throughout your
results based on their relevance; and content with ties to your network are displayed with a
higher relevance than their counterparts. Searchers only see social search results when they are
logged into Google and have their social networks connected.

Bing Social Search


This summer, Bing announced a new version of its search engine. It included an entirely new
layout that closely integrates a searcher's social network into the results displayed for a given
search term. According to Bing, the social results -- which include the ability to directly ask advice
from your Facebook network -- "complement the standard search results without compromising
them, offering you the chance to start a conversation and get advice from your friends, experts
and enthusiasts right within the search experience." 

Facebook Social Search


This fall, Facebook CEO Mark Zuckerberg indicated that he is interested in launching a social
search engine powered by Facebook user activity. He explained, "Search engines are really
evolving towards giving you a set of answers… like, I have a specific question, answer this
question for me. And when you think about it from that perspective, Facebook is pretty uniquely
positioned to answer a lot of the questions that people have." 
According to Zuckerberg, Facebook handles close to 1 billion search queries per
day already. Many of these searches are for individuals or company pages, but the potential
exists for inquiries related to decision-making or reviews.  

Q7: Explain Social Search and Inbound Marketing.


Ans:

Even if the social search playing field hasn't been completely defined yet, one of the key
takeaways from the early actions of Google, Bing, and Facebook is that as marketers, we need
to start seeing our search engine optimization strategy and our social media strategy as
utterly intertwined. Here's how you can do just that.
Step 1: Make sure your social media tools are informed by your SEO tools.

The best way to come out on top of social search is to have a fully integrated marketing platform
where social media and SEO are fully linked.

Truly though, having a blog with built-in social sharing and as-you-type SEO recommendations
definitely helps. With or without that kind of technology, however, there are some steps you can
take to leverage the growing use of social search. 
 Audit your existing strengths: Take a look at your top ranking and most shared
content. Is there overlap? If you've found a type of content that is simultaneously strong in
search and frequently shared, it's worth optimizing that content even further.  
 Update your company profiles to be keyword-rich: If, as in the example above, I
search Bing for "Inbound Marketing," a few things will happen. 1) Bing will give me
traditional search results. 2) Bing will show me friends who have written or shared
"inbound marketing" content. 3) Bing will bring in "People Who Know" who include the
keyword "inbound marketing" in their profile or frequently shared content. For the latter
circumstance, it doesn't hurt to put your main keywords as part of your company's profile
online. The combination of that profile and the strength of your content and shares will add
up.
 Make your top keywords more social: Make a list of the keywords for which you want to
rank highly. Does the content you share on social media and your blog cover those
keywords? Zero in on one or two of your most desirable keywords and find ways to make
content under those keywords more shareable. At a bare minimum, include social sharing
buttons on your content. Beyond that you may want to experiment with encouraging social
sharing through pay-by-tweet downloads or using easy share links throughout your posts, 
Step 2: Find and encourage your social media influencers.
The reason social is such a natural extension of search is that it adds both relevancy and
authority. Think about this: According to Nielsen Research 92% of consumers worldwide
trust recommendations from friends and family more than any form of advertising. This is
up from 74% in 2007. As recommendations from peers become more prominent online, the
influence they levy will weigh more heavily into activity on search and social sites combined. For
this reason, it's wise to start thinking of your company or organization's fans as extensions of
your inbound marketing team.
 Find your influencers: Spend some time to get to know the people who consistently
share your content. Pull together a list of contacts with more than a thousand followers
and a history of engagement in your content. Knowing your social media influencers will
help you expand your reach online and ultimately increase the rate at which your content
gets found online. 
 Nurture your influencers: Once you've discovered your evangelists, think about ways to
nurture and encourage them. At the simplest (and possibly most meaningful) level, find a
way to thank them for spreading the word about your company. As a second step,
consider inviting them to a special open-house or providing them sneak peeks of
upcoming news or announcements. (Note: Be careful when nurturing your influencers that
you are not offering them benefits in exchange for talking about your company. That's not
inboundy at all and really questionable, ethically. In fact, in some cases, it may even be
illegal.)

(Above: A search for HubSpot's social media influencers)


Step 3: Watch for changing factors in social search.

While there are a few core principles at work in social search, individual factors will continue to
develop in the near future. As you're considering the social channels you use, think about the role
each plays in your search engine of choice. 

 Don't rule out Google+: When Google+ first entered the social media space, many
marketers wondered if it was really worth diverting marketing attention into yet another
social network. But when the parent company of said social network is the biggest search
engine in the world and starts to integrate its content into search results, it's worth dipping
a toe in the water.
 Don't rule out Bing: Not only did Bing account for 30% of all searches this spring
(Source: Experian), Bing also has a more diverse social search offering than any other
search engine. With Facebook, Twitter, Quora, Klout and Foursquare tied in, Bing may
give social active companies an edge. 
 Keep your Facebook pages active: While search is clearly not Facebook's primary
purpose yet, Facebook does have a team of engineers, including former Google engineer
Lars Rasmussen, working on an improved search engine for the site. There's
a tremendous opportunity for Facebook to delve deeper into search. There's also
tremendous opportunity for businesses to grow their reach through Facebook. My
colleague Amanda Sibley just finished a top-notch eBook on attracting customers
through Facebook that could be a good starting point.
Step 4: Remember the golden rule.
Years ago, when HubSpot first started teaching people about search engine optimization, one
rule was essential: Above all else, create good, useful content. 

Even with the rapidly growing influence social sharing has on search results, the good news is if
you're creating good content, you're already half-way there. Useful content is by nature more
search-friendly than sales-oriented content. It is also more likely to be shared. The increasingly
formal relationship between search and social is really just a natural extension to what has
always been true -- content that is relevant and can be trusted as authoritative will continue to
drive both your search and social media marketing.

Q8: What are Realtime Reports? Explain Traffic Sources, Page View, and Events.

Ans: Realtime reports measure live user activity as they move throughout the site. Additionally,
the report will measure the last 30 minutes of activities. With realtime reports, you can get a
glimpse of all of the four common GA reports. Here are the following attributes you can measure:

 Location: This report comes from the audience report. You can see the country the user
is in.
 Content: This report comes from the behavior report. You can see the top landing pages
and page title the user lands on.
 Events: This report also comes in the behavior report. You can see the current events the
user goes through.
 Conversions: This report comes from the conversions report. You can see the goals
(pageview or events) and goal titles that were passed through the site.

Testing Realtime Reports

It normally takes a couple of hours before data starts appearing in GA. Therefore, it’s important to
test your site right away to avoid missing valuable insights.

That’s where realtime plays a key role. Within a matter of seconds of setting up GA or any
configurations within GA or Google Tag Manager (GTM), realtime allows you to test the site.

Here are a couple ways you can do testing through realtime:

Pageview Tracking

When creating GA tags in GTM, it’s important to test the site early on to ensure that GA is
populating data as the user navigates through the site.
Traffic Sources

The traffic sources report allows you to see what source and medium your user is on. You can
use the traffic sources report to make sure GA is reading the correct source as you navigate the
site.

The screenshot below shows how a source/medium would populate in realtime.

Pageviews

In the content and events report, you can test out pageview hits on your site. This would be
useful if you created a pageview tag for a specific page on your site; you can ensure the specific
page is collecting property by checking the content report.

Events

If you wanted to see if specific user actions were being tracked on your site, then you can click
through the specific user actions on the site and go under the events report to see if the events
populate.

Here’s an example of how you can test a user event on the site.

In the first screenshot is a common dataLayer custom event for the site; the user wants to track a
“Header Click” (ie. when a user clicks on any of the header links at the top of the site). In GA, you
would need to the event category and the event action to match the dataLayer Category and
Action
In the screenshot below, it shows the realtime event report for the same site with the dataLayer
event. You would go under the events report and check to see if there’s a hit on the page, and
then look at the active users table at the bottom.

Q11: Explain Various Web Analytics Tools.

Ans: there are plenty of tools out there that can turn all of that collected information into an easy-
to-understand report that gives you much-needed insight into your unique Web visitors. When
you are armed with this knowledge, you get to see how effective your website is and what
changes you need to make in order to make it even better.

Here are some of the top 10 tools that you can use to gain more understanding about your
website traffic.

1. Google Analytics.
Google Analytics is one of the best free tools that any website owner can use to track and
analyze data about Web traffic. You get to see what keywords are bringing the most visitors to
your pages and what aspects of your designs are turning them off. This tool will generate a report
for your website that includes information about visitors, traffic sources, goals, content and e-
commerce. The downside of Google Analytics is that it can take time to update. (The real-time
version is still in beta testing.) There are other tools that offer real-time updates of your data now.

2. Spring Metrics.
Spring Metrics has taken the analytics tool and made it simpler. You don’t have to be a
professional data-miner to get the answers to your questions. You get real-time conversion
analytics, top converting sources, keyword analytics, landing-page analysis, e-mail performance
reports and simple point-and-click configuration. Unlike Google Analytics, Spring Metrics tracks a
visitor’s path through your website from the time he landed to the time he left. All of this is
included in Spring Metrics’ Standard Plan for $49 a month. When you first sign up, you get to try
it free for 14 days. The simplicity of this tool has a lot of website owners switching over from
Google Analytics.

3. Woopra.
Woopra is another tool that offers real-time analytics tracking, whereas Google Analytics can take
hours to update. It is a desktop application that feeds you live visitor stats, including where they
live, what pages they are on now, where they’ve been on your site and their Web browser. You
also have the ability to chat live with individual site visitors. This can be a great feature for your e-
commerce site to interact with customers. Woopra offers a limited freebie plan as well as several
paid options.

Sign up to stay ahead with our once-a-week Newsletter, Business Class: The Brief. Expect
handpicked insights and inspiration for small businesses – straight to your inbox.

4. Clicky.
Clicky also offers a free service if you have only one website and a Pro account for a monthly fee.
You get real-time analytics, including Spy View, which lets you observe what current visitors are
doing on your site. Clicky's dashboard is simple to use and presents all the information you want
to see clearly. They also have a mobile version that makes it easy for you to check your stats
anywhere.

5. Mint.
Mint is an analytics tool that is self-hosted and costs $30 per website. You get the benefit of real-
time stats, which you don’t get with the free Google Analytics. You can track site visitors, where
they are coming from and what pages they are viewing. And Peppermill, a part of Mint, lets you
make any adjustments to make it more compatible for your use with tons of free add-ons.

6. Chartbeat.
Chartbeat lets users get the most from their data with instant information. They keep constant
watch on your visitors and what they are doing on your website. This gives you the information
you need in order to make the adjustments necessary to your content or design. You get a free
month when you sign up and after that plans start at $9.95 per month.

7. Kissmetrics.
Kissmetrics is another analytics tool that allows clients to track the movements of individual
visitors throughout their websites. You can see how behaviors change over time, identify patterns
and see the most typical and recent referrers, among other stats. It offers a “Timeline View” of
visitor activity in an easy-to-understand visual format. You can try this service free for 30 days.
Plans start at $149 a month, depending on how many events are tracked.

8. UserTesting.
UserTesting.com is a unique way to gather information about site users. You are paying for a
group of participants of your choosing to perform a set of tasks on your site. The user and his
activity will be recorded on video. In about an hour, you will have your feedback. You get to hear
the actual thoughts of users in your target demographic. The cost is $39 per participant you
choose. You may choose anywhere from 1 to 100 testers.

9. Crazy Egg.
Crazy Egg uses the power of Heatmap technology to give you a visual picture of what site visitors
are doing on your Web pages. It shows you where people are moving their mouse on the page
and where they click. There is a link between where people put the mouse and where they are
moving their eyes. So, this kind of tracking helps you see what areas are catching the most
attention and interaction from users. There is a free one-month trial with this service, and prices
start at $9 month for 10 Heatmaps.

10. Mouseflow.
Mouseflow is somewhat of a combination of UserTesting and Crazy Egg. You can see video of
users interacting with your website, including every mouse click and movement, scrolling and
keystrokes. You also get to view heat maps from different time periods so that you can see the
effect of changes that you make on your page. Pricing varies depending on how many sites you
want to cover and how many sessions you want. For a single site and up to 100 recorded
sessions, there is no cost. Over that, prices start at $13 a month.

Q12: What is A/B Testing?


Ans: A/B Testing is the process of comparing two variations of the same thing to see which
variant yields the best results. A/B testing is often used in marketing to determine which
marketing message, offer or other element is most effective at improving response rates. On the
Web, A/B testing is used in WEBSITE OPTIMIZATION to determine which variations of a page
element improve conversion rates the most.

A/B testing is commonly used by online companies to improve the performance of their website
and marketing campaigns, because it is relatively easy to create and run tests by updating the
code or design of a website. This makes A/B testing much easier on the Web than it is to test
things like billboards or magazine ads.

A/B testing isn’t just for marketing however, product teams can A/B test different product
variations, customer service can A/B test various responses and more.

How Does A/B Testing Work?


Marketers use A/B testing on their websites to improve conversion rates. For example you may
want to determine which call to action on your landing page results in more people clicking to the
next page or step in your funnel. To determine this you can set up an A/B test with two different
variations of the button. In this hypothetical example the “A” variant of the button might say
“Learn More” while the “B” variant of the button might say “See How”.

In an A/B test, both variants of the button run at the same time, to a set percentage of website
visitors. In the simplest case, 50% of visitors see variant “A” and 50% see variant “B”. By using
an A/B testing software like Optimizely, you can watch over time to see which button variant gets
the most clicks.

Suppose the results of the A/B test are:


At the end of your test, you might find that one button performs better than another at getting
people to click more frequently. In this example, button variant B performs twice as well as
variant A. This variant is called the “winning variant” or “winner”.
Assuming that the conversion rate holds*, by using “See How” for every visitor after the test
concludes, you have successfully doubled your conversion rate. After the conclusion of this test,
you may run another test to see if you can improve upon the “See How” button variation, or test
some other element of the page. This is the essence of CONVERSION RATE OPTIMIZATION.

The conversion optimization process looks as follows, where A/B testing is just one step in the
overall optimization effort.

Q13: Differentiate between Crawling and Indexing in Search Engine Optimization (SEO)
Ans:
1. Crawling :
Crawling is the discovery process in which search engines send out a team of robots (known as
crawlers or spiders) to find newly updated content.
2. Indexing :
Indexing is the process that stores information they find in an index, a huge database of all the
content they have discovered, and seem good enough to serve up to searchers.

Difference between Indexing and Crawling :


S.No.CRAWLING INDEXING

In the SEO world, Crawling Indexing is the process of “adding webpages into
1. means “following your links”. Google search”.

Crawling is the process


through which indexing is
done. Google crawls through When search engine crawlers visit any link is crawling
the web pages and index the and when crawlers save or index that links in search
2. pages. engine database is called indexing.
When google visits your
website for tracking purpose. After crawling has been done, the result gets put on to
This process is done by Google’s index (i.e. web search), which means
3. Google’s Spiders or Crawlers. Crawling and Indexing is a step by step process.

Indexing means when search engine bots crawl the


Crawling is a process which is web pages and saves a copy of all information on
done by search engine bots to index servers and search engines show the relevant
discover publicly available web results on search engine when a user performs a
4. pages. search query.

It finds web pages and queues It analyses the web pages content and saves the
5. for indexing. pages with quality content in index.

It performs analysis on the page content and stores it


6. It crawls the web pages. in the index.

Crawling is simply when search


engines bots are actively
7. crawling your website. Indexing is the process of placing a page.

Crawling discovers the web Indexing builds its index with every significant word on
crawler’s URLs recursive visits a web page found in the title, heading, meta tags, alt
8. of input of web pages. tags, subtitles and other important positions.

Q14: What is Text Analysis: Techniques, Applications & Examples

Ans: Text analysis is an umbrella term encompassing AI-empowered techniques that help derive
meaningful information from unstructured data. These insights, in turn, help make informed, data-
backed decisions, enhance productivity, and improve business intelligence.

Text analysis can:

 help analyze customer preferences, trends, and needs, assisting you in developing better
products and features.

 Help study a substantial amount of data in real-time – without occupying your team’s time.
Since text analysis with AI reduces manual work, productivity soars.
 Reduce the scope of error - by encompassing Machine Learning and Natural Language
Processing (NLP) to unify criteria of analysis.

Common use cases for Text Analysis

 Healthcare: The industry uses text analysis to find patterns in doctors’ reports, identifying
patterns in patient data. You can also use it to detect disease outbreaks by discovering
cases in social media data.

 Research: Researchers use text analysis with AI to explore pre-existing literature to


identify trends and patterns - or categorizing research survey answers by topic or
sentiment.

 Product development: By analyzing boatloads of customer reviews and trends, text


analysis helps determine in-demand features. By analyzing customer reviews on Amazon,
a young analyst’s team, for instance, studied the price customers were happy to pay for a
new market they were tapping into.

 Customer service and experience: By automatically studying various data such as


critical tickets, call notes, surveys, and more, businesses can identify urgent requests to
respond to and discover sentiment around their product/service.

Q15: What is Natural Language Processing (NLP)?


Ans: Natural Language Processing (NLP) is “the ability of machines to understand and interpret
human language the way it is written or spoken.” The objective of NLP is to make
computer/machines as intelligent as human beings in understanding language. The ultimate goal
of NLP is to fill the gap how the people communicate (natural language) and what the computer
understands (machine language). There are three different levels of linguistic analysis done
before performing NLP-

 Syntax - What part of the given text is grammatically right.


 Semantics - What is the meaning of the given text?
 Pragmatics - What is the purpose of the text?

NLP is a subset technique of Artificial Intelligence which is used to narrow the communication
gap between the Computer and Human. NLP deal with different aspects of language such as:

 Phonology - It is a systematic organization of sounds in language.


 Morphology - It is a study of words formation and their relationship with each other.

Approaches of NLP for understanding semantic analysis.


 Distributional - It employs large-scale statistical tactics of Machine Learning and
Deep Learning.
 Frame-Based - The sentences which are syntactically different but semantically same
are represented inside data structure (frame) for the stereotyped situation.
 Theoretical - This approach builds on the idea that sentences refer to the real world
(the sky is blue) and parts of the sentence can be combined to represent whole
meaning.
 Interactive Learning - It involves a pragmatic approach and the user is responsible
for teaching the computer to learn the language step by step in an interactive learning
environment.

The real success of NLP lies in the fact that humans deceive into believing that they are talking to
humans instead of computers.
Importance of Natural Language Processing Applications
With NLP, it is possible to perform certain tasks like Automated Speech and Automated Text
Writing in less time. Due to the presence of significant data (text) around, why not we use the
computers untiring willingness and ability to run several algorithms to perform tasks in no time.
These tasks include other NLP applications like Automatic Summarization (to generate a
summary of given text) and Machine Translation (translation of one language into another)
MCQ’s

1) In views that don’t have data import enabled, Custom Dimensions values may be
viewed for dates before the Custom Dimension was created.

1. True
2. False
ANSWER: False

2) What feature is required to send data from a web-connected device (like a point-of-sale
system) to Google Analytics?

1. The Measurement Protocol


2. Browser cookies
3. Data Import
4. The Networking Protocol
ANSWER: The Measurement Protocol

3) What would prevent data from appearing in a Custom Report?

1. Custom Report isn’t shared with users in the same view


2. Too many dimensions in a Custom Report
3. Too many metrics in a Custom Report
4. A filter that removes all the data
ANSWER: Custom Report isn’t shared with users in the same view

4) What report shows which types of mobile devices visited a website?

1. All Traffic > Source/Medium report


2. Site Content > Landing Page report
3. Technology > Network report
4. Mobile > Devices report
ANSWER: Mobile > Devices report

5) What is a “dimension” in Google Analytics?

1. A comparison of data between two date ranges


2. The lifetime value of a user in a given date range
3. An attribute of a data set that can be organized for better analysis
4. A report that offers different demographic information about your audience
ANSWER: An attribute of a data set that can be organized for better analysis

6) Which parameters can be included with an event hit for reporting?

1. Category, Action, Label, Unique Events


2. Category, Action, Label, Value
3. Event, Category, Action, Label
4. Category, Action, Label, Total Events
ANSWER: Category, Action, Label, Value

7) Where should the Analytics tracking code be placed in the HTML of a webpage for data
collection?

1. Just before the closing tag


2. Just after the openingtag
3. Just after the openingtag
4. Just before the closingtag     A

NSWER: Just before the closingtag

8) What model represents the hierarchical structure of a Google Analytics account?

1. Property > Account > View


2. Account > Property > View
3. View > Account > Property
4. Account > View > Property
ANSWER: Account > Property > View

9) What Remarketing audiences cannot be defined by default?

1. Users who played a video on a website


2. Users who speak a particular language
3. Users who visited a specific page on a website
4. Users who visited a physical store
ANSWER: Users who visited a physical store

10) Which default traffic source dimensions does Google Analytics report for each website
visitor?
1. Campaign and Ad Content
2. Source and Campaign
3. Source and Medium
4. Campaign and Medium
ANSWER: Source and Medium

11) When the same default tracking code is installed on pages with different domains,
what will result?

Analytics will associate users and sessions with a single domain


Analytics will send an alert about duplicate data collection
Analytics will not associate users and sessions with any domain
Analytics will associate users and sessions with their respective domains

ANSWER: Analytics will associate users and sessions with their respective domains

12) View filters are applied in what order?

Creation date
Sequential order
Alphabetical order
Random order
ANSWER: Sequential order

13) Which Goals are available in Google Analytics?

Pageview, Event, Transaction, Social


Location, Event, Time, Users per Session
Destination, Event, Pageview, Social
Destination, Event, Duration, Pages/Screens per Session
ANSWER: Destination, Event, Duration, Pages/Screens per Session

14) Once filters have been applied, what is the option to recover filtered data?

Data may be recovered within 10 days


Filtered data is not recoverable
Data may be recovered within 30 days
Data may be recovered within 5 days
ANSWER: Filtered data is not recoverable

15) What scope would be set for a Custom Dimension that reports membership status for
a customer rewards program?

Product
Hit
Session
User
ANSWER: user

16) To recognize users across different devices, what feature must be enabled?
Google Ads Linking
User ID
Audience Definitions
Attribution Models
ANSWER: User id

17) Custom Dimensions can be used as what?

Secondary dimensions in Standard reports


Primary dimensions in Custom Reports
All of the above
Secondary dimensions in Custom Reports
ANSWER: All of the above

18) Within how many days can a deleted view be restored?

15
25
5
35
ANSWER: 35

19) In Custom Reports, what must metrics and dimensions share in order to report
accurately?

Same scope
Same view
Same index
Same Custom Report
ANSWER: Same Scope

20) Which reports indicate how traffic arrived at a website?

All Traffic
Behavior
Geo
Demographics
ANSWER: All Traffic

21) The default session timeout duration in Google Analytics is how many minutes?

20
10
30
5
ANSWER: 30

22) What criteria could not be used to create a Dynamic Remarketing audience?
Users who viewed a homepage
Users who returned an item they purchased
Users who viewed a search result page on a website
Users who viewed product-detail pages
ANSWER: Users who returned an item they purchased

23) What report shows the percentage of traffic that previously visited a website?

All traffic > Referrals report


Behavior > New vs returning report
Interests > Affinity categories report
Behavior > Frequency and Recency report
ANSWER: Behavior > New vs returning report

24) What data is Google Analytics Goals unable to track?

Customer’s lifetime value


Making a purchase
Watching a video
Signing up for a newsletter
ANSWER: Customer’s lifetime value

25) To increase the speed at which Google Analytics compiles reports, what action could
be taken?

Apply an advanced filter to the report


Choose “Greater precision” in the sampling pulldown menu
Remove any filters on the view
Choose “Faster response” in the sampling pulldown menu

ANSWER: Choose “Faster response” in the sampling pulldown menu

26) In Multi-Channel Funnel Reports, what channel would not be credited with a
conversion?

Television commercials
Social network
Website referrals
Paid and organic search

ANSWER: Television commercials

27) If a web page visitor clears the Analytics cookie from their browser, what will occur?

All of the above


Analytics will not be able to associate user behavior data with past data collected
Analytics will set a new unique ID the next time a browser loads a tracked page
Analytics will set a new browser cookie the next time a browser loads a tracked page
ANSWER: All of the above
28)What is not a benefit of using segments to analyze data?

Permanently modify the data in a view


Analyze users by single or multi-session conditions
Isolate and analyze specific conversion paths using conversion segments
Compare behavior metrics for groups of users like Converters vs non Converters

ANSWER: Permanently modify the data in a view

29) What is the “Bounce Rate” in Google Analytics?

Percentage of total site exits


Number of times users returned to a website in a given time period
Percentage of visits when a user landed on a website and exited without any interactions
Percentage of sessions in which a user exits from a homepage

ANSWER: Percentage of visits when a user landed on a website and exited without any
interactions

30) What is used to create Smart Goals?

Custom Reports
Remarketing audience
Analytics Goals
Machine-learning algorithms

ANSWER: Machine-learning algorithms

Unit 4

Short Answers

Q1: What are Facebook analytics?

Ans: Facebook Analytics is a robust tool that lets marketers explore users' interactions with
advanced goal paths and sales funnels for Facebook ads

Q2: How do you analyze Facebook Insights?


Ans: Accessing Facebook Insights is simple: just go to the Facebook Page Manager and click
Insights. The default data range displayed on Facebook Insights is 28 days, but you can toggle
this to fit your needs

Q3: What is the purpose of Facebook insight?

Ans: Facebook Audience Insights gives you aggregate information about two groups of
people—people connected to your Page and people on Facebook—so you can create
content that resonates and easily find more people like the ones in your current audience.

Q4: What is the target audience for Facebook?

Ans: Users ages 25–34 years are the largest demographic. In the distribution of global Facebook
users, 19.3% were male users between 25 and 34 years old and 13.1% were female users in the
same age range. While Facebook users can be found at all ages, 72.8% are within the 18–44
years old range.

Q5: What is unique users in Facebook analytics?

Ans: Unique users can be described as the number of individual people who are clicking on
your page or using your product/service. Below this graph, you can see the age and gender
of your audience which can be helpful when planning your campaigns

Q6: How do you track engagement on Facebook?

1. Ans: Open Insights for your Facebook Page.


2. Select Posts.
3. Scroll down to the section titled All Posts Published where you'll be able to see how many
people your posts reached and your engagement data.

Q7: What is Social Media Marketing?

Ans: Social media marketing is a process of attaining attention, build your brand, increase
website traffic and sales through social media websites. Brands and individuals simply achieve
this by publishing engaging content on their social media channels, engaging with their followers,
and running social media campaigns.

Q8: What is the impact of social media on marketing?

Ans: First of all, the consumption of social media is so high and people use multiple social media
platforms in a given day. So, it is important for brands to use social media marketing as one of
their primary marketing strategy to reach their targeted customers, converting them into users,
keep existing customers brand loyal, solve their problems with the product, answer their queries,
and more.
Q9: Why social media is so popular?

Ans: Social media is so popular because there are multiple reasons people around the world use
social media. Primary reasons are:

It gives the possibility to stay connected with friends and family.

It gives the opportunity to find and connect with new people.

Social networking sites are free, and they implement various algorithms to find the type of content
users might like and display similar type of content on their feed section.

People use social media sites like Twitter to stay updated & connect with the brands they like.

People use sites like LinkedIn for professional networking.

Looking at beautiful visuals from people all over the place, share their photos with the help of
sites like Instagram

Q10: List down some of the popular social media tools.

Ans: Following are the top social media tools to use:

Social Clout: Social clout is a social media analytics tool which helps advertisers to track
engagement and ROI.

OptinMonster: OptinMonster lets the advertiser engage with visitors at the perfect moment.

Audiense: Audiense is a social tool let’s you find new target audiences and categorize them.

Tweepi: Tweepi helps you find relevant users interested in the topic of yours. You can engage
with the users, following them and eventually make them follow you.

Socedo: Socedo finds people who come under your buyer persona radius. After you find your
audience, you can segregate them into multiple divisions, so you can promote content
accordingly.

Socialbakers: Socialbakers is a set of tools to help you make decisions based on your followers.
It lets you measure the performances against your social competitors.

ZeroFOX: ZeroFOX is a tool that helps companies to be safe against hackers.

Followerwonk: Followerwork helps you optimize your audience. It recommends you the people
to follow.

CrowdBooster: This tool gets real-time data. You can then make reports with your KPI.

Q11: How can LinkedIn be used for marketing?


Ans: LinkedIn is one top social media platform to promote oneself or business. It has 310
Monthly Active Users as of now. First of all, the business account needs to be optimized for
search. Publishing engaging content on company page regularly can increase the followers. Rich
content has proved themselves to be pretty useful and will be helpful in increasing the
engagement rate.

Periodically sponsoring your posts will give regular hikes which the page needs. We can use
LinkedIn analytics to regularly monitor your performance and improve your strategy.LinkedIn is
the #1 channel B2B marketers use to distribute content at 94%.

Q12: How will you boost Tweets or Twitter posts?

Ans: To best practices to boost tweets are:

 Find out the best time to post on Twitter

 Reach out to influencers and connect with them. Try to engage in a cross-promotional
activity.

 We can use twitter paid ads to quickly reach out to our targeted audiences.

 Use hashtags properly. Not too much but the right ones to get more reach.

 Schedule your tweets and use calendar to organize everything.

 Use images, links, GIFs to increase retweets.

 Use twitter polls to let users engage with your content.

 Creatively participate in twitter chats to increase your brand awareness.

 Share good content from across the web to increase follower count.

 Use video in the post to improve the reach and engagement rate.

Q13: How social media can benefit a business?

Ans: Social media helps business by building awareness of the business and their products.
Social media can be used for customer engagement as customers can communicate directly with
brands. Social media organic posts don’t cost anything so even smaller companies can afford
them. And, social media paid campaigns are relatively cheaper than many other online
campaigns. Social media reaches all demographics. According to MarketingSherpa, Online
adults aged 18-34 are most likely follow a brand via social networking (95%)

Social media users are active and sharing among their friends and family can help you reach
more audience. Regularly engaging with your brand, makes you loyal thus it increases brand
loyalty and value. Social media humanizes your brand. Apart from all these, it also increases your
website traffic, generate leads, boosting sales, helps you reach influencers and improve your
visibility, helps you promoting your content, helps you tackle negative comments about your
brand, a medium to understand more about the sentiment of your customers, and helps you keep
an eye of your competitors.

Q14: What is the best time to post on social media?


Ans: It depends upon the social media platforms.
Facebook
The best time to post on Facebook is 9 am to 3 pm on Weekdays.
Sunday has the least engagement.
Wednesday has the best day to post on Facebook.
Instagram
The best time to post on Instagram is from Tuesday to Friday from 10 am to 3 pm.
Wednesday has the best day to post on Instagram.
Best time to post on Instagram is Wednesday at 11 am and Friday 10-11 am.
Twitter
The best time to post on Twitter is Wednesday 9 am and Friday 9 am.
Consistent performance is your requirement? Monday to Friday from 8 am to 4 pm.
Saturday gets the worst engagement.
LinkedIn
The best time to post in LinkedIn is Wednesday 9 to 10 am and 12 pm.
The best day to post is Wednesday
From Tuesday to Friday from 8 am to 2 pm, good engagement guaranteed.

Long Answers

Q1: What is Facebook Analytics 

Ans: Facebook defines Facebook Analytics as the "people-first analytics for an omnichannel
world." It offers automated insights into where and how people interact with a business across its
website, app and Facebook page so that marketers can optimize and grow their business.
Facebook Analytics is accessible via facebook.com/analytics or the Facebook Analytics app.

Facebook’s dominance in the social media space is part of the reason it has become of the most
efficient ways to generate an audience for your business, whether it’s in content creation or the
restaurant industry. This is why one of the keys to growing your business in the modern age is
appreciating how to use Facebook, and what role it can play whether you’re starting out a new
venture, or looking to scale-up.

Facebook Analytics is the central pillar for how businesses can effectively use Facebook, offering
users the capacity to make informed decisions around marketing strategies by using a rich pool
of customer data. The information provided by users interacting with your posts and content, from
their age, location and gender to their interests, allows you to truly understand your target
demographic and cater your strategy towards their needs.

While Google Analytics offers a sophisticated set of tools for businesses to manage the users
visiting their sites, the analytics capabilities offered by Facebook have quickly become a more
popular alternative, if not a complementary tool. It’s especially important for businesses with the
need to build a strong social media presence as part of their growth and marketing strategies.
Benefits of Facebook Analytics

This tool offers comprehensive insight into your page and audience. It's best suited for more
advanced marketers and takes time to learn. However, this is time well invested, as Facebook
Analytics can come with big rewards, especially for businesses that use Facebook as a channel
to push sales.

Here are some of the key benefits of Facebook Analytics:

Funnel visualization

Facebook Analytics allows you to visualize your entire sales funnel to analyze conversions for a
sequence of actions page visitors took, along with the time it took for them to complete those
actions.

Data visualization over time

Marketers can pull visually attractive reports, as opposed to just numbers in an Excel
spreadsheet. Most reports are for a 90-day period, but some users may be able to pull data as far
back as one or two years. The ability to view data over time is just one more way businesses can
get value from Facebook Analytics.

Access to the full picture

Instead of tracking data from organic and paid marketing separately, this tool gives users the full
picture of Facebook pages, Facebook pixels and apps all in one, which saves time and lets you
see everything in one place, so you don't have to hop back and forth from tool to tool.

Free tool

There are some really great tools on the market that come at a hefty price, but Facebook
Analytics is completely free to use. 

Data to help improve return on investment

Facebook Analytics data can help you grow your business and improve your return on
investment. For example, you can see the following metrics:

 How many views your page gets


 Which devices people use to access your page
 How long users stay on your page

This data can help you learn about opportunities for improvements with your Facebook marketing
strategy. For example, if page views are low, you may need to adjust your Facebook advertising
to drive more traffic to your page, or if users are not staying on your page long, you might want to
experiment with videos to see if they drive more engagement. If a lot of users are accessing your
page from mobile devices but you use images that have small text, you could increase the font
size.

Q2: What Is A Social Media Campaign? How Do Social Media Campaigns Work?
Ans: Social Media Campaign Definition: The execution of a planned social media marketing
campaign and social media advertising strategy to improve brand awareness, social media user
interaction, as well as business goals or KPI’s (key performance indicators), which are measured
through analytics and sales revenue outcomes. Coordinated marketing efforts that develop or
promote a specific business goal by using one or more social media networks. This requires
focus, targeting, and analysis when compared with typical social media marketing use.
How Do Social Media Campaigns Work
 Social. Being social and making a presence online means sharing valuable content.
Respond to any comments and messages you receive. Like others’ posts, host a digital
event, or anything that will increase interactions in your target market.
 Content. Sharing quality content across your social media marketing channel is very
important. It’s something you may want to consider developing yourself. Create well-
designed images and videos, white papers, press releases, blog posts or infographics.
You can grab the attention of your target market and make the most of your ad campaign
budget.
 Advertising. Most social media networks offer a range of paid advertising tools that can
help drive traffic right to your website. Some of the most common examples are Facebook
Ads, Twitter’s Promoted Tweets, and Pinterest’s Promoted Pins.
 Analytics. Measuring your data is key when making improvements. Social media
platforms offer plenty of valuable data when running paid campaigns. But you’ll also want
to set up and track how your visitors interact with your website, too!
How To Set Social Media Campaign Goals:
A successful social media campaign should focus on a particular business goal as it’s structured
as part of a marketing plan. Whether it’s increasing your following on Facebook, or driving sales
through Instagram.

Practical goals for a successful social media campaign might include:

 Getting direct feedback from your target audience


 Generate FB page messages
 Building an email marketing list of engaged consumers
 Improving brand engagement across social media networks
 Increasing website traffic and brand recognition
 Driving sales directly through your campaign advertisements
To get the best results, goals for social media campaigns must be specific and measurable.

Before the launch of your social media campaign, be sure to measure a baseline of your traffic
and your targeted metrics. In doing so, you’ll be able to track any changes in performance
throughout the duration of the campaign and thereafter.

It’s important to note that different audiences will have different preferences depending on which
social media platforms you use. You’ll want to make sure to pick the best platform suited to reach
your target audience.
Q3: List some security steps while using social networking.

Ans: Social networking sites like Facebook and Twitter can be a great way to connect with
friends. But there are some social networking safety tips you should always keep in mind.

 Manage your privacy settings. Learn about and use the privacy and security settings on
your social networking sites. They help you control who sees what you post and manage
your online experience in a positive way. You'll find some information about Facebook
privacy settings at the bottom of this webpage.
 Remember: once posted, always posted. Protect your reputation on social networks.
What you post online stays online. Think twice before posting pictures you wouldn't want
your parents or future employers to see. Recent research found that 70% of job recruiters
rejected candidates based on information they found online.
 Build a positive online reputation. Recent research also found that recruiters respond to
a strong, positive personal brand online. So demonstrate your mastery of the environment
and showcase your talents.
 Keep personal info personal. Be careful how much personal info you provide on social
networking sites. The more information you post, the easier it may be for someone to use
that information to steal your identity, access your data, or commit other crimes such as
stalking.
 Protect your computer. Security start with protecting your computer. Install Antivirus
software. Keep your operating system, web browser, and other software current. You can
use the Pitt Software Update Service to automatically download the latest security updates
for Windows.
 Know what action to take. If someone is harassing or threatening you, remove them
from your friends list, block them, and report them to the site administrator.
 Use strong passwords. Make sure that your password is at least eight characters long
and consists of some combination of letters, numbers, and special characters (for
example, +, @, #, or $).
 Be cautious on social networking sites. Even links that look they come from friends can
sometimes contain harmful software or be part of a phishing attack. If you are at all
suspicious, don't click it. Contact your friend to verify the validity of the link first.

Q4: What are the Different Types of social media?

Ans: Seeing the popularity and power of Social Media Channels, businesses and marketers look
for different types of Social Media networks that they can use to target and convert their
audiences.
General people are only aware of Facebook, Twitter, Snapchat, and Instagram sorts of social
channels.
Over 88% of the companies are now marketing on Social Media.
But for a marketer or any brand, many other types of Social Media channels are there to explore,
as they play a significant role in targeting and converting prospects.
Different Types of Social Media Networks
1. Social Networks: Facebook, Twitter, LinkedIn
Such types of Social Media are used to associate with individuals (and brands) on the web. They
help your business via branding, social awareness, relationship building, customer service, lead
generation, and conversion.
You can channelize different types of Social Media campaigns on these networks that will help
you widen your reach. Some of the benefits of these Social Marketing Networks are-
(i) They encourage individuals and businesses to interact online and share data and thoughts for
ensuring mutually productive relationships
(ii) In case you are searching for the best ways to optimize current marketing campaigns then you
will discover a variety of organic and paid ways to do this on Facebook, Twitter, and LinkedIn
sorts of social networks.
2. Media Sharing Networks: Instagram, Snapchat, YouTube
Media sharing types of Social Media are used to find and share photographs, live video, video
and other kinds of media on the web.
They are also going to help you in brand building, lead generation, targeting and so on. They give
individuals and brands a place to discover and share media so the target audiences can be
targeted and converted into a convincing and result-driven way possible.
Social networks nowadays also offer these features, however, for Media Sharing Networks,
sharing of media is their basic role.
(i) Starting with image or video on Instagram, YouTube and Snapchat types of media sharing
networks would be more beneficial for you.
(ii) To decide whether you should use these networks for your business or not, you should
consider your resources and target audiences. These channels will help you run well-planned
campaigns to generate leads and widen your audience base.
3. Discussion Forums: Reddit, Quora, Digg
Such types of Social Media channels are used for finding, sharing and discussing different kinds
of information, opinions, and news.
They help businesses by being a top-notch resource for doing immaculate market research.
These forums are the oldest ways of running Social Media Marketing campaigns.
Before the entry of popular Social Media players like Facebook, these forums were the places
where professionals, experts and enthusiasts used to do different kinds of discussions
concerning a variety of fields.
(i) These discussion forums have a massive number of users and it ensures unprecedented
reach for your business. These are the places that provide the answers to different queries of any
domain.
(ii) In case your business needs deep customer research then these places would the most
befitting one for your business.
(iii) Along with sharing information and knowing answers, these places are very impactful in
advertising as well.
4. Bookmarking & Content Curation Networks: Pinterest, Flipboard
Opting for such types of Social Media will help you find out, share, discuss and save a variety of
latest content and media that are trending as well.
They are very helpful in channelizing brand awareness for your business, plus, choosing this one
to run different types of Social Media Marketing campaigns will help you generate website traffic
and customer engagement.
In case you want to run some out of the box highly creative campaigns that can not only inform
your audience and but also attract them then this one is the best fit.
(i) To run a Social Media campaign on Pinterest, you need to have a site that is bookmark-
friendly. You should optimize headlines and images for the feeds that Bookmarking and Content
Curation Networks use for accessing and sharing your content.
(ii) Flipboard lets you create your own Flipboard magazine by using most engaging content and
then you can showcase that to your audiences.
5. Consumer Review Networks: Yelp, Zomato, TripAdvisor
Using Customer Review networks will help you find out, share and review different information
about a variety of products, services or brands.
When a business has positive reviews on these networks, their claims turn more credible
because reviews on these networks act as Social Proof.
For running a successful Social Media Marketing Campaigns, it is very important for today’s
businesses to have positive reviews on these sites.
In addition, resolving all the issues that your customers are posting on these Review platforms is
another thing that is going to be very important for the positive and productive outcomes for your
business.
(i) These networks offer a place to users for reviewing different kinds of products and services
that they have used.
(ii) Review content adds great value to any brand because it will influence more and the number
of new buyers to attempt your services.
(iii) Yelp and Zomato are the types of social media platforms that offer location-based review
services that will help you run location-based social campaigns.

Q5: What is AdWords and how to use it? How Does Google AdWords Charge?
Ans: Google AdWords is a pay-per-click online advertising platform that allows advertisers to
display their ads on Google’s search engine results page. Based on the keywords that want to
target, businesses pay to get their advertisements ranked at the top of the search results page.
Since the platform runs on pay-per-click (PPC) advertising, you have to pay only when a visitor
clicks your ad.

The Google AdWords marketplaces work like an auction where people bid for clicks. However,
it’s not necessary that the highest bid wins. Apart from money, Google also considers the quality
score to ensure that the people clicking on the ads have the best possible experience.

To use Google AdWords, follow the steps below:

1. Establish your account goals. For example, if you are using your Google Ads for brand
building, the account structure and the features that you use will be completely different if
you use ads for lead generation
2. Develop audience personas by determining who your ideal customers are, what they do,
what are they searching for and on what device
3. Conduct keyword research by using keyword tools, such as SEMrush, to discover, cost,
competition and volume for the search terms at every stage of your search
4. Structure your AdWords account into different ad campaigns and ad groups, each
featuring relevant keywords and ads
5. Once you’ve listed the keywords relevant to your business, you can place your ads in the
search results by bidding on the keywords. If the competition is high for the keywords, your
Cost Per Click (CPC) would be too high to bid. In this case, it’s better to get granular and
bid for long-tail keywords that are relevant for business
6. Create the ad copy. Make sure that you include relevant keywords, a compelling headline,
a clear call-to-action and ad extensions
7. Design a mobile-friendly landing page that focuses on the benefits and features of the
product or service that you’re trying to sell, has good-quality images, a form and a clear
call-to-action
8. Place a Google Analytics code on the website for conversion tracking
9. The key to a successful ad campaign is routine optimization and A/B testing all your ad
copies and landing pages

The amount that Google AdWords charges advertisers depends on what they are advertising.

Since Google AdWords is a pay-per-click advertising program, your ads are displayed for free
and you’re charged only when someone clicks on your ad on Google search results page. Also,
the AdWords system is a live auction, therefore, the click prices are determined by the amount of
competition, and how much they’re willing to pay for a click.

When done correctly, Google AdWords can drive high-quality traffic to the website at costs that
are much more competitive as compared to other forms of advertising.

However, when you don’t know how to expertly manage the process, costs can rack up fast while
you potentially drive low-quality traffic. The key to running a successful AdWords campaign is to
understand the factors that play into how much each click costs you.

 Keyword competition
 Maximum bid and bid position
 Your average monthly budgets
 Click-through rates
 The quality score of your keywords
If you’re targeting high-volume keywords with lots of monthly searches, you could be paying a
hefty amount for that traffic, which can be anywhere between a few cents to over ten dollars for
each click.

To manage your AdWords costs, set a daily budget at the campaign level. You’re free to make
changes to this when you like. Ideally, beginning advertisers should start small with a low budget.
Based on the insights and the quality of leads, you can determine whether you want to boost your
budget or stop a campaign.

Q6: How to use competitive benchmarking to grow social presence?


Ans: Through benchmarking, you can better understand where your company is winning and
uncover opportunities to improve your content strategy. This guide will break down exactly how.

Simply put, benchmarking means comparing the performance of something against a standard.

In marketing, competitive benchmarking is the process of measuring your company’s


campaigns versus your competitors’. By tracking specific metrics and KPIs, you can
establish realistic benchmarks to assess your performance.

Benchmarking is common practice in business. For example, a company might practice internal
benchmarking within a department so a set number of tasks are completed per week (think: sales
calls). This standard encourages accountability and likewise gives employees a goal to work
toward.
The same rings true when it comes to benchmarking on social media. Rather than second-
guess whether a campaign or piece of content performed well, you can look at your benchmark
and get a data-driven answer as to how it stacks up to similar efforts by competitors.
And, just to drive home the relevance of competitive data to social marketers: In our recent
survey of 250 business executives, 60% identified investing more resources into social media
as a key way to gain a competitive advantage. Read on to learn how you can start creating
benchmarks for your brand.
Q7: What Are the Different Types of Web Traffic?
Ans: "Web traffic" refers to the visits that your website receives. Not all visits are created
equal.Knowing the different traffic sources that bring people to your site will help you
understand how to improve it.  

To properly analyze web traffic, you must be able to distinguish between the total number of
visits during a given period and the number of unique visitors, since the same person may
visit the site several times. 

You should also take into account factors related to the quality of the visit, such as duration or
number of page views.

1. Organic Traffic

Organic traffic is the number of visitors who enter a website after doing a search on Google
or other search engines and clicking one of the links on the results page.

This type of web traffic can account for a very significant percentage of visits in the long run.
To get more organic traffic, it is necessary to apply search engine optimization
(SEO) techniques.

It is often said that organic traffic is free, but this is not entirely true. Although you are not
paying directly for each click, optimizing your website for SEO involves an initial investment of
time and resources as well as some maintenance.

2. Direct Traffic

Direct traffic includes visitors from several different origins:

 People who have directly typed the URL of your website into their search bar.

 People who have saved your website's URL in their favorites and have arrived
through it.

 People who have clicked on a link in a non-indexed document or in an opened email


using email software.
In order not to lose potential direct traffic, your URLs should be clean, simple, and easy to
remember. You can also invite users to bookmark your site for future visits.

 3. Referral Traffic

This type of web traffic refers to people entering a website by clicking on a link from another
site like a blog or a forum.

Increasing referral traffic involves participating in active link building activities, like guest


blogging or submitting your site to directories. However, it's crucial to take into account
Google’s policies on links to avoid possible penalties.

4. Email Marketing

If you’re doing email marketing campaigns, you can measure their success by tracking how
many visitors come to your site through your messages.
Email marketing management programs provide plenty of information about your delivery rate,
opening rate, clicks on links, total clicks, unique clicks, etc.

To improve web traffic from email marketing, we recommend two best practices:

 Make A/B tests with different versions of the same email to optimize factors such as
subject, images, or the time when the emails are sent out.
 Use email marketing automation solutions to manage the whole process more
efficiently.

5. Social Networks

This traffic source refers to visitors who arrive after clicking on a social media post. You can
distinguish between the different social networks and you can go deeper into the data on clicks
and interactions.

6. Paid Media

This type of traffic refers to visitors who arrive after clicking on a pay-per-click ad on a social
network. 
Social media ads (for example, Facebook Ads or TikTok Ads) can effectively attract visitors to
your site and give you statistics on how users behave when they arrive there. Social ad
platforms also provide you with a lot of data about your campaigns, like information about
demographic and user interests.
 

7. Paid Search

This category would include users who come to your website after clicking on an ad
from Google Ads or other PPC platforms.
Remember that these platforms incorporate different types of ads and locations. For example,
with Google Ads you can place an ad in search results, launch a campaign on YouTube, or
place banners on third party sites, among others. In order to properly evaluate the results, you
will have to distinguish between different types of campaigns.

Search engine ads are a very effective way of getting short-term traffic and an excellent
complement to organic positioning or SEO strategies. To optimize results, pay close attention
to keywords and location targeting.
 

8. Offline Traffic

So far, all the types of traffic we have mentioned come from digital channels. But it's also
possible that visitors have come to your website from offline sources. Some web analytics
programs, such as HubSpot, allow you to identify the traffic that has reached your site through
these channels.
 

9. Other Campaigns

Finally, you may be able to identify traffic coming from web campaigns that do not exactly fit
into any of the types of web traffic we’ve discussed so far.

To do this, we recommend creating tracking URLs to associate each campaign to a unique


URL redirected to a landing page. You can do this using HubSpot. Tracking URLs allow you to
filter traffic according to unique URLs and attribute it to the correct campaign.

Q8: What is Google Website Optimizer? How to Use the Website Optimizer Tool in
AdWords?

Ans: Google Website Optimizer is designed for testing two or more similar Web pages to see
which performs best. If you rely on your company's website for lead generations with sign-up
forms, or if you sell products or services online, the Website Optimizer will let you compare
different page designs to see which results in the most sign-ins or purchases. In 2012, Google
began integrating the Website Optimizer into Google Analytics and plans to rename it Google
Analytics Content Experiments.
Signing Up and Preparation
1 Open a new Web browser window and go to the Google Analytics website (link provided in
Resources). Create an account if you don't have one already and follow Google's instructions to
place the analytics code on your website. You may have to wait up to a day for Google to verify
your code before proceeding to the next step. Google Analytics tracks traffic to your website
pages and gives detailed information about your viewers.

2 Decide which website page you want to experiment with. For example, if you have a sales
landing page or an email subscription sign-up page, you may want to change the text, images,
fonts or color scheme.

3 Decide if you want to test one new design of the existing page as an A/B experiment, or
several different designs simultaneously as a multivariate experiment. Google recommends
using an A/B test if your existing page gets less than 1,000 visitors each week. Without
sufficient traffic, it could take several weeks for your pages to get enough visitors for you to
make an informed decision about which design works best.

4 Create the new page design and upload it to your Web server. Make a note of its URL.

Using Website Optimizer


1 Log in to your Google Analytics account and go to the Website Optimizer page (see
Resources). Alternatively, click your website name in the Google Analytics home page and then
click the "Standard Reporting" tab at the top of the page. Click the "Content" button in the left
menu and select "Experiments."

2 Click the "Create a New Experiment" link. Select either the "A/B Experiment" link or the
"Multivariate Experiment" link, depending on which type of test you decided on.

3 Follow the instructions provided by Google for your experiment. You will need to enter the
URLs of your Web pages into the Optimizer, then copy and paste JavaScript code into your
Web pages. Google will verify that the code is in place correctly before you can begin the
experiment.

4 Provide links to your original page and the new test pages using your normal methods, such
as online ads, social media or email newsletters.

5 Log in to your Google Analytics account after at least one day and go to the Website
Optimizer experiment page. Click on the experiment you created to see a report. The report
compares traffic and clicks on each page to determine which works best. The report will also
inform you if there is not yet enough data gathered to make a determination, in which case you
should continue running the experiment.
Q9: Explain some Limitations and Advantages of Google Analytics.

Ans: CONS OF GOOGLE ANALYTICS

1. Ad Blockers RENDER ANALYTICS USELESS!

Popular ad blockers like Firefox and antivirus software mean that millions of people won’t even
be counted in your google analytics. There is no definitive estimate on how many people will be
missed but it is definitely something to bear in mind when choosing any analytic software.

2. Time Consuming to learn how to use!

If you plan on integrating Google Analytics into your marketing metrics and campaign then you
would have to understand at least at a basic level how it works. This takes time and if you own
your own business there is no free time to spend!

The time, effort and experience needed to analyse all the different optional metrics is complex
and learning to ‘use this language’ can be difficult to follow and find online. There is resources
available here; Analytics Academy It has four, easy-to-use, step-by-step courses ranging from
beginner to advanced so if you have some free time I would definitely recommend it.

3. Too much data!

For many business and business owners there is simply too much information collected on
website performance, For most site owners, the amount of data Google Analytics collects is
overkill. It’s a powerful but complex tool that takes time to understand and requires training. With
around 300 different metrics available and many only using a tiny number of metrics and ignoring
the other data.

4. Questions about the reliability?

Though Google Analytics is a reliable source for majority of the data collected there can be
inconsistencies. The skewing of data makes it less accurate and less reliable, spam and bot
traffic are often reported as a referral, which cannot be removed meaning analysing the overall
data will be inaccurate.

Another issue is that Google is in effect reporting on Google’s performance meaning it’s not
entirely independent. Acquisition reports or how well Google search engines drove users to the
site, the validity of this particular metric is questionable for obvious reasons.

5. Lack of COMPLEX Long Term Performance tracking.


If you want to analyse behaviour or evaluate long term performance then Google Analytics might
not be the right metric source for you. Identifying which blog attracted long term subscribers or
leads, similarly if you wanted to track and analyse behaviour over months or years again it
doesn’t give results capable of that.

Pros of Google Analytics

1. it’s completly Free!

Google Analytics is free of charge so everyone and anyone can use it, if you have a website, blog
or any kind of online presence. However, if you have more than 5 million impressions per month
you will be required to upgrade to the Google Analytics 360 account which is a price to consider if
you are on a budget.

2. It CAN BE Customisable TO YOU!

Like other marketing tools you can customize your Google Analytics with custom dashboards by
adding widgets that are useful and most beneficial to your particular goals and value measures.
Though this does mean you tend to overlook the other features and metrics that Google Tracks
but it is easier to streamline your process and make better decisions instead of facing huge
amounts of useless metrics.

3. multi – platform USAGE

You can use Google Analytics on any device; mobile, tablet, laptop as long as you have internet
connection. This is perfect for busy businesses and owners who want to check metrics on the go
and make sure everything on the website is running smoothly.

Another added benefit is you can connect your Google Analytics account with your Google
Ads account. This will streamline your entire process and make the measuring and analysing of
your data much faster and easier.

4. THE BEST AT MONITORING website performance!

The dashboards in Google Analytics are great for monitoring how your website is performing,
showing aspects such as active visitors and user locations. You can see who is landing on your
website and how they navigate between pages or posts, even how long they stay on each page.
It will track the source and medium down to the social campaign, collecting landing page
information and measuring if they met a goal such as making a purchase or subscribing.

You can see; your website audience, audience behaviour, website acquisition and conversion
metrics as well as a real-time view of the audience on your site. If you want to confirm your
website is loading smoothly for everyone, monitor the amount of traffic and quickly detect any
bugs then it is ideal to use.

Another forgotten benefit is that anyone, anywhere can view your Google Analytics account and
understand how your website is performing. This is particularly attractive for business owners
working with third parties overseas such as international clients or agencies.

5. increased visibility

The smart interface neatly displays information in a collection of graphs, tables and charts,
helping you digest the information quickly and easily at just a glance. This increased visibility also
means that the tool is easily shareable and digestible by other users, should you need a third
party to take a look at the account.

Google Analytics is clearly a valuable tool that will help you understand and monitor your website
24/7 for free! Along with monitoring your own goals and traffic measures and with the online
learning resources available. 

MCQ’s

1. Can you choose your target group using facebooks ads?

Answers: 
• Yes
• No

2. What is the name for Facebook's ranking algorithm?

Answers: 
• Like Rank
• Face Rank
• Page Rank
• Edge Rank

3. Which of the following is NOT considered in Facebook's engagement metric?

Answers: 
• Share
• Comments
• Likes
• Views

4. Which of the following best describes third-party apps interacting with Facebook?

Answers: 
• AoF calls
• AFI calls
• API calls
• alk3 calls

5. True or False? Mobile users can see a sponsored story in their newsfeed.

Answers: 
• False
• True

6. On Facebook, you can sponsor a post to a specific audience age, gender and location.

Answers: 
• False
• True

7. What is the name of Facebook’s analytic package?

Answers: 
• Princeps
• GlassDoor
• Discovery
• Insights

8. True or False? Cover photos are crucial to establishing respectable company pages.

Answers: 
• False
• True

9. What does NFO stand for?

Answers: 
• Network/Feed Organization
• Novel Feature Orientation
• News Feed Optimization
• No Fee Operation

10. True or False? Half of Facebook page visits come from mobile devices.

Answers: 
• FALSE
• True

11. What does CTR stand for?

Answers: 
• Client/Thought Relation
• Click Through Rate
• Cost Times Response
• Cost Through Recession
12. True or False? Once a budget is set it cannot be changed.

Answers: 
• False
• True

13. True or False? Facebook supports hashtags.

Answers: 
• True
• FALSE

14. True or False? Facebook and Twitter profiles can be connected.

Answers: 
• True
• False

15. True or False? It is possible to embed single Facebook posts in an external blog or
website?

Answers: 
• True
• False

16. What button can be installed on outside webpages to allow consumers to become a
fan of your page?

Answers: 
• Like Button
• Feed Button
• F - Button
• Fan Button

17. How can I get more people to share my posted content?

Answers: 
• (all of these)
• Offer an incentive to users for sharing
• Include an image with your Facebook status update
• Use a Call to Action in the post

18. What allows users to share their location on Facebook?

Answers: 
• Pin-Points
• Check-Ins
• Shout-Outs
• Four-Squares

19. Facebook users spend the most time (40%) on which page?
Answers: 
• News Feed
• Chat Windows
• Collective Friend's Profiles
• Own Profile

20. True or False? Long, rich, and detailed posts get more likes/shares than short or
medium length posts.

Answers: 
• True
• False

21. Which of the following is most likely to get attention on Facebook?

Answers: 
• .GIFs
• Photos
• Text pitches
• Facebook page shares

22. True or False? Facebook's analytic tool shows the names of people who have visited
your page.

Answers: 
• True
• False

23. Where can you see statistics of your posts on Facebook?

Answers: 
• Analytics
• See who like
• Insights
• Settings

24. What symbol allows users to 'tag' other users or companies?

Answers: 
•>
•@
•#
•*

25. What symbol does Facebook use to mark verified pages?

Answers: 
• The letters "OK"
• A star
• A check mark
• A hand giveing a thumbs-up signal
26. What is the name of Facebook's Analytics tool?

Answers: 
• Facebook Analytics
• Edge Rank
• Page Rank
• Insights

27. Which of the following is *NOT* a type of promotion offered by Facebook Offers?

Answers: 
• Online
• In Store
• Traditional Media

28. True or False? A hashtag on Facebook should be unique from hashtags you use on
other social networks.

Answers: 
• False
• TRUE

29. What is CTR?

Answers: 
• Cost Through Rate
• Click Through Rate
• Click Table Rate
• Cost Through Rates
• Clicks Through Rate

30. TRUE or FALSE? Integrating apps and third party websites with Facebook does *NOT*
require creating a FB application.

Answers: 
• FALSE
• TRUE
Unit-5
Short Answers
Q1: What is web analytics 2.0?
Ans: (1) the analysis of qualitative and quantitative data from your website and the competition,
(2) to drive a continual improvement of the online experience that your customers, and potential
customers have,
(3) which translates into your desired outcomes (online and offline).
Q2: Explain competitive intelligence.
Ans: It is an ethical process for obtaining information on the competitive environment for use in
organizational decision making. As a result, competitive intelligence collection and analysis has
both tactical and strategic importance for companies. 
Q3: What is panel Data?
Ans: Panel data is another well-established method of collecting data. to gather panel data, a
company may recruit participants to be in a panel, and each panel member installs a piece of
monitoring software. The software collects all the panel’s browsing behaviour and reports it to the
company running the panel. Additionally the person is also required to self report demographic,
salary, household members, hobbies, education level and other such detailed information.
Q4: Explain ISP (Network) Data..
Ans: We all get our internet access from Internet Service Providers (ISP's), and as we surf the
Web, our requests go through the servers of these ISP's to be stored in server log files.

The data collected by the ISP consists of elements that get passed around in URLs, such as
sites, page names, keywords searched, and so on. The ISP servers can also capture information
such as browser types and operating systems.

Q5: Define Search Engine Data..


Ans: Our queries to search engines, such as Bing, Google, Yahoo!, and Baidu, are logged by
those search engines, along with basic connectivity information such as IP address and browser
version. In the past, analysts had to rely on external companies to provide search behavior data,
but increasingly search engines are providing tools to directly mine their data.

You can use search engine data with a greater degree of confidence, because it comes directly
from the search engine (doh!). Remember, though, that the data is specific to that search engine
—and because each search engine has distinct user base, it is not wise to apply lessons from
one to another.

Q6: What is Self-reported Data?


Ans: It is common knowledge that some methods of data collection, such as panel-
based, do not collect data with the necessary degree of accuracy. A site’s own analytics tool may
report 10 million visits, and the panel data may report 6 million. To overcome this issue, some
vendors, such as Quantcast and Google’s Ad Planner, allow websites to report their own data
through their tools.
Q7: What is a traffic trend?
Ans: Great for deciding when to leave work. Specify a start address and a destination address,
and Traffic Trend keeps track of the traffic conditions between the two locations. The
current travel time is always displayed in a badge on the extension's icon, and the icon changes
color based on the current traffic trend.

Long Answers:
Q1: How traffic Trends works?
Ans: Traffic Trend Analysis tracks and records the movement and action trends of your sites
visitors, providing you with all the information you need to follow and analyze the success of your
website!
More so than ever before, internet users around the world are holding websites accountable to
high expectations. They know what they want, and expect to find it quickly, easily, and
conveniently. Is your page performing well? Awesome, use Traffic Trend Analysis to see exactly
how well it’s working, why it’s working that well, and how you can apply that success to other
parts of your site. Got a page that’s not living up to your goals? No problem. Traffic Trend
Analysis will give you what you need to disect the problems on the page, make changes, and
gauge your results.
They say that to know where you’re going, you have to know where you’ve been. Well with traffic
trend analysis, you can see where your users have been, AND where they’ve gone. Is your
websites intent to convert traffic to sales? Traffic trend analysis can help you ensure that your
content and navigation is properly funneling visitors toward that goal. Is your site’s goal simply to
entertain or inform? Use traffic trend analysis to see how long visitors stay, what pages they stay
on, and where they go next. Traffic trend analysis provides you with valuable, usable information
about your visitors, and in today’s information age, the more you know, the better.

The trafficTrends feature improves recommended routes on Garmin devices by referencing


trends in traffic flow. This helps to provide a more accurate time of arrival expectations as well as
alternative routes to the destination depending on the time of day and the day of the week. You
do not need an active traffic subscription in order to use this feature.

Garmin has managed to improve this feature by constantly collecting historical traffic data which
is delivered to the device via map updates.

Examples of how this feature works can be seen in the below screenshots. The picture on the left
is a route provided to the user at 2PM on a Thursday. The image on the right is on the same day
but at 5PM when traffic is more prevalent on the route. In this case, the user is provided with a
different route based on historical traffic in that area.

Q2: What Is Competitive Intelligence? How Competitive Intelligence Works?


Ans: Competitive intelligence, sometimes referred to as corporate intelligence, refers to the
ability to gather, analyze, and use information collected on competitors, customers, and other
market factors that contribute to a business's competitive advantage. Competitive intelligence is
important because it helps businesses understand their competitive environment and the
opportunities and challenges it presents. Businesses analyze the information to create effective
and efficient business practices.

 Competitive intelligence refers to the ability to gather and use information on factors that
affect a company's competitive advantage.
 Organizations analyze collected data and information to develop effective and efficient
business practices.
 Competitive intelligence can be classified as myopic-oriented, tactical intelligence, or
long-term focused strategic intelligence.
 Gathering data and information is more complex than conducting a simple Internet
search.

Competitive Intelligence Working

By definition, competitive intelligence assembles actionable information from diverse published


and unpublished sources, collected efficiently and ethically. Ideally, a business successfully
employs competitive intelligence by cultivating a detailed enough portrait of the marketplace so it
may anticipate and respond to challenges and problems before they arise.

Competitive intelligence transcends the simple cliché "know your enemy." Rather, it is a deep
dive exercise, where businesses unearth the finer points of competitors’ business plans,
including the customers they serve and the marketplaces in which they operate. Competitive
intelligence also analyzes how a wide variety of events disrupts rival businesses. It also reveals
how distributors and other stakeholders may be impacted, and it telegraphs how
new technologies can quickly render invalid every assumption.
Within any organization, competitive intelligence means different things to different people and
departments. For example, to a sales representative, it may refer to tactical advice on how best
to bid for a lucrative contract. To top management, it may mean cultivating unique marketing
insights used to gain market share against a formidable competitor.

Q3: What is Website Traffic and how to interpret it?


Ans: Website traffic refers to web users who visit a website. Web traffic is measured in visits,
sometimes called "sessions," and is a common way to measure an online business effectiveness
at attracting an audience.

When ecommerce took off in the 1990s, the metric of web traffic was first viewed as the most
important means of determining a website's popularity, as other metrics did not yet exist to gauge
online success. As digital marketers got savvier, analyzing a website's performance became
much more comprehensive.

Analysts no longer just ask "how many people visited?" Now, it's just as — if not more
— important to find out:

 How long did users stay? Bringing in huge amounts of traffic is ultimately meaningless if
users leave after mere seconds. Metrics such as bounce rate and time on page pant a
picture of how users behave.
 What % of users made a purchase? For an online business to flourish, it needs a large
audience. But it also needs to be the right audience. Determining how many users buy
products, commonly measured by conversion rate, shows whether an ecommerce store is
effectively selling marketing their product offerings.
 How much does it cost to bring in a visitor? Some web traffic is free, but many online
stores rely on paid traffic — such as PPC or affiliates — to support and grow their
business. Cost of Acquiring Customers (CAC) and Cost Per Acquisition (CPA) are
arguably the two most important ecommerce metrics. When balanced with AOV (average
order value) and CLV (customer lifetime value), a business can assess and adjust its ad
spend as necessary.

Website traffic is not the be all, end all of ecommerce performance measurements. But it is still a
great starting point to determine a website's popularity and visibility. Consider two contrasting
ecommerce underachievers:

a) Website A:Effective call to actions and concise yet eloquent product descriptions convert a
high percentage of visitors to sale, but they only bring in minimal traffic.
500 monthly visits * 40 sales = 8% Conversion Rate (CR)

b) Website B: Ranks highly in natural Google search listings, puts out well-received content, and
brings in paid advertising. They do outstanding with web traffic, yet convert a minimal number of
visitors.

5000 visits, 40 sales = 0.8% CR

This example illustrates why marketing metrics such as web traffic cannot be viewed in a
vacuum. Two contrasting websites achieve the same outcome, where they are failing to
capitalize on what they do well. By focusing on the one metric where they excel, it fails to
acknowledge the area for improvement. By studying the whole picture and optimizing areas of
subpar performance, ecommerce stores give their customers the best possible experience while
maximizing revenue.

Q4: How is website traffic actually recorded?

Ans: When someone visits a website, their computer or other web-connected device
communicates with the website's server. Each page on the web is made up of dozens of distinct
files. The site's server transmits each file to user browsers where they are assembled and formed
into a cumulative piece with graphics and text. Every file sent represents a single “hit”, so a single
page viewing can result in numerous hits.

It is not only the traffic on the website's homepage that is monitored. Rather, all segments of the
website are constantly monitored by the server to determine exactly how many hits each
receives. In web vernacular, a single visit is known as a “session”. The minutia of each session
varies, yet each has a beginning and an end point.

Servers are able to compile every request for a web page, arming its operator with the
information needed to determine how popular the site is and which pages receive the most
attention. When a web server processes a file request, it makes an entry in what is known as the
“server log” on the server's hard drive. The log gathers entries across posterity, forming a
valuable database of information that the site owner can analyze to better understand the
website's visitor activity.

Q5: Why should we check our website traffic? Why Should You Check Your Competitor’s
Website Traffic? Explain Steps to follow while competitor site analysis.

Ans: By checking your website stats, you can easily see how your website is performing.

Your website traffic data will show you where your traffic is coming from, how visitors engage with
your site, and what digital marketing strategies are working.

If you want to get more email subscribers, more sales for your online store, or just more
traffic overall, then you need to regularly check your website analytics.

By tracking your site’s traffic, you’ll know where your site currently stands and what you can do to
improve.

Analyzing your competitor’s website traffic statistics can reveal a lot of helpful information such
as:

 The pages and posts bringing your competitors the most traffic

 Which keywords your competitors are ranking for

 The channels that are driving them the most traffic


All of this information can be used to improve your content marketing strategy, link
building, keyword research process, and more.

By understanding what brings your competition the most traffic, you’ll be able to target those
same keywords and topics to generate more traffic for your website.

There are a lot of free and paid traffic checking tools that you can use. Each one has unique
features that set them apart.

Most experts use multiple tools to check website traffic estimates for their competitors. We
always recommend readers to try at least two different tools to analyze web traffic stats.

By using various tools, you’ll be able to fill in the gaps and get more accurate traffic stats for any
website.

As your WordPress site and budget grow, you can invest in multiple tools to gain more insights
and dominate your market.

Step #1: Identify your top competitors

You probably already have a decent idea of the other big players within your niche. However,
the shortlist you have in your head might not match the data you uncover.

The best way to know who your top competitors are is to find out which websites share the most
‘audience overlap’ with you. To put it another way, ascertain the other sites your audience visits,
as they will also target many of the same keywords as you.

Step #2: Take a look at your competitor’s top content

Knowing what content drives the most traffic to your competitors is an excellent way to gain
insight into their strategy. You can learn what kind of posts they focus on, how in-depth their
content is, what keywords they’re targeting, and more.

In the real world, you’d need to conduct corporate espionage to find out this information. Online,
all you have to do is sign up for a free account using a service such as SEMRush

Step #3: Look for keywords you have in common

As an addendum to step number two, it’s good to also look for keywords you have in common
with competitors. This gives you an idea of what content you already have that, if improved,
could bring in more traffic.

You can also use SEMRush to find that data. Log into your account and go to the Domain vs.
Domain tab on the left menu. Here, you can enter multiple domains so SEMRush can compare
them head-to-head

Step #4: Identify keyword gaps you can target


Once you have a list of your competitor’s top keywords, one thing becomes evident quickly:
they’re probably targeting a lot of terms you aren’t. Those missed spots in your content strategy
constitute ‘keyword gaps.’ Your goal is to fill the gap with content that can rank, so you can also
get a slice of the organic traffic coming in.

However, since we’re talking about your competitor’s top keywords, they’re probably high
difficulty search queries. 

Step #5: Analyze your competition’s backlink profile

We’ve talked a lot about keywords so far, but they aren’t the only important factor when it comes
to SEO. As you may know, backlinks are another signal search engines value highly. Knowing
which websites are linking to your competitors can give you a list of publications to target for
additional backlinks of your own.

Q6: Differentiate between Web analytics 1.0 and 2.0.


Ans: Web 1.0 – 
Web 1.0 refers to the first stage of the World Wide Web evolution. Earlier, there were only a few
content creators in Web 1.0 with a huge majority of users who are consumers of content.
Personal web pages were common, consisting mainly of static pages hosted on ISP-run web
servers, or on free web hosting services. 
In Web 1.0 advertisements on websites while surfing the internet are banned. Also, in Web 1.0,
Ofoto is an online digital photography website, on which users could store, share, view, and
print digital pictures. Web 1.0 is a content delivery network (CDN) that enables the showcase of
the piece of information on the websites. It can be used as a personal website. It costs the user
as per pages viewed. It has directories that enable users to retrieve a particular piece of
information. 
Four design essentials of a Web 1.0 site include:
1. Static pages.
2. Content is served from the server’s file system.
3. Pages built using Server Side Includes or Common Gateway Interface (CGI).
4. Frames and Tables are used to position and align the elements on a page.
Web 2.0 – 
Web 2.0 refers to worldwide websites which highlight user-generated content, usability, and
interoperability for end users. Web 2.0 is also called the participative social web. It does not
refer to a modification to any technical specification, but to modify the way Web pages are
designed and used. The transition is beneficial but it does not seem that when the changes
occur. Interaction and collaboration with each other are allowed by Web 2.0 in a social media
dialogue as the creator of user-generated content in a virtual community. Web 2.0 is an
enhanced version of Web 1.0. 
The web browser technologies are used in Web 2.0 development and it includes AJAX and
JavaScript frameworks. Recently, AJAX and JavaScript frameworks have become a very
popular means of creating web 2.0 sites. 
 Five major features of Web 2.0:
1. Free sorting of information, permits users to retrieve and classify the information
collectively.
2. Dynamic content that is responsive to user input.
3. Information flows between the site owner and site users by means of evaluation &
online commenting.
4. Developed APIs to allow self-usage, such as by a software application.
5. Web access leads to concern different, from the traditional Internet user base to a
wider variety of users.
Usage of Web 2.0 –
The social Web contains a number of online tools and platforms where people share their
perspectives, opinions, thoughts and experiences. Web 2.0 applications tend to interact much
more with the end user. As such, the end user is not only a user of the application but also a
participant by these 8 tools mentioned below:
1. Podcasting
2. Blogging
3. Tagging
4. Curating with RSS
5. Social bookmarking
6. Social networking
7. Social media
8. Web content voting

1. Social networks are organized primarily


around ____.
(A) Brands
(B) People
(C) Discussion
(D) All of the above
Correct option is B

2. Which social network is considered the most


popular for social media marketing?
(A) Twitter
(B) Whatsapp
(C) LinkedIn
(D) Facebook
Correct option is D

3. What is the name for Facebook`s ranking


algorithm?
(A) Like Rank
(B) Face Rank
(C) Page Rank
(D) Edge Rank
Correct option is D

4. Social networks have an enormous


information sharing capacity. As such, they
are a great distribution channel for
(A) Customer feedbac
(B) Viral content
(C) Exclusive coupon
(D) Marketing message
Correct option is D

5. Which social network is considered the most


popular for business to business marketing?
(A) Twitter
(B) Whatsapp
(C) LinkedIn
(D) Facebook
Correct option is C
6. What is the term adopted for updates by
Twitter users?
(A) Tweets
(B) Twoots
(C) Twinks
(D) None of above
Correct option is A

7. What is meant by “guerilla marketing”?


(A) Using resources such as time, energy and
imagination rather than money to market
(B) Using advertising spots which utilize
gorillas to capture the audience
(C) Having a large scale marketing budget
(D) All of above
Correct option is D

8. Which of the following is functions of social


media for business?
(A) Are you participating in the conversation
and sharing?
(B) Are you listening and monitoring what is
being said about you?
(C) Both
(D) None of these
Correct option is C

9. What is the name of Facebook's analytic


package?
(A) Princeps
(B) Viewership
(C) Insights
(D) None of these
Correct option is C

1. The development of the ____________ led to the development of web analytics. 

a) Social media
b) Internet

c) Google Analytics

d) Insights

2. The first forms of analytics tools were built to monitor _____________.

a) Website links

b) User behavior

c) Social media

d) Interaction and conversions

3. Which of these is not an indicator or signal of a website meaningful and significant for web
analytics?

a) The gross quantity of visitors on a website or traffic quantum

b) The number of unique visitors or visitors who are new

c) The number of social media followers

d) The category of users searches for particular keywords or search terms

4. Which of these is an indicator or signal of a website meaningful and significant for web
analytics?

a) The number of users who signed up for a newsletter

b) The number of links you have on the home page

c) The number of links you post in blog articles

d) The number of users who go beyond the main page, to deeper links

5. The first web analytics tool was called _____________.

a) Google Ads

b) Google Analytics

c) WebTrends

d) Cookies
6. First analytics tools shifted focus on ______________ in order to monitor the performance of
the websites.

a) User behavior

b) Search engine ads

c) Search engine results

d) External links

7. The first form of web analytics were _____________.

a) Spreadsheets

b) Log files

c) Infographics

d) Tables

8. JavaScript enables _______________.

a) Log files

b) Analytics

c) Page tagging

d) HTML

9. Page tagging enabled web analysts to find more information about the users, such as
____________.

a) If a user was a new guest or frequent visitor

b) User behavior

c) Analytics

d) Social media tags

10. The first free analytics software was introduced in 1995 under the name of ______________.

a) Google Analytics

b) URCHIN
c) HORSE

d) Bing Analytics

11. In order to avoid the analysis being a bunch of data without any meaning, you have to set up
_____________.

a) Google AdWords

b) Goals and objectives

c) Social media accounts

d) Traffic and conversions

12. Which of these is not data segment?

a) Acquisition

b) Behavior

c) Outcomes

d) Banners

13. Web analytics helps you with different tasks, such as understanding the audience, which will
help you find out _________________.

a) How much time users spend on the website

b) Which pages are the most and the least visited

c) Which websites refer the most traffic to your pages

d) What can be done to increase conversions

14. Web analytics helps you with different tasks, such as determining the strengths and
weaknesses of your website, which will help you find out _________________.

a) How much time users spend on the website

b) Which pages are the most and the least visited

c) Which websites refer the most traffic to your pages

d) What can be done to increase conversions

15. Two major components of any web analytics process are:


a) Website and search engines

b) Analytics software and the analyst

c) Google Analytics and Bing Analytics

d) Users and conversions

16. The ‘Who’ aspect of web analytics helps you find out:

a) What kind of traffic your website attracts

b) How long the users stayed on your website

c) The data about referrals

d) The data about which keyword or phrase brought users to your website

17. The ‘Where’ aspect of web analytics help you find out:

a) What kind of traffic your website attracts

b) How long the users stayed on your website

c) The data about referrals

d) The data about which keyword or phrase brought users to your website

18. Metrics are ________________.

a) Also known as impressions

b) Indices used to gauge the performance of the website

c) The time the user took to browse a website in one go

d) The number of views of one particular page

19. User session is ________________.

a) Also known as impressions

b) Indices used to gauge the performance of the website

c) The time the user took to browse a website in one go

d) The number of views of one particular page


20. An impressions is registered when a user ___________________.

a) Leaves your website

b) Clicks on the search engine ad

c) Requests a file and your server delivers it

d) Clicks on the call-to-action button

21. Some of the aspects to think about when choosing web analytics tool are the following:

a) Ability to integrate

b) The costs

c) The need and features

d) The need, features, costs and ability to integrate

22. What is another name for proprietary software?

a) Open source software

b) Closed source software

c) On-demand software

d) Primary source software

23. What is another name for hosted software?

a) On-demand software

b) Primary source software

c) Open source software

d) Home source software

24. Web analytics dashboard presents _______________:

a) A snapshot of the lengthy analysis conducted using complicated tools

b) An outline of the data about referrals

c) An independent tool from the rest of the analytics


d) An irrelevant part of the web analytic tool

25. Having web analytics dashboard helps you ______________.

a) Install web analytics tool

b) Identify and highlight the most important part of the analytics

c) Use several web analytics tool within one dashboard

d) Staying competitive

26. Which analytical tool uses Heatmap Technology?

a) Google Analytics

b) Crazy Egg

c) Clicky

d) Kissmetrics

27. Metrics are commonly divided into four categories. Which ones?

a) Visitor metrics, content metrics, goal metrics, and external metrics

b) Visitor metrics, website metrics, goal metrics, and traffic metrics

c) Visitor metrics, content metrics, goal metrics, and traffic metrics

d) Visitor metrics, content metrics, goal metrics, and social metrics

28. Which one of these is goal metric?

a) Unique visitors

b) Entrance pages

c) Funnel visualization

d) Referring sites

29. Which one of these is traffic metric?

a) Unique visitors

b) Entrance pages
c) Funnel visualization

d) Referring sites

30. The process allowing analytics experts to identify site referrals is called:

a) Reverse DNS Lookup

b) Key Performance Indicators

c) Complimentary channel surfing

d) Home source evaluation

ANSWERS

1. b 11. b 21. d

2. b 12. d 22. b

3. c 13. a 23. a

4. d 14. b 24. a

5. c 15. b 25. b

6. a 16. a 26. b

7. b 17. c 27. c

8. c 18. b 28. c

9. a 19. c 29. d

10. b 20. c 30. a


Unit-6

Q1: What is cloud computing?

Ans: Cloud computing is an internet based new age computer technology. It is the next stage
technology that uses the clouds to provide the services whenever and wherever the user need
it.It provides a method to access several servers world wide.

Q2: What are the benefits of cloud computing?

Ans: The main benefits of cloud computing are:

o Data backup and storage of data.


o Powerful server capabilities.
o Incremented productivity.
o Very cost effective and time saving.
o Software as Service known as SaaS.

Q3: What is a cloud?

Ans: A cloud is a combination of networks ,hardware, services, storage, and interfaces that helps
in delivering computing as a service. It has three users :

1. End users
2. Business management users
3. cloud service provider

Q4: What is the difference between cloud computing and mobile computing?

Ans: Mobile computing and cloud computing are slightly same in concept. Mobile computing
uses the concept of cloud computing. Cloud computing provides users the data which they
required while in mobile computing, applications run on the remote server and gives user the
access for storage and manage.

Q5: What is the usage of utility computing?

Ans: Utility computing is a plug-in managed by an organization which decides what type of
services has to be deployed from the cloud. It facilitates users to pay only for what they use.

Q6: What are the various data types that cloud computing uses?

Answer: With the exponential hike in data creation, the cloud offers a wide variety of unstructured
data types like hyperlinks, emails, images, multimedia data, contacts, blogs, etc.
Q7:  What are the different cloud deployment models?

Answer: The different cloud deployment models are –

a.  Public Cloud

b.  Private Cloud

c.   Hybrid Cloud

d.  Community Cloud

Q8: What is the benefit of the on-demand cloud feature?

Answer: Cloud computing leverages its users’ on-demand access to virtualized IT resources. It
provides a shared pool of configurable resources that users can access as and when required.

Long Answers:

Q1:Discuss in detail about Roots of Cloud computing technology.

Ans: Cloud computing consists of multiple servers that host web services and data
storage. This technology allows companies to eliminate the need for expensive and
powerful computers.

The information and data of the company can store at low-cost servers and workers can easily
access that data via a common network.

In the traditional system, the company owns and maintains physical hardware which also costs
much higher, while cloud computing delivers a virtual platform.

In a virtual platform, each server hosts applications, and data are power by a separate provider.
Hence, you just have to pay them for the services.

Cloud computing has many applications like infrastructure management, application execution,
and data access management.

In addition, central management of virtual systems, business intelligence tools, desktop services,
real-time processing, and rich content delivery.

To understand cloud computing roots of cloud computing, there are mainly four roots of cloud
computing,
1. Internet Technologies
2. Distributed computing
3. Hardware
4. System management

First Root: Internet Technologies


The first root is Internet Technologies from the roots of cloud computing which contains
service-oriented architecture (SOA), web 2.0, and web services.

Internet technologies are widely accessible to the public. People can access the content and run
applications that depend on the network connection.

Second Root: Distributed Computing

The second root is Distributed Computing from the roots of cloud computing which
contains grids, utility computing, and cluster.

To understand the second root, let’s take an example, a computer is a general store, and
documents in the form of files.

Each document stored in the computer has a specific location, either on the local hard disk or it
is stored over the internet.

Now, when someone visits your website over the internet, the person browses through the files
on the browser without downloading them.

This means users can access the files to the specific location after processing; they can also
send that file back to the server.

Thus, it is known as distributed computing of the cloud. It is distributed in a manner so people can
access it anywhere in the world.

Third Root: Hardware

The third root is Hardware from the roots of cloud computing which contains multi-core
chips and virtualization.

When we talk about Hardware for cloud computing, it is usually virtual and people do not need to
buy it.

Generally, computers do require hardware such as CPU, RAM, ROM, and Motherboard to
process, store, analyze, and manage the data.

In Cloud Computing there are no hardware devices or components because the applications are
all managed via the internet.

If you are using a large amount of data then it becomes very difficult for your computer to
manage the constant increase in data.

On the other hand, the cloud stores data on its own computers rather than having the computer
that holds the data physically.

Fourth Root: System Management


The fourth root of cloud computing (System Management) contains data center
automation and autonomic computing.

System management root handles the operations to improve the productivity and efficiency of the
system.

To achieve this system management ensures all the employees have easy access to all the
necessary information.

For that, employees can change configurations, obtain/resend information and perform other
related functions from any location.

This makes it possible for the system admin to respond to any user demand instantly. Moreover,
the admin can restrict or deny access for the different users.

In an autonomic system, admin work becomes easier as the system is autonomic or self-
managing. Additionally, data analysis and monitoring are handling by the sensors.

Based on that data, the system responses perform various tasks such as optimization,
adaptation, configuration, and protection.

Hence, in this root, human involvement is less and the computing system handles most of the
operations.

Q2: Explain about various features of Cloud computing with an example.

Ans: Cloud computing is becoming popular day by day. Continuous business expansion and
growth requires huge computational power and large-scale data storage systems. Cloud
computing can help organizations expand and securely move data from physical locations to the
'cloud' that can be accessed anywhere.

Cloud computing has many features that make it one of the fastest growing industries at present.
The flexibility offered by cloud services in the form of their growing set of tools and technologies
has accelerated its deployment across industries. This blog will tell you about the essential
features of cloud computing.

1. Resources Pooling

Resource pooling is one of the essential features of cloud computing. Resource pooling means
that a cloud service provider can share resources among multiple clients, each providing a
different set of services according to their needs. It is a multi-client strategy that can be applied to
data storage, processing and bandwidth-delivered services. The administration process of
allocating resources in real-time does not conflict with the client's experience.

2. On-Demand Self-Service

It is one of the important and essential features of cloud computing. This enables the client to
continuously monitor server uptime, capabilities and allocated network storage. This is a
fundamental feature of cloud computing, and a customer can also control the computing
capabilities according to their needs.

3. Easy Maintenance

This is one of the best cloud features. Servers are easily maintained, and downtime is minimal or
sometimes zero. Cloud computing powered resources often undergo several updates to optimize
their capabilities and potential. Updates are more viable with devices and perform faster than
previous versions.

4. Scalability And Rapid Elasticity

A key feature and advantage of cloud computing is its rapid scalability. This cloud feature
enables cost-effective handling of workloads that require a large number of servers but only for a
short period. Many customers have workloads that can be run very cost-effectively due to the
rapid scalability of cloud computing.

5. Economical

This cloud feature helps in reducing the IT expenditure of the organizations. In cloud computing,
clients need to pay the administration for the space used by them. There is no cover-up or
additional charges that need to be paid. Administration is economical, and more often than not,
some space is allocated for free.

6. Measured And Reporting Service

Reporting Services is one of the many cloud features that make it the best choice for
organizations. The measurement and reporting service is helpful for both cloud providers and
their customers. This enables both the provider and the customer to monitor and report which
services have been used and for what purposes. It helps in monitoring billing and ensuring
optimum utilization of resources.

7. Security

Data security is one of the best features of cloud computing. Cloud services make a copy of the
stored data to prevent any kind of data loss. If one server loses data by any chance, the copied
version is restored from the other server. This feature comes in handy when multiple users are
working on a particular file in real-time, and one file suddenly gets corrupted.

8. Automation

Automation is an essential feature of cloud computing. The ability of cloud computing to


automatically install, configure and maintain a cloud service is known as automation in cloud
computing. In simple words, it is the process of making the most of the technology and
minimizing the manual effort. However, achieving automation in a cloud ecosystem is not that
easy. This requires the installation and deployment of virtual machines, servers, and large
storage. On successful deployment, these resources also require constant maintenance.

9. Resilience
Resilience in cloud computing means the ability of a service to quickly recover from any
disruption. The resilience of a cloud is measured by how fast its servers, databases and network
systems restart and recover from any loss or damage. Availability is another key feature of cloud
computing. Since cloud services can be accessed remotely, there are no geographic restrictions
or limits on the use of cloud resources.

10. Large Network Access

A big part of the cloud's characteristics is its ubiquity. The client can access cloud data or transfer
data to the cloud from any location with a device and internet connection. These capabilities are
available everywhere in the organization and are achieved with the help of internet. Cloud
providers deliver that large network access by monitoring and guaranteeing measurements that
reflect how clients access cloud resources and data: latency, access times, data throughput, and
more.

Q3: Discuss about advantage and disadvantages of Cloud computing


Ans:

Advantages of Cloud Computing

1) Back-up and restore data

Once the data is stored in the cloud, it is easier to get back-up and restore that data using the
cloud.

2) Improved collaboration

Cloud applications improve collaboration by allowing groups of people to quickly and easily share
information in the cloud via shared storage.

3) Excellent accessibility

Cloud allows us to quickly and easily access store information anywhere, anytime in the whole
world, using an internet connection. An internet cloud infrastructure increases organization
productivity and efficiency by ensuring that our data is always accessible.

4) Low maintenance cost

Cloud computing reduces both hardware and software maintenance costs for organizations.

5) Mobility

Cloud computing allows us to easily access all cloud data via mobile.

6) IServices in the pay-per-use model

Cloud computing offers Application Programming Interfaces (APIs) to the users for access
services on the cloud and pays the charges as per the usage of service.
7) Unlimited storage capacity

Cloud offers us a huge amount of storing capacity for storing our important data such as
documents, images, audio, video, etc. in one place.

8) Data security

Data security is one of the biggest advantages of cloud computing. Cloud offers many advanced
features related to security and ensures that data is securely stored and handled.

Disadvantages of Cloud Computing

A list of the disadvantage of cloud computing is given below -

1) Internet Connectivity

As you know, in cloud computing, every data (image, audio, video, etc.) is stored on the cloud,
and we access these data through the cloud by using the internet connection. If you do not have
good internet connectivity, you cannot access these data. However, we have no any other way to
access data from the cloud.

2) Vendor lock-in

Vendor lock-in is the biggest disadvantage of cloud computing. Organizations may face problems
when transferring their services from one vendor to another. As different vendors provide
different platforms, that can cause difficulty moving from one cloud to another.

3) Limited Control

As we know, cloud infrastructure is completely owned, managed, and monitored by the service
provider, so the cloud users have less control over the function and execution of services within a
cloud infrastructure.

4) Security

Although cloud service providers implement the best security standards to store important
information. But, before adopting cloud technology, you should be aware that you will be sending
all your organization's sensitive information to a third party, i.e., a cloud computing service
provider. While sending the data on the cloud, there may be a chance that your organization's
information is hacked by Hackers.

Q4: Discuss about the History of Cloud Computing.

Ans: Before emerging the cloud computing, there was Client/Server computing which is basically
a centralized storage in which all the software applications, all the data and all the controls are
resided on the server side.

If a single user wants to access specific data or run a program, he/she need to connect to the
server and then gain appropriate access, and then he/she can do his/her business.
Then after, distributed computing came into picture, where all the computers are networked
together and share their resources when needed.

On the basis of above computing, there was emerged of cloud computing concepts that
later implemented.

At around in 1961, John MacCharty suggested in a speech at MIT that computing can be sold
like a utility, just like a water or electricity. It was a brilliant idea, but like all brilliant ideas, it was
ahead if its time, as for the next few decades, despite interest in the model, the technology simply
was not ready for it.

But of course time has passed and the technology caught that idea and after few years we
mentioned that:

In 1999, Salesforce.com started delivering of applications to users using a simple website. The


applications were delivered to enterprises over the Internet, and this way the dream of computing
sold as utility were true.

In 2002, Amazon started Amazon Web Services, providing services like storage, computation


and even human intelligence. However, only starting with the launch of the Elastic Compute
Cloud in 2006 a truly commercial service open to everybody existed.

In 2009, Google Apps also started to provide cloud computing enterprise applications.

Of course, all the big players are present in the cloud computing evolution, some were earlier,
some were later. In 2009, Microsoft launched Windows Azure, and companies like Oracle and
HP have all joined the game. This proves that today, cloud computing has become mainstream.

Q5: Elaborate on Cloud Infrastructure Management.


Ans: Cloud infrastructure consists of servers, storage devices, network, cloud management
software, deployment software, and platform virtualization.

Hypervisor

Hypervisor is a firmware or low-level program that acts as a Virtual Machine Manager. It


allows to share the single physical instance of cloud resources between several tenants.

Management Software
It helps to maintain and configure the infrastructure.

Deployment Software

It helps to deploy and integrate the application on the cloud.

Network

It is the key component of cloud infrastructure. It allows to connect cloud services over the
Internet. It is also possible to deliver network as a utility over the Internet, which means, the
customer can customize the network route and protocol.

Server

The server helps to compute the resource sharing and offers other services such as resource
allocation and de-allocation, monitoring the resources, providing security etc.

Storage

Cloud keeps multiple replicas of storage. If one of the storage resources fails, then it can be
extracted from another one, which makes cloud computing more reliable.
Infrastructural Constraints
Fundamental constraints that cloud infrastructure should implement are shown in the following
diagram:

Transparency

Virtualization is the key to share resources in cloud environment. But it is not possible to satisfy
the demand with single resource or server. Therefore, there must be transparency in resources,
load balancing and application, so that we can scale them on demand.

Scalability
Scaling up an application delivery solution is not that easy as scaling up an application because
it involves configuration overhead or even re-architecting the network. So, application delivery
solution is need to be scalable which will require the virtual infrastructure such that resource can
be provisioned and de-provisioned easily.

Intelligent Monitoring

To achieve transparency and scalability, application solution delivery will need to be capable of
intelligent monitoring.

Security

The mega data center in the cloud should be securely architected. Also the control node, an
entry point in mega data center, also needs to be secure.

MCQ’s
Question 1 : Which of these is NOT a technical characteristic of the cloud?
(a) High scalability
(b) Resource pooling
(c) Resource scheduling
(d) Multi-tenancy

Answer : (d) Multi-tenancy


Question 2 : Which of the following is associated with cluster computing?
(a) Loose coupling
(b) Tight coupling
(c) Distributed job management
(d) Diversity

Answer : (a) Loose coupling

Question 3 : Which of these is a key business driver of cloud computing?


(a) Costs reduction
(b) Scalability
(c) Sharing
(d) Mobility

Answer : (a) Costs reduction


Question 4 : In which year was Oracle Cloud launched?
(a) 2014
(b) 2012
(c) Later 1990s
(d) Early 2000s

Answer : (b) 2012


Question 5 : Who proposed Intergalactic computer network?
(a) IBM
(b) Oracle
(c) J.C.R. Licklider
(d) Salesforce

Answer : (c) J.C.R. Licklider


Question 6 : Remote use of applications via network-based subscriptions is known as
(a) Cloud service
(b) Infrastructure-as-a-Service
(c) On-demand service
(d) Software-as-a-Service

Answer : (d) Software-as-a-Service


Question 7 : Metered service to access and use computing resources is tagged
(a) Pay service
(b) Utility computing
(c) Metered access
(d) Pay-as-to-go

Answer : (b) Utility computing


Question 8 : Efficiency in cloud computing is termed
(a) Energy-aware computing
(b) Energy virtualization
(c) Green computing
(d) High performance computing

Answer : (c) Green computing


Question 9 : Difficulty in locating faults is common to?
(a) Mainframe systems
(b) Cloud computing
(c) Grid computing
(d) Cluster computing

Answer : (d) Cluster computing


Question 10 : Which of the following costs is associated with the server costs?
(a) Network cost
(b) Storage cost
(c) Support cost
(d) Recovery cost

Answer : (c) Support cost

Question 11 : Data protection is often managed using …………………. with defined roles
and privileges on data encryption keys management.
Answer : data encryption

Question 12 : The main stakeholders of the cloud ecosystem are the …………………. .
Answer : cloud customers
Question 13 : The consolidation of fixed cloud services into one or new services is known
as ………………… .
Answer : service aggregation

Question 14 : ………………….. are entities that facilitate efficient and effective use of cloud
services while ensuring peak performance and seamless delivery of such services.
Answer : Cloud brokers

Question 15 : ………………. makes an expansion of cloud service provisioning business


globally a reality in the cloud market.
Answer : Cloud resellers

Question 16 : Which of these companies is not a leader in cloud computing?


(a) Google
(b) Amazon
(c) Intel
(d) Microsoft

Answer : (c) Intel

Question 17 : YARN is an acronym for ?


(a) Yet Another Resource Negotiator
(b) Yet Another Resource Navigator
(c) Yet Another Resource Negotiator
(d) Yet Again Resource Navigator

Answer : (a) Yet Another Resource Negotiator

Question 18 : What are the three main factors of the cloud computing adoption?
(a) Technological, governmental and environmental
(b) Technological, organizational and environmental
(c) Technological, organizational and cultural
(d) Technological, cultural and industrial

Answer : (b) Technological, organizational and environmental

Question 19 : A public cloud is defined by ?


(a) Portability
(b) Ownership
(c) Storage size
(d) Accessibility

Answer : (a) Portability

Question 20 : Which of these techniques is vital for creating cloud computing centers?
(a) Virtualization
(b) Transubstantiation
(c) Cannibalization
(d) Insubordination
Answer : (a) Virtualization

Question 21 : Which of the following is a type of virtualization?


(a) Storage
(b) Desktop
(c) CPU
(d) All of the above

Answer : (d) All of the above

Question 22 : What are the core components of a web service?


(a) XML and XML Schema
(b) SOAP and HTTP
(c) WSDL and XML
(d) ESB and WS-Policy

Answer : (c) WSDL and XML

Question 23 : Potential defense mechanism to repudiation include all of the following


except
(a) Digital signatures
(b) Time stamps
(c) Audit trails
(d) Filtering

Answer : (d) Filtering

Question 24 : Which is not a major cloud computing platform?


(a) Apple iCloud
(b) IBM Deep Blue
(c) Microsoft Azure
(d) Amazon EC2

Answer : (b) IBM Deep Blue

Question 25 : Which of the following is not a component of the OpenStack cloud platform?
(a) Nova
(b) Keystone
(c) Neutron
(d) Walrus

Answer : (d) Walrus

Question 26 : Google provides the following internationally recognized certifications:


(a) Associate Cloud Engineer
(b) Professional Cloud Architect
(c) Professional Data Engineer
(d) Professional Cloud Developer
(e) All of the above
Answer : (e) All of the above

Question 27 : One of the following is not a characteristic of the public cloud.


(a) Homogenous resources
(b) Multi-tenancy
(c) Economies of scale
(d) Heterogeneous resources

Answer : (d) Heterogeneous resources

Question 28 : Which of the following cloud concept is related to pooling and sharing of
resources?
(a) Polymorphism
(b) Abstraction
(c) Virtualization
(d) None of the mentioned

Answer : (c) Virtualization

Question 29 : Type …….. VM is  full virtualization.


(a) 1
(b) 2
(c) 3
(d) All of the above

Answer : (a) 1

Question 30 : What is true in case of SOA and BPM?


(a) Tasks can be rearranged with user interface
(b) Business rules can be dynamically changed
(c) New services can be created at runtime
(d) None of the above

Answer : (a) Tasks can be rearranged with user interface

Question 31 : Authentication, authorization, throttling, and filtering are potential defense


mechanisms for 
(a) Data tampering
(b) Repudiation
(c) Information Disclosure
(d) Denial of Service

Answer : (d) Denial of Service


Question 32 : Which of the following is a cloud platform by Amazon?
(a) Azure
(b) AWS
(c) Cloudera
(d) All of the above

Answer : (c) Cloudera


Question 33 : Which is open source data query system?
(a) Walrus
(b) Hive
(c) Pig
(d) Glance

Answer : (b) Hive

Question 24 : The term MOOCs refers to …………… .


(a) Massive Open Online Courses
(b) Massive Open Online Curriculum
(c) Massive Open Source Online Courses
(d) None of the above

Answer : (a) Massive Open Online Courses

Question 35 : One of the following is true about the on-premise private cloud.
(a) High standardized process
(b) Also referred to as the internal cloud
(c) High scalability
(d) High level of security

Answer : (c) High scalability


Unit -7
Short Answers
Q1. What are deployment models?
a) Private b) Public c) Hybrid d)Community
Q2. What is public deployment model?
 Is a huge data centre that offers the same services to all its users.
 The services are accessible for everyone and used for consumer segement
 Eg., facebook, google,Linkedin
Q3. What is private deployment model?
 A private cloud is built within the domain of an intranet owned by a single
organization.
 It is a client owned and managed, and its access is limited to the owning clients
and their partner

Q4. What is hybrid deployment model?


 A hybrid cloud is built with both public and private clouds.
 The Research Compute Cloud (RC2) is a private cloud, built by IBM, that
interconnects the computing and IT resources at eight IBM Research Centers
scattered throughout the United States, Europe, and Asia.

Q5. What is community deployment model?


Ans: More than one group with common and specific needs shares the cloud infrastructure. This
can include environments such as a U.S. federal agency cloud with stringent security
requirements, or a health and medical cloud with regulatory and policy requirements for
privacy matters.

Q6: List the design objectives of cloud computing? [AZ]


Ans:
1. Shifting computing from desktops to data centers
2. Service provisioning and cloud economics
3. Scalability in performance
4. Data privacy protection
5. High quality of cloud services
6. New Standards and interfaces

Q7: Define Public Cloud.


Ans: A public cloud is built over the Internet, which can be accessed by any user who has paid
for the service. Public clouds are owned by service providers. They are accessed by
subscription. Eg. Google App Engine GAE, Amazon Web Services AWS, Microsoft
Azure, IBM Blue Cloud etc.

Q8: Which cloud model is costlier?

Ans:
 Public Cloud: Less expensive as the cloud provider offers all the resources.
 Private Cloud: More expensive as the organization must solely purchase all the
resources.
 Hybrid Cloud: This cloud can provide both on-premise and off-premise data centers
to create a custom environment, meeting the specific needs of your expenses.

Q9: Which cloud model has the best resources?

Ans: This again depends on how you want to use your expenses for cloud services.

 Public Cloud: Offers you the best resources as they provide access to unlimited
resources – operational expenses.
 Private Cloud: Deployment of more resources requires renting or purchasing more
software or hardware components – capital expenses.
 Hybrid Cloud: Provides customers with the choice of choosing operational
expenses to scale out or capital expenditures to scale up.

Long Answers:
Q1: BASIC REQUIREMENTS OF A CLOUD COMPUTING SERVICE
Ans: Cloud computing is here to rule. Right now, most of the small, medium enterprises have
gone 100% on cloud. I have seen several startups - which are using cloud services for all their
computing needs. But large enterprises are reluctant to move to cloud services and rightly so.
Many companies are just testing waters and have held back on full scale deployment of cloud IT
services.

Today the cloud services have several deficiencies - which from an enterprise prespective are
the basic requirements for them to consider cloud services. In this article I have written about
about 6 basic requirements for enterprises to adapt cloud services in a big way.

1. Availability - with loss less DR

Customers want their IT services be up and available at all times. But in reality, computers
sometimes fail. This implies that the service provider should have implemented a reliable disaster
recovery (DR) mechanism - where in the service provider can move the customer from one data
center to another seamlessly and the customer does not even have to know about it.

As a cloud service provider, there will be enormous pressure to minimise costs by optimally
utilizing all the IT infrastructure. The traditional Active-Passive DR strategy is very expensive and
cost inefficient. Instead, service providers will have to create an Active-Active disaster recovery
mechanism - where more than one data center will be active at all times and ensures that the
data and services can be accessed by the customer from either of the data centers seamlessly.

Today, there are several solutions that are available to do just that. EMC VPLEX solution to
maintain an Active-Active data centre. Another approach will be implemented Hadoop/Hive stack
for data intensive applications such as emails, messaging, data store, services.

In an ideal scenario, the customer on the cloud services should not even notice any change at all
and the movement of all his data & applications from one data centre to another must be
transparent to the end user.

2. Portability of Data & Applications

Customers hate to be locked into a service or a platform. Ideally a cloud offering must be able to
allow customers to move out their data & applications from one service provider to another - just
like customers can switch from one telephone service provider to another.

As applications are being written on standard platforms - Java, PHP, Python, etc. It should be
possible to move the customer owned applications from one service provider to another.
Customers should also take care to use only the open standards and tools, and avoid vendor
specific tools. Azure or Google services offers several tools/applications/utilities which are
valuable - but it also creates a customer locking - as the customer who uses these vendor
specific tools cannot migrate to another service provider without rewriting the applications.

To illustrate this, today in India, customers can move from one cell phone service provider to
another without changing their handsets, but in US, if one were to move from AT&T to Verizon,
one needs to pay for the handset - which forms a customer lock in instrument.

With public cloud services, customers should be able to move their data & applications from one
cloud to another - without disrupting the end user's IT services. This movement should be
transparent to the end user.
The Cloud Computing Interoperability Forum (CCIF) was formed by organizations such as Intel,
Sun, and Cisco in order to enable a global cloud computing ecosystem whereby organizations
are able to seamlessly work together for the purposes for wider industry adoption of cloud
computing
technology. The development of the Unified Cloud Interface (UCI) by CCIF aims at creating a
standard programmatic point of access to an entire cloud infrastructure.

Recently in EMC world 2011, EMC demonstrated moving several active VMs & applications from
EMC data center to CSC data center without disruption of service. This was just a proof of
concept, but to make this a common place, some amount of regulation and business coordination
will be required.

However, in their current form, most of cloud computing services and platforms do not employ
standard methods of storing user data and applications. Consequently, they do not interoperate
and user data are not portable.

3. Data Security

Security is the key concern for all customers - since the applications and the data is reciding in
the public cloud, it is the responsibility of the service provider for providing adequate security. In
my opinion security for customer data/applications becomes a key differentiator when it comes to
selecting the cloud service provider. When it comes to IT security, customers tend to view the
cloud service providers like they view banks. The service provider is totally responsible for user
security, but there are certain responsibilities that the customer also needs to take.

The service provider must a robust Information Security Risk Management process - which is
well understood by the customer, and customer must clearly know his responsibilities as well. As
there are several types of cloud offerings (SaaS, PaaS, IaaS etc), there will be different sets of
responsibility for the customer and the service provider depending on the cloud service offering.

When it comes to security, the cloud service providers offer better security than what the
customer's own data center security. This is a kin to banks - where banks can offer far greater
security than any individual or company. The security in cloud is much higher due to: Centralized
monitoring, enhanced incidence detection/forensics, logging of all activity, greater
security/venerability testing, centralized authentication testing (aka password
protection/assurance), Secure builds & testing patches before deployment and lastly better
security software/systems.

Cloud service providers know that the security is the key to their success - and hence invest
more on security. The amount of efforts/money invested by cloud service providers will always be
greater than the amount an individual company(most) can spend.
Security issues will also be addressed through legal & regulatory systems. Despite the best IT
security, breaches can happen and when it happens, the laws and rules of the land - where the
data resides play an important role. For example, specific cryptography techniques could not be
used because they are not allowed in some countries. Similarly, country laws can impose that
sensitive data, such as patient health records, are to be stored within national borders. Therefore
customer needs to pay attention to Legal and regulatory issues when selecting the service
providers.
4. Manageability

Managing the cloud infrastructure from the customer perspective must be under the control of the
customer admin. Customers of Cloud services must be able to create new accounts, must be
able to provision various services, do all the user account monitoring - monitoring for end user
usage, SLA breaches, data usage monitoring etc. The end users would like to see the availability,
performance and configuration/provisioning data for the set of infrastructure they are using in the
cloud.

Cloud service provider will have various management tools for Availability management,
performance management, configuration management and security management of applications
and infrastructure (storage, servers, and network). Customers want to know how the entire
infrastructure is being managed - and if possible, can that management information be shared
with them, and alert the customer on any outage, slow service, or breach of SLA as it happens.
This allows customer to take corrective actions - either move the applications to another cloud or
enable their contingency plans.

Sharing the application performance and resource management information will help improve
utilization and consequently optimize usage by customers. This will result in improving ROI for
the customers and encourage customers to adapt cloud services.

As customers buy cloud services from multiple vendors, it will become a necessity to have a
unified management system to manage all the cloud services they have. This implies that cloud
service providers must embrace an XML based reporting formats to provide management
information to customers and customers then can build their own management dashboards.

5. Elasticity

Customer on Cloud computing have a dynamic computing load. At times of high load, they need
greater amount of computing resources available to them on demand, and when the work loads
are low, the computing resources are released back to the cloud pool. Customer expect the
service provider to charge them for what they have actually used in the process.

Customers also want a self-service on-demand resource provisioning capability from the service
provider. This feature enables users to directly obtain services from clouds, such as spawning the
creation of a server and tailoring its software, configurations, and security policies, without
interacting with a human system administrator. This eliminates the need for more time-
consuming, labour-intensive, human driven procurement processes familiar to many in IT.

This implies that the dynamic provisioning system should be the basic part of cloud management
software - through which users can easily interact with the system.

To provide an elastic computing resource, the service provider must be able to dynamically
provision resources as needed and have adequate charge back systems to bill the customer.

In reality, it may not be possible for any single cloud service provider to build an infinitely scalable
infrastructure and hence customers will have to rely on a fedrated system of multiple cloud
service providers sharing the customer loads. (Just like a power grid, where the load gets
distributed to other power plants during peak loads)
6. Federated System

There are several reasons as to why customers will need a Federated cloud system. Customers
may have to buy services from several cloud service providers for various services - email from
Google, online sales transaction services from Amazon and ERP from another vendor etc. In
such cases customer want their cloud applications to interact with other other services from
several vendors to provide a seamless end to end IT services.

This implies that each of the cloud services must have an interface with other cloud services for
load sharing & application interoperability.

In a federated environment there is potentially an infinite pool of resources. To build such a


system, there should be inter-cloud framework agreements between multiple service providers,
and adequate chargeback systems in place.

Having a federated system helps customers to move their data/applications across different cloud
service providers and prevents customer locking.

Interoperability of applications across different cloud services has led to creations of standard
APIs. But these APIs are cumbersome to use and that has led to creation of Cloud Integration
Bus - based on Enterprise Service Bus (ESB).

Q2: What do you mean by dynamic cloud infrastructure?

Ans: Dynamic infrastructure relies primarily on software to identify, virtualize, classify and track
data center resources. These resources are grouped into pools, regardless of their physical
location within one or multiple data centers. By classifying data center resources, IT teams can
establish and monitor multiple service tiers to ensure more demanding workloads receive more
compute and storage resources.

In most cases, the software used in dynamic infrastructures can automatically allocate resources
from the appropriate pools to meet workload demands. The software adds resources when
workload demands increase, and then returns resources to the pool when demands decrease – a
process known as workload balancing.

Dynamic infrastructure helps align IT use with business policies. For example, a critical workload
can retain more resources longer to ensure top performance, while less-important business
applications can use fewer resources or release unneeded resources sooner. Such behaviors
help maximize resource use and reduce the need for new IT purchases.
Although dynamic infrastructure can work with any data center hardware, it is often deployed with
highly integrated and expandable hardware systems, known as a hyper-converged infrastructure
(HCI). These are typically appliances that include compute, storage and network capabilities.
Examples of HCI systems include VMware EVO:RAIL, Nutanix Acropolis and SimpliVity
OmniCube.

The evolving combination of software and hardware has facilitated the notion of software-defined
data centers (SDDCs). Similarly, the ability to autonomously provision resources for new
workloads, scale resources to meet changing demands and recover unused resources are
important attributes of cloud computing.

Q3: What is the public cloud?


Ans: The public cloud refers to the cloud computing model in which IT services are delivered via
the internet. As the most popular model of cloud computing services, the public cloud offers vast
choices in terms of solutions and computing resources to address the growing needs of
organizations of all sizes and verticals.

The defining features of a public cloud solution include:

 High elasticity and scalability


 A low-cost subscription-based pricing tier

Services on the public cloud may be free, freemium, or subscription-based, wherein you’re
charged based on the computing resources you consume.

The computing functionality may range from common services—email, apps, and storage—to the
enterprise-grade OS platform or infrastructure environments used for software development and
testing.
The cloud vendor is responsible for developing, managing, and maintaining the pool of computing
resources shared between multiple tenants from across the network.

When to use the public cloud


The public cloud is most suitable for these types of environments:

 Predictable computing needs, such as communication services for a specific number


of users
 Apps and services necessary to perform IT and business operations
 Additional resource requirements to address varying peak demands
 Software development and test environments
Advantages of public cloud
People appreciate these public cloud benefits:

 No CapEx. No investments required to deploy and maintain the IT infrastructure.


 Technical agility. High scalability and flexibility to meet unpredictable workload
demands.
 Business focus. The reduced complexity and requirements on in-house IT expertise
is minimized, as the cloud vendor is responsible for infrastructure management.
 Affordability. Flexible pricing options based on different SLA offerings
 Cost agility. The cost agility allows organizations to follow lean growth strategies and
focus their investments on innovation projects

Drawbacks of public cloud


The public cloud does come with limitations:

 Lack of cost control. The total cost of ownership (TCO) can rise exponentially for
large-scale usage, specifically for midsize to large enterprises.
 Lack of security. Public cloud is the least secure, by nature, so it isn’t best for
sensitive mission-critical IT workloads.
 Minimal technical control. Low visibility and control into the infrastructure may not
meet your compliance needs.

Q4: What is the private cloud?


Ans: The private cloud refers to any cloud solution dedicated for use by a single organization. In
the private cloud, you’re not sharing cloud computing resources with any other organization.

The data center resources may be located on-premise or operated by a third-party vendor off-
site. The computing resources are isolated and delivered via a secure private network, and not
shared with other customers.

Private cloud is customizable to meet the unique business and security needs of the
organization. With greater visibility and control into the infrastructure, organizations can operate
compliance-sensitive IT workloads without compromising on the security and performance
previously only achieved with dedicated on-premise data centers.

When to use the private cloud


The private cloud is best suited for:

 Highly regulated industries and government agencies


 Sensitive data
 Companies that require strong control and security over their IT workloads and the
underlying infrastructure
 Large enterprises that require advanced data center technologies to operate efficiently
and cost-effectively
 Organizations that can afford to invest in high performance and availability
technologies

Advantages of private cloud


The most popular benefits of private cloud include:

 Exclusive environments. Dedicated and secure environments that cannot be


accessed by other organizations.
 Custom security. Compliance to stringent regulations as organizations can run
protocols, configurations, and measures to customize security based on unique
workload requirements
 Scalability without tradeoffs. High scalability and efficiency to meet unpredictable
demands without compromising on security and performance
 Efficient performance. The private cloud is reliable for high SLA performance and
efficiency.
 Flexibility. The private cloud is flexible as you transform the infrastructure based on
ever-changing business and IT needs of the organization.

Drawbacks of private cloud


The private cloud has drawbacks that might limit use cases:

 Price. The private cloud is an expensive solution with a relatively high TCO compared
to public cloud alternatives, especially for short-term use cases.
 Mobile difficulty. Mobile users may have limited access to the private cloud
considering the high security measures in place.
 Scalability depends. The infrastructure may not offer high scalability to meet
unpredictable demands if the cloud data center is limited to on-premise computing
resources

Q5: Define Hybrid Cloud.


Ans: The hybrid cloud is any cloud infrastructure environment that combines both public and
private cloud solutions.

The resources are typically orchestrated as an integrated infrastructure environment. Apps and
data workloads can share the resources between public and private cloud deployment based on
organizational business and technical policies around aspects like:

 Security
 Performance
 Scalability
 Cost
 Efficiency

This is a common example of hybrid cloud: Organizations can use private cloud environments for
their IT workloads and complement the infrastructure with public cloud resources to
accommodate occasional spikes in network traffic.

Or, perhaps you use the public cloud for workloads and data that aren’t sensitive, saving cost, but
opt for the private cloud for sensitive data.

As a result, access to additional computing capacity does not require the high CapEx of a private
cloud environment but is delivered as a short-term IT service via a public cloud solution. The
environment itself is seamlessly integrated to ensure optimum performance and scalability to
changing business needs.

When you do pursue a hybrid cloud, you may have another decision to make: whether to
be homogeneous or heterogenous with your cloud. That is—are you using cloud services from a
single vendor or from several vendors?

When to use the hybrid cloud


Here’s who the hybrid cloud might suit best:

 Organizations serving multiple verticals facing different IT security, regulatory, and


performance requirements
 Optimizing cloud investments without compromising on the value that public or private
cloud technologies can deliver
 Improving security on existing cloud solutions such as SaaS offerings that must be
delivered via secure private networks
 Strategically approaching cloud investments to continuously switch and tradeoff
between the best cloud service delivery model available in the market

Advantages of hybrid cloud

 Policy-driven option. Flexible policy-driven deployment to distribute workloads


across public and private infrastructure environments based on security, performance,
and cost requirements.
 Scale with security. Scalability of public cloud environments is achieved without
exposing sensitive IT workloads to the inherent security risks.
 Reliability. Distributing services across multiple data centers, some public, some
private, results in maximum reliability.
 Cost control. Improved security posture as sensitive IT workloads run on dedicated
resources in private clouds while regular workloads are spread across inexpensive
public cloud infrastructure to tradeoff for cost investments

Drawbacks of hybrid cloud


Common drawbacks of the hybrid cloud include:

 Price. Toggling between public and private can be hard to track, resulting in wasteful
spending.
 Management. Strong compatibility and integration is required between cloud
infrastructure spanning different locations and categories. This is a limitation with
public cloud deployments, for which organizations lack direct control over the
infrastructure.
 Added complexity. Additional infrastructure complexity is introduced as organizations
operate and manage an evolving mix of private and public cloud architecture.

Q6: Differentiate between Public, private and Hybrid Cloud.


Ans: Below is the table of critical differences of each cloud:

Difference Public Cloud Private Cloud Hybrid Cloud


Multi-tenancy: The Single The data stored in the
data of numerous Tenancy: The data public cloud is shared, and
Data Tenancy companies is stored of only a single the data stored in the
in a shared organization is private cloud is not shared
environment stored in the cloud and kept confidential
Services on the public cloud
Only that specific can be accessed by
organization can everyone, whereas services
Cloud Services Open to public
use the cloud in the private cloud can be
services accessed only by that
organization
Over the internet for public
Over the cloud services and
Connectivity Over the internet organization’s organization’s private
private network network for private cloud
services
The public cloud is
Managed by the managed by the cloud
Management Managed by the
administrators of service provider, whereas
of Cloud cloud service
that specific the administrators of that
Services provider
organization particular organization
manage the private cloud
Software and The cloud service That particular Public cloud components –
Hardware provider manages organization Cloud Service
Components these components operates these providerPrivate cloud
components components – Organization
Less expensive as Very expensive as
Less costly for public cloud
the cloud service the organization has
Costs and more expensive for
provider offers all to purchase all the
private cloud resources
the resources resources
Scalability and
High High High
Flexibility
Public Cloud – LowPrivate
Security Low High
Cloud – High

Q7: What is Community Cloud? Benefits & Examples with Use Cases.

Ans: Community Cloud is a hybrid form of private cloud. They are multi-tenant platforms that
enable different organizations to work on a shared platform.

The purpose of this concept is to allow multiple customers to work on joint projects and
applications that belong to the community, where it is necessary to have a centralized cloud
infrastructure. In other words, Community Cloud is a distributed infrastructure that solves the
specific issues of business sectors by integrating the services provided by different types of cloud
solutions.

The communities involved in these projects, such as tenders, business organizations, and
research companies, focus on similar issues in their cloud interactions. Their shared interests
may include concepts and policies related to security and compliance considerations, and the
goals of the project as well.

Community Cloud computing facilitates its users to identify and analyze their business demands
better. Community Cloud may be hosted in a data center, owned by one of the tenants, or by a
third-party cloud services provider and can be either on-site or off-site.
Community Cloud Examples and Use Cases

Cloud providers have developed Community Cloud offerings, and some organizations are
already seeing the benefits. The following list shows some of the main scenarios of the
Community Cloud model that is beneficial to the participating organizations.

 Multiple governmental departments that perform transactions with one another can have
their processing systems on shared infrastructure. This setup makes it cost-effective to the
tenants, and can also reduce their data traffic.
 Federal agencies in the United States. Government entities in the U.S. that share similar
requirements related to security levels, audit, and privacy can use Community Cloud. As it
is community-based, users are confident enough to invest in the platform for their projects.

 Multiple companies may need a particular system or application hosted on cloud services.
The cloud provider can allow various users to connect to the same environment and
segment their sessions logically. Such a setup removes the need to have separate servers
for each client who has the same intentions.
 Agencies can use this model to test applications with high-end security needs rather than
using a Public Cloud. Given the regulatory measures associated with Community Clouds,
this could be an opportunity to test features of a Public Cloud offering.

Benefits of Community Clouds

Community Cloud provides benefits to organizations in the community, individually as well as


collectively. Organizations do not have to worry about the security concerns linked with Public
Cloud because of the closed user group.

This recent cloud computing model has great potential for businesses seeking cost-


effective cloud services to collaborate on joint projects, as it comes with multiple advantages.

Openness and Impartiality


Community Clouds are open systems, and they remove the dependency organizations have on
cloud service providers. Organizations can achieve many benefits while avoiding the
disadvantages of both public and private clouds.

Flexibility and Scalability

 Ensures compatibility among each of its users, allowing them to modify properties
according to their individual use cases. They also enable companies to interact with their
remote employees and support the use of different devices, be it a smartphone or a tablet.
This makes this type of cloud solution more flexible to users’ demands.
 Consists of a community of users and, as such, is scalable in different aspects such as
hardware resources, services, and manpower. It takes into account demand growth, and
you only have to increase the user-base.

High Availability and Reliability

Your cloud service must be able to ensure the availability of data and applications at all times.
Community Clouds secure your data in the same way as any other cloud service, by replicating
data and applications in multiple secure locations to protect them from unforeseen
circumstances.

Cloud possesses redundant infrastructure to make sure data is available whenever and wherever
you need it. High availability and reliability are critical concerns for any type of cloud solution.

Security and Compliance

Two significant concerns discussed when organizations rely on cloud computing are data security
and compliance with relevant regulatory authorities. Compromising each other’s data security is
not profitable to anyone in a Community Cloud.

Users can configure various levels of security for their data. Common use cases:

 The ability to block users from editing and downloading specific datasets.
 Making sensitive data subject to strict regulations on who has access to Sharing sensitive
data unique to a particular organization would bring harm to all the members involved.
 What devices can store sensitive data.

Convenience and Control

Conflicts related to convenience and control do not arise in a Community Cloud. Democracy is a
crucial factor the Community Cloud offers as all tenants share and own the infrastructure and
make decisions collaboratively. This setup allows organizations to have their data closer to them
while avoiding the complexities of a Private Cloud.

Less Work for the IT Department

Having data, applications, and systems in the cloud means that you do not have to manage them
entirely. This convenience eliminates the need for tenants to employ extra human resources to
manage the system. Even in a self-managed solution, the work is divided among the participating
organizations.
Environment Sustainability

In the Community Cloud, organizations use a single platform for all their needs, which dissuades
them from investing in separate cloud facilities. This shift introduces a symbiotic relationship
between broadening and shrinking the use of cloud among clients. With the reduction of
organizations using different clouds, resources are used more efficiently, thus leading to a
smaller carbon footprint.

MCQ’s

1. _________ computing refers to applications and services that run on a distributed


network using virtualized resources.
a) Distributed
b) Cloud
c) Soft
d) Parallel
Answer: b

2. Point out the wrong statement.


a) The massive scale of cloud computing systems was enabled by the popularization of the
Internet
b) Soft computing represents a real paradigm shift in the way in which systems are deployed
c) Cloud computing makes the long-held dream of utility computing possible with a pay-as-you-
go, infinitely scalable, universally available system
d) All of the mentioned
Answer: b

3. ________ as a utility is a dream that dates from the beginning of the computing industry
itself.
a) Model
b) Computing
c) Software
d) All of the mentioned
Answer: b
on the Internet and turns them into a self-service utility.

4. Which of the following is essential concept related to Cloud?


a) Reliability
b) Productivity
c) Abstraction
d) All of the mentioned
Answer: c

5. Point out the wrong statement.


a) All applications benefit from deployment in the cloud
b) With cloud computing, you can start very small and become big very fast
c) Cloud computing is revolutionary, even if the technology it is built on is evolutionary
d) None of the mentioned
Answer: a

6. Which of the following cloud concept is related to pooling and sharing of resources?
a) Polymorphism
b) Abstraction
c) Virtualization
d) None of the mentioned
Answer: c

7. ________ has many of the characteristics of what is now being called cloud computing.
a) Internet
b) Softwares
c) Web Service
d) All of the mentioned
Answer: a

8. Which of the following can be identified as cloud?


a) Web Applications
b) Intranet
c) Hadoop
d) All of the mentioned
Answer: c

9. Cloud computing is an abstraction based on the notion of pooling physical resources


and presenting them as a ________ resource.
a) real
b) virtual
c) cloud
d) none of the mentioned
Answer: b

10. Which of the following is Cloud Platform by Amazon?


a) Azure
b) AWS
c) Cloudera
d) All of the mentioned
Answer: b

11. _________ model consists of the particular types of services that you can access on a
cloud computing platform.
a) Service
b) Deployment
c) Application
d) None of the mentioned
Answer: a

12. Point out the correct statement.


a) The use of the word “cloud” makes reference to the two essential concepts
b) Cloud computing abstracts systems by pooling and sharing resources
c) cloud computing is nothing more than the Internet
d) All of the mentioned
Answer: b

13. ________ refers to the location and management of the cloud’s infrastructure.
a) Service
b) Deployment
c) Application
d) None of the mentioned
Answer: b

14. Which of the following is the deployment model?


a) public
b) private
c) hybrid
d) all of the mentioned
Answer: d

15. Point out the wrong statement.


a) Cloud Computing has two distinct sets of models
b) Amazon has built a worldwide network of datacenters to service its search engine
c) Azure enables .NET Framework applications to run over the Internet
d) None of the mentioned
Answer: b

16. Which of the following is best known service model?


a) SaaS
b) IaaS
c) PaaS
d) All of the mentioned
Answer: d

17. The __________ model originally did not require a cloud to use virtualization to pool
resources.
a) NEFT
b) NIST
c) NIT
d) All of the mentioned
Answer: b

18. _______ model attempts to categorize a cloud network based on four dimensional
factors.
a) Cloud Square
b) Cloud Service
c) Cloud Cube
d) All of the mentioned
Answer: c

19. How many types of dimensions exists in Cloud Cube Model?


a) 1
b) 2
c) 3
d) 4
Answer: d

20. Which of the following dimension is related to organization’s boundaries?


a) Physical location of data
b) Ownership
c) Security boundary
d) All of the mentioned
Answer: a

21. How many types of security boundary values exist in Cloud Cube model?
a) 1
b) 2
c) 3
d) None of the mentioned
Answer: b

22. Point out the correct statement.


a) A deployment model defines the purpose of the cloud and the nature of how the cloud is
located
b) Service model defines the purpose of the cloud and the nature of how the cloud is located
c) Cloud Square Model is meant to show is that the traditional notion of a network boundary
being the network’s firewall no longer applies in cloud computing
d) All of the mentioned
Answer: a

23. Which of the following is provided by ownership dimension of Cloud Cube Model?
a) Proprietary
b) Owner
c) P
d) All of the mentioned
Answer: b

24. __________ is a measure of whether the operation is inside or outside the security
boundary or network firewall.
a) Per
b) P
c) Pre
d) All of the mentioned
Answer: d

25. Point out the wrong statement.


a) Public cloud may be managed by the constituent organization(s) or by a third party
b) A community cloud may be managed by the constituent organization(s) or by a third party
c) Private clouds may be either on- or off-premises
d) None of the mentioned
Answer: a

26. Which of the following is related to the service provided by Cloud?


a) Sourcing
b) Ownership
c) Reliability
d) AaaS
Answer: a

7. ________ dimension corresponds to two different states in the eight possible cloud
forms.
a) Physical location of data
b) Ownership
c) Security boundary
d) None of the mentioned
Answer: d

28. The ________ cloud infrastructure is operated for the exclusive use of an organization.
a) Public
b) Private
c) Community
d) All of the mentioned
Answer: b

29. __________ cloud is one where the cloud has been organized to serve a common
function or purpose.
a) Public
b) Private
c) Community
d) All of the mentioned
Answer: c
Explanation: A community cloud may be managed by the constituent organization(s) or by a third
party.

30. A hybrid cloud combines multiple clouds where those clouds retain their unique
identities but are bound together as a unit.
a) Public
b) Private
c) Community
d) Hybrid
Answer: d

31. Which of the following is owned by an organization selling cloud services?


a) Public
b) Private
c) Community
d) Hybrid
Answer: a

32. Point out the wrong statement.


a) Everything from the application down to the infrastructure is the vendor’s responsibility
b) In the deployment model, different cloud types are an expression of the manner in which
infrastructure is deployed
c) AaaS provides virtual machines, operating systems, applications, services, development
frameworks, transactions, and control structures
d) All of the mentioned
Answer: c
Unit-8

Short Answers:

Q1: What is a cloud service?


Ans: Cloud service is used to build cloud applications using the server in a network through
internet.  It provides the facility of using the cloud application without installing it on the computer.
It also reduces the maintenance and support of the application which are developed using cloud
service.

Q2 List down the three basic clouds in cloud computing?

 Professional cloud
 Personal cloud
 Performance cloud

Q3: As a infrastructure as a service what are the resources that are provided by it?

Ans: IAAS ( Infrastructure As A Service) provides virtual and physical resources that are used to
build a cloud. It deals with the complexities of deploying and maintaining of the services provided
by this layer. Here the infrastructure is the servers, storage and other hardware systems.

Q4: What are the business benefits involved in cloud architecture?

Ans: The benefits involved in cloud architecture is

 Zero infrastructure investment


 Just in time infrastructure
 More efficient resource utilization

Q5 What are the characteristics of cloud architecture that separates it from traditional
one?

Ans: The characteristics that makes cloud architecture above traditional architecture is

 According to the demand cloud architecture provides the hardware requirement


 Cloud architecture is capable of scaling the resource on demand
 Cloud architecture is capable of managing and handling dynamic workloads without failure

Q6: Mention what is the difference between elasticity and scalability in cloud computing?

Ans: Scalability is a characteristics of cloud computing through which increasing workload can be
handled by increasing in proportion the amount of resource capacity.  Whereas, elasticity, is
being one of the characteristics that  highlights the concept of commissioning and
decommissioning of a large amount of resource capacity.

Q7 Mention the services that are provided by Window Azure Operating System?

Ans: Window Azure provides three core services which are given as
 Compute
 Storage
 Management

Q8: In cloud architecture what are the different components that are required?

 Cloud Ingress
 Processor Speed
 Cloud storage services
 Cloud provided services
 Intra-cloud communications

Q9: In cloud architecture what are the different phases involved?

 Launch Phase
 Monitor Phase
 Shutdown Phase
 Cleanup Phase

Q10: List down the basic characteristics of cloud computing?

 Elasticity and Scalability


 Self-service provisioning and automatic de-provisioning
 Standardized interfaces
 Billing self service based usage model

Q11: In cloud architecture what are the building blocks?

 Reference architecture
 Technical architecture
 Deployment operation architecture

Q12: Mention in what ways cloud architecture provide automation and performance
transparency?

Ans: To provide the performance transparency and automation there are many tools used by
cloud architecture.  It allows to manage the cloud architecture and monitor reports. It also allows
them to share the application using the cloud architecture.  Automation is the key component of
cloud architecture which helps to improve the degree of quality.

LONG Questions:
Q1: What Is Cloud Computing Architecture and its Benefits. Explain in detail.
Ans: Cloud Computing Architecture is divided into two parts, i.e., front-end and back-end. Front-
end and back-end communicate via a network or internet. A diagrammatic representation of
cloud computing architecture is shown below:
Cloud Computing Architecture

Front-End 

 It provides applications and the interfaces that are required for the cloud-based service.
 It consists of client’s side applications, which are web browsers such as Google
Chrome and Internet Explorer.
 Cloud infrastructure is the only component of the front-end. Let's understand it in detail.

Front-end - Cloud Computing Architecture

 Cloud infrastructure consists of hardware and software components such as data


storage, server, virtualization software, etc.
 It also provides a Graphical User Interface to the end-users to perform respective tasks.
Moving ahead, let’s understand what the back-end is.

Back-End 
It is responsible for monitoring all the programs that run the application on the front-end
It has a large number of data storage systems and servers. The back-end is an important and
huge part of the whole cloud computing architecture, as shown below:
Back-end - Cloud Computing Architecture
The components of the back-end cloud architecture are mentioned below. Let's understand them
in detail one by one. 

Application 

 It can either be a software or a platform 


 Depending upon the client requirement, the application provides the result to the end-
user (with resources) in the back end

Service 

 Service is an essential component in cloud architecture


 Its responsibility is to provide utility in the architecture
 In a Cloud, few widely used services among the end-users are storage application
development environments and web services
Looking forward to enhancing your cloud computing skills? Enroll in our Cloud Computing
Certification Course today and take your career to new heights.

Storage

 It stores and maintains data like files, videos, documents, etc. over the internet
 Some of the popular examples of storage services are below:
 Amazon S3
 Oracle Cloud-Storage 
 Microsoft Azure Storage
 Its capacity varies depending upon the service providers available in the market

Management
 Its task is to allot specific resources to a specific task, it simultaneously performs
various functions of the cloud environment
 It helps in the management of components like application, task, service, security, data
storage, and cloud infrastructure
 In simple terms, it establishes coordination among the cloud resources

Security

 Security is an integral part of back-end cloud infrastructure


 It provides secure cloud resources, systems, files, and infrastructure to end-users
 Also, it implements security management to the cloud server with virtual firewalls which
results in preventing data loss

Benefits of Cloud Computing Architecture


The cloud computing architecture is designed in such a way that:

 It solves latency issues and improves data processing requirements


 It reduces IT operating costs and gives good accessibility to access data and digital
tools
 It helps businesses to easily scale up and scale down their cloud resources
 It has a flexibility feature which gives businesses a competitive advantage
 It results in better disaster recovery and  provides high security
 It automatically updates its services
 It encourages remote working and promotes team collaboration 

Q2: Explain Various components of Cloud Computing Architecture.

Ans: Some of the important components of Cloud Computing architecture that we will be looking
into are as follows:

 Hypervisor
 Management Software
 Deployment Software
 Network
 Cloud Server
 Cloud Storage

Hypervisor
 It is a virtual machine monitor which provides Virtual Operating Platforms to every user 
 It also manages guest operating systems in the cloud 
 It runs a separate virtual machine on the back end which consists of software and
hardware
 Its main objective is to divide and allocate resources

Management Software 

 Its responsibility is to manage and monitor cloud operations with various strategies to
increase the performance of the cloud
 Some of the operations performed by the management software are: 
 compliance auditing
 management of overseeing disaster
 contingency plans

Deployment Software

 It consists of all the mandatory installations and configurations required to run a cloud
service
 Every deployment of cloud services are performed using a deployment software
 The three different models which can be deployed are the following:

 SaaS - Software as a service hosts and manages applications of the end-user.


Example: Gmail

Image_Name: PaaS

 PaaS - Platform as a service helps developers to build, create, and manage


applications.
Example: Microsoft Azure

 IaaS - Infrastructure as a service provides services on a pay-as-you-go pricing model.

Network

 It connects the front-end and back-end. Also, allows every user to access cloud
resources
 It helps users to connect and customize the route and protocol
 It is a virtual server which is hosted on the cloud computing platform
 It is highly flexible, secure, and cost-effective

Cloud Storage
 Here, every bit of data is stored and accessed by a user from anywhere over the
internet
 It is scalable at run-time and is automatically accessed
 Data can be modified and retrieved from cloud storage over the web

Q3: What is NIST Cloud Computing Reference Architecture in detail?


Ans: The NIST SP 500-292 breaks down into several sections that define and explain all
elements of cloud computing. These form a taxonomy with four distinct levels, each representing
a more nuanced, niche set of terms. The first two levels define the most essential terms:
 The Level 1 terms – A set of Roles that collectively comprise the cloud Reference Model
 The Level 2 terms – A set of Activities that define the model’s Architectural Components
By understanding these terms and the relationships between them, any company can begin to
optimize its cloud computing security architecture in response to ever-evolving cloud threats.

The NIST’s Cloud Computing Architecture Model


The first portion of NIST SP 500-292 defines the relationships between all stakeholders involved
in cloud computing. There are five major roles detailed within NIST SP 500-292:
 Cloud Consumer
 Cloud Provider
 Cloud Auditor
 Cloud Broker
 Cloud Carrier
As a disclaimer, these roles may be less stable today than they were in 2011, as providers and
consumers alike have changed drastically in nature and scale. Still, the definitions are useful as
templates for understanding the basis of stakeholders’ differing roles and responsibilities.

Cloud Consumers in the NIST Cloud Computing Reference Architecture


NIST designates Cloud Consumers as the principal stakeholders for cloud computing services.
The category includes three Cloud Consumer distinctions according to the services used:
 Software as a service (SaaS) consumers who rely on cloud computing for general office or
productivity services (e.g., HR and accounting tasks)
 Platform as a service (PaaS) consumers who rely on cloud computing for their business
intelligence needs (e.g., database management and application integration)
 Information technology as a service (ITaaS) consumers who rely on cloud computing for IT
needs (e.g., storage, backups, content delivery, and other general computing tasks)
Cloud Providers in the NIST Cloud Computing Reference Architecture
Cloud providers are the parties most closely associated with cloud consumers. They are
responsible for making cloud services available. Cloud providers’ offerings correspond to the
types of consumers, along with the “Activities” or “Components.”

SaaS cloud providers generally deploy or manage the configuration of given software on cloud
infrastructure. PaaS cloud providers generally manage the cloud infrastructure while also
developing tools for optimizing workflows. ITaaS cloud providers generally facilitate distribution,
maintenance, and monitoring of cloud infrastructure.

Cloud Auditors in the NIST Cloud Computing Reference Architecture


The NIST defines cloud auditors as parties who can execute independent audits or assessments
on a company’s cloud infrastructure. Audits are typically done to determine whether the
infrastructure meets cybersecurity or compliance benchmarks. Critically, auditing services must
be delivered separately from any cloud services when partnering with the same vendor or by
another third party.

However, in the contemporary cloud environment, a provider may integrate a secure and logically
separate auditing functionality into a suite of services. As a result, consumers might seek out
providers who integrate this functionality for efficiency’s sake.

Cloud Brokers in the NIST Cloud Computing Reference Architecture


Cloud brokers are defined as managing service providers. Consumers may contact cloud brokers
instead of cloud providers. Brokers tend to handle three cloud categories:
 Intermediation – Enhancing access, performance monitoring, identity management, etc.
 Aggregation – Integrating a provider’s cloud services into a comprehensive cloud suite
 Arbitrage – Integrating services from multiple providers into a uniform service suite
These parties may be distinct from providers, but providers may also conduct such activity.

Cloud Carriers in the NIST Cloud Computing Reference Architecture


The NIST defines cloud carriers as the parties facilitating consumers’ and providers’ data
transmissions and their connectivity to cloud services.

Cloud carriers’ responsibilities include the production and distribution of all physical and virtual
resources needed to maintain cloud computing. Responsibilities pertain to all the servers and
hardware needed to keep cloud networks up and running, along with endpoints or network
access devices used to access cloud data safely.

Q4: Explain laaS – PaaS – SaaS in details.

Ans: Cloud services are a trending topic in the business world. Almost every organization needs
to utilize a cloud computing service. Cloud services are more essential today. It becomes hard to
manage the large volume of sensitive and confidential data. Organizations may not have enough
resources to build their own data centres. Therefore, cloud services are an essential part of any
organization.

However, the main question arises, which cloud service to use? The organization has three
service models of cloud computing to choose from. These are as follows:

 Software as a Service (SaaS)


 Infrastructure as a Service (IaaS)
 Platform as a Service (PaaS)

All of the services that we have mentioned above have their benefits and limitations. Therefore, it
is necessary to know about the main differences.  

On-Premise vs IaaS PaaS SaaS – Key Difference in the Example of Transport

Infrastructure as a Service (IaaS)


With IaaS, a brand essentially buys or rents server space from a vendor. They can then take
advantage of the scaling potential guaranteed by the vendor while managing every detail of their
applications — from operating system to middleware to runtime —  without any assistance from
the IaaS vendor.  

IaaS or “infrastructure-as-a-service” is often used to describe “cloud services” or “managed


infrastructure services”. IaaS involves providers offering dedicated or cloud based server and
network infrastructure where the provider generally manages the hardware (swapping failed hard
drives and that sort of thing) and sometimes the operating system of the infrastructure itself,” said
David Vogelpohl, VP of Web Strategy at Austin, TX.-based WPEngine.
“Many IaaS providers provide additional offerings like easy deployment options, stand-alone
products which can be used on the IaaS provider’s platform, and some application layer services
and support. IaaS is generally a good fit for organizations looking for high levels of customization
for their infrastructure. [Thus], IaaS normally requires a high degree of technical proficiency within
an organization,” he explained.

Jonathan Whiteside, CTO of L.A-based Building Blocks, expanded on Vogelpohl’s definition,


stating that, “IaaS gives you total control by managing your own infrastructure in a public cloud,
but you'll need the resources and expertise to comfortably configure and manage your own
infrastructure.”

IaaS Examples
 AWS (Amazon Web Services)

 Google Compute

 Microsoft Azure
Platform as a Service (PaaS)
With platform-as-a-service or PaaS, the vendor gives its clients or customers the same server
space and flexibility, but with some additional tools to help build or customize applications more
rapidly. Furthermore, a PaaS vendor handles things like runtime, middleware, operating system,
virtualization and storage — although the client or customer manages their own applications and
data.

PaaS describes [an offering made up of] both the infrastructure and software for building digital
applications. PaaS providers generally specialize in creating certain types of applications, like
eCommerce applications for example, Vogelpohl told CMSWire. He went on to explain how some
PaaS providers offer dedicated or virtualized hardware, and some hide the infrastructure layer
from the customer for ease of use. “PaaS is generally a good fit for organizations building a
particular type of application which would benefit from the additional features and management
offered by the PaaS for that type of application. PaaS can require a high degree of technical
proficiency; however, PaaS providers often include products and features that make it easier for
non-technical customers to create digital applications and experiences,” he continued

Whiteside also gave his two cents, telling CMSWire that PaaS is a, “good fit if you don't have the
level of expertise in infrastructure needed for IaaS but still want your development teams to
deploy their applications and websites themselves.”

PaaS Examples
 Google App Engine

 Heroku

 OutSystems
Software as a Service (SaaS)
Software-as-a-service basically handles all the technical stuff while at the same time providing an
application (or a suite of applications) that the client or customer can use to launch projects
immediately — or at least, faster than they would do with an IaaS or PaaS solution, both of which
require more technical input from the client or customer. Coincidentally, most, if not all, SaaS
vendors use IaaS or PaaS Solutions to support their suite of applications, handling the technical
elements so their customers don’t have to.
Whiteside told CMSWire that SaaS is the least hands-on of the three cloud computing solutions
and is good if you don't have developer resources but need to provide capabilities to end users.
"You won't have visibility or control of your infrastructure and are restricted by the capabilities and
configuration of the software tools. This can be restrictive when you want to integrate with other
systems you may own and run, but does allow you to get up and running quickly,” he continued.

WP Engine’s Vogelpohl chimed in once again, explaining that “SaaS providers allow customers
to take advantage of the features of the software they provide without the customer having to
purchase infrastructure or use IaaS or PaaS solutions.”

SaaS services are generally chosen based on the features and quality of the software.
"Customers choosing server software on IaaS and PaaS providers generally do so over SaaS
because of the need for high levels of customization to the core software or aggressive security
requirements,” he said.

SaaS Examples
 Google G Suite

 Microsoft Office 365

 Mailchimp
Q5: Differentiate between IaaS, PaaS and SaaS.
Ans: Following table mentions difference between IaaS, PaaS and SaaS with respect to various
comparison parameters.

Parameters IaaS PaaS SaaS

Platform as a
Full Name Infrastructure as a Service Service Software as a Service

Who uses
it ? System administrators Developers End users

Virtual data center to store


information and create Virtual platform
Which platforms for services and and tools to create, Web software and
service app development, testing test and deploy apps to complete
users get ? and deployment apps and services business tasks

Provider Servers, Storage, Servers, Storage, Servers, Storage,


controls Networking, Virtualization Networking, Networking,
what ? Virtualization, OS, Virtualization, OS,
Middleware, Middleware, Runtime,
Runtime Applications, Data

User
controls OS, Middleware, Runtime,
what ? Applications, Data Applications, Data -

Cost Most expensive Mid level cost Cheapest

Flexible but with


Flexibility Very flexible some limitations Lowest modifications

Most control over data, but Secure but higher


need advanced knowledge level of risk than Secure but can be
Security in security SaaS accessed by provider

MCQ’s
1. What second programming language did Google add for App Engine development?
A. C++
B. Flash
C. Java
D. Visual Basic
Ans:C
2. What facet of cloud computing helps to guard against downtime and determines
costs?
A. Service-level agreements
B. Application programming interfaces
C. Virtual private networks
D. Bandwidth fees
ANS:A
3. Which of these is not a major type of cloud computing usage?
A. Hardware as a Service
B. Platform as a Service
C. Software as a Service
D. Infrastructure as a Service
ANS:A
4. Cloud Services have a____ relationship with their customers.
A. Many-to-many
B. One-to-many
C. One-to-one
ANS:B
5. What is the name of Rackspace’s cloud service?
A. Cloud On-Demand
B. Cloud Servers
C. EC2
ANS:B
6. What is the name of the organization helping to foster security standards for cloud
computing?
A. Cloud Security Standards Working Group
B. Cloud Security Alliance
C. Cloud Security WatchDog
D. Security in the Cloud Alliance
ANS:B
7. Which of these companies specializes in cloud computing management tools and
services?
A. RightScale
B. Google
C. Salesforce.com
D. Savis
ANS:A
8. What’s the most popular use case for public cloud computing today?
A. Test and development
B. Website hosting
C. Disaster recovery
D. Business analytics
ANS:A
9. Virtual Machine Ware (VMware) is an example of
A. Infrastructure Service
B. Platform Service
C. Software Service
ANS:A
10. Cloud Service consists of
A. Platform, Software, Infrastructure
B. Software, Hardware, Infrastructure
C. Platform, Hardware, Infrastructure
ANS:A
11. Google Apps Engine is a type of
A. SaaS
B. PaaS
C. IaaS
D. NA
ANS:D
12. Which vendor recently launched a cloud-based test and development service for
enterprises?
A. HP
B. Cisco
C. IBM
D. Oracle
ANS:C
13. Geographic distribution of data across a cloud provider’s network is a problem for
many enterprises because it:
A. Breaks compliance regulations
B. Adds latency
C. Raises security concerns
D. Makes data recovery harder
ANS:A
14. Amazon Web Services is which type of cloud computing distribution model?
A. Software as a Service
B. Platform as a Service
C. Infrastructure as a Service
ANS:C
15. Cloud computing networks are designed to support only private or hybrid clouds.
A. True
B. False
Ans: B
16. A good cloud computing network can be adjusted to provide bandwidth on
demand.
A. True
B. False
Ans: A
17. A larger cloud network can be built as either a layer 3 or layer 4 network.
A. True
B. False
Ans: B
18. The typical three-layer switching topology will not create latency within a cloud
network.
A. True
B. False
Ans: B
19. The term ‘Cloud’ in cloud-computing refers to __.
A. The Internet
B. Cumulus Clouds
C. A Computer
D. Thin Clients
Ans: A
20. In order to participate in cloud-computing, you must be using the following OS
___ .
A. Windows
B. Mac OS
C. Linux
D. All of the above
Ans: D
21. Which of the following is true of cloud computing?
A. It’s always going to be less expensive and more secure than local computing.
B. You can access your data from any computer in the world, as long as you have an
Internet connection.
C. Only a few small companies are investing in the technology, making it a risky
venture.
Ans:B
22. What is private cloud?
A. A standard cloud service offered via the Internet
B. A cloud architecture maintained within an enterprise data center.
C. A cloud service inaccessible to anyone but the cultural elite
Ans:B
23. Amazon Web Services is which type of cloud computing distribution model?
A. Software as a Service (SAAS)
B. Platform as a Service (PAAS)
C. Infrastructure as a Service (IAAS)
A.ns:C
24. Google Docs is a type of cloud computing.
A. True
B. False
Ans: A
25. What is Cloud Foundry?
A. A factory that produces cloud components
B. An industry wide PaaS initiative
C. VMware-led open source PaaS.
Ans: C
26 This is a software distribution model in which applications are hosted by a vendor
or service provider and made available to customers over a network, typically the
Internet.
A. Platform as a Service (PaaS)
B. Infrastructure as a Service (IaaS)
C. Software as a Service (SaaS).
Ans:C
27. Which of the following statements about Google App Engine (GAE) is
INCORRECT.
A. It’s a Platform as a Service (PaaS) model.
B. Automatic Scalability is built in with GAE. As a developer you don’t need to worry
about application scalability
C. You can decide on how many physical servers required for hosting your
application.
D. The applications deployed on GAE have the same security, privacy and data
protection policies as that of Google’s applications. So, applications can take
advantage of reliability, performance and security of Google’s infrastructure.
Ans:C
28. I’ve a website containing all static pages. Now I want to provide a simple
Feedback form for end users. I don’t have software developers, and would like to
spend minimum time and money. What should I do?
A. Hire software developers, and build dynamic page.
B. Use ZOHO creator to build the required form, and embed in html page.
C. Use Google App Engine (GAE) to build and deploy dynamic page.
Ans:B
29. What is the name of the organization helping to foster security standards for cloud
computing?
A. Cloud Security Standards Working.
B. Cloud Security Alliance.
C. Cloud Security WatchDog.
D. Security in the Cloud Alliance.
Ans:B
30. “Cloud” in cloud computing represents what?
A. Wireless
B. Hard drives
C. People
D. Internet
Ans:D
Unit-9
Short Answers
Q1. What is Google App Engine?
Google App Engine (often referred to as GAE or simply App Engine) is a web framework
and cloud computing platform for developing and hosting web applications in Google-managed
data centers. Applications are sandboxed and run across multiple servers
Q2. What are the key features in Google App Engine application environment?
 dynamic web serving, with full support for common web technologies
 persistent storage with queries, sorting and transactions
 automatic scaling and load balancing
 APIs for authenticating users and sending email using google accounts
 a fully featured local development environment that simulates Google App Engine on
users computer
 task queues for performing work outside of the scope of a web request
 scheduled tasks for triggering events at specified times and regular intervals
Q3. What are the advantages of Google App Engine ?
 Scalability
 Lower total cost of ownership
 Rich set of APIs
 Fully featured SDK for local development
 Ease of deployment
 Web administration console and diagnostic utilities
Q4 What are the service provided by Google App Engine?
Wide range of services available
 User service
 Blobstore
 Task Queues
 Mail Servie
 Image
 Memcache, etc
Q5. Describe the services available in User services?
It provides a simple API for authentication and authorization
It detect if a user is signed in App
It detect if a user is an admin
Q7. What are the three authentication options in User service?
Google Account
Google Apps domains users
OpenID - experimental
Q7 Describe the services available in Blobstore services?
o Blobstore service allows our application to serve binary objects, larger than the
entities in the Datastore.
o Blobs are created by uploading files through HTTP
o The upload and store logic is handled by the service
o the HTTP request is redirected to a dispatcher, by specified by the developer

Q8. Describe the services available in Task Queues?


o Task queues allow the application to perform work, initiated by a user request,
outside of that request.
o It is suitable for triggering background processes

Q9. What are the different ways of storing application data in Google App Engine?
o Datastore
o Google Cloud SQL
o Google Cloud Storage

Q10: What is Amazon Web Service(AWS)?


Amazon web services is a collection of remote computing services(web services) that
together make up a cloud computing platform offered over the internet by Amazon.com
Q11. What does Amazon Web Service offering?
o Low ongoing cost
o Instant Elasticity and Flexible capacity (Scaling up and down)
o Speed and Agility
o Apps not Ops
o Global Reach
o Open and flexible
o Secure

Q12. What is Amazon Elastic Compute Cloud(EC2)?


A Web service that provides resizable compute capacity in the cloud. EC2 allows creating virtual
machine on-demand

Q13. What is Amazon Elastic Block Store(EBS)?


EBS provides block level storage volumes(1 GB to 1 TB) for use with Amazon EC2
instances
 multiple volumes can be mounted to the same instance
 EBS volumes are network-attached and persist independently from the life of an instance
 Storage volumes behave like raw, unformatted block devices, allowing users to create a
file system on top of Amazon EBS volumes or use them in any other way you would use
a block device

Q14. What is Amazon Simple Storage Service (S3)?


Amazon S3 provides a simple web services interface that can be used to store and retrieve
any amount of data, at any time, from anywhere on the web.

Long Answers
Q1: What is Storage as a Service (STaaS) and What is it Used For?

Ans: Storage as a service (STaaS) is a managed service in which the provider supplies the
customer with access to a data storage platform. The service can be delivered on premises from
infrastructure that is dedicated to a single customer, or it can be delivered from the public cloud
as a shared service that's purchased by subscription and is billed according to one or more
usage metrics.

STaaS customers access individual storage services through standard system interface protocols
or application program interfaces (APIs). Typical offerings include bare-metal storage capacity;
raw storage volumes; network file systems; storage objects; and storage applications that support
file sharing and backup lifecycle management.

Storage as a service was originally seen as a cost-effective way for small and mid-size
businesses that lacked the technical personnel and capital budget to implement and maintain
their own storage infrastructure. Today, companies of all sizes use storage as a service.

Uses of STaaS
Storage as a service can be used for data transfers and redundant storage, as well as to restore
any corrupted or lost data. CIOs may want to use STaaS for the ability to deploy resources at an
instant or to replace some existing storage space -- leaving room for on-premises storage
hardware. CIOs may also appreciate the ability to tailor storage capacity and performance per
workload.

As an example, instead of maintaining a large tape library and arranging to vault (store) tapes off
site, a network administrator that uses STaaS for backups could specify what data on the
network should be backed up and how often it should be backed up. Their company would sign a
service-level agreement (SLA) whereby the STaaS provider agrees to rent storage space on a
cost-per-gigabyte-stored and cost-per-data-transfer basis, and the company's data would then be
automatically transferred at the specified time over the storage provider's proprietary wide area
network (WAN) or the internet. If the company's data were to ever become corrupt or get lost, the
network administrator could contact the STaaS provider and request a copy of the data.

Storage as a service in cloud computing


Instead of storing data on-premises, organizations that use STaaS will typically utilize a public
cloud for storage and backup needs. Public cloud storage may also use different storage
methods for STaaS. These storage methods include backup and restore, disaster recovery, block
storage, SSD storage, object storage and bulk data transfer. Backup and restore refers to the
backing up of data to the cloud, which provides protection in case of data loss. Disaster recovery
may refer to protecting and replicating data from virtual machines (VMs).

Block storage enables customers to provision block storage volumes for lower-latency I/O. SSD
storage is another storage type that is typically used for intensive read/write and I/O operations.
Object storage systems are used in data analytics, disaster recovery and cloud applications and
tend to have high latency. Cold storage is used to create and configure stored data quickly. Bulk
data transfers will use disks and other hardware to transfer data.

Advantages of STaaS
Key advantages to STaaS in the enterprise include the following:

 Storage costs. Personnel, hardware and physical storage space expenses are


reduced.

 Disaster recovery. Having multiple copies of data stored in different locations can


better enable disaster recovery measures.

 Scalability. With most public cloud services, users only pay for the resources that they
use.

 Syncing. Files can be automatically synced across multiple devices.

 Security. Security can be both an advantage and a disadvantage, as security methods


may change per vendor. Data tends to be encrypted during transmission and while at
rest.
Disadvantages of STaaS
Common disadvantages of STaaS include the following:

 Security. Users may end up transferring business-sensitive or mission-critical data to


the cloud, which makes it important to choose a service provider that's reliable.

 Potential storage costs. If bandwidth limitations are exceeded, these could be


expensive.

 Potential downtimes. Vendors may go through periods of downtime where the


service is not available, which can be trouble for mission-critical data.
 Limited customization. Since the cloud infrastructure is owned and managed by the
service provider, it is less customizable.

 Potential for vendor lock-in. It may be difficult to migrate from one service to another.

Q2: What is S3 in AWS? List various features.


Ans: Amazon Simple Storage Service (Amazon S3) is an object storage service that offers
industry-leading scalability, data availability, security, and performance. Customers of all sizes
and industries can use Amazon S3 to store and protect any amount of data for a range of use
cases, such as data lakes, websites, mobile applications, backup and restore, archive, enterprise
applications, IoT devices, and big data analytics. Amazon S3 provides management features so
that you can optimize, organize, and configure access to your data to meet your specific
business, organizational, and compliance requirements.

Features of Amazon S3

Storage classes

Amazon S3 offers a range of storage classes designed for different use cases. For example, you
can store mission-critical production data in S3 Standard for frequent access, save costs by
storing infrequently accessed data in S3 Standard-IA or S3 One Zone-IA, and archive data at the
lowest costs in S3 Glacier Instant Retrieval, S3 Glacier Flexible Retrieval, and S3 Glacier Deep
Archive.

You can store data with changing or unknown access patterns in S3 Intelligent-Tiering, which
optimizes storage costs by automatically moving your data between four access tiers when your
access patterns change. These four access tiers include two low-latency access tiers optimized
for frequent and infrequent access, and two opt-in archive access tiers designed for
asynchronous access for rarely accessed data.

Storage management

Amazon S3 has storage management features that you can use to manage costs, meet
regulatory requirements, reduce latency, and save multiple distinct copies of your data for
compliance requirements.

 S3 Lifecycle – Configure a lifecycle policy to manage your objects and store them cost
effectively throughout their lifecycle. You can transition objects to other S3 storage classes
or expire objects that reach the end of their lifetimes.
 S3 Object Lock – Prevent Amazon S3 objects from being deleted or overwritten for a fixed
amount of time or indefinitely. You can use Object Lock to help meet regulatory
requirements that require write-once-read-many (WORM) storage or to simply add another
layer of protection against object changes and deletions.
 S3 Replication – Replicate objects and their respective metadata and object tags to one or
more destination buckets in the same or different AWS Regions for reduced latency,
compliance, security, and other use cases.
 S3 Batch Operations – Manage billions of objects at scale with a single S3 API request or
a few clicks in the Amazon S3 console. You can use Batch Operations to perform
operations such as Copy, Invoke AWS Lambda function, and Restore on millions or
billions of objects.

Access management

Amazon S3 provides features for auditing and managing access to your buckets and objects. By
default, S3 buckets and the objects in them are private. You have access only to the S3
resources that you create. To grant granular resource permissions that support your specific use
case or to audit the permissions of your Amazon S3 resources, you can use the following
features.

 S3 Block Public Access – Block public access to S3 buckets and objects. By default, Block
Public Access settings are turned on at the account and bucket level.
 AWS Identity and Access Management (IAM) – Create IAM users for your AWS account to
manage access to your Amazon S3 resources. For example, you can use IAM with
Amazon S3 to control the type of access a user or group of users has to an S3 bucket that
your AWS account owns.
 Bucket policies – Use IAM-based policy language to configure resource-based
permissions for your S3 buckets and the objects in them.
 Access control lists (ACLs) – Grant read and write permissions for individual buckets and
objects to authorized users. As a general rule, we recommend using S3 resource-based
policies (bucket policies and access point policies) or IAM policies for access control
instead of ACLs. ACLs are an access control mechanism that predates resource-based
policies and IAM. For more information about when you'd use ACLs instead of resource-
based policies or IAM policies, see Access policy guidelines.
 S3 Object Ownership – Disable ACLs and take ownership of every object in your bucket,
simplifying access management for data stored in Amazon S3. You, as the bucket owner,
automatically own and have full control over every object in your bucket, and access
control for your data is based on policies.
 Access Analyzer for S3 – Evaluate and monitor your S3 bucket access policies, ensuring
that the policies provide only the intended access to your S3 resources.

Data processing
To transform data and trigger workflows to automate a variety of other processing activities at
scale, you can use the following features.

 S3 Object Lambda – Add your own code to S3 GET requests to modify and process data
as it is returned to an application. Filter rows, dynamically resize images, redact
confidential data, and much more.
 Event notifications – Trigger workflows that use Amazon Simple Notification Service
(Amazon SNS), Amazon Simple Queue Service (Amazon SQS), and AWS Lambda when
a change is made to your S3 resources.

Storage logging and monitoring

Amazon S3 provides logging and monitoring tools that you can use to monitor and control how
your Amazon S3 resources are being used. For more information, see Monitoring tools.

Automated monitoring tools

 Amazon CloudWatch metrics for Amazon S3 – Track the operational health of your S3
resources and configure billing alerts when estimated charges reach a user-defined
threshold.
 AWS CloudTrail – Record actions taken by a user, a role, or an AWS service in Amazon
S3. CloudTrail logs provide you with detailed API tracking for S3 bucket-level and object-
level operations.
Manual monitoring tools

 Server access logging – Get detailed records for the requests that are made to a bucket.
You can use server access logs for many use cases, such as conducting security and
access audits, learning about your customer base, and understanding your Amazon S3
bill.
 AWS Trusted Advisor – Evaluate your account by using AWS best practice checks to
identify ways to optimize your AWS infrastructure, improve security and performance,
reduce costs, and monitor service quotas. You can then follow the recommendations to
optimize your services and resources.

Analytics and insights

Amazon S3 offers features to help you gain visibility into your storage usage, which empowers
you to better understand, analyze, and optimize your storage at scale.
 Amazon S3 Storage Lens – Understand, analyze, and optimize your storage. S3 Storage
Lens provides 29+ usage and activity metrics and interactive dashboards to aggregate
data for your entire organization, specific accounts, AWS Regions, buckets, or prefixes.
 Storage Class Analysis – Analyze storage access patterns to decide when it's time to
move data to a more cost-effective storage class.
 S3 Inventory with Inventory reports – Audit and report on objects and their corresponding
metadata and configure other Amazon S3 features to take action in Inventory reports. For
example, you can report on the replication and encryption status of your objects. For a list
of all the metadata available for each object in Inventory reports, see Amazon S3
Inventory list.

Strong consistency

Amazon S3 provides strong read-after-write consistency for PUT and DELETE requests of
objects in your Amazon S3 bucket in all AWS Regions. This behavior applies to both writes of
new objects as well as PUT requests that overwrite existing objects and DELETE requests. In
addition, read operations on Amazon S3 Select, Amazon S3 access control lists (ACLs), Amazon
S3 Object Tags, and object metadata (for example, the HEAD object) are strongly consistent. For
more information, see Amazon S3 data consistency model.

Q3: How does S3 in AWS works? Explain Buckets, Objects and Keys.

Ans: Amazon S3 is an object storage service that stores data as objects within buckets.
An object is a file and any metadata that describes the file. A bucket is a container for objects.

To store your data in Amazon S3, you first create a bucket and specify a bucket name and AWS
Region. Then, you upload your data to that bucket as objects in Amazon S3. Each object has
a key (or key name), which is the unique identifier for the object within the bucket.

S3 provides features that you can configure to support your specific use case. For example, you
can use S3 Versioning to keep multiple versions of an object in the same bucket, which allows
you to restore objects that are accidentally deleted or overwritten.

Buckets and the objects in them are private and can be accessed only if you explicitly grant
access permissions. You can use bucket policies, AWS Identity and Access Management (IAM)
policies, access control lists (ACLs), and S3 Access Points to manage access.

Buckets

A bucket is a container for objects stored in Amazon S3. You can store any number of objects in
a bucket and can have up to 100 buckets in your account.
Every object is contained in a bucket. For example, if the object named photos/puppy.jpg is
stored in the DOC-EXAMPLE-BUCKET bucket in the US West (Oregon) Region, then it is
addressable using the
URL https://DOC-EXAMPLE-BUCKET.s3.us-west-2.amazonaws.com/photos/puppy.jpg.
When you create a bucket, you enter a bucket name and choose the AWS Region where the
bucket will reside. After you create a bucket, you cannot change the name of the bucket or its
Region. Bucket names must follow the bucket naming rules.

Buckets also:

 Organize the Amazon S3 namespace at the highest level.


 Identify the account responsible for storage and data transfer charges.
 Provide access control options, such as bucket policies, access control lists (ACLs), and
S3 Access Points, that you can use to manage access to your Amazon S3 resources.
 Serve as the unit of aggregation for usage reporting.

Objects

Objects are the fundamental entities stored in Amazon S3. Objects consist of object data and
metadata. The metadata is a set of name-value pairs that describe the object. These pairs
include some default metadata, such as the date last modified, and standard HTTP metadata,
such as Content-Type. You can also specify custom metadata at the time that the object is
stored.

An object is uniquely identified within a bucket by a key (name) and a version ID (if S3 Versioning
is enabled on the bucket).

Keys

An object key (or key name) is the unique identifier for an object within a bucket. Every object in a
bucket has exactly one key. The combination of a bucket, object key, and optionally, version ID (if
S3 Versioning is enabled for the bucket) uniquely identify each object. So you can think of
Amazon S3 as a basic data map between "bucket + key + version" and the object itself.

Every object in Amazon S3 can be uniquely addressed through the combination of the web
service endpoint, bucket name, key, and optionally, a version. For example, in the
URL https://DOC-EXAMPLE-BUCKET.s3.us-west-2.amazonaws.com/photos/puppy.jpg, DOC-
EXAMPLE-BUCKET is the name of the bucket and /photos/puppy.jpg is the key.

Q4: How Google App Engine helps in cloud computing?


Ans: Building applications on the cloud is gaining traction as it accelerates your business
opportunities while ensuring availability, security, accessibility, and scalability. However, to start
with creating web applications, you would require a suitable cloud computing technology. This is
where Google App Engine fits in by allowing you to build and host web applications on a fully-
managed serverless platform.

The App Engine architecture in cloud computing looks like this:

Services provided by App Engine includes:


 Platform as a Service (PaaS) to build and deploy scalable applications

 Hosting facility in fully-managed data centers

 A fully-managed, flexible environment platform for managing application server and


infrastructure

 Support in the form of popular development languages and developer tools


Major Features of Google App Engine in Cloud Computing

1. Collection of Development Languages and Tools

The App Engine supports numerous programming languages for developers and offers the
flexibility to import libraries and frameworks through docker containers. You can develop and test
an app locally using the SDK containing tools for deploying apps. Every language has its SDK
and runtime.

2. Fully Managed

Google allows you to add your web application code to the platform while managing the
infrastructure for you. The engine ensures that your web apps are secure and running and saves
them from malware and threats by enabling the firewall.
3. Pay-as-you-Go

The app engine works on a pay-as-you-go model, i.e., you only pay for what you use. The app
engine automatically scales up resources when the application traffic picks up and vice-versa.
4. Effective Diagnostic Services

Cloud Monitoring and Cloud Logging that helps run app scans to identify bugs. The app reporting
document helps developers fix bugs on an immediate basis.
5. Traffic Splitting

The app engine automatically routes the incoming traffic to different versions of the apps as a
part of A/B testing. You can plan the consecutive increments based on what version of the app
works best.

Q5: Explain various benefits of Google App Engine in Cloud Computing.

Ans: Adopting the App Engine is a smart decision for your organization — it will allow you to
innovate and stay valuable. Here the answer to why Google App Engine is a preferable choice for
building applications:
1. All Time Availability

When you develop and deploy your web applications on the cloud, you enable remote access for
your applications. Considering the impact of COVID-19 on businesses, Google App Engine is the
right choice that lets the developers develop applications remotely, while the cloud service
manages the infrastructure needs.
2. Ensure Faster Time to Market

For your web applications to succeed, ensuring faster time to market is imperative as the
requirements are likely to change if the launch time is extended. Using Google App Engine is as
easy as it can get for developers. The diverse tool repository and other functionalities ensure that
the Google Cloud application development and testing time gets reduced, which, in turn, ensures
faster launch time for MVP and consecutive launches.
3. Easy to Use Platform

The developers only require to write code. With zero configuration and server management, you
eliminate all the burden to manage and deploy the code. Google App Engine makes it easy to
use the platform, which offers the flexibility to focus on other concurrent web applications and
processes. The best part is that GAE automatically handles the traffic increase through patching,
provisioning, and monitoring.
4. Diverse Set of APIs

Google App Engine has several built-in APIs and services that allow developers to build robust
and feature-rich apps. These features include:

 Access to the application log

 Blobstore, serve large data objects

 Google App Engine Cloud Storage

 SSL Support

 Page Speed Services

 Google Cloud Endpoint, for mobile application

 URL Fetch API, User API, Memcache API, Channel API, XXMP API, File API
5. Increased Scalability

Scalability is synonymous with growth — an essential factor that assures success and
competitive advantage. The good news is that the Google App Engine cloud development
platform is automatically scalable. Whenever the traffic to the web application increases, GAE
automatically scales up the resources, and vice-versa.
6. Improved Savings

With Google App Engine, you do not have to spend extra on server management of the app.
The Google Cloud service is good at handling the backend process.
Also, Google App Engine pricing is flexible as the resources can scale up/down based on the
app’s usage. The resources automatically scale up/down based on how the app performs in the
market, thus ensuring honest pricing in the end.
7. Smart Pricing

The major concern of organizations revolves around how much does Google App Engine cost?
For your convenience, Google App Engine has a daily and a monthly billing cycle, i.e.,
 Daily: You will be charged daily for the resources you use
 Monthly: All the daily charges are calculated and added to the taxes (if applicable)
and debited from your payment method
Also, the App Engine has a dedicated billing dashboard, “App Engine Dashboard” to view and
manage your account and subsequent billings.

Q6: What Is Azure? Why Use Azure?

Ans: Microsoft Azure is a cloud computing service from Microsoft. Azure offers a range of
software as a service (SaaS), platform as a service (PaaS) and infrastructure as a service (IaaS)
options for deploying applications and services on Microsoft-managed data center infrastructure.

Microsoft Azure has more than a hundred services to help you quickly solve your toughest
challenges. Azure’s agility and built-in Development Operations (DevOps) allow you to iterate
quickly and deliver code using an end-to-end cloud development platform. Whatever language
you utilize, whether Microsoft Azure’s Visual Studio Team Services or another open-source tool
like Chef or Jenkins, you will be able to debug faster and easier than ever before.

Microsoft Azure supports private cloud, public cloud, and hybrid cloud deployments. Azure’s
robust Information Security (InfoSec) services provide general, storage, database, and
networking security, identity and access management, backup and Disaster Recover (DR).

Microsoft Azure supports any tool, language, or framework: Node.js, Java, .NET, and more.
Microsoft’s best-in-class development tools help you write great code. Visual Studio and Visual
Studio Code are supported to improve your productivity.

Microsoft Azure offers more than 100 turn-key services, and the latest in AI and data to bring
intelligence to your operations. More than 150 Azure Logic Apps connections are available right
away, including favorites like Office 365, Dropbox, Google Services, Salesforce, and Twitter.

Things that you should know about Azure:

 It was launched on February 1, 2010, significantly later than its main competitor, AWS.

 It’s free to start and follows a pay-per-use model, which means you pay only for the
services you opt for.

 Interestingly, 80 percent of the Fortune 500 companies use Azure services for their
cloud computing needs.

 Azure supports multiple programming languages, including Java, Node Js, and C#.

 Another benefit of Azure is the number of data centers it has around the world. There
are 42 Azure data centers spread around the globe, which is the highest number of
data centers for any cloud platform. Also, Azure is planning to get 12 more data
centers, which will increase the number of data centers to 54, shortly.

Why Use Azure?


Now that you know more about Azure and the services it provides, you might be interested in
exploring the various uses of Azure.
 Application development: You can create any web application in Azure.
 Testing: After developing an application successfully on the platform, you can test it.
 Application hosting: Once the testing is done, Azure can help you host the application.
 Create virtual machines: You can create virtual machines in any configuration you want
with the help of Azure. 
 Integrate and sync features: Azure lets you integrate and sync virtual devices and
directories. 
 Collect and store metrics: Azure lets you collect and store metrics, which can help you
find what works. 
 Virtual hard drives: These are extensions of the virtual machines; they provide a huge
amount of data storage.

Q7: What are the Various Azure Services 


Ans: Azure provides more than 200 services, are divided into 18 categories. These categories
include computing, networking, storage, IoT, migration, mobile, analytics, containers, artificial
intelligence, and other machine learning, integration, management tools, developer tools,
security, databases, DevOps, media identity, and web services. Let’s take a look at some of the
major Azure services by category:
Compute Services  
 Virtual Machine
This service enables you to create a virtual machine in Windows, Linux or any other
configuration in seconds.
 Cloud Service
This service lets you create scalable applications within the cloud. Once the application
is deployed, everything, including provisioning, load balancing, and health monitoring, is
taken care of by Azure. 
 Service Fabric
With service fabric, the process of developing a microservice is immensely simplified.
Microservice is an application that contains other bundled smaller applications.
 Functions
With functions, you can create applications in any programming language. The best
part about this service is that you need not worry about hardware requirements while
developing applications because Azure takes care of that. All you need to do is provide
the code.
Networking
 Azure CDN
Azure CDN (Content Delivery Network) is for delivering content to users. It uses a high
bandwidth, and content can be transferred to any person around the globe. The CDN
service uses a network of servers placed strategically around the globe so that the
users can access the data as soon as possible.
 Express Route 
This service lets you connect your on-premise network to the Microsoft cloud or any
other services that you want, through a private connection. So, the only
communications that will happen here will be between the enterprise network and the
service that you want. 
 Virtual network
The virtual network allows you to have any of the Azure services communicate with one
another privately and securely. 
 Azure DNS
This service allows you to host your DNS domains or system domains on Azure.
Storage
 Disk Storage 
This service allows you to choose from either HDD (Hard Disk Drive) or SSD (Solid
State Drive) as your storage option along with your virtual machine.
 Blob Storage 
This service is optimized to store a massive amount of unstructured data, including text
and even binary data. 
 File Storage
This is a managed file storage service that can be accessed via industry SMB (server
message block) protocol. 
 Queue Storage 
With queue storage, you can provide stable message queuing for a large workload.
This service can be accessed from anywhere in this world.

Q7: What is a cloud analysis? How cloud analytics can help drive growth and scalability

Ans: Cloud analytics describes the application of analytic algorithms in the cloud against data in
a private or public cloud to then deliver a result of interest. Cloud analytics involves deployment
of scalable cloud computing with powerful analytic software to identify patterns in data and to
extract new insights. More and more businesses rely on data analysis to gain a competitive
advantage, to advance scientific discovery, or to improve life in all sorts of ways. Data analytics
has therefore become an increasingly valuable tool as the quantity and the value of data continue
to climb.

Cloud analytics is often associated with artificial intelligence (AI), machine learning (ML),


and deep learning (DL). And it is commonly used in industry applications such as scientific
research in genomics or in oil and gas fields, business intelligence, security, Internet of Things
(IoT), and many others. In fact, any industry can benefit from data analytics to improve
organizational performance and to drive new value.
By leveraging AI and other analytics approaches, organizations of all sizes can quickly make
data-driven decisions to gain efficiencies in their products and services. The cloud is an
indispensable platform that enables quick experimentation of ideas through proofs of concept
(POCs) and provides a rich software ecosystem for building AI applications and for training DL
models.

AI is increasingly being used in multiple industry verticals to support important business needs
such as automating business processes, providing cognitive insights through data analysis, and
interacting with customers with natural language processing. DL, the next level of ML, is effective
at learning from large volumes of data to mimic the human brain’s pattern recognition (for
example, images, speech, and text).

Cloud infrastructure analytics, a subset of cloud analytics, focuses on the analysis of data that’s
associated with IT infrastructure, on the premises or in the cloud. The goal is to identify I/O
patterns, to evaluate application performance, to identify policy compliance, and to support
capacity management and infrastructure resilience.

How cloud analytics can help drive growth and scalability


Data analytics isn’t a new concept. The term “big data” was introduced in the late 90’s to describe
large data sets often found in specific industries such as energy, financial services, healthcare,
space travel, and other scientific disciplines. The ability to analyze and extract insights from large
data sets, data analytics, accelerated with the introduction of analytics software, such as Apache
Hadoop. As analytics technologies and workloads moved to the cloud they became known as
Cloud Analytics. Cloud Analytics has rapidly increased the ease, accessibility, and capability of
performing complex data analysis on very large data sets.

Cloud Analytics is particularly interesting for several reasons:

 The amount of data collected around the world is growing at staggering rates and
much of it is being created and pooled in the cloud or at IOT endpoints

 Services delivered in the cloud are much easier to deploy as they are delivered as an
automated service and they don’t require deployment and maintenance of physical
hardware

 The cloud business model enables a user to turn services on and off as needed. This
consumption approach allows customers to pay only for what they use when they use
it, thereby removing the responsibility of procuring and managing capital infrastructure
as well as reducing data center space
 The cloud allows users to deploy the right quantity of IT resources to match the
problem at hand. Dynamic resizing of resources means that users can easily apply
compute and storage and scale them as needs change. Users are spared the
requirement to procure a fixed capacity of physical IT equipment for all of their data
analysis projects

 Building a hybrid analytics solution is effective for users who wish to leverage the
cloud to test a new analytics project as a POC before committing to investments on-
premises

Cloud Analytics empowers organizations to:

 Test genomic data to better understand genetic disease and how to offer cures

 Identify patterns in speech, images and videos in order to improve customer


satisfaction and improve customer service

 Study buying behavior to improve product availability and delivery

 Identify patterns of disease reporting to improve availability of medicine and vaccines

 Analyze hybrid cloud infrastructures to improve application performance and optimize


IT costs

Q8: What is software as a service and security in cloud computing?


Ans: The software-as-a-service (SaaS) model mean that it’s here to stay for the foreseeable
future. This means that companies are going to need to develop at least a basic understanding of
what it means for security in general and data protection in particular. 

SaaS makes security a dual responsibility


With SaaS, as with pretty much everything else to do with the cloud, security is a dual
responsibility. The provider is responsible for securing their cloud platform against external
threats (environmental and human) and the customer is responsible for managing their own
accesses to ensure that they are not misused by their own staff or anyone else.
With SaaS you are very much dependent on your service provider’s security
SaaS is typically offered in a public cloud environment, in other words, customers access a
service that is used by other, unrelated, customers, of whom they have no knowledge and over
whom they have no control.
This means that in addition to the standard threat of cyber attackers, you have the theoretical
possibility that another customer will either be leaked your data or use their access to breach the
“invisible walls”, which are supposed to separate customers from each other and give the illusion
of complete privacy. For the sake of completeness, at present, we are not aware of any real-
world instances where this has happened, just “proof-of-concept” demonstrations.
Because of this, it’s really important to make sure that you are working with a reputable SaaS
provider and make sure that you understand as much as possible about their security and how
they implement it. Obviously, the exact details of the security system will almost certainly be a
closely-guarded secret but many providers will give customers a thorough overview of what sort
of processes and tools they use to make sure that their platform has robust protection.
Unauthorized use of SaaS can be a major security threat
You can only scrutinize a vendor’s security when you know you are using a vendor and you can
only decide what additional security processes are required for a SaaS platform when you are
aware that you are using it. It’s not enough just to tell employees that they must get permission
before signing up to a SaaS platform (especially since they may not grasp what a SaaS platform
actually is). You need to keep monitoring and auditing your network usage so that you quickly
identify who is using what and take remedial access as necessary.
For the sake of completeness, if you have a fairly relaxed network-usage policy, employees may
use unauthorized SaaS services for their own purposes, for example, during their lunch break. In
principle, you may be fine with this, but you will still need to check firstly that their activity is
personal and does not involve any of your data and secondly that there is no other way that their
SaaS usage might compromise your security.
Your SaaS security is only as good as your identity and access management
Identity and access management is core to all forms of security and SaaS is no exception. The
good news is that it’s fairly straightforward to implement a very granular level of access control –
once you have defined exactly who needs access to exactly what. There also needs to be a
process in place to ensure that accesses are reviewed periodically, even if people stay within the
same role.

Q9: What is security governance in cloud computing? Explain various challenges.


Ans: Cloud security governance refers to the management model that facilitates effective and
efficient security management and operations in the cloud environment so that an enterprise’s
business targets are achieved. This model incorporates a hierarchy of executive mandates,
performance expectations, operational practices, structures, and metrics that, when
implemented, result in the optimization of business value for an enterprise. Cloud security
governance helps answer leadership questions such as:
 Are our security investments yielding the desired returns?
 Do we know our security risks and their business impact?
 Are we progressively reducing security risks to acceptable levels?
 Have we established a security-conscious culture within the enterprise?
Strategic alignment, value delivery, risk mitigation, effective use of resources, and performance
measurement are key objectives of any IT-related governance model, security included. To
successfully pursue and achieve these objectives, it is important to understand the operational
culture and business and customer profiles of an enterprise, so that an effective security
governance model can be customized for the enterprise.        
Cloud Security Governance Challenges
Whether developing a governance model from the start or having to retrofit one on existing
investments in cloud, these are some of the common challenges:
Lack of senior management participation and buy-in
The lack of a senior management influenced and endorsed security policy is one of the common
challenges facing cloud customers. An enterprise security policy is intended to set the executive
tone, principles and expectations for security management and operations in the cloud. However,
many enterprises tend to author security policies that are often laden with tactical content, and
lack executive input or influence. The result of this situation is the ineffective definition and
communication of executive tone and expectations for security in the cloud. To resolve this
challenge, it is essential to engage enterprise executives in the discussion and definition of tone
and expectations for security that will feed a formal enterprise security policy. It is also essential
for the executives to take full accountability for the policy, communicating inherent provisions to
the enterprise, and subsequently enforcing compliance  
Lack of embedded management operational controls
Another common cloud security governance challenge is lack of embedded management
controls into cloud security operational processes and procedures. Controls are often interpreted
as an auditor’s checklist or repackaged as procedures, and as a result, are not effectively
embedded into security operational processes and procedures as they should be, for purposes of
optimizing value and reducing day-to-day operational risks. This lack of embedded controls may
result in operational risks that may not be apparent to the enterprise. For example, the security
configuration of a device may be modified (change event) by a staffer without proper analysis of
the business impact (control) of the modification. The net result could be the introduction of
exploitable security weaknesses that may not have been apparent with this modification. The
enterprise would now have to live with an inherent operational risk that could have been avoided
if the control had been embedded in the change execution process.
Lack of operating model, roles, and responsibilities
Many enterprises moving into the cloud environment tend to lack a formal operating model for
security, or do not have strategic and tactical roles and responsibilities properly defined and
operationalized. This situation stifles the effectiveness of a security management and operational
function/organization to support security in the cloud. Simply, establishing a hierarchy that
includes designating an accountable official at the top, supported by a stakeholder committee,
management team, operational staff, and third-party provider support (in that order) can help an
enterprise to better manage and control security in the cloud, and protect associated investments
in accordance with enterprise business goals. This hierarchy can be employed in an in-sourced,
out-sourced, or co-sourced model depending on the culture, norms, and risk tolerance of the
enterprise.
Lack of metrics for measuring performance and risk
Another major challenge for cloud customers is the lack of defined metrics to measure security
performance and risks – a problem that also stifles executive visibility into the real security risks
in the cloud. This challenge is directly attributable to the combination of other challenges
discussed above. For example, a metric that quantitatively measures the number of exploitable
security vulnerabilities on host devices in the cloud over time can be leveraged as an indicator of
risk in the host device environment. Similarly, a metric that measures the number of user-
reported security incidents over a given period can be leveraged as a performance indicator of
staff awareness and training efforts. Metrics enable executive visibility into the extent to which
security tone and expectations (per established policy) are being met within the enterprise and
support prompt decision-making in reducing risks or rewarding performance as appropriate.

Q10: What are the Key Objectives for Cloud Security Governance?
Ans: Building a cloud security governance model for an enterprise requires strategic-level
security management competencies in combination with the use of appropriate security
standards and frameworks (e.g., NIST, ISO, CSA) and the adoption of a governance framework
(e.g., COBIT). The first step is to visualize the overall governance structure, inherent
components, and to direct its effective design and implementation. The use of appropriate
security standards and frameworks allow for a minimum standard of security controls to be
implemented in the cloud, while also meeting customer and regulatory compliance obligations
where applicable. A governance framework provides referential guidance and best practices for
establishing the governance model for security in the cloud. The following represents key
objectives to pursue in establishing a governance model for security in the cloud. These
objectives assume that appropriate security standards and a governance framework have been
chosen based on the enterprise’s business targets, customer profile, and obligations for
protecting data and other information assets in the cloud environment.
1. Strategic Alignment
Enterprises should mandate that security investments, services, and projects in the cloud
are executed to achieve established business goals (e.g., market competitiveness,
financial, or operational performance).
 
2. Value Delivery
Enterprises should define, operationalize, and maintain an appropriate security
function/organization with appropriate strategic and tactical representation, and charged
with the responsibility to maximize the business value (Key Goal Indicators, ROI) from the
pursuit of security initiatives in the cloud.  
 
3. Risk Mitigation
Security initiatives in the cloud should be subject to measurements that gauge
effectiveness in mitigating risk to the enterprise (Key Risk Indicators). These initiatives
should also yield results that progressively demonstrate a reduction in these risks over
time.
 
4. Effective Use of Resources
It is important for enterprises to establish a practical operating model for managing and
performing security operations in the cloud, including the proper definition and
operationalization of due processes, the institution of appropriate roles and
responsibilities, and use of relevant tools for overall efficiency and effectiveness.
 
5. Sustained Performance
Security initiatives in the cloud should be measurable in terms of performance, value and
risk to the enterprise (Key Performance Indicators, Key Risk Indicators), and yield results
that demonstrate attainment of desired targets (Key Goal Indicators) over time.

MCQ’s

1. Which of the following service creates an application hosting environment?


a) EBS
b) Azure AppFabric
c) ESW
d) All of the mentioned

Answer: b
Clarification: AppFabric is a cloud-enabled version of the .NET Framework.

2. Point out the wrong statement.


a) Microsoft’s approach is to view cloud applications as software plus service
b) Microsoft calls their cloud operating system the Windows Platform
c) Azure is a combination of virtualized infrastructure to which the .NET Framework has been
added as a set of .NET Services
d) None of the mentioned

Answer: d
Clarification: In SaaS model, the cloud is another platform and applications can run locally and
access cloud services or run entirely in the cloud and be accessed by browsers using standard
Service Oriented Architecture (SOA) protocols.

3. Which of the following is also known as Compute?


a) set of virtual machine instances
b) set of replicas
c) set of commodity servers
d) all of the mentioned

Answer: a
Clarification: Azure and its related services were built to allow developers to extend their
applications into the cloud.
4. Database marketplace based on SQL Azure Database is code-named ________
a) Akamai
b) Dallas
c) Denali
d) None of the mentioned

Answer: b
Clarification: Azure is a virtualized infrastructure to which a set of additional enterprise services
has been layered on top.

5. Point out the correct statement.


a) The Windows Azure service itself is a hosted environment of virtual machines enabled by a
fabric called Windows Azure ApplicationFab
b) Windows Azure service is a Compliance as a Service offering
c) Windows Live Services is a collection of applications and services that run on the Web
d) All of the mentioned

Answer: c
Clarification: Some of these applications called Windows Live Essentials are add-ons to Windows
and downloadable as applications.

6. ________ Live Services can be used in applications that run in the Azure cloud.
a) Microsoft
b) Windows
c) Yahoo
d) Ruby

Answer: b
Clarification: Windows Live Services is a collection of services that runs on Windows Live.

7. Which of the following is based on Microsoft Dynamics?


a) Static CRM
b) Social CRM
c) Dynamics CRM
d) None of the mentioned

Answer: c
Clarification: Dynamics CRM is an xRM (Anything Relations Management) service.

8. Which of the following is based on Microsoft Sharepoint?


a) Sharepoint Services
b) .NET Services
c) Windows Services
d) All of the mentioned

Answer: a
Clarification: A document and collaboration service based on SharePoint is called SharePoint
Service.
9. Azure is Microsoft’s ___________ as a Service Web hosting service.
a) Platform
b) Software
c) Infrastructure
d) All of the mentioned

Answer: c
Clarification: Windows Azure Platform is a competitor to Google’s App Engine.

10. Which of the following is a pure infrastructure play?


a) Azure
b) Google App Engine
c) AWS
d) None of the mentioned

Answer: c
Clarification: Microsoft has a very different vision for cloud services than either Amazon or
Google does.

Question-11 : Which of these is a business concern in the Cloud?

(A) : Expertise

(B) : Cost Management

(C) : Outages

(D) : All of these

Answer : (d)

Question-12 : Point out the wrong statement.

(A) : Soft computing represents a real paradigm shift in the way in which systems are deployed

(B) : The massive scale of cloud computing systems was enabled by the popularization of the
Internet

(C) : Cloud computing makes the long-held dream of utility computing possible with a pay-as-you-
go, infinitely scalable, universally available system

(D) : All of these

Answer : (a)

Question-13 : ________ as a utility is a dream that dates from the beginning of the
computing industry itself.

(A) : Model
(B) : Software

(C) : Computing

(D) : All of these

Answer : (c)

Question-14 : Which of the following is an essential concept related to Cloud?

(A) : Reliability

(B) : Abstraction

(C) : Productivity

(D) : None of these

Answer : (b)

Question-15 : Point out the wrong statement.

(A) : All applications benefit from deployment in the cloud

(B) : With cloud computing, you can start very small and become big very fast

(C) : Cloud computing is revolutionary, even if the technology it is built on is evolutionary

(D) : None of these

Answer : (a)

Question-16 : Which of the following cloud concept is related to pooling and sharing of
resources?

(A) : Polymorphism

(B) : Abstraction

(C) : Virtualization

(D) : None of these

Answer : (c)

Question-17 : Which of the following is Cloud Platform by Microsoft?

(A) : Azure

(B) : AWS
(C) : Cloudera

(D) : Rackspace

Answer : (a)

Question-18 : Which of the following benefit is related to creates resources that are pooled
together in a system that supports multi-tenant usage?

(A) : On-demand self-service

(B) : Broad network access

(C) : Resource pooling

(D) : All of these

Answer : (a)

Question-19 : Point out the wrong statement.

(A) : The cost advantages of cloud computing have enabled new software vendors to create
productivity applications

(B) : A client can provision computer resources without the need for interaction with cloud service
provider personnel

(C) : All infrastructure management must be done by the client

(D) : None of these

Answer : (c)

Question-20 : All cloud computing applications suffer from the inherent _______ that is
intrinsic in their WAN connectivity.

(A) : propagation

(B) : latency

(C) : noise

(D) : All of these

Answer : (b)

Question-21 : Which of the following is the most important area of concern in cloud
computing?

(A) : Storage
(B) : Scalability

(C) : Availability

(D) : Security

Answer : (d)

Question-22 : You can’t count on a cloud provider maintaining your _____ in the face of
government actions.

(A) : scalability

(B) : reliability

(C) : privacy

(D) : availability

Answer : (c)

Question-23 : ________ refers to the location and management of the cloud’s


infrastructure.

(A) : Service

(B) : Application

(C) : Deployment

(D) : None of these

Answer : (c)

Question-24 : Point out the wrong statement.

(A) : Cloud Computing has two distinct sets of models

(B) : Amazon has built a worldwide network of data centers to service its search engine

(C) : Azure enables .NET Framework applications to run over the Internet

(D) : None of these

Answer : (b)

Question-25 : Which of the following is the best-known service model?

(A) : IaaS
(B) : PaaS

(C) : SaaS

(D) : All of these

Answer : (d)

Question-26 : The ________ cloud infrastructure is operated for the exclusive use of an
organization.

(A) : Public

(B) : Private

(C) : Community

(D) : All of these

Answer : (b)

Question-27 : Creating more logical IT resources, within one physical system is called
________.

(A) : Load balancing

(B) : Hypervisor

(C) : Virtualization

(D) : None of these

Answer : (c)

Question-28 : What is most commonly used for managing the resources for every virtual
system?

(A) : Load balancer

(B) : Hypervisor

(C) : Router

(D) : Cloud

Answer : (b)

Question-29 : Which is not a benefit of virtualization?

(A) : Flexible and efficient allocation of resources


(B) : Lowers the cost of IT infrastructure

(C) : Remote access and rapid scalability

(D) : Run on single operating system

Answer : (d)
Unit-10
Short Answers
Q1. Mention the types of virtualization?
Answer:

Types of Virtualization are as follows:

 User Virtualization

 Application Virtualization

 Hardware virtualization

 Desktop Virtualization

 Network Virtualization

 Server Virtualization

Q2. Benefits of virtualization?


Answer:

Below are the benefits of Virtualization :

 Cost and resource reduction.

 Reduces hardware dependency for running giant applications.

 Allows installing multiple systems on a single platform.

 Reduces the amount of space involved in installing data centers.

Q3. Different types of server software’s VMware provides?


Answer:

They are as follows :

 VM ware ESX server

 VMware server

 VMware ESXi server


Q4. What is VM cloning?
Answer:

A process of cloning existing VM with transparent configurations, so once the cloning process is

completed, the new one becomes a separate VM machine. The actual VM becomes a parent.

Q5 How does virtualization work in cloud computing?

Ans: Virtualization plays a very important role in the cloud computing technology, normally in
the cloud computing, users share the data present in the clouds like application etc, but actually
with the help of virtualization users shares the Infrastructure.

The main usage of Virtualization Technology is to provide the applications with the standard
versions to their cloud users, suppose if the next version of that application is released, then
cloud provider has to provide the latest version to their cloud users and practically it is possible
because it is more expensive.

To overcome this problem we use basically virtualization technology, By using virtualization, all
severs and the software application which are required by other cloud providers are maintained
by the third party people, and the cloud providers has to pay the money on monthly or annual
basis.

Q6: What is the need for a virtualization platform to implement the cloud?

Ans. Virtualization in cloud computing is very much required to –

 Decouple hardware from software


 Save the cost for components like hardware and servers
 Store the data in the virtual server
 Reduce the wastage, electricity bills, and maintenance costs

Q7. Name different types of virtualization in cloud computing.

Ans. Different types of virtualization in cloud computing are –

 Hardware Virtualization
 Software Virtualization
 Memory Virtualization
 Storage Virtualization
 Data Virtualization
 Network Virtualization
 Desktop Virtualization

Q8. How would you secure the data while it is being transported in the cloud?
Ans. We can secure the data while it is being transported in the cloud by implementing the
encryption key. This not helps in ensuring data security but also data leakage.

Q9. What is the most crucial concern people often have in mind regarding the use of
Cloud Computing?

Ans. The prevailing concern is about the security of data in Cloud Computing. Most people
remain worried about the misuse of data.

Q10. What is utility computing?

Ans. Utility computing is a service-provisioning model that enables users to pay only for the
services that they are using. It doesn’t include any upfront costs. It provides computing resources
and infrastructure management and is a plug-in managed by the organization. Most of the
organizations these days are using a hybrid strategy.

Long Answers:

Q1: What is the role of virtualization in cloud-based services?


Ans: The basic concept of virtualization is that a piece of software will function as a physical
object, that is, it will “look” and “behave” like hardware. Thus, it will perform all of the functions
that a piece of hardware performs without the hardware in place. As such, the software emulates
a desktop PC on a server.
And this, in fact, is what cloud-based IT service provides – a place where business functions can
occur and be stored without the need for in-house hardware.
Virtualization is Different from Cloud Computing
Virtualization software allows multiple operating systems and applications to run on the same
server at the same time, and, as a result, lowers costs and increases efficiency of a company’s
existing hardware. It’s a fundamental technology that powers cloud computing.
Virtualization thus emulates hardware. Cloud computing is a service that results from that
manipulation and is an external service. Cloud computing almost always assumes virtualization
of certain resources (storage or data) that will be then delivered to the customer on-demand.
The Main Types of Virtualization
There are several types of virtualization, categorized according to the elements they are used on.
1. Server Virtualization
Server space is conserved through by consolidating multiple machines into a single server that
then runs multiple virtual environments. It’s a method by which businesses can run the same
applications on multiple servers, so that there is a “failsafe” position. Because each server is
independent, running software on one will not affect the other. Another emerging trend in server
virtualization is migration. A server environment can be moved from one place to another, even if
the machines have different operating systems. The obvious benefit is the savings on hardware.
2. Storage Virtualization
Disk storage used to be a simple matter. If a business needed more, it simply purchased a larger
disk drive. But storage needs continue to grow, and managing them becomes much harder.
Virtualization is a great answer. It adds an additional layer of software between systems and
servers, and applications no longer need to know where specific data resides. It is managed as if
it is a single resource. Servers will see the virtualization layer as one single storage device, and
each individual storage device sees the layer as its only server.
3. Network Virtualization
This type of virtualization allows management and monitoring of an entire network as a single
entity. Primarily, it is designed to automate administrative tasks, disguising the complexity of the
network. Each server (and service) is considered a part of one pool of resources to be used
without worry about its physical components.
Understanding the Advantages of Virtualization
The best way to think about the role of virtualization is to understand the difference between
private and public clouds. Basically, in a private cloud environment, a business owns/leases both
the hardware and the software that provides the service consumption. This is in-house
virtualization, and the business maintains full management and control.
The public cloud environment is one in which all of the virtualization is housed somewhere else,
and a vendor provides the service to clients on a fee basis. In the public cloud, there are “co-
tenants” in the same cloud, and clients pay for the specific services they use, as they use them.

Q2: Difference between Private and Public Cloud. List advantages and disadvantages of
both.

Ans: The Private Cloud

A private cloud is thus its own virtualized world. It gives users more control, along with flexibility
to manage their own systems, while still having all of the benefits of the cloud. Plus, the owner
does not have to worry about coexisting “bad neighbors” or possible slowdowns in performance.
Virtualization results in the following benefits:

 Maximizing existing resources: Virtualization will allow a user to keep physical systems to
a minimum, getting greater value out of existing servers.

 Running multiple applications and their operating systems on the same hardware.

 Costs are direct but are fixed. All costs for management, administration, and other
requirements are within the in-house IT budget.

For a business to consider whether to use virtualization (a private cloud), it must consider who
will be providing the support and how will it be integrated with other in-house systems. Cost
(operational expenditures), of course, is a consideration. How much management is a business
willing to do? What about scalability and security needs?
In general, businesses that need greater control and security and that have large IT staffs for
these purposes will probably find virtualization preferable.

The Public Cloud

Virtualized services through a public cloud environment are usually preferable for businesses that
have smaller IT staffs and that tend to have fewer security concerns. A cloud-based solution will
provide the following:

 IT is basically outsourced. Because there is a service provider, administration and


supportive services are taken care of elsewhere. In-house IT staff remain available for
other business purposes.

 Setup is easy and fast. And servers, hardware, and software licenses are eliminated.

 Pay-as-you-use. Cloud-based services are charged based upon scope of use, and, while
they can sometimes seem pricey, businesses do not have to put dollars into supportive
products (spam/anti-virus resources, data archiving, encryption, off-site storage, etc.)

 Scalability. Cloud services allow both permanent and temporary scaling. Thus, a


business can off-load high-demand requirements at any time, even on a temporary basis,
and pay only for the time of that off-load.

It’s important to note that virtualization via private cloud or the move to cloud computing services
are not mutually exclusive nor are they competitive.

Many businesses have in-house virtualization for some functions and move to the cloud for
others. Still others who begin with virtualization of their own servers may ultimately end up in the
cloud, as an evolutionary matter. They simply want more service delivery, scale, and agility.

Private Cloud Virtualization: Advantages and Disadvantages

As discussed above, there are several advantages of private cloud virtualization – in-house
control and the flexibility to manage one’s own systems, being the most important. And the cost
benefits are obvious as well – minimizing the need for physical systems.

Taking a more specific look at the advantages and disadvantages will provide CIO’s with the
information they need as they make decisions about virtualization.

Advantages

 Businesses that “live” in a regulatory environment (e.g. financial services, health) have
critical data and protection responsibilities. Building virtualization infrastructures
themselves rather than sharing them with others in a public cloud, can raise issues.
 Likewise, companies that have data which they wish to remain confidential (e.g.,
research), can feel a bit better about in-house virtualization, in which they can protect that
data. No other company has access to that infrastructure.

 Private Cloud Virtualization has greater reliability. When public clouds are considered,
potential users must conduct solid research to determine if the server they select can
provide premiere performance for the types of applications and services they need. In
building a private cloud, predictable and reliable service for users is generally most
assured.

 Cost and Flexibility. There are always trade-offs when implementing new hardware and
software. In the case of a private cloud, the initial expense of installing servers and storage
can be high. On the other hand, great flexibility can be built in so that workloads can easily
be shifted during peak usage spikes and when new applications are deployed. There is no
need to make a request of a cloud service provider, before changes can be accomplished.

Disadvantages

No software or hardware solution is perfect, and that is certainly the case with private cloud
virtualization. Before building and deploying, there are disadvantages to be considered:

 Integration with other in-house systems can be an issue.

 Managing and supporting virtualization will often require dedicated IT staff, and that may
bring costs up, if there is already not a good-sized department. This is the primary reason
why smaller businesses opt for external cloud services.

 Scaling and security will require specific expertise.

Q3: What is server virtualization with example? List various types of Server Visualization.
Mention Advantages and Disadvantages.
Ans: Server Virtualization

Server Virtualization is the process of dividing a physical server into several virtual servers,
called virtual private servers. Each virtual private server can run independently.

The concept of Server Virtualization widely used in the IT infrastructure to minimizes the costs by
increasing the utilization of existing resources.

Types of Server Virtualization

1. Hypervisor

In the Server Virtualization, Hypervisor plays an important role. It is a layer between the  operating
system (OS) and hardware. There are two types of hypervisors.

o Type 1 hypervisor ( also known as bare metal or native hypervisors)


o Type 2 hypervisor ( also known as hosted or Embedded hypervisors)
The hypervisor is mainly used to perform various tasks such as allocate physical hardware
resources (CPU, RAM, etc.) to several smaller independent virtual machines, called "guest" on
the host machine.

2. Full Virtualization

Full Virtualization uses a hypervisor to directly communicate with the CPU and physical server.
It provides the best isolation and security mechanism to the virtual machines.

The biggest disadvantage of using hypervisor in full virtualization is that a hypervisor has its own
processing needs, so it can slow down the application and server performance.

VMWare ESX server is the best example of full virtualization.

3. Para Virtualization

Para Virtualization is quite similar to the Full Virtualization. The advantage of using this
virtualization is that it is easier to use, Enhanced performance, and does not require
emulation overhead. Xen primarily and UML use the Para Virtualization.

The difference between full and pare virtualization is that, in para virtualization hypervisor does
not need too much processing power to manage the OS.

4. Operating System Virtualization

Operating system virtualization is also called as system-lever virtualization. It is a server


virtualization technology that divides one operating system into multiple isolated user-space
called virtual environments. The biggest advantage of using server visualization is that it
reduces the use of physical space, so it will save money.

Linux OS Virtualization and Windows OS Virtualization are the types of Operating System


virtualization.

FreeVPS, OpenVZ, and Linux Vserver are some examples of System-Level Virtualization.

5. Hardware Assisted Virtualization

Hardware Assisted Virtualization was presented by AMD and Intel. It is also known as Hardware
virtualization, AMD virtualization, and Intel virtualization. It is designed to increase the
performance of the processor. The advantage of using Hardware Assisted Virtualization is that it
requires less hypervisor overhead.

6. Kernel-Level Virtualization

Kernel-level virtualization is one of the most important types of server virtualization. It is an  open-
source virtualization which uses the Linux kernel as a hypervisor. The advantage of using
kernel virtualization is that it does not require any special administrative software and has very
less overhead.
User Mode Linux (UML) and Kernel-based virtual machine are some examples of kernel
virtualization.

Advantages of Server Virtualization

There are the following advantages of Server Virtualization -

1. Independent Restart

n Server Virtualization, each server can be restart independently and does not affect the working
of other virtual servers.

2. Low Cost

Server Virtualization can divide a single server into multiple virtual private servers, so it reduces
the cost of hardware components.

3. Disaster Recovery<

Disaster Recovery is one of the best advantages of Server Virtualization. In Server Virtualization,
data can easily and quickly move from one server to another and these data can be stored and
retrieved from anywhere.

4. Faster deployment of resources

Server virtualization allows us to deploy our resources in a simpler and faster way.

5. Security

It allows uses to store their sensitive data inside the data centers.

Disadvantages of Server Virtualization

There are the following disadvantages of Server Virtualization -

1. The biggest disadvantage of server virtualization is that when the server goes offline, all
the websites that are hosted by the server will also go down.
2. There is no way to measure the performance of virtualized environments.
3. It requires a huge amount of RAM consumption.
4. It is difficult to set up and maintain.
5. Some core applications and databases are not supported virtualization.
6. It requires extra hardware resources.

Uses of Server Virtualization

A list of uses of server virtualization is given below -


o Server Virtualization is used in the testing and development environment.
o It improves the availability of servers.
o It allows organizations to make efficient use of resources.
o It reduces redundancy without purchasing additional hardware components.

Q4: What is Storage Virtualization in Cloud Computing? List Types & Benefits.
Ans: Storage virtualization in Cloud Computing is nothing but the sharing of physical storage
into multiple storage devices which further appears to be a single storage device. It can be also
called as a group of an available storage device which simply manages from a central console.
This virtualization provides numerous benefits such as easy backup, achieving, and recovery of
the data.

This whole process requires very less time and works in an efficient manner. Storage
virtualization in Cloud Computing does not show the actual complexity of the Storage Area
Network (SAN). This virtualization is applicable to all levels of SAN.

Types of Storage Virtualization

Here, we are going to list down all the storage virtualization in Cloud Computing;

 Hardware Assisted Virtualization


 Kernel Level Virtualization
 Hypervisor Virtualization
 Para-Virtualization
 Full Virtualization
i. Hardware Assisted Virtualization

This type of virtualization requires hardware support. It is similar to full Para-virtualization. Here,
the unmodified OS can run as hardware support for virtualization and we can also use to handle
hardware access requests and protect operations.
ii. Kernel Level Virtualization

It runs a separate version of the Linux Kernel. Kernel level allows running multiple servers in a
single host. It uses a device driver to communicate between main Linux Kernel and the virtual
machine. This virtualization is a special form of Server Virtualization.
iii. Hypervisor Virtualization

A hypervisor is a layer between the Operating system and hardware. With the help of hypervisor
multiple operating systems can work. Moreover, it provides features and necessary services
which help OS to work properly.
iv. Para-Virtualization

It is based on hypervisor which handles emulation and trapping of software. Here, the guest
operating system is modified before installing it to any further machine. The modified system
communicates directly with the hypervisor and improves the performance.
v. Full Virtualization

This virtualization is similar to Para-Virtualization. In this, the hypervisor traps the machine
operations which is used by the operating system to perform the operations. After trapping the
operations, it emulates in particular software and the status codes returned.

Q6: List various advantages of Storage Virtualization. How should we implement Storage
Virtualization?

Ans: Let’s discuss some benefits of Storage Virtualization in Cloud Computing:

i. Easy Retrieval and Upload of Data


In storage virtualization, the data quickly retrieve from virtual storage. It is as easy as accessing a
file on the local computer. The data store very easily with the help of some application and an
internet connection which is an easy task.

ii. Better Management


The data can be migrated based upon the utilization such as the data which is frequently used
can be stored on a high-performance storage system. However, the data which is rarely used can
be placed on a bit slower system.

This is an example of a battery management system and the customer won’t face any issue
regarding storage.

iii. Security
In storage virtualization, the data stores in different place and secure with maximum security. If
any disaster takes place the data can be retrieved from some other place and it won’t affect the
customer.

The security has the ability to meet the real utilization necessities rather than providing additional
storage.

How Storage Virtualization Apply?

Following are the different ways for storage applies to the virtualization:

 Host-Based
 Network-Based
 Array-Based
i. Host-Based Storage Virtualization

Here, all the virtualizations and management is done at the host level with the help of software
and physical storage, it can be any device or array.

The host is made up of multiple hosts which present virtual drives of a set to the guest machines.
Doesn’t matter whether they are VMs in an enterprise or PCs.
ii.  Network-Based Storage Virtualization

Network-based storage virtualization is the most common form which are using nowadays.
Devices such as a smart switch or purpose-built server connect to all the storage device in a fibre
channel storage network and present the storage as a virtual pool.

iii. Array-Based Storage Virtualization

Here the storage array provides different types of storage which are physical and used as
storage tiers. The software is available which handles the amount of storage tier made up of
solid-state drives hard drives.

Q7: What are the benefits of virtualized security?


Ans: Virtualized security, or security virtualization, refers to security solutions that are
software-based and designed to work within a virtualized IT environment. This differs from
traditional, hardware-based network security, which is static and runs on devices such as
traditional firewalls, routers, and switches. 

In contrast to hardware-based security, virtualized security is flexible and dynamic. Instead of


being tied to a device, it can be deployed anywhere in the network and is often cloud-based. This
is key for virtualized networks, in which operators spin up workloads and applications
dynamically; virtualized security allows security services and functions to move around with those
dynamically created workloads. 

Cloud security considerations (such as isolating multitenant environments in public cloud


environments) are also important to virtualized security. The flexibility of virtualized security is
helpful for securing hybrid and multi-cloud environments, where data and workloads migrate
around a complicated ecosystem involving multiple vendors.

Benefits of virtualized security?


Virtualized security is now effectively necessary to keep up with the complex security demands of
a virtualized network, plus it’s more flexible and efficient than traditional physical security. Here
are some of its specific benefits:
 Cost-effectiveness: Virtualized security allows an enterprise to maintain a secure
network without a large increase in spending on expensive proprietary hardware.
Pricing for cloud-based virtualized security services is often determined by usage,
which can mean additional savings for organizations that use resources efficiently.
 Flexibility: Virtualized security functions can follow workloads anywhere, which is
crucial in a virtualized environment. It provides protection across multiple data centers
and in multi-cloud and hybrid cloud environments, allowing an organization to take
advantage of the full benefits of virtualization while also keeping data secure.
 Operational efficiency:Quicker and easier to deploy than hardware-based security,
virtualized security doesn’t require IT teams to set up and configure multiple hardware
appliances. Instead, they can set up security systems through centralized software,
enabling rapid scaling. Using software to run security technology also allows security
tasks to be automated, freeing up additional time for IT teams.
 Regulatory compliance:Traditional hardware-based security is static and unable to
keep up with the demands of a virtualized network, making virtualized security a
necessity for organizations that need to maintain regulatory compliance.
How does virtualized security work?
Virtualized security can take the functions of traditional security hardware appliances (such as
firewalls and antivirus protection) and deploy them via software. In addition, virtualized security
can also perform additional security functions. These functions are only possible due to
the advantages of virtualization, and are designed to address the specific security needs of a
virtualized environment. 

For example, an enterprise can insert security controls (such as encryption) between the
application layer and the underlying infrastructure, or use strategies such as micro-segmentation
to reduce the potential attack surface. 

Virtualized security can be implemented as an application directly on a bare metal hypervisor (a


position it can leverage to provide effective application monitoring) or as a hosted service on a
virtual machine. In either case, it can be quickly deployed where it is most effective, unlike
physical security, which is tied to a specific device. 

Q8: WHAT IS IDENTITY AND ACCESS MANAGEMENT?

Ans: Identity and access management (IAM) is a framework of business processes, policies and
technologies that facilitates the management of electronic or digital identities. With an IAM
framework in place, information technology (IT) managers can control user access to critical
information within their organizations. Systems used for IAM include single sign-on systems, two-
factor authentication, multifactor authentication and privileged access management. These
technologies also provide the ability to securely store identity and profile data as well as data
governance functions to ensure that only data that is necessary and relevant is shared.

IAM systems can be deployed on premises, provided by a third-party vendor through a cloud-
based subscription model or deployed in a hybrid model. 

Features

Cloud IAM typically includes the following features:

 Single Access Control Interface. Cloud IAM solutions provide a clean and consistent
access control interface for all cloud platform services. The same interface can be used for
all cloud services.
 Enhanced Security. You can define increased security for critical applications.
 Resource-level Access Control. You can define roles and grant permissions to users to
access resources at different granularity levels.
Importance of CLOUD IAM
It can be difficult for a company to start using cloud Identity and Access Management solutions
because they don’t directly increase profitability, and it is hard for a company to cede control over
infrastructure. However, there are several perks that make using an IAM solution very valuable,
such as the following:

 The ability to spend less on enterprise security by relying on the centralized trust model to
deal with Identity Management across third-party and own applications.
 It enables your users to work from any location and any device.
 You can give them access to all your applications using just one set of credentials
through Single Sign-On.
 You can protect your sensitive data and apps: Add extra layers of security to your mission-
critical apps using Multifactor Authentication.
 It helps maintain compliance of processes and procedures. A typical problem is that
permissions are granted based on employees’ needs and tasks, and not revoked when
they are no longer necessary, thus creating users with lots of unnecessary privileges.

MCQ’s

Question-1 : It helps a user to have remote access to an application from a server.

(A) : Application virtualization

(B) : Network virtualization

(C) : Desktop virtualization

(D) : Storage virtualization

Answer : (a)

Question-2 : It is an array of servers that are managed by a virtual storage system.

(A) : Application virtualization

(B) : Network virtualization

(C) : Desktop virtualization

(D) : Storage virtualization


Answer : (d)

Question-3 : It has the ability to run multiple virtual networks with each has a separate
control and data plan.

(A) : Application virtualization

(B) : Network virtualization

(C) : Desktop virtualization

(D) : Storage virtualization

Answer : (b)

Question-4 : Identify the correct option: The main benefits of it are user mobility,
portability, and easy management of software installation, updates, and patches.

(A) : Application virtualization

(B) : Network virtualization

(C) : Desktop virtualization

(D) : Storage virtualization

Answer : (c)

Question-5 : Which is the correct type of full virtualization?

(A) : Software assisted

(B) : IoT assisted

(C) : Hardware-assisted

(D) : Software assisted and Hardware-assisted both

Answer : (d)

Question-6 : Identify virtualizations: It emulates the hardware using the software


instruction sets.

(A) : Software assisted full virtualizations

(B) : IoT assisted full virtualizations

(C) : Hardware-assisted full virtualizations

(D) : None of these


Answer : (a)

Question-7 : Identify virtualizations: Due to binary translation, it often criticized for


performance issues.

(A) : Software assisted full virtualizations

(B) : IoT assisted full virtualizations

(C) : Hardware-assisted full virtualizations

(D) : None of these

Answer : (a)

Question-9 : Identify virtualizations: Guest OS’s instructions might allow a virtual context
to execute privileged instructions directly on the processor, even though it is virtualized.

(A) : Software assisted full virtualizations

(B) : IoT assisted full virtualizations

(C) : Hardware-assisted full virtualizations

(D) : None of these

Answer : (c)

Question-9 : ________refers to the use of a remote computer from a local computer where
the actual computer user is located.

(A) : Virtual machine

(B) : Virtual computing

(C) : Virtual cloud

(D) : Virtual OS

Answer : (b)

Question-10 : ________is an operating system or application environment that is installed


on software, which reproduces dedicated hardware virtually.

(A) : Virtual machine

(B) : Virtual computing

(C) : Virtual cloud


(D) : None of these

Answer : (a)

Question-11 : In computing, ________ improves the distribution of workloads across


multiple computing resources, such as computers, a computer cluster, network links,
central processing units, or disk drives.

(A) : Virtual machine

(B) : Virtual computing

(C) : Virtual cloud

(D) : Load balancer

Answer : (d)

Question-12 : What is the need for load balancing in cloud computing?

(A) : Increased scalability

(B) : Ability to handle sudden traffic spikes

(C) : A and B both

(D) : None of these

Answer : (c)

Question-13 : Which of these network resources can be load balanced?

(A) : Servers

(B) : Routing mechanism

(C) : A and B both

(D) : None of these

Answer : (c)

Question-14 : The _________ is what controls and allocates what portion of hardware
resources each operating system should get, in order every one of them to get what they
need and not to disrupt each other.

(A) : hypervisor

(B) : virtual machine


(C) : virtual cloud

(D) : None of these

Answer : (a)

Question-15 : A ___________ makes a copy or a clone of the entire computer system


inside a single file.

(A) : virtualization

(B) : machine imaging

(C) : parallel processing

(D) : None of these

Answer : (b)

Question-16 : Full form of AMI

(A) : Amazon Memory Image

(B) : Amazon Memory Instance

(C) : Amazon Machine Instance

(D) : Amazon Machine Image

Answer : (d)

Question-17 : Which of these is an example of a cloud marketplace?

(A) : AWS marketplace

(B) : Oracle marketplace

(C) : Microsoft Windows Azure marketplace

(D) : All of these

Answer : (d)

Question-18 : Point out the wrong statement.

(A) : Abstraction enables the key benefit of cloud computing: shared, ubiquitous access

(B) : Virtualization assigns a logical name for a physical resource and then provides a pointer to
that physical resource when a request is made
(C) : All cloud computing applications combine their resources into pools that can be assigned on
demand to users

(D) : All of the mentioned

Answer : (c)

Question-19 : Point out the correct statement.

(A) : A client can request access to a cloud service from any location

(B) : A cloud has multiple application instances and directs requests to an instance based on
conditions

(C) : Computers can be partitioned into a set of virtual machines with each machine being
assigned a workload

(D) : All of the mentioned

Answer : (d)

Question-20 : Point out the wrong statement.

(A) : Load balancing virtualizes systems and resources by mapping a logical address to a
physical address

(B) : Multiple instances of various Google applications are running on different hosts

(C) : Google uses hardware virtualization

(D) : All of these

Answer : (c)

Question-21 : Which of the following provide system resource access to virtual machines?

(A) : VMM

(B) : VMC

(C) : VNM

(D) : All of these

Answer : (a)

Question-22 : An operating system running on a Type ______ VM is full virtualization.

(A) : 1
(B) : 2

(C) : 3

(D) : All of these

Answer : (a)

Question-23 : In _______ the virtual machine simulates hardware, so it can be independent


of the underlying system hardware.

(A) : paravirtualization

(B) : full virtualization

(C) : emulation

(D) : None of these

Answer : (c)

Question-24 : Which of the following operating system support operating system


virtualization?

(A) : Windows NT

(B) : Sun Solaris

(C) : Windows XP

(D) : Compliance

Answer : (b)

Question-25 : AWS platform was launched in:

(A) : Jul-01

(B) : Jul-02

(C) : Jul-03

(D) : Jul-04

Answer : (b)

Question-26 : In ________, Amazon launched an auto-scaling service on AWS.

(A) : Jan-18
(B) : Feb-18

(C) : Mar-18

(D) : Apr-18

Answer : (a)

Question-27 : In January 2015, Amazon Web Services acquired ___________, an Israel-


based microelectronics company reputedly for US$350–370M.

(A) : Sun Micro Systems

(B) : Max Computing Services

(C) : Annapurna Labs

(D) : Sun Cloud Infra

Answer : (c)

Question-28 : Which of these is not a component that makes up the AWS Global
Infrastructure.

(A) : Availability Zones

(B) : Regions

(C) : Regional Edge Caches

(D) : Rural Edge Caches

Answer : (d)

Question-29 : This is where the actual compute, storage, network, and database resources
are hosted that we as consumers provision within our Virtual Private Clouds (VPCs).

(A) : Availability Zones

(B) : Regions

(C) : Regional Edge Caches

(D) : Rural Edge Caches

Answer : (a)

Question-30 : Choose the below statements are true or false for AWS: 1. The single
availability zone(AZ) is equal to a single data center. 2. Each AZ will always have at least
one other AZ that is geographically located within the same area.
(A) : 1. True, 2. True

(B) : 1. True, 2. False

(C) : 1. False, 2. True

(D) : 1. False, 2. False

Answer : (c)

Question-31 : ______ is a collection of availability zones that are geographically located


close to one other.

(A) : Edge Locations

(B) : Regions

(C) : Regional Edge Caches

(D) : Rural Edge Caches

Answer : (b)

Question-32 : Edge Locations are used for:

(A) : Physical Infrastructure deployment

(B) : Reduce latency

(C) : A and B both

(D) : None of these

You might also like