Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

CLOUD COMPUTING FUNDAMENTALS

MODULE 1: How do computers communicate


Analogy between computer networking and road (Computer Network
components):
Data: vehicles that travel on the roadway (computer network) in an
efficient form.
Cables, fiber optic and wireless signals: roads that connect locations
(connect devices to the computer network)
Network adapters: the interface between the computer and the
external network
Routers, switches and Hubs: directing the flow of the data and
ensuring its arrival to its destination.
Network protocols: such as TCP/IP. They ensure that devices
transmit the data correctly and efficiently between themselves on
the network
Moving Data: dividing data at the src into data packets and then
reassembling it in the destination in the correct order.
What is cloud?
At the beginning we would talk about internet servers, engineers
designed servers to deliver websites with all the services a website
needs such as storage, networking and security, however cloud
computing improved this model by separating the features of one
server into distinct functions on specially designed and dedicated
servers. This makes it easier for engineers to use only the resources
they need when they need them. (pay for what you use) MODULE
2: How does cloud computing work
Cloud computing makes many powerful technical tools and
innovations easy to access in everyday life. Such as: Dynamic
mapping (Google Maps)/ global identity and access management/
personal digital assistants like Siri or Alexa /Files and photos
everywhere and anywhere /Video and music streaming/Natural
language processing and IA.
The components of a cloud service:
Physical computers (processing and storing data)/ virtual machines
(runs on physical computers)/Networking devices (ensuring
movement of the data quickly and accurately)/Management
software (management and monitoring of the computers and
enforcing security policies) / service software (graphical user
interface).
The following illustration how these five components work
together. It should be noted that administrators use management
software to monitor the computers both physical and virtual ones
The role of data centers in cloud computing:
How a data center works
Data centers are essential for the cloud and they function as the hub
through which the cloud does its work.
Internet Data: the data is transferred from computers and systems to data
centers.
Management software: manages how the data is processed.
Security: data centers have many security measures such as firewalls to
ensure the malicious data doesn’t inflect the data center
Data storage: some data is stored for later use some is processed by
database management and analytics software and then sent back to users
Services.
How cloud computing provides its benefits to users and
businesses:
1) Single sign on: users have a single account with many cloud
providers, the users accounts can be used on all their devices and
they can even log in their devices using their cloud account
2) Cloud storage: once logged in users can store files, pictures and all
sort of data.
3) Apps: Companies publish their internal corporate apps to the cloud
and deliver them to employees all over the world.
4) Workloads: a workload is a specific job that a computer perform
such as automatic backups. Businesses launch microservices on the
cloud that users can scale up and down based on their needs.
5) Connected devices: cloud enables connectivity between devices
even miles apart
6) Performance: since the cloud uses thousands of computers, users
and businesses get highly performant apps and services, reducing
network latency and delays.

Advances in cloud technology:


Engineers are developing by adding many advances:
computer technology: this technology is the backbone of the cloud
computing, improving the computers components such as CPU, RAM,
network interface, long term storage such as hard disk drive (HDD)
and solid-state disk (SSD), ports and cooling systems will help make
the cloud more efficient. The advances that are made:
- improving efficiency: more processing power, consuming less
energy and generating less heat - reducing components size.
- using more solid-state components: as they consume less power
virtualization technology: by using VMs the cloud reduces the
wasted resources.
The advances that are made:
- Containerization: is a lightweight form of virtualization that enables
applications to run in isolated environments on a single host
operating system. Containers are portable and easier to
manage unlike VMs and physical computers
- Serverless computing: a type of cloud computing where the cloud
provider manages the infrastructure and automatically allocates the
resources as needed.
- Hardware virtualization: the creation of a virtual machine inside a
physical server.
- Network virtualization: creating a virtual network independent from
the physical one.
networking technology: it enables the connectivity between computers
in a data center and between data centers.
The advances that are made:
- Improving network protocols
- Edge computing: data is been processed closer to where it was
generated rather than sending it to a data center. This approach
reduces latency and delays and improves performance.
- IBN (Intent-based networking): an approach that combines IA and
ML to automate tasks.

Key points to remember:


• 1
Computer networking simplifies transferring data from one
computer to another.
• 2
Cloud computing evolved from computer networking and
server farms to provide benefits to developers, users, and
businesses.
• 3

Cloud computing separates the features of one server into


distinct functions on specially designed and dedicated servers
to make it easier to use only the resources you need when you
need them.
• 4

Cloud computing enables many of the modern technologies


you use every day, including mapping, digital assistants,
video and music streaming, and natural language processing
and AI.
• 5

There are five main components in cloud services: physical


computers, virtual computers, networking devices,
management software, and service software.
• 6

Data centers enable cloud providers to store and process vast


amounts of data, to run complex applications, and to provide
reliable connectivity to users.
• 7

Cloud computing provides distributed computing, scalability,


availability, and resilience to users and businesses.
• 8
Cloud computing enables all users to access their data and
applications reliably and securely from anywhere in the world
on any device, paying only for the resources they need at that
time.
• 9

New developments in computer technology, virtualization,


and networking will enable better, faster, cheaper, and more
reliable systems in the future.
MODULE 2: understanding cloud computing services
Building a business that scales with IaaS:
• Scaling up:
Scaling up interferes to address a certain problem by adding
new resources to a server configuration such as memory, CPUs
and storage
• Scaling out:
Creating a copy of a specific resource so they can share
the workload. (replicating the same configuration of a virtual machine)
Deploying solutions with PaaS:
• Deploying a website:
A CMS (content management system) enables users to create
and publish websites, then a cloud service provider offers all
the tools to deploy the website to the cloud.
• Developing and deploying a software solution:
1. Application dev:
Developing the application without worrying about the tools we need to
develop the software.
2. API dev:
API is a tool used to enable developers to get and send data from
and to the company. The company uses PaaS to develop their APIs
which are an essential component of any modern distributed
application. The built-in frameworks that PaaS provides greatly
simplify API development and management. They also use PaaS for
microservices that are complementary to PaaS and a great aid in
deployment.
3. Testing:
PaaS creates different environments to for different operating
systems to test applications on it before deploying to users.
4. Securing data:
PaaS is used for database hosting which enables the company to store
data securely.
5. Analytics and communication:
Analyzing data from the application to make informed choices about
how they can improve the business.
6. Internal tools:
The company uses PaaS services for internal tools and private
dashboards which allow them to monitor the performance of their
application and make sure it is running smoothly.

Key points to remember:


1) AI technologies provide many useful features of modern life,
including medical diagnoses, virtual assistants, chatbots, and
fraud detection.
2) The cloud enables AI technologies by providing data storage,
processing power, and connections.
3) The IoT is a networking model that describes the interconnection
of physical devices that owners can access and control through the
internet, and the cloud is crucial for storing, processing, and
analyzing the vast amounts of data generated by IoT devices.
4) There are four types of cloud service models:
- SaaS delivers software applications to users over the internet. -
IaaS provides access to computing resources like servers,
storage, and networking over the internet.
- PaaS enables users to develop, run, and manage applications over
the internet.
- DBaaS provides users with access to fully managed databases.

5) Proprietary software is software that a company or individual


owns and controls.
6) Open source software is software that developers, often
collaboratively as a community of programmers, create and
distribute under a special type of license.
7) The source code for open source software is available for users to
access, modify, and distribute alongside their own code.
8) Most cloud providers have a global presence with strong security,
reliability, and scalability. They offer standard services such as
IaaS, SaaS, PaaS, and DBaaS.
9) The major cloud providers—IBM Cloud, Microsoft Azure, and
Amazon Web Services (AWS)—also offer services that
differentiate them from each other.
10) Open source cloud software enables developers to create their
own private clouds on personal or company computers.

MODULE 3: Understanding cloud deployment models


Cloud deployment models (the way that cloud services are delivered
and hosted):
• Public cloud:
- Resources available to anyone.
- The service provider owns the infrastructure, software and security of
the cloud.
- Cheaper and more scalable, less control and customization.
- Suitable for businesses and individuals who want to use open source
software without modifying it.
• Private cloud:
- Dedicated to a grp of users.
- The SP owns and manages the infrastructure, software and security.
- More expensive and less scalable, more control and customization.
• Hybrid cloud:
- Combines public & private.
- The SP or the organization can choose what resources to run on a
cloud based.
- More flexible but more complex and face security challenges.
• Community cloud:
- Owned bye a group of users or organizations that have share the same
goals or interests.
- The SP or one of the organizations owns the infra and manages it.
- More cost-effective, less scalable and available.

Key points to remember:


• Large organizations use modern mainframe computers to process large amounts
of data and keep information safe and secure. Modern mainframes provide the
high processing power, high availability, security, scalability, and integration that
today’s businesses need.
• 2

On-premises hosting is an infrastructure model where businesses own and


operate the systems, including hardware and software, needed to run business
operations.
• 3

Making the move to the cloud involves choosing the right cloud services. This
depends on various factors, including the business’s specific needs, scalability of
the services to grow, security, and integration with existing hardware and
software.
• 4

When businesses are choosing a cloud provider, they must consider the types of
cloud services they will need. They must also consider the cost, reliability, and
support the cloud providers offer.
• 5

A cloud migration plan is critical for a business to have a smooth transition to the
cloud.
• 6

Overall, a move to the cloud involves: migrating company data and applications,
testing the migrated system, monitoring the cloud infrastructure,
decommissioning the on-premises infrastructure, and optimizing the new cloud
infrastructure to ensure good performance.
• 7

It’s critical for businesses to monitor metrics and optimize performance to ensure
their cloud solution is efficient, reliable, meeting performance goals, and cost-
effective.
• 8

The four types of cloud deployment models are public, private, hybrid, and
community.
• 9

A main factor in choosing a public or a private cloud is the level of control and
security a business needs over their data and applications.
• 10
A public cloud offers lower costs, scalability, and flexibility. But the business
must share resources and infrastructure with other businesses, which might pose
some risks or limitations.
• 11

A private cloud offers more customization, privacy, and compliance. But it also
requires more investment, maintenance, and expertise.
• 12

A hybrid cloud offers using public and private cloud services together to achieve
the best results for a business. It requires careful planning and management to
ensure compatibility, security, and performance across different environments.

MODULE 4: virtualization in the cloud


Cloud providers offer three key types of virtualization:

• Virtual machines (1)


• Virtual storage (2)
• Virtual networks (3)

1→ can run on physical hardware and a physical computer can run or


host different VMs. VMs have their own virtual hardware such as CPU
storage and memory that is isolated from the OS and other VMs.

2→ How does virtual storage work?

Virtual storage uses remote servers to store data, such as files, business
information, videos, and images. But how do users interact with that
data?
Users access data in cloud storage through an internet connection. The
software they use to do this might be a web portal, a browser, or a
mobile app.
3→ A virtual network is a network that uses software instead of
physical hardware. It behaves and operates like a regular network, but it
exists only in the digital world, not in the physical world.
Virtual networks:

o Use virtualization technology, which allows engineers to


create a network using software-based network components.
o Can simulate all the functionality of a physical network,
including routing, switching, firewalling, and load balancing.

o How do virtual networks work?


Virtual networks use virtual routers, virtual switches, and virtual firewalls
in the same way that physical networks use physical components. They
exist in a layer between physical network components and the workloads
that use them.
Containers and virtualization
Containerization and virtualization are two methods of creating
isolated environments for running software applications. They both
aim to improve the efficiency, security, and portability of software
development and deployment.
However, they differ in significant ways:

o Virtualizing computer resources typically means creating one or more virtual machines
to provide a resource for performing various tasks.

o Containerization means creating containers that run applications and services for
performing tasks.
Key difference between virtualization and containerization:
o Virtualization uses the hardware of the host computer, but has its own operating
system and software. Developers call this hardware abstraction level.
o Containerization uses the hardware and operating system of the host computer,
but has its own software. Developers call this operating system abstraction
level.
o VMs vs containers:
→ Containers show better resources utilization, startup time and
performance because they share the same OS with the host computer.

→ Because they share the same OS with the physical computer,


containers are less secure than VMs.

→ And because containers can only run on hosts that use the same
operating system as themselves, they are less compatible with host
software than VMs which have their own operating system.

Container tools:
There are many container applications in use today. Some are proprietary and others are open
source. Three widely-used container applications to be familiar with are:

o Docker(opens in a new tab), which is the most popular tool for creating and running
containers.
o Kubernetes(opens in a new tab), which is a system for orchestrating multiple containers
across different machines or clouds. Orchestration is a way to manage and automate
tasks across multiple cloud platforms.
o Podman(opens in a new tab), which is a newer alternative to Docker for creating and
running containers that offers some advantages in security and simplicity.

There are three key benefits of using container applications.

▪ They can run on any cloud platform that supports them, without requiring changes to
the application code. This means that developers can focus on creating and updating
their applications, without worrying about the underlying infrastructure or
compatibility issues.
▪ Because users can create and destroy containers in seconds, containers also facilitate
faster deployment and easier updates.

▪ Developers can group them together into clusters. Clusters are collections of
containers that work together to provide a service or function. For example, a cluster
of containers can run a web server, a database, and a load balancer. Developers can
use tools such as Kubernetes to manage their clusters.

Microservices:
Microservices communicate using APIs (Application Programming
Interface).
An application programming interface (API) is a set of definitions
and protocols that enable two software components to communicate
with each other.

APIs are like contracts that defines the rules of communication between
microservices. It specifies what type of data can be sent and received

Best practices when using containers and microservices:


• Design each microservice to be loosely coupled and highly
cohesive. This means that each microservice should have a clear and
single responsibility, and should minimize its dependencies on other
services. This way, businesses can develop, test, deploy, and update
each microservice independently, without affecting the rest of the
system.
• Use a consistent and standardized approach to communicate
between microservices. Businesses can do this by using common
protocols, formats, and patterns for inter-service communication, such
as RESTful APIs, JSON, and message queues. This can help to ensure
interoperability, reliability, and security of the communication.
• Implement proper logging, monitoring, and tracing mechanisms for
each microservice and the whole system. This can help to detect and
troubleshoot errors, performance issues, and anomalies in the system. It
can also provide useful insights into the behavior and health of the
system.
• Use a container orchestration tool to automate and manage the
deployment, scaling, networking, and configuration of the
containers. This can help to simplify the complexity of managing many
containers across different environments. Some of the popular container
orchestration tools are Kubernetes, Docker Swarm, and Amazon ECS.
• Apply security best practices at every level of the system. This
includes securing the communication between microservices using
encryption and authentication. It also includes securing the containers
using techniques such as scanning for vulnerabilities, limiting
privileges, and enforcing policies.

You might also like