Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

IT APPLICATIONS - END TERM

Q1. Compare Cloud computing to Traditional Computing on Prem in Global enterprise.

Traditional Computing: The conventional approach of managing IT infrastructure within an


organization's premises. The organization owns and maintains hardware, software, and networking
resources, requiring significant capital investment and operational effort.
Cloud Computing: A model for accessing on-demand computing resources over the internet. Cloud
service providers offer virtual machines, storage, and applications hosted in remote data centers.
Users pay for resources consumed, benefiting from scalability, flexibility, and shared
infrastructure. CSPs handle provisioning, maintenance, security, and updates, allowing
organizations to focus on core business.

User Demand Flexibility:

Cloud Computing: Cloud services provide elastic scalability, allowing enterprises to quickly
adjust their resources based on user demand. It enables enterprises to scale up or down
seamlessly, ensuring optimal performance during peak periods while avoiding overprovisioning
during off-peak times.

Traditional Computing: On-premises infrastructure has limited user demand flexibility. Scaling
up or down requires physical hardware procurement and configuration, leading to longer lead
times and potential underutilization or resource constraints.
Price Flexibility:

Cloud Computing: Cloud services offer a pay-as-you-go model, allowing enterprises to align
their expenses with actual usage. This pricing flexibility enables cost optimization, as enterprises
can scale resources according to their budget and adjust usage patterns as needed.

Traditional Computing: On-premises computing involves upfront capital expenditure on


hardware and infrastructure, making it less flexible in terms of pricing. Enterprises need to
predict their capacity requirements and invest accordingly, which may result in underutilized
resources or the need for additional investments during periods of growth.

Time-to-Market Agility:

Cloud Computing: Cloud services enable rapid deployment of applications and services,
reducing time-to-market. With preconfigured infrastructure and managed services, enterprises
can focus on development rather than infrastructure setup, allowing for faster innovation and
quicker response to market demands.

Traditional Computing: On-premises computing requires time for infrastructure provisioning,


software installation, and configuration. This can lead to longer development and deployment
cycles, delaying time-to-market and potentially impacting competitive advantage.
Location Flexibility:

Cloud Computing: Cloud services offer location flexibility, allowing enterprises to deploy their
applications and services across multiple regions and data centers. This provides the ability to
serve customers in different geographic locations efficiently and ensures low latency and better
user experience.

Traditional Computing: On-premises infrastructure limits location flexibility. Enterprises need to


establish physical data centers or infrastructure in each desired location, resulting in increased
complexity and cost.

Asset Optimization:
Cloud Computing: Cloud services enable resource optimization through dynamic allocation and
efficient utilization of computing resources. Enterprises can scale resources as needed, ensuring
optimal usage and reducing the risk of underutilized or idle hardware.

Traditional Computing: On-premises infrastructure requires careful capacity planning and


resource allocation to optimize asset utilization. Scaling resources may involve significant
upfront investments and lead times, making it challenging to adapt to changing demands
efficiently.

Q2. Draw Cloud Architecture and define main layers.

Cloud Architecture: Malta only

Cloud architecture is the design and structure of a cloud computing environment. It


encompasses layers like infrastructure, platform, software, management, and security. These
layers work together to deliver scalable resources, platforms, and applications, enabling
organizations to leverage the benefits of cloud computing for their IT infrastructure and
services.
Main Layers of Cloud:

Infrastructure Layer: This layer forms the foundation of cloud architecture and includes
physical resources such as servers, storage devices, networking equipment, and data centers. It
provides the underlying infrastructure on which the higher layers operate.

Platform Layer: The platform layer, also known as the platform as a service (PaaS) layer,
provides a platform for developing, deploying, and managing applications. It offers tools,
runtime environments, and services that abstract away the complexities of infrastructure
management, enabling developers to focus on application development.

Software Layer: The software layer, also referred to as the software as a service (SaaS) layer,
encompasses the cloud-based applications and services that end-users directly interact with.
These applications are hosted and provided by cloud service providers, who handle maintenance,
updates, and scalability, while users access the software via web browsers or dedicated
interfaces.

Q3. Compare Private and Public Cloud computing.

Definition:
Public cloud (off-site and remote) describes cloud computing where resources are dynamically
provisioned on an on-demand, self-service basis over the Internet, via web applications/web
services, open API, from a third-party provider who bills on a utility computing basis.
Private cloud environment is often the first step for a corporation prior to adopting a public cloud
initiative. Corporations have discovered the benefits of consolidating shared services on
virtualized hardware deployed from a primary datacenter to serve local and remote users.

Ownership and Control:


Private Cloud: Private cloud infrastructure is owned, managed, and operated by a single
organization. It provides dedicated resources and infrastructure for exclusive use by that
organization. The organization has full control over the private cloud, allowing for greater
customization, security, and compliance adherence.
Public Cloud: Public cloud infrastructure is owned and operated by a third-party cloud service
provider (CSP). Multiple organizations share the same pool of resources provided by the CSP.
The organization using public cloud services has limited control over the underlying
infrastructure but gains the benefits of scalability and cost efficiency.

Security and Privacy:


Private Cloud: Private clouds offer greater control and customization over security measures.
Organizations can implement stringent security protocols, access controls, and data encryption
tailored to their specific requirements. Private clouds are generally considered more suitable for
sensitive data and compliance-driven industries.
Public Cloud: Public clouds provide built-in security measures implemented by the CSPs. They
invest heavily in security protocols, physical infrastructure protection, data encryption, and
compliance certifications. However, organizations may have concerns regarding data privacy and
security due to the shared nature of the infrastructure.

Scalability and Resource Flexibility:


Private Cloud: Private clouds offer scalability but with limitations based on the organization's
infrastructure capacity. Scaling up private cloud resources often requires additional investments
in hardware and infrastructure.
Public Cloud: Public clouds provide high scalability and resource flexibility. Organizations can
quickly scale resources up or down based on demand without upfront investments. The shared
nature of the public cloud allows for efficient resource allocation and cost optimization.

Cost:
Private Cloud: Private clouds involve higher initial capital expenditure as organizations need to
procure and maintain their own infrastructure. The costs also include ongoing management and
maintenance expenses.
Public Cloud: Public clouds operate on a pay-as-you-go model, allowing organizations to pay for
the resources they consume. This eliminates upfront capital costs and provides cost flexibility as
organizations can scale resources according to their needs.
Customization and Flexibility:
Private Cloud: Private clouds offer greater customization options, allowing organizations to
tailor the infrastructure and services to their specific needs and requirements.
Public Cloud: Public clouds provide standardized services and limited customization options.
While organizations have flexibility in selecting available services, they may have limited
control over the underlying infrastructure and configurations.

Rationale for Private Cloud:


● Security and privacy of business data was a big concern
● Potential for vendor lock-in
● SLA’s required for real-time performance and reliability
● Cost savings of the shared model achieved because of the multiple projects involving
semantic technologies that the company is actively developing

Q4. Define BigData and Hadoop Architecture.

Big data refers to extremely large and complex datasets that cannot be effectively processed
using traditional data processing techniques. These datasets typically involve massive volumes of
structured, semi-structured, and unstructured data, generated from various sources such as social
media, sensors, transaction records, and more. Big data is characterized by the three V's: volume
(large amount of data), velocity (high speed at which data is generated and processed), and
variety (diverse types of data).

Hadoop Architecture: Hadoop is an open-source framework designed for distributed storage


and processing of Big Data across clusters of commodity hardware. It consists of two main
components:

Hadoop Distributed File System (HDFS): HDFS is a distributed file system that stores data
across multiple machines in a cluster. It breaks data into blocks and replicates them for fault
tolerance. HDFS enables high-throughput access to large datasets and provides reliability and
scalability.
Architecture: NameNode: Stores metadata for the files, like the directory structure of a typical
FS. The server holding the NameNode instance is quite crucial, as there is only one. Stores the
actual data in HDFS Can run on any underlying filesystem (ext3/4, NTFS, etc) Notifies
NameNode of what blocks it has

MapReduce: MapReduce is a programming model and processing paradigm used for distributed
data processing in Hadoop. It allows parallel execution of tasks across the nodes in a cluster. The
MapReduce framework divides a task into smaller subtasks, performs mapping and reducing
functions, and aggregates the results.
Architecture: Distributed, with some centralization Main nodes of cluster are where most of the
computational power and storage of the system lies Main nodes run TaskTracker to accept and
reply to MapReduce tasks, Main Nodes run DataNode to store needed blocks closely as possible
Central control node runs NameNode to keep track of HDFS directories & files, and JobTracker
to dispatch compute tasks to TaskTracker Written in Java, also supports Python and Ruby

Hadoop architecture also includes additional components such as YARN (Yet Another Resource
Negotiator) for resource management and job scheduling, and various ecosystem tools like Hive,
Pig, Spark, and HBase that extend the capabilities of Hadoop for data processing, querying,
analytics, and real-time applications. Overall, Hadoop architecture provides a scalable, fault-
tolerant, and cost-effective solution for processing and analyzing Big Data, enabling
organizations to harness the power of massive datasets for valuable insights and decision-
making.
Q5. What is SaaS and Paas? Give 2 business deployment examples.

Software as a Service (SaaS) and Platform as a Service (PaaS) are two prominent models in
cloud computing that have transformed the way businesses access and utilize software
applications and development platforms.

SaaS eliminates the need for organizations to install and maintain software locally, as it is hosted
and managed by a third-party provider. Users can access the software through web browsers,
making it convenient and accessible from anywhere. SaaS allows businesses to focus on their
core operations while relying on reliable and up-to-date software applications. One exemplary
SaaS platform is Salesforce, a leading customer relationship management (CRM) solution.
Salesforce offers a comprehensive suite of CRM tools, including sales, marketing, and customer
support functionalities. By leveraging Salesforce as a SaaS solution, businesses can efficiently
manage customer interactions, track sales activities, and optimize marketing campaigns. The
cloud-based infrastructure of Salesforce ensures seamless updates and maintenance, providing
organizations with the latest features and security enhancements. Another notable SaaS example
is Workday, an enterprise resource planning (ERP) system. Workday streamlines various aspects
of business operations, such as finance, human resources, and supply chain management. It
offers a user-friendly interface and robust analytics capabilities, enabling organizations to
improve efficiency, streamline processes, and make data-driven decisions. With Workday as a
SaaS solution, businesses can leverage a scalable and flexible ERP system without the need for
extensive infrastructure management.

Moving on to PaaS, it provides a platform for developing, deploying, and managing


applications without the complexities of infrastructure management. PaaS offerings simplify the
development process by providing pre-configured environments, frameworks, and tools that
enable developers to focus on coding and application logic. Heroku is a prime example of a PaaS
platform that allows businesses to develop and deploy web applications with ease. It offers a
scalable and fully managed environment that supports various programming languages and
frameworks. Heroku's streamlined deployment process and automatic scaling capabilities enable
developers to focus on building robust and innovative applications, without worrying about
infrastructure provisioning or management. Furthermore, Google Cloud's BigQuery is a PaaS
offering that excels in data analytics and processing. It provides a powerful and scalable platform
for analyzing large datasets using SQL queries and distributed computing. Businesses can
leverage BigQuery's capabilities to gain valuable insights, perform complex data analysis, and
drive data-based decision-making.
Q6. Define and draw IoT Architecture.

IoT (Internet of Things): The Internet of Things (IoT) refers to a network of interconnected
physical devices, vehicles, appliances, and other objects embedded with sensors, software, and
network connectivity that enables them to collect and exchange data. IoT enables these devices
to interact and communicate with each other and with the cloud, allowing for seamless
integration of the physical and digital worlds. IoT Architecture: The architecture of IoT involves
multiple layers that work together to enable the communication, data processing, and
functionality of IoT systems. Here is a high-level overview of the IoT architecture:

Perception Layer: The perception layer consists of physical devices or sensors that gather data
from the environment. These devices can include sensors, actuators, cameras, RFID (Radio
Frequency Identification) tags, and more. They collect data such as temperature, humidity,
pressure, location, and other relevant information.

Network Layer: The network layer is responsible for transmitting the data collected by the
perception layer to the appropriate destinations. It involves various network technologies such as
Wi-Fi, Bluetooth, Zigbee, cellular networks, or even satellite communication. This layer ensures
the connectivity and communication between the devices and the Internet.

Middleware Layer: The middleware layer acts as a bridge between the devices/sensors and the
application layer. It handles tasks such as data filtering, data aggregation, protocol translation,
and device management. The middleware layer enables seamless integration of diverse devices
and protocols into a unified system.

Application Layer: The application layer comprises the software applications and services that
process and analyze the data collected from the devices. It involves data storage, real-time
analytics, machine learning algorithms, and visualization tools. The application layer provides
actionable insights and enables decision-making based on the collected data.

Business Layer: The business layer represents the domain-specific applications and services
built on top of the IoT infrastructure. It encompasses various use cases and applications, such as
smart homes, industrial automation, healthcare monitoring, transportation systems, and more.
The business layer leverages the capabilities of the underlying IoT architecture to deliver specific
functionalities and value to end users.
Q7. Draw Application Architecture of AI in Healthcare.

In the context of AI in healthcare, the application architecture typically involves multiple layers
and components that work together to enable various AI capabilities. Here's a general overview
of the architecture:

Data Collection: The architecture starts with data collection from various sources such as
electronic health records (EHRs), medical devices, wearables, and imaging systems. These data
sources provide the foundation for AI models and algorithms.

Data Preprocessing: Once the data is collected, it needs to be preprocessed and cleaned to ensure
accuracy and consistency. Preprocessing steps may include data normalization, handling missing
values, removing outliers, and anonymization to protect patient privacy.

Data Storage: Processed data is then stored in secure and scalable data repositories such as data
lakes or data warehouses. These repositories enable efficient data management and retrieval for
AI processing.

AI Model Development: This layer involves the development and training of AI models and
algorithms. Techniques like machine learning, deep learning, natural language processing, and
computer vision are employed to create models that can extract insights, make predictions, and
assist in clinical decision-making.
Model Training and Validation: The developed AI models are trained using the collected and
preprocessed data. Training involves feeding the model with labeled examples to learn patterns
and make predictions. The trained models are then validated using separate datasets to ensure
their accuracy, performance, and generalizability.

Inference and Decision Support: Once trained and validated, the AI models are deployed for
real-time inference. They analyze new data, provide predictions, and offer decision support to
healthcare professionals. This layer includes integrating AI capabilities into existing healthcare
systems and workflows.

User Interface: The final layer of the architecture encompasses the user interface, which enables
healthcare professionals to interact with the AI system. This interface can take the form of a
web-based application, a mobile app, or integration into existing clinical systems. It allows users
to input data, view AI-generated insights, and access recommendations for diagnosis, treatment
planning, or patient monitoring.
BLOCKCHAIN: Foundation, Application and Future:

Blockchain technology has emerged as a transformative innovation with the potential to


revolutionize various industries and sectors. At its core, blockchain is a decentralized and
distributed ledger that enables secure and transparent transactions and information sharing. This
technology has gained significant attention due to its foundations, wide-ranging applications, and
promising future directions.

Foundations: The foundation of blockchain lies in its key features. First, it operates on a
decentralized network of computers, known as nodes, which collectively maintain and validate the
blockchain. This decentralized nature eliminates the need for intermediaries and enhances trust
and security. Each node in the network holds a copy of the entire blockchain, ensuring redundancy
and resilience. Second, blockchain relies on cryptographic techniques to ensure data integrity,
immutability, and privacy. Transactions recorded on the blockchain are cryptographically linked
and stored in blocks, forming a chain of blocks that cannot be altered without consensus from the
network. Cryptographic hash functions ensure the integrity of the data, making it virtually
impossible to tamper with the recorded transactions. Third, blockchain utilizes consensus
algorithms, such as Proof of Work (PoW) or Proof of Stake (PoS), to validate and agree on the
order of transactions, ensuring the integrity of the ledger. In PoW, miners compete to solve
complex mathematical puzzles, while in PoS, validators are selected based on the amount of
cryptocurrency they hold, reducing energy consumption and increasing scalability.

Applications: Blockchain has found applications across various sectors. In finance,


cryptocurrencies like Bitcoin and Ethereum utilize blockchain to enable secure and transparent
digital transactions without relying on centralized financial institutions. Blockchain-based smart
contracts enable self-executing agreements, eliminating the need for intermediaries and reducing
costs. Supply chain management has benefited from blockchain's ability to provide an immutable
record of product provenance, ensuring transparency and traceability. By recording every step of
the supply chain on the blockchain, stakeholders can verify the authenticity and quality of
products, detect counterfeit items, and streamline logistics processes. Healthcare is exploring
blockchain for secure storage and sharing of patient records, enabling interoperability and
enhancing data privacy. Blockchain can provide a decentralized and tamper-proof system for
managing electronic health records, enabling patients to have control over their data and granting
healthcare providers access to accurate and up-to-date medical information. Other sectors,
including logistics, real estate, voting systems, and intellectual property, are also exploring
blockchain applications.
Future Directions: The future of blockchain technology holds immense potential for further
advancements and innovations. One direction is the advancement of interoperability among
different blockchains, allowing seamless communication and information exchange. Currently,
blockchains operate in isolation, hindering the efficient flow of data and assets between different
networks. Interoperability protocols, such as Polkadot and Cosmos, aim to bridge this gap by
enabling cross-chain transactions and interoperability of smart contracts. Another area of focus is
scalability, as current blockchain networks face challenges in processing large volumes of
transactions quickly. Solutions like sharding, where the blockchain is divided into smaller
partitions, and layer 2 protocols, such as Lightning Network, aim to address this issue by increasing
transaction throughput and reducing latency. Additionally, advancements in privacy-enhancing
technologies, such as zero-knowledge proofs and secure multiparty computation, can enable
private transactions on public blockchains, striking a balance between transparency and
confidentiality. The integration of blockchain with other emerging technologies, such as artificial
intelligence (AI) and the Internet of Things (IoT), opens up new possibilities for decentralized and
autonomous systems. Blockchain can provide a secure and transparent framework for managing
AI algorithms, ensuring data integrity and privacy. IoT devices can leverage blockchain for secure
device identity, data integrity, and decentralized data sharing. Moreover, there is a growing interest
in the development of blockchain-based decentralized finance (DeFi) platforms, enabling peer-to-
peer lending, decentralized exchanges, and other financial services without the need for traditional
intermediaries. DeFi protocols, built on blockchain networks like Ethereum, provide open and
permissionless access to financial services, allowing individuals to have greater control over their
assets and participate in decentralized lending and borrowing. Blockchain is also being explored
for its potential in addressing social and environmental challenges. For example, blockchain can
be used to verify the authenticity of sustainable products by recording their supply chain and
certifications on the blockchain. It can also be used to track carbon emissions and facilitate carbon
offset transactions, enabling transparency and accountability in environmental initiatives.

You might also like