Download as pdf or txt
Download as pdf or txt
You are on page 1of 106

Name -: Sumit Ravindra Bhavsar

Amity Institute of Information Technology

Name : Sumit Ravindra Bhavsar

Enrollment No : A710145023046

Programme:
MCA (Semester II)

Course Name: Power BI

Course Code: IFT4106


Dr. Karuna Sitaram Kirwale
Subject Teacher:
Name -: Sumit Ravindra Bhavsar

AMITY UNIVERSITY MAHARASHTRA

Established vide Maharashtra Act No. 13 of 20 14, of Government


of Maharashtra, and recognized under Section 2(t) of UGC Act
1956

CERTIFICATE

This is to certify that Sumit Ravindra Bhavsar Enrollment No.


A710145023046 of class MCA , Semester II has satisfactorily completed the
Power BI Practical Lab Manual prescribed by Amity University Maharashtra
during the academic year 2023-2024.

Sign of Faculty Sign of HOI

Name: Dr. Karuna Sitaram Kirwale Name: Dr. Manoj Devare


Name -: Sumit Ravindra Bhavsar

Index
Sr. No. Topic

1. Installation of power bi

2. Import the legacy data from different source and loading in the target
system

3. Perform the Extraction Transformation and Loading (ETL) Process to


construct the database using PowerBi

4. Creation of data queue with suitable dimension tables and fact table

5. Extracting data from the data warehouse using MDX queries

6. Visualization of data using pivot table and pivot chart

7. Application of what if analysis for data visualization

8. Implementation of data classification with decision tree

9. Implementation of Data cluster

10. Implementation of association rule mining

11. Implementation of regression and neural network.

12. Preparation of report in sql server reporting services


Name -: Sumit Ravindra Bhavsar

Practical 1: Installation of power bi


Step 1: System Requirements
Before you start the installation process, ensure that your system meets the minimum requirements:
Operating System: Windows 10, Windows 7, Windows 8, or Windows Server 2012 or later.
Memory: At least 2 GB of RAM; 4 GB or more recommended.
.NET Framework: Version 4.6.2 or later.
Step 2: Download Power BI Desktop
Open your web browser and go to the official Microsoft Power BI website: Power BI Desktop Download.
Click on the Try for free button.
You will be redirected to the Microsoft Store page for Power BI Desktop. Click on Get.
Name -: Sumit Ravindra Bhavsar

Step 3: Installation via Microsoft Store


If prompted, sign in with your Microsoft account.
Click Install to download and install Power BI Desktop.

Once the download is complete, Power BI Desktop will automatically install on your machine.
You can launch Power BI Desktop directly from the Microsoft Store or find it in the Start menu.
Name -: Sumit Ravindra Bhavsar

Step 4: Launch Power BI Desktop


After installation, you can launch Power BI Desktop from the Start menu.
The first time you run Power BI Desktop, you may be prompted to sign in with your Microsoft account.
Name -: Sumit Ravindra Bhavsar
Name -: Sumit Ravindra Bhavsar

Practical 2: Import the legacy data from different source and loading in the target system
Steps 1: Create an excel sheet with data

Step 2: Open Power BI desktop


Step 3: Go to Home Ribbon-> Get Data-> Excel and browse your excel file
Name -: Sumit Ravindra Bhavsar

Step 4: In the Navigator tab, select your table(Sheet1) from your dataset(Student_Marks.xlsx)
Name -: Sumit Ravindra Bhavsar

Step 5: Click Transform Data

Step 6: You will obtain this screen for queries


Name -: Sumit Ravindra Bhavsar

Practical 3: Perform the Extraction Transformation and Loading (ETL) Process to construct the
database using PowerBi
Step 1: Open Power BI.

Step 2: Get Data a Excel


Name -: Sumit Ravindra Bhavsar

Step 3: Choose you Excel data.

Step 4: Loading might take time.


Step 5: Select Sheet1.
Name -: Sumit Ravindra Bhavsar

Step 6: Click on Transform data


Step 7: Power Query Editor window will open.

Step 8: Reduce a Remove rows a Remove top row.

Step 9: Enter the number of rows you want to reduce (here it is just one row from top)
Name -: Sumit Ravindra Bhavsar

Step 10: Transform a Use first row as headers.


Name -: Sumit Ravindra Bhavsar

Step 11: Your table has appropriate headers now.

Step 12: Right click on the columns which you don’t want to use a Remove Columns.
Name -: Sumit Ravindra Bhavsar

Step 13: Replace values by selecting that cell a right click a replace values…

Step 14: Provide the new value and click on OK.

Step 15: All values having the values you wished to replace earlier will be replaced with new values.
Name -: Sumit Ravindra Bhavsar

Practical 4: Create the cube with suitable dimension and fact tables based on OLAP.

Step 1: Creating Data Warehouse


Let us execute our T-SQL Script to create data warehouse with fact tables, dimensions and populate them
with appropriate test values.
Download T-SQL script attached with this article for creation of Sales Data
Warehouse or download from this article “Create First Data Warehouse” and run it in your SQL Server.

Downloading "Data_WareHouse SQLScript.zip" from the article


https://www.codeproject.com/Articles/652108/Create-First-Data-WareHou se

After downloading extract file in folder.


Follow the given steps to run the query in SSMS (SQL Server Management Studio).

1. Open SQL Server Management Studio 2012

2. Connect Database Engine


Name -: Sumit Ravindra Bhavsar

Click Connect.

3. Open New Query editor

4. Copy paste Scripts given below in various steps in new query editor window one by one

5. To run the given SQL Script, press F5

6. It will create and populate “Sales_DW” database on your SQL Server


OR

1. Go to the extracted sql file and double click on it.

2. New Sql Query Editor will be opened containing Sales_DW Database.

3. Click on execute or press F5 by selecting query one by one or directly click on Execute.
Name -: Sumit Ravindra Bhavsar
4. After completing execution save and close SQL Server Management studio & Reopen to see
Sales_DW in Databases Tab.

Step 2: Open Visual Studio 2022


Name -: Sumit Ravindra Bhavsar

Analysis Services Multidimensional appropriate project name click OK

Right click on Data Sources in solution explorer -> New Data Source
Name -: Sumit Ravindra Bhavsar

Data Source Wizard appears

Click on New
Name -: Sumit Ravindra Bhavsar

Select Server Name -> select Use SQL Server Authentication -> Select or enter a database name
(Sales_DW)
Name -: Sumit Ravindra Bhavsar
Click Next

Select Inherit -> Next


Finish

Sales_DW.ds gets created under Data Sources in Solution Explorer


Step 3: Creating New Data Source View

In Solution explorer right click on Data Source View -> Select New Data Source View
Click Next

Select FactProductSales(dbo) from Available objects and put in Includes Objects by clicking
on
Click Finish

Sales DW.dsv appears in Data Source Views in Solution Explorer.


Step 4: Creating new cube

Right click on Cubes -> New Cube


Select Use existing tables in Select Creation Method -> Next
In Select Measure Group Tables -> Select FactProductSales -> Click Next

In Select Measures -> check all measures -> Next


In Select New Dimensions -> Check all Dimensions -> Next

Click on Finish
Sales_DW.cube is created

Step 5: Dimension Modification

In dimension tab -> Double Click Dim Product.dim

Drag and Drop Product Name from Table in Data Source View and Add in Attribute Pane at left side
Step 6: Creating Attribute Hierarchy in Date Dimension

Double click On Dim Date dimension -> Drag and Drop Fields from Table shown in Data Source View to
Attributes-> Drag and Drop attributes from leftmost pane of attributes to middle pane of Hierarchy.

Drag fields in sequence from Attributes to Hierarchy window (Year, Quarter Name, Month Name, Week of
the Month, Full Date UK)

Step 7: Deploy Cube

Right click on Project name -> Properties


This window appears

Do following changes and click on Apply & ok


Right click on project name -> Deploy
Deployment successful

To process cube right click on Sales_DW.cube Process


Click run

Browse the cube for analysis in solution explorer


Practical 5: Extracting data from the data warehouse using MDX queries

Step 1: Open SQL Server Management Studio and connect to Analysis Services.
Server type: Analysis Services
Server Name: (according to base machine)

Click on connect

Step 2: Click on New Query & type following query based on Sales_DW
select [Measures].[Sales Time Alt Key] on columns
from [Sales DW]
Click on execute

select [Measures].[Quantity] on columns from [Sales DW]


select [Measures].[Sales Invoice Number] on columns from [Sales DW]

select [Measures].[Sales Total Cost] on columns from [Sales DW]


select [Measures].[Sales Total Cost] on columns , [Dim Date].[Year].[Year]
on rows from [Sales DW]

select [Measures].[Sales Total Cost] on columns , NONEMPTY({[Dim


Date].[Year].[Year]}) on rows from [Sales DW]
select [Measures].[Sales Total Cost] on columns from [Sales DW]
Where [Dim Date].[Year].[Year].&[2013]
Practical 6: Visualization of Data using Pivot Table and Pivot Chart.

Connect to the SQL Server database

Open the Microsoft Excel and import the data into excel.

Click on the insert ->PivotTable -> From External Data Source.


Create a new source
Click on pivot chart and choose the chart
Sum of CustomerID

IMI-001 2014 Arial Washing


Powder 1kg Guj
IMI-002 2014 Arial Washing
Powder 1kg Guj
IMI-003 2014 Arial Washing
Powder 1kg Guj
IMI-004 2014 Arial Washing
Powder 1kg Guj
IMI-005 2014 Arial Washing
Powder 1kg Guj
Practical 7: Application of what if analysis for data visualization
A book store and have 100 books in storage. You sell a certain % for thehighest price of $50 and a certain %
for the lower price of $20.

If you sell 60% for the highest price, cell D10 calculates a total profit of 60 *50 + 40 * 20 = 3800.
Create Different Scenarios But what if you sell 70% for the highest price? And what if you sell 80% for the
highest price? Or 90%, or even 100%? Eachdifferent percentage is a different scenario. You can use the
Scenario Manager to create these scenarios.

Note: To type different percentage into cell C4 to see the correspondingresult of a scenario in cell D10 we
use what if analysis.
What-if analysis enables you to easily compare the results of differentscenarios.

Step 1: In Excel, On the Data tab, in the Data tools group, click What-IfAnalysis
Step 2: Click on What –if-Analysis and select scenario manager.
The Scenario Manager Dialog box appears.

Step 3: Add a scenario by clicking on Add.

Step 4: Type a name (60percent), select cell F11 (% sold for the highestprice) for the Changing cells and
click on OK.

Click on icon which is circled.


Click back on the icon again and then click OK

Step 5: Enter the corresponding value 0.6 and click on OK again.


Step 6: To apply scenarios, click on Show

Step 7: Next, add 4 other scenarios (70%, 80%, 90% and 100%).
Practical 8: Implementation of data classification with decision tree

 Time series is a series of data points in which each data point is associated with a timestamp.
 A simple example is the price of a stock in the stock market at different points of time on a given
day.
 Another example is the amount of rainfall in a region at different months of the year.
 R language uses many functions to create, manipulate and plot the time series data.
 The data for the time series is stored in an R object called time-series object. It is also a R data
object like a vector or data frame. The time series object is created by using the ts() function.

Syntax:

The basic syntax for ts() function in time series analysis is − timeseries.object.name <- ts(data, start, end,
frequency)

Following is the description of the parameters used −

• data is a vector or matrix containing the values used in the time series.

• start specifies the start time for the first observation in time series.

• end specifies the end time for the last observation in time series.

• frequency specifies the number of observations per unit time.

Except the parameter "data" all other parameters are optional

Consider the annual rainfall details at a place starting from January 2012. We create an R time series object
for a period of 12 months and plot it. Code to run in R

# Get the data points in the form of a R vector

rainfall <- c(799,1174.8,865.1,1334.6,635.4,918.5,685.5,998.6,784.2,985,882.8,1071)

# Convert it to a time series object

rainfall.timeseries <- ts(rainfall, start = c(2012, 1), frequency = 12)


# Print the timeseries data

print(rainfall.timeseries)

# Give the chart file a name

png(file = "rainfall.png")

# Plot a graph of the time series

plot(rainfall.timeseries)

# Save the file

dev.off()

# Plot the time series again to display the chart in the R console

plot(rainfall.timeseries)

Output:
Practical 9: Implemenation of Data cluster
#apply K means to iris and store result

# Load the iris dataset

data(iris)

# Remove the Species column for


clustering

newiris <- iris

newiris$Species <- NULL

# Apply k-means clustering to the data

kc <- kmeans(newiris, 3)

# Print the k-means clustering result

print(kc)

# Compare the Species label with the


clustering result

comparison_table <- table(iris$Species,


kc$cluster)

print(comparison_table)

# Plot the clusters and their centers

png(file = "iris_clusters.png")

# Save the plot to a file

plot(newiris[c("Sepal.Length",
"Sepal.Width")], col = kc$cluster, main =
"K-means Clustering of Iris Data")

points(kc$centers[, c("Sepal.Length",
"Sepal.Width")], col = 1:3, pch = 8, cex =
2)
dev.off()

# Plot the clusters and their centers again


to display in the R console

plot(newiris[c("Sepal.Length",
"Sepal.Width")], col = kc$cluster, main =
"K-means Clustering of Iris Data")

points(kc$centers[, c("Sepal.Length",
"Sepal.Width")], col = 1:3, pch = 8, cex =
2)

Output:
Practical 10: Implementation of association rule mining.
Code:
# Sample data: Each list represents a transaction
data <- list(
c("Milk", "Bread", "Eggs"),
c("Milk", "Diapers", "Beer", "Bread"),
c("Bread", "Eggs"),
c("Milk", "Diapers", "Beer", "Cola"),
c("Diapers", "Bread", "Milk", "Eggs")
)

# Convert the data into transactions


trans <- as(data, "transactions")

# Display the transactions


summary(trans)

# Generate frequent itemsets with a minimum support threshold


frequent_itemsets <- apriori(trans, parameter = list(supp = 0.6, target = "frequent itemsets"))
# Display the frequent itemsets
inspect(frequent_itemsets)

# Generate association rules with a minimum confidence threshold


rules <- apriori(trans, parameter = list(supp = 0.6, conf = 0.7, target = "rules"))
# Display the association rules
inspect(rules)
Practical 11: Implementation of regression and neural network.
Code:
# Linear Regression Example

# Load the mtcars dataset


data(mtcars)

# Display the first few rows of the dataset


head(mtcars)

# Fit the linear regression model


model <- lm(mpg ~ wt, data = mtcars)

# Display the summary of the model


summary(model)

# Create a scatter plot with the regression line


ggplot(mtcars, aes(x = wt, y = mpg)) +
geom_point() +
geom_smooth(method = "lm", col = "blue") +
labs(title = "Linear Regression: MPG vs Weight", x = "Weight (1000 lbs)", y = "Miles per
Gallon (mpg)")

# Neural Network Example

# Normalize the data


normalize <- function(x) {
return((x - min(x)) / (max(x) - min(x)))
}

mtcars_normalized <- as.data.frame(lapply(mtcars, normalize))


# Split the data into training and testing sets
set.seed(123)
training_indices <- sample(1:nrow(mtcars_normalized), 0.7 * nrow(mtcars_normalized))
training_data <- mtcars_normalized[training_indices, ]
testing_data <- mtcars_normalized[-training_indices, ]

# Train the neural network


nn <- neuralnet(mpg ~ wt + hp, data = training_data, hidden = c(5, 3), linear.output = TRUE)

# Plot the neural network


plot(nn)

# Make predictions on the testing data


predictions <- compute(nn, testing_data[, c("wt", "hp")])$net.result

# Denormalize the predictions


denormalize <- function(x, min, max) {
return(x * (max - min) + min)
}

actual_mpg <- denormalize(testing_data$mpg, min(mtcars$mpg), max(mtcars$mpg))


predicted_mpg <- denormalize(predictions, min(mtcars$mpg), max(mtcars$mpg))

# Evaluate the model


results <- data.frame(actual_mpg, predicted_mpg)
correlation <- cor(results)
print(correlation)

# Visualize the results


ggplot(results, aes(x = actual_mpg, y = predicted_mpg)) +
geom_point() +
geom_abline(slope = 1, intercept = 0, col = "red") +
labs(title = "Neural Network: Actual vs Predicted MPG", x = "Actual MPG", y = "Predicted
MPG")
Output:
Practical 12: Preparation of report in sql server reporting services

Step 1: Install SQL Server and SSRS

Install SQL Server

1. Download SQL Server: Go to the Microsoft SQL Server downloads page and download
the appropriate version of SQL Server.
2. Run the Installer: Follow the installation wizard to install SQL Server. Make sure to
select the Database Engine Services.

Install SQL Server Reporting Services (SSRS)

1. Download SSRS: Download SSRS from the Microsoft website.


2. Run the Installer: Follow the installation wizard to install SSRS.

Run the Installer: Follow the installation wizard to install SSRS.

Step 2: Configure SSRS

Open Reporting Services Configuration Manager

1. Open the Tool: Go to Start > SQL Server > SQL Server Reporting Services
Configuration Manager.
2. Connect to the Report Server: Select the appropriate server name and instance, then
click Connect.
Configure SSRS

1. Service Account:
o Click on Service Account.
o Select the built-in account or specify a domain account to run the Report Server
service.
o Click Apply.
2. Web Service URL:
o Click on Web Service URL.
o Configure the URL for the Report Server Web Service (e.g.,
http://<ServerName>/ReportServer).
o Click Apply.

Step 3: Create Report Server Database

1. Database:
o Click on Database.
o Click Change Database.
o Select Create a new report server database, then click Next.
Database Server:

 Specify the SQL Server instance where the Report Server Database will be hosted.
 Click Test Connection to ensure connectivity.
 Click Next.
Database Name:

 Provide a name for the Report Server Database, typically ReportServer.


 Optionally, you can create a ReportServerTempDB for temporary data storage.
 Click Next.

Credentials:

 Specify the credentials that the Report Server will use to connect to the database. You
can use Windows credentials or SQL Server credentials.
 Click Next.
 Review the summary of configurations.
 Click Next to create the database.
 Wait for the creation process to complete, then click Finish.
Step 4: Configure the Report Manager URL

1. Report Manager URL:


o Click on Report Manager URL.
o Configure the URL for the Report Manager (e.g.,
http://<ServerName>/Reports).
o Click Apply.
Step 5: Verify and Test the Configuration

1. Open a Web Browser:


o Navigate to the Report Manager URL configured earlier (e.g.,
http://<ServerName>/Reports).
2. Verify Access:
o You should see the SQL Server Reporting Services web interface.
o You can manage folders, upload reports, and configure security settings from
this interface.
Now create a folder
 Now download the report builder
Step 6: Create the Expenditure Table and Insert Data

1. Create the Expenditure Table:


CREATE TABLE Expenditure (
Person NVARCHAR(50),
Expenses DECIMAL(8,2)
);
Insert Data into the Expenditure Table:
INSERT INTO Expenditure (Person, Expenses) VALUES ('John Doe', 150.75);
INSERT INTO Expenditure (Person, Expenses) VALUES ('Jane Smith', 200.50);
INSERT INTO Expenditure (Person, Expenses) VALUES ('Mike Johnson', 99.99);
INSERT INTO Expenditure (Person, Expenses) VALUES ('Emily Davis', 450.00);
INSERT INTO Expenditure (Person, Expenses) VALUES ('Chris Brown', 300.25);
INSERT INTO Expenditure (Person, Expenses) VALUES ('Anna White', 250.80);
INSERT INTO Expenditure (Person, Expenses) VALUES ('David Harris', 175.45);
INSERT INTO Expenditure (Person, Expenses) VALUES ('Susan Clark', 120.00);
INSERT INTO Expenditure (Person, Expenses) VALUES ('Robert Lewis', 600.60);
INSERT INTO Expenditure (Person, Expenses) VALUES ('Linda Walker', 320.10);
Step 7: Create a Report in SQL Server Reporting Services

1. Open SQL Server Data Tools (SSDT) or SQL Server Report Builder.
2. Create a New Report:
o Select New Project > Report Server Project.
o Define your Data Source (pointing to your SQL Server instance and the
database where Expenditure table is located).
o Create a Data Set using a query like SELECT Person, Expenses FROM
Expenditure.
3. Design the Report:

 Use the Report Designer to drag and drop fields into the report layout.
 Add necessary charts, tables, and formatting as needed.

4. Deploy the Report:

 Configure the target server URL in the project properties.


 Deploy the report to the configured SSRS instance.
View the Report:
Open a web browser and navigate to the Report Manager URL (e.g.,
http://<ServerName>/Reports).
Find and view your deployed report.

You might also like