Professional Documents
Culture Documents
Ai - PRCT - Nov 2022
Ai - PRCT - Nov 2022
PRACTICAL EXAMINATION
ARTIFICIAL INTELLIGENCE
INSTRUCTIONS TO CANDIDATES:
I. Assume suitable data wherever necessary
II. Upload source code file on Moodle.
III. Upload .pdf of readable source code with visible output and additional
information/algorithm/explanation/comments with python program Moodle.
IV. Comments: Add comments to explain each program step and necessary
additional information of concepts implemented (you can use text block in
colab to write additional information).
V. Presentation slides: Upload Presentation Slides on allocated algorithm
and its implementation on Moodle.
VI. Upload Lab information as per template provided in .doc format. (Use
google doc [ Lab Assingment Template.docx ] to prepare it and submit
downloaded .doc on Moodle)
VII. Write your Name, Seat Number, PRN number and problem statement you
are implementing at the top of every page/ file.
VIII. Plagiarism is strictly prohibited. If it is found that the contents are copied
(both information and program) from internet sources or other students or
from any project then assignment will be evaluated directly with zero
marks.
IX. All students need to give a presentation of Practical Assignment Allocated
Folder for sharing:
https://drive.google.com/drive/folders/1hriXixNuTACxWz6W4PmYuAjfHozl7
TE3?usp=sharing
Q.1 Build a statistical model by using collected data and [30 Marks]
applying machine learning algorithms. Follow typical CO4
workflow of Machine Learning based software L4
development.
A. Data Collection: Use self-created dataset or use pre [2 Marks]
collected data, or real world domain specific data, or
datasets from Kaggle, UCI, etc. (Don’t use dataset
frequently used in class examples and for regular
assignments). Mention additional information about the
dataset and properties of the attribute.
B. Statistical summary or descriptive statistics of data set [2 Marks]
used (at least three statistical operations), using python.
Highlight the inferences drawn from statistical patterns of
the data.
C. Data Exploration and Visualisation for understanding [2 Marks]
data (at least three to five visualization methods) using
python. Highlight the inferences drawn from this
visualization.
D. Data Preparation or preprocessing using Python: Data [2 Marks]
wrangling and cleaning for converting raw to ready to use
data. Mention reasons for selecting specific operations.
E. Data Splitting and Cross Validation using python, [2 Marks]
Remarks on Cross validation
F. Implement Machine Learning Algorithm (Please see the [6 Marks]
following list for allocated algorithms for your roll
number). Train the Model, Tune Parameters for
improving performance and Justify it.
G. Evaluate Model: Use evaluation metric to measure the [4 Marks]
performance of the model and show it (e.g. confusion
metric, MSE, MAE, ROC etc…).
H. Test the model on previously unseen data and find the [2 Marks]
accuracy of the model. Comparative Analysis with any
other machine learning model.
I. Presentation on assigned problem statement, ORAL and [8 Marks]
Question and Answer session
https://docs.google.com/spreadsheets/d/1xnNistugn9cBwTAS_d__Bxpb30F0D
s9vYN7m7VZ74mU/edit?usp=sharing
Seat_nu
PRN Name Address Problem Statement
m
yuvraj.wadava
WADAVALE YUVARAJ Implement Logistic Regression with
120200217 T224065 le@mitaoe.ac.
BHAGWAT Gradient Descent
in
siddharth.man Implement Feature
MANE SIDDHARTH
120200221 T224068 e@mitaoe.ac.i Engineering and compare all classification
DATTATRAY
n algorithms in the syllabus.
neeraj.narwad
NARWADE NEERAJ Implement Agglomerative Clustering and show
120200244 T224074 e@mitaoe.ac.i
ANIL performance graphically
n
nehaal.pande
Implement and simulate Reinforcement
120200245 PANDEY NEHAAL ANIL T224075 y@mitaoe.ac.i
learning for any real time application
n
anikat.raju@m
120200253 ANIKAT RAJU T224078 Implement Apriori Algorithm
itaoe.ac.in
aniket.kumar
120200255 ANIKET KUMAR T224079 Compare at least five clustering algorithms
@mitaoe.ac.in
divyansh.saw
SAWANT DIVYANSH Implement Regularization techniques in
120200270 T224082 ant@mitaoe.a
VILAS regression and compare their performance.
c.in
Implement Multiple Linear
sahil.gupta@ Regression with Gradient Descent and
120200284 GUPTA SAHIL NITESH T224084
mitaoe.ac.in show how feature scaling/standardization
affect optimization
abhishek.paw Implement and compare Polynomial Regression
PAWAR ABHISHEK
120200307 T224087 ar@mitaoe.ac. with different degree and how it is affecting
NENSING
in error
ajinkya.kshatri
KSHATRIYA AJINKYA
120200323 T224089 ya@mitaoe.ac Logistic Regression with parameter estimation
DHANANJAY
.in
piyush.bhonda
BHONDAVE PIYUSH
120200326 T224090 ve@mitaoe.ac Linear Regression with Parameter Estimation
LAXMAN
.in
Implement Gradient Descent Implementation
TEJAS GAJANAN tejas.mohod@
120200330 T224092 and graphically show how the global maxima is
MOHOD mitaoe.ac.in
achieved.
Implement Linear Regression and show Bias
GAYAKE PAVAN pavan.gayake
120200337 T224093 and Variance Tradeoff with respect to training
MOHAN @mitaoe.ac.in
and testing dataset
Implement different Cross Validation
PATIL CHETAN chetan.patil@
120200373 T224102 techniques with respect to Linear Regression
SHEKHAR mitaoe.ac.in
and compare their performance
Implement Polynomial Regression and
THORAT TEJAS tejas.thorat@
120200384 T224104 show how it affects overfitting and
RAJENDRA mitaoe.ac.in
underfitting of data
chinmay.manu
MANUSMARE
120200402 T224108 smare@mitao Implement Lasso and Ridge Regression
CHINMAY RAJESH
e.ac.in
vasudha.kada Implement K Nearest Neighbour for
KADAM VASUDHA
120200416 T224111 m@mitaoe.ac. Classification and compare for different k
SURESH
in values
MADAMANCHI akanksha.mad
Implement Naive Bays for Prediction of
120200471 AKANKSHA T224119 amanchi@mit
discrete variable
SHRINIWAS aoe.ac.in
mithilesh.birad
Implement Density Based Clustering and show
120200641 BIRADAR MITHILESH T224145 ar@mitaoe.ac.
performance graphically
in
yashodhan.ha
HAKKE YASHODHAN Implement Divisive Clustering and show
120200250 T226011 kke@mitaoe.a
DEEPAK performance graphically
c.in
parth.kumar@
120200364 PARTH KUMAR T227076 Implement Lasso and Ridge Regression
mitaoe.ac.in
saurabh.moha
MOHABE SAURABH Implement Logistic Regression with
120200584 T227112 be@mitaoe.ac
RAMPRASAD Gradient Descent
.in
abhijeet.sawa Implement Feature
SAWANT ABHIJEET
120200058 T228005 nt@mitaoe.ac. Engineering and compare all classification
SANJAY
in algorithms in the syllabus.
sharwari.chan
CHANDURKAR Implement and compare k-Means Clustering
120200124 T228012 durkar@mitao
SHARWARI NITIN with different k values.
e.ac.in
aumkar.gurnul
GURNULE AUMKAR Implement k-Medoid Clustering and show
120200125 T228013 e@mitaoe.ac.i
SATISH performance graphically
n
asuhtosh.pha
PHANSE ASHUTOSH
120200341 T228033 nse@mitaoe.a Implement Apriori Algorithm
AVINASH
c.in
avish.khandel
120200353 AVISH KHANDELWAL T228035 wal@mitaoe.a Compare at least five clustering algorithms
c.in
hardik.kewda
120200650 HARDIK KEWDA T228061 Linear Regression with Parameter Estimation
@mitaoe.ac.in
202102080 GANDHI PRAJVAL prajval.gandhi Implement spam filtering using Naive Bays
T228074
006 VINOD @mitaoe.ac.in on real time messages/emails
harshada.gaik
202102080 GAIKWAD HARSHADA Implement Naive Bayes for Prediction of
T228075 wad@mitaoe.
007 DATTATRAY discrete variable
ac.in
sanskar.wagh
WAGHMARE Implement Logistic Regression with
120200258 T229048 mare@mitaoe
SANSKAR AMIT Gradient Descent
.ac.in
Implement Feature
NIKAM RONAK ronak.nikam@
120200260 T229049 Engineering and compare all classification
DINESH mitaoe.ac.in
algorithms in the syllabus.
avishkar.padal
AVISHKAR HARIBHAU Implement k-Medoid Clustering and show
120200360 T229068 e@mitaoe.ac.i
PADALE performance graphically
n
abhishek.jawa
202102090 JAWADE ABHISHEK
T229179 de@mitaoe.ac Compare at least five clustering algorithms
099 SATISH
.in
Oral