Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

MIT ACADEMY OF ENGINEERING

COURSE CODE: CS352L NOVEMBER 2022

THIRD YEAR BTECH SEMESTER - V EXAMINATION 2022 – 2023


DEPARTMENT OF COMPUTER ENGINEERING (OPEN ELECTIVE)

PRACTICAL EXAMINATION
ARTIFICIAL INTELLIGENCE

DATE : 23 and 24 NOV 2022 MAX MARKS : 30 MARKS

INSTRUCTIONS TO CANDIDATES:
I. Assume suitable data wherever necessary
II. Upload source code file on Moodle.
III. Upload .pdf of readable source code with visible output and additional
information/algorithm/explanation/comments with python program Moodle.
IV. Comments: Add comments to explain each program step and necessary
additional information of concepts implemented (you can use text block in
colab to write additional information).
V. Presentation slides: Upload Presentation Slides on allocated algorithm
and its implementation on Moodle.
VI. Upload Lab information as per template provided in .doc format. (Use
google doc [ Lab Assingment Template.docx ] to prepare it and submit
downloaded .doc on Moodle)
VII. Write your Name, Seat Number, PRN number and problem statement you
are implementing at the top of every page/ file.
VIII. Plagiarism is strictly prohibited. If it is found that the contents are copied
(both information and program) from internet sources or other students or
from any project then assignment will be evaluated directly with zero
marks.
IX. All students need to give a presentation of Practical Assignment Allocated
Folder for sharing:
https://drive.google.com/drive/folders/1hriXixNuTACxWz6W4PmYuAjfHozl7
TE3?usp=sharing
Q.1 Build a statistical model by using collected data and [30 Marks]
applying machine learning algorithms. Follow typical CO4
workflow of Machine Learning based software L4
development.
A. Data Collection: Use self-created dataset or use pre [2 Marks]
collected data, or real world domain specific data, or
datasets from Kaggle, UCI, etc. (Don’t use dataset
frequently used in class examples and for regular
assignments). Mention additional information about the
dataset and properties of the attribute.
B. Statistical summary or descriptive statistics of data set [2 Marks]
used (at least three statistical operations), using python.
Highlight the inferences drawn from statistical patterns of
the data.
C. Data Exploration and Visualisation for understanding [2 Marks]
data (at least three to five visualization methods) using
python. Highlight the inferences drawn from this
visualization.
D. Data Preparation or preprocessing using Python: Data [2 Marks]
wrangling and cleaning for converting raw to ready to use
data. Mention reasons for selecting specific operations.
E. Data Splitting and Cross Validation using python, [2 Marks]
Remarks on Cross validation
F. Implement Machine Learning Algorithm (Please see the [6 Marks]
following list for allocated algorithms for your roll
number). Train the Model, Tune Parameters for
improving performance and Justify it.
G. Evaluate Model: Use evaluation metric to measure the [4 Marks]
performance of the model and show it (e.g. confusion
metric, MSE, MAE, ROC etc…).
H. Test the model on previously unseen data and find the [2 Marks]
accuracy of the model. Comparative Analysis with any
other machine learning model.
I. Presentation on assigned problem statement, ORAL and [8 Marks]
Question and Answer session

It is recommended to use your own dataset in the domain of interest. Pl write


name of the dataset, domain the dataset and any additional information of
dataset, you are using in the Google Sheet with below link. All are informed to
take different dataset. Adding your dataset information as early as possible in
this sheet will avoid others taking the same dataset.

https://docs.google.com/spreadsheets/d/1xnNistugn9cBwTAS_d__Bxpb30F0D
s9vYN7m7VZ74mU/edit?usp=sharing

Seat_nu
PRN Name Address Problem Statement
m

Implement Multiple Linear


ANSARI AFSANA afsana.ansari Regression with Gradient Descent and
120200064 T224013
MOHAMMAD HUSSAIN @mitaoe.ac.in show how feature scaling/standardization
affect optimization
Implement and compare Polynomial Regression
JOSHI PRERNA prerna.joshi@
120200083 T224016 with different degree and how it is affecting
DHARAMRAJ mitaoe.ac.in
error
shreya.magdu
MAGDUM SHREYA Implement Locally weighted Linear Regression
120200090 T224020 m@mitaoe.ac.
SHIVANAND and compare loss function of linear regression.
in
avdhoot.tavha
TAVHARE AVDHOOT
120200105 T224024 re@mitaoe.ac. Logistic Regression with parameter estimation
ANKUSH
in
abhishek.hajar
HAJARE ABHISHEK
120200127 T224029 e@mitaoe.ac.i Linear Regression with Parameter Estimation
SHIRISH
n
aalhad.narwa Implement Gradient Descent Implementation
NARWADE AALHAD
120200145 T224032 de@mitaoe.ac and graphically show how the global maxima is
RAVINDRA
.in achieved.
pravin.chilkaw Implement Linear Regression and show Bias
CHILKAWAR PRAVIN
120200149 T224034 ar@mitaoe.ac. and Variance Tradeoff with respect to training
BALAJI
in and testing dataset
yogesh.khand Implement different Cross Validation
KHANDARE YOGESH
120200156 T224035 are@mitaoe.a techniques with respect to Linear Regression
KESHAV
c.in and compare their performance
rutuja.mahaja Implement Polynomial Regression and
MAHAJAN RUTUJA
120200160 T224039 n@mitaoe.ac.i show how it affects overfitting and
YOGESH
n underfitting of data

BORLE KUNAL kunal.borle@


120200164 T224041 Implement Lasso and Ridge Regression
RAJENDRA mitaoe.ac.in

lokesh.chaudh Implement K Nearest Neighbour for


CHAUDHARI LOKESH
120200167 T224043 ari@mitaoe.ac Classification and compare for different k
DEVCHAND
.in values
siddhant.chau
SIDDHANT Implement K Nearest Neighbour for Regression
120200176 T224046 dhary@mitaoe
CHOUDHARY and compare for different k values.
.ac.in

trivedi.sunit@ Implement spam filtering using Naive Bays


120200200 SUNIT TRIVEDI T224058
mitaoe.ac.in on real time messages/emails

BHISE AKASH akash.bhise@ Implement Naive Bayes for Prediction of


120200209 T224062
MADHUKAR mitaoe.ac.in discrete variable

yuvraj.wadava
WADAVALE YUVARAJ Implement Logistic Regression with
120200217 T224065 le@mitaoe.ac.
BHAGWAT Gradient Descent
in
siddharth.man Implement Feature
MANE SIDDHARTH
120200221 T224068 e@mitaoe.ac.i Engineering and compare all classification
DATTATRAY
n algorithms in the syllabus.

lalit.kinage@ Implement and compare k-Means Clustering


120200234 KINAGE LALIT JAYANT T224072
mitaoe.ac.in with different k values.

WASNIK RACHI rachi.wasnik@ Implement k-Medoid Clustering and show


120200240 T224073
SATISH mitaoe.ac.in performance graphically

neeraj.narwad
NARWADE NEERAJ Implement Agglomerative Clustering and show
120200244 T224074 e@mitaoe.ac.i
ANIL performance graphically
n
nehaal.pande
Implement and simulate Reinforcement
120200245 PANDEY NEHAAL ANIL T224075 y@mitaoe.ac.i
learning for any real time application
n

isha.chatap@ Implement Density Based Clustering and show


120200251 CHATAP ISHA ASHOK T224077
mitaoe.ac.in performance graphically

anikat.raju@m
120200253 ANIKAT RAJU T224078 Implement Apriori Algorithm
itaoe.ac.in
aniket.kumar
120200255 ANIKET KUMAR T224079 Compare at least five clustering algorithms
@mitaoe.ac.in

PATIL PRANJAL pranjal.patil@ Implement Divisive Clustering and show


120200263 T224080
SANJIV mitaoe.ac.in performance graphically

divyansh.saw
SAWANT DIVYANSH Implement Regularization techniques in
120200270 T224082 ant@mitaoe.a
VILAS regression and compare their performance.
c.in
Implement Multiple Linear
sahil.gupta@ Regression with Gradient Descent and
120200284 GUPTA SAHIL NITESH T224084
mitaoe.ac.in show how feature scaling/standardization
affect optimization
abhishek.paw Implement and compare Polynomial Regression
PAWAR ABHISHEK
120200307 T224087 ar@mitaoe.ac. with different degree and how it is affecting
NENSING
in error

aditya.patil@ Implement Locally weighted Linear Regression


120200320 PATIL ADITYA NIWAS T224088
mitaoe.ac.in and compare loss function of linear regression.

ajinkya.kshatri
KSHATRIYA AJINKYA
120200323 T224089 ya@mitaoe.ac Logistic Regression with parameter estimation
DHANANJAY
.in
piyush.bhonda
BHONDAVE PIYUSH
120200326 T224090 ve@mitaoe.ac Linear Regression with Parameter Estimation
LAXMAN
.in
Implement Gradient Descent Implementation
TEJAS GAJANAN tejas.mohod@
120200330 T224092 and graphically show how the global maxima is
MOHOD mitaoe.ac.in
achieved.
Implement Linear Regression and show Bias
GAYAKE PAVAN pavan.gayake
120200337 T224093 and Variance Tradeoff with respect to training
MOHAN @mitaoe.ac.in
and testing dataset
Implement different Cross Validation
PATIL CHETAN chetan.patil@
120200373 T224102 techniques with respect to Linear Regression
SHEKHAR mitaoe.ac.in
and compare their performance
Implement Polynomial Regression and
THORAT TEJAS tejas.thorat@
120200384 T224104 show how it affects overfitting and
RAJENDRA mitaoe.ac.in
underfitting of data
chinmay.manu
MANUSMARE
120200402 T224108 smare@mitao Implement Lasso and Ridge Regression
CHINMAY RAJESH
e.ac.in
vasudha.kada Implement K Nearest Neighbour for
KADAM VASUDHA
120200416 T224111 m@mitaoe.ac. Classification and compare for different k
SURESH
in values

moiz.bohari@ Implement K Nearest Neighbour for Regression


120200437 BOHARI MOIZ ALTAF T224114
mitaoe.ac.in and compare for different k values.
sanket.patil@ Implement spam filtering using Naive Bays
120200460 PATIL SANKET AJAY T224116
mitaoe.ac.in on real time messages/emails

MADAMANCHI akanksha.mad
Implement Naive Bays for Prediction of
120200471 AKANKSHA T224119 amanchi@mit
discrete variable
SHRINIWAS aoe.ac.in

INGLE SAINATH sainath.ingle Implement Logistic Regression with


120200520 T224126
SHAMRAO @mitaoe.ac.in Gradient Descent

soham.deshm Implement Feature


DESHMUKH SOHAM
120200532 T224130 ukh@mitaoe.a Engineering and compare all classification
BAJIRAO
c.in algorithms in the syllabus.
dewashish.sol
DEWASHISH Implement and compare k-Means Clustering
120200570 T224133 anke@mitaoe.
SOLANKE with different k values.
ac.in
bhagyashri.ka
KADAM BHAGYASHRI Implement k-Medoid Clustering and show
120200581 T224135 dam@mitaoe.
AVINASH performance graphically
ac.in
MANDAVE prathamesh.m
Implement Agglomerative Clustering and show
120200595 PRATHAMESH T224136 andave@mita
performance graphically
SANJAY oe.ac.in

MALI SNEHAL snehal.mali@ Implement and simulate Reinforcement


120200625 T224142
CHANDRAKANT mitaoe.ac.in learning for any real time application

mithilesh.birad
Implement Density Based Clustering and show
120200641 BIRADAR MITHILESH T224145 ar@mitaoe.ac.
performance graphically
in

ZADE ATHARVA atharva.zade


120200155 T225012 Implement Apriori Algorithm
SANDIP @mitaoe.ac.in

MOKASHI ALFIYA alfiya.mokashi


120200236 T226009 Compare at least five clustering algorithms
AMIN @mitaoe.ac.in

yashodhan.ha
HAKKE YASHODHAN Implement Divisive Clustering and show
120200250 T226011 kke@mitaoe.a
DEEPAK performance graphically
c.in

anushka.tiwari Implement Regularization techniques in


120200271 ANUSHKA TIWARI T226013
@mitaoe.ac.in regression and compare their performance.

Implement Multiple Linear


abhishek.hulk
HULKE ABHISHEK Regression with Gradient Descent and
120200567 T226030 e@mitaoe.ac.i
APPAJI show how feature scaling/standardization
n
affect optimization
Implement and compare Polynomial Regression
LATKAR OMKESH omkesh.latkar
120200012 T227006 with different degree and how it is affecting
DARSHAN @mitaoe.ac.in
error

KAMBLE ANIKET aniket.kamble Implement Locally weighted Linear Regression


120200044 T227012
SURESH @mitaoe.ac.in and compare loss function of linear regression.
akanksha.bhe
BHERJE AKANKSHA
120200302 T227060 rje@mitaoe.ac Logistic Regression with parameter estimation
ANANDA
.in

KOKATE KRISHNA krishna.kokate


120200304 T227061 Linear Regression with Parameter Estimation
BABARAO @mitaoe.ac.in

Implement Gradient Descent Implementation


JADHAV YOGESH yogesh.jadhav
120200311 T227063 and graphically show how the global maxima is
UDDHAV @mitaoe.ac.in
achieved.
Implement Linear Regression and show Bias
JIDGE AVANTIKA avantika.jidge
120200332 T227065 and Variance Tradeoff with respect to training
GANESH @mitaoe.ac.in
and testing dataset
Implement different Cross Validation
KHUTE SARTHAK sarthak.khute
120200350 T227071 techniques with respect to Linear Regression
DAULAT @mitaoe.ac.in
and compare their performance
Implement Polynomial Regression and
DEVLE SATWIK satwik.devle@
120200358 T227074 show how it affects overfitting and
PRAMOD mitaoe.ac.in
underfitting of data

parth.kumar@
120200364 PARTH KUMAR T227076 Implement Lasso and Ridge Regression
mitaoe.ac.in

Implement K Nearest Neighbour for


riya.hiwanj@
120200456 HIWANJ RIYA NITIN T227087 Classification and compare for different k
mitaoe.ac.in
values
KOPARDE prathamesh.k
Implement K Nearest Neighbour for Regression
120200492 PRATHAMESH T227093 oparde@mita
and compare for different k values.
PRALHAD oe.ac.in
sarthak.jamda
JAMDAR SARTHAK Implement spam filtering using Naive Bays
120200550 T227103 r@mitaoe.ac.i
VITHALDAS on real time messages/emails
n

apurva.sorte Implement Naive Bays for Prediction of


120200565 SORTE APURVA DILIP T227106
@mitaoe.ac.in discrete variable

saurabh.moha
MOHABE SAURABH Implement Logistic Regression with
120200584 T227112 be@mitaoe.ac
RAMPRASAD Gradient Descent
.in
abhijeet.sawa Implement Feature
SAWANT ABHIJEET
120200058 T228005 nt@mitaoe.ac. Engineering and compare all classification
SANJAY
in algorithms in the syllabus.
sharwari.chan
CHANDURKAR Implement and compare k-Means Clustering
120200124 T228012 durkar@mitao
SHARWARI NITIN with different k values.
e.ac.in
aumkar.gurnul
GURNULE AUMKAR Implement k-Medoid Clustering and show
120200125 T228013 e@mitaoe.ac.i
SATISH performance graphically
n

rucha.patil@m Implement Agglomerative Clustering and show


120200169 PATIL RUCHA HILAL T228022
itaoe.ac.in performance graphically
janhvi.salunkh
SALUNKHE JANHVI Implement and simulate Reinforcement
120200242 T228026 e@mitaoe.ac.i
MAHENDRA learning for any real time application
n

prateek.bhat Implement Density Based Clustering and show


120200259 PRATEEK BHAT T228027
@mitaoe.ac.in performance graphically

asuhtosh.pha
PHANSE ASHUTOSH
120200341 T228033 nse@mitaoe.a Implement Apriori Algorithm
AVINASH
c.in
avish.khandel
120200353 AVISH KHANDELWAL T228035 wal@mitaoe.a Compare at least five clustering algorithms
c.in

GAIKWAD YASH yash.gaikwad Implement Divisive Clustering and show


120200388 T228041
DATTATRAYA @mitaoe.ac.in performance graphically

JADHAV OMKAR omkar.jadhav Implement Regularization techniques in


120200421 T228043
SUNIL @mitaoe.ac.in regression and compare their performance.

Implement Multiple Linear


FAROOQUI mohammad.w
Regression with Gradient Descent and
120200475 MOHAMMAD WAQAR T228049 aqar@mitaoe.
show how feature scaling/standardization
NIZAMUDDIN ac.in
affect optimization
LOKHANDE hrushikesh.lok Implement and compare Polynomial Regression
120200486 HRUSHIKESH T228052 hande@mitao with different degree and how it is affecting
AMBADAS e.ac.in error
hrutuja.hiwark
HIWARKAR HRUTUJA Implement Locally weighted Linear Regression
120200487 T228053 ar@mitaoe.ac.
RAJENDRA and compare loss function of linear regression.
in

JADHAV TEJAS tejas.jadhav@


120200504 T228054 Logistic Regression with parameter estimation
NARAYAN mitaoe.ac.in

hardik.kewda
120200650 HARDIK KEWDA T228061 Linear Regression with Parameter Estimation
@mitaoe.ac.in

pawan.bhand Implement Gradient Descent Implementation


PAWAN SINGH
120200651 T228062 ari@mitaoe.ac and graphically show how the global maxima is
BHANDARI
.in achieved.
divanshu.sing Implement Linear Regression and show Bias
120200652 DIVANSHU SINGH T228063 h@mitaoe.ac.i and Variance Tradeoff with respect to training
n and testing dataset
harsh.bawask Implement different Cross Validation
120200653 HARSH BAWASKAR T228064 ar@mitaoe.ac. techniques with respect to Linear Regression
in and compare their performance
shriram.jagdal Implement Polynomial Regression and
JAGDALE SHRIRAM
120200658 T228066 e@mitaoe.ac.i show how it affects overfitting and
RAMESH
n underfitting of data
mohammad.a
120200661 MOHAMMAD ANAS T228068 nas@mitaoe.a Implement Lasso and Ridge Regression
c.in
Implement K Nearest Neighbour for
202102080 MANKAR SIDDHI siddhi.mankar
T228071 Classification and compare for different k
003 DINESH @mitaoe.ac.in
values
prathmesh.ma
202102080 MANE PRATHMESH Implement K Nearest Neighbour for Regression
T228073 ne@mitaoe.ac
005 MAHADEO and compare for different k values.
.in

202102080 GANDHI PRAJVAL prajval.gandhi Implement spam filtering using Naive Bays
T228074
006 VINOD @mitaoe.ac.in on real time messages/emails

harshada.gaik
202102080 GAIKWAD HARSHADA Implement Naive Bayes for Prediction of
T228075 wad@mitaoe.
007 DATTATRAY discrete variable
ac.in
sanskar.wagh
WAGHMARE Implement Logistic Regression with
120200258 T229048 mare@mitaoe
SANSKAR AMIT Gradient Descent
.ac.in
Implement Feature
NIKAM RONAK ronak.nikam@
120200260 T229049 Engineering and compare all classification
DINESH mitaoe.ac.in
algorithms in the syllabus.

SATBHAI NIKHIL nikhil.satbhai Implement and compare k-Means Clustering


120200316 T229061
SANTOSH @mitaoe.ac.in with different k values.

avishkar.padal
AVISHKAR HARIBHAU Implement k-Medoid Clustering and show
120200360 T229068 e@mitaoe.ac.i
PADALE performance graphically
n

pritesh.nistane Implement Agglomerative Clustering and show


120200548 PRITESH NISTANE T229099
@mitaoe.ac.in performance graphically

navtej.more@ Implement and simulate Reinforcement


120200551 MORE NAVTEJ VIJAY T229100
mitaoe.ac.in learning for any real time application

PAWAR SURAJ suraj.pawar@ Implement Density Based Clustering and show


120200615 T229109
RAJESH mitaoe.ac.in performance graphically

202102090 BIRHADE ROHIT rohit.birhade


T229175 Implement Apriori Algorithm
094 SATISH @mitaoe.ac.in

abhishek.jawa
202102090 JAWADE ABHISHEK
T229179 de@mitaoe.ac Compare at least five clustering algorithms
099 SATISH
.in

Implement Divisive Clustering and show


performance graphically

Implement Regularization techniques in


regression and compare their performance.
Rubrics
Criteria

Design of the Logic as per the requirement specified in problem statement

Code Elegance and Desired Output

Creativity and Originality

Artificial Intelligence and Machine Learning Conceptual understanding

Oral

You might also like