Evaluation of Fruit Ripeness Using Electronic Nose: This Paper Describes The Use of An

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 34

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND

COMMUNICATION TECHNOLOGIES NCFC2T15

1. EVALUATION OF FRUIT RIPENESS USING ELECTRONIC NOSE


N.Sameena Begum, R.Suveenashri, R.Tamil Bharathi, J.Vanmathi, R.Sudha
Department of Electronics And Communication Engineering
Avinashilingam University

This paper describes the use of an Electronic nose or an artificial nose that mimics the
behavior of human nose. Electronic nose is defined as an instrument which comprises of a sensor
for recognizing simple or complex odor. One of the main concerns of the food industry is the
systematic determination of fruit ripeness under harvest and post-harvest conditions, because
variability in ripeness is identified by consumers as a lack of quality. Most of the traditional
methods that have been used to access fruit ripeness are destructive and thus cannot be readily
applied. Hence we use ethylene gas sensor to detect the fruit ripeness as ethylene gas is the key
component in fruit maturation. A good correlation between sensor signals and some fruit quality
indicators was also found. These results prove that E-NOSE can be used as a quality control tool
i.e., it has been used for continuous monitoring of fruit freshness during its point of sale and
shipment.
2. MALICIOUS NODE DETECTION IN A QOS ORIENTED DISTRIBUTED
APPROACH
Anitha.M , Gobinath T, Gomathi B J, Gowthaman D ,Prabhakaran T
Department of Electronics and Communication Engineering
SNS College of Technology

As wireless communications gains popularity, significant research has been devoted to


supporting real time transmission with stringent quality of service requirements for wireless
applications. At the same time, a wireless hybrid network that integrates a mobile wireless adhoc
network and a wireless infrastructure network has been proven to be a better alternative for the
next generation wireless network. Quality of service Oriented Distributed routing protocol
enhances the Quality of Service capability of hybrid networks. In our project, we improve the
quality of QOD(Quality of service Oriented Distributed routing protocol) and develop a protocol
named EQOD(Enhanced Quality of service Oriented Distributed routing protocol). The
Enhanced Quality of service Oriented Distributed routing protocol is mobility resilient than
Quality of service Oriented Distributed routing protocol. The EQOD protocol improves the
throughput and decreased the overhead in the network. Hence, the EQOD protocol can provide
high Quality of Service performance in terms of overhead, transmission delay and mobility
resilient. This also increases the energy efficiency of Quality of service Oriented Distributed
routing protocol and avoids the energy harvesting problem. Along with this, we deploy several
number of nodes and use the 3 modes of EQOD to detect the malicious node present in the
network. Once it is detected; it will be isolated from the network. Hence it leads to develop an
effective hybrid network with improved QOS and high security.

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

3. FEATURE EXTRACTION AND IMAGE DISPLAY USING LOSSLESS


MULTIWAVELET EEG COMPRESSION
Srisabitha.N, Vasudevan.S, Rajesh Kumar.T,Geetha.K
Department of Computer Science And Engineering
Karpagam Institute of Technology

This paper explains the method of compression


of
multi-channel
electroencephalogram signals using lossless multi wavelet compression technique. The
Multichannel wavelet transform is used to exploit the inter-correlation among the EEG channels.
To further minimize the temporal redundancy, Huffman transform is applied. A compression
algorithm is built based on the principle of lossy plus residual coding, consisting of a
matrix/tensor decomposition based coder in the lossy layer. This approach guarantees a
specifiable maximum absolute error between original and reconstructed signals. The
compression algorithm is applied to three different scalp EEG datasets and an intracranial EEG
dataset, each with different sampling rate and resolution. The compression of
Electroencephalographic (EEG) signal is of great interest to many in the biomedical community.
The motivation for this is the large amount of data involved in collecting EEG information which
requires more memory for storage and high bandwidth for transmission. Lossless compression of
EEG is essential due to the necessity for exact recovery of the data for diagnostic purposes.
4. SECURED AND RELEVANT INFORMATION RETRIEVAL OVER ENCRYPTED
CLOUD DATA USING MRSE
Mahalakshmi K , Ummusalma.S
Department of Computer Science and Engineering
Bannari Amman Institute of Technology

With the advent of cloud computing, data owners are motivated to outsource their
complex data management systems from local sites to the commercial public cloud for great
flexibility and economic savings. But for protecting data privacy, sensitive data have to be
encrypted before out sourcing, which obsoletes traditional data utilization based on plaintext
keyword search. Thus, enabling an encrypted cloud data search service is of paramount
importance. Considering the large number of data users and documents in the cloud, it is
necessary to allow multiple keywords in the search request and return documents in the order of
their relevance to these keywords. Related works on searchable encryption focus on single
keyword search or Boolean keyword search, and rarely sort the search results. In this paper, for
the first time, we define and solve the challenging problem of privacy-preserving multi-keyword
ranked search over encrypted data in cloud computing (MRSE). We establish a set of strict
privacy requirements for such a secure cloud data utilization system. Among various multikeyword semantics, we choose the efficient similarity measure of coordinate matching, i.e., as
many matches as possible, to capture the relevance of data documents to the search query. We
further use inner product similarity to quantitatively evaluate such similarity measure. We first
propose a basic idea for the MRSE based on secure inner product computation, and then give
two significantly improved MRSE schemes to achieve various stringent privacy requirements in
two different threat models. To improve search experience of the data search service, we further
extend these two schemes to support more search semantics. Thorough analysis investigating

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

privacy and efficiency guarantees of proposed schemes isgiven. Experiments on the real-world
data set further show proposed schemes indeed introduce lowoverhead on computation and
communication.
5. AN EMBEDDED SYSTEM-ON-CHIP ARCHITECTURE FOR REAL-TIME
FEATURE DETECTION AND MATCHING
E.Sabarinathan, M.Senthilkumar
Department of Electrical And Electronical Engineering
K.S.R. College of Engineering

Effective recognition and unswerving identification of pictorial features is an essential


delinquent in applications, such as object recognition, structure from motion, image indexing and
visual localization. The input files proceeds, numerous methods such as cinematic systems,
views from manifold cameras or multi-dimensional records from a scanner. Contemporaneous
routine is an extreme demand to ultimate of these solicitations, which require the discovery and
consistent of the filmic structures in real time .Although feature recognition and identification
approaches have been deliberate in the literature, due to their computational sophistication,
therefore untainted software implementation deficient by distinctive hardware is far from
appropriate in their routine for real time applications. The existing scheme comprises of Scale
Invariant Feature Transform (SIFT) for feature detection and Binary Robust Independent
Elementary Features (BRIEF) for feature description and matching. This scheme fails to
distinguish the features are invariant to scale, alteration in perspective and illumination and the
accumulation of noise. The proposed system comprises of wavelet feature extraction technique
and classification method subtractive clustering is used. This pattern diminishes a time
consuming and total system complexity. This paper emphases on a diverse hardware strategy to
support real-time performance of founding correspondences among perfectionist consecutive
frames of high-resolution 720 p (1280 x 720) video. Due to these assists, the projected system
achieves feature recognition and matching at 60 frame/s for 720-p video. Its dispensation
promptness can meeting and even ruin the ultimatum of supreme realistic simultaneous video
analytics applications.
6. CLASSIFICATION OF PULMONARY NODULE ON LUNG CT IMAGE USING SVM
CLASSIFIER ALONG WITH OTSUs THRESHOLDING
S.Shanmugalakshmi,K.Sagarban Nisha,K.Vaiyammal Ms.A.Alaimahal
Department of Electronics And Communication Engineering
Velammal College of Engineering & Technology

Lung cancer has killed many people in recent years. Early diagnosis is very important to
enhance the patients chance for survival .The overall 5-year survival rate for lung cancer patients
increases from 14 to 49% if the disease is detected in time. Pulmonary nodule is the initial
indication of lung cancer. A CAD system that is adopted for the diagnosis lung cancer, uses lung
CT images as input and based on an algorithm helps radiologists to perform an image analysis.
The algorithm initiates with the preprocessing step improves images by removing distortion and
enhance the important features. This preprocessing step used to lead the following image
segmentation stage by Otsus thresholding and image classification stage by SVM classifier
3

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

.This paper proposes GLCM feature extraction and SVM classifier which is used to check the
state of a patient in its early stage whether the given lung CT image is Benign or Malignant.
7. SECURE IRIS AUTHENTICATION USING VISUAL CRYPTOGRAPHY
S.Lavanya
Department of Master of Computer Applications
K.S. Rangasamy College of Technology

This

project is entitled as SECURE IRIS AUTHENTICATION USING VISUAL


CRYPTOGRAPHY is authorized iris using authentication system. Security of data has been a
major issue from many years. Using the age old technique of encryption and decryption has been
easy to track for people around. Providing security to data using new technique is the need of the
hour. This project uses the technique of Visual cryptography and providing biometric
authentication.For automated personal identification biometric authentication is getting more
attention. Biometrics is the detailed measurement of human body. Biometrics deal with
automated methods of identifying a person or verifying the identity of person based on
physiological or behavioral characteristics. There are various applications where personal
identification is required such as passport Controls, computer login control, secure electronic
banking, bank ATM, credit cards, airport, mobile phones, health and social services, etc. Many
biometric techniques are available such as facial thermo gram, hand vein, odour, ear, hand
geometry, fingerprint, face, retina, iris, palm print, voice and signature. Among those iris
recognition is one of the most promising approach because of stability, uniqueness and noninvasiveness.
8. ONTOLOGY BASED AUTOMATIC MODULEGENERATION FROM THE
E-BOOK
Nikhila.K.N, Manjula.P.M, Rajeshwari.R,Keerthana.M
Department of Computer Science And Engineering
Jayam College of Engineering And Technology

Technology Supported learning Systems have proved to be helpful in many learning


situations. These systems require an appropriate representation of the knowledge to be learned,
the Domain Module. The authoring of the Domain Module is cost and labor intensive. A novel
DOM-Sortze is a system that uses natural language processing techniques, heuristic reasoning,
and ontology for the semiautomatic construction of the Domain Module from electronic
textbooks. To determine how it might help in the Domain Module authoring process, it has been
tested with an electronic textbook. Its work presents novel DOM-Sortze and describes the
experiment carried out. Novel DOM-Sortze comprises improving the generation of the LDO. It
is planned to enhance the grammar for identifying pedagogical relationships. Novel DOM-Sortze
is currently able to process images in the electronic document, it only considers their position in
the text, and not where the image is referenced. Novel DOM-Sortze is being enhanced to support
multilingual Domain Module generation. The LDO ontology supports the multilingual
representation of the domain topics, and machine translation might be used to get approximate
translations of the gathered LOs, used for searching and retrieving from the LOR or web pages.

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

9. GENERATION OF META DATA USING CONTENT BASED IMAGE RETRIEVAL


SYSTEM- IMAGE PROCESSING
R.Ganesa, R.Geetha
Department of Master of Computer Application
K.S. Rangasamy College of Technology

The content based image retrieval (CBIR) is one of the most popular, rising research
areas of the digital image processing. Most of the available image search tools, such as Google
Images and Yahoo! Image search, are based on textual annotation of images. In these tools,
images are manually annotated with keywords and then retrieved using text-based search
methods. The performances of these systems are not satisfactory. The goal of CBIR is to extract
visual content of an image automatically, like color, texture, or shape. This project aims to
introduce the problems and challenges concerned with the design and the creation of CBIR
systems, which is based on the accurate image search mechanism. For efficient data
management, a system is proposed which generates metadata for image contents. This system is
using Content-Based Image Retrieval System(CBIR) based on Mpeg-7 descriptors. First, lowlevel features are extracted from the query image without metadata and the images with similar
low-level features are retrieved from the CBIR system. Metadata of the result images which are
similar to the query image are extracted from the metadata database. From the resulting
metadata, common keywords are extracted and proposed as the keywords for the query image.
The extraction of color features from digital images depends on an understanding of the theory
of color and the representation of color in digital images. Color spaces are an important
component for relating color to its representation in digital form. The transformations between
different color spaces and the quantization of color information are primary determinants of a
given feature extraction method. The approach is found to be robust in terms of accuracy and is
92.4% amongst five categories.
10. DATA INTEGRITY IN CLOUD COMPUTING
K.Latha
Department of Master of Computer Applications
K.S. Rangasamy College of Technology

Cloud computing has been envisioned as the de-facto solution to the rising storage costs
of IT Enterprises. With the high costs of data storage devices as well as the rapid rate at which
data is being generated it proves costly for enterprises or individual users to frequently update
their hardware. Apart from reduction in storage costs data outsourcing to the cloud also helps in
reducing the maintenance. Cloud storage moves the users data to large data centers, which are
remotely located, on which user does not have any control. However, this unique feature of the
cloud poses many new security challenges which need to be clearly understood and resolved. It
provide a scheme which gives a proofof data integrity in the cloud which the customer can
employ to check the correctness of his data in the cloud. This proof can be agreed upon by both
the cloud and the customer and can be incorporated in the Service level agreement (SLA).

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

11. AUTHORIZED ATM APPLICATION USING FINGER PRINT


S.Ramya
Department of Master of Computer Applications
K.S. Rangasamy College of Technology

This project is entitled as AUTHORIZED ATM APPLICATION USING FINGER


PRINT is authorized using Fingerprint authentication system. First of all collect customer
personal details like name, address, phone number and account details with finger print images
and stored in databases. After finger print registration, then go to finger print verification
process. During the verification process the registered finger print image stored in database will
be compared with the existing image. If the comparisons of both images are true it will display as
authentication success and move to ATM transaction window or else display as not
authenticated. After the successful login of ATM application user can able to change password,
withdraw, view balance enquiry. Normalization and segmentation methods are used for security
purpose. Here user can upload two finger print images for alternate transaction.This Project is
developed in Java Platform with the database is MYSQL. Because Java is Object Oriented
Language and understand easily also user friendly to both Programmer and user compare to all
other web technologies. Database MYSQL is very essential Queries to develop web-based
application and it is very simple to create and maintain.
12. PRIVACY PRESERVING INFLUENCER MINING IN INTEREST-BASED SOCIAL
MEDIA NETWORKS
P.Sugapriya ,K.Devipriya ,B.Gayathri , K.Mohana Vidhya
Department of Information Technology
V.S.B Engineering College

OSNs provide built-in mechanisms enabling users to communicate and share contents
with other members. OSN users can post statuses and notes, upload photos and videos in their
own spaces, tag others to their contents, and share the contents with their friends. On the other
hand, users can also post contents in their friends spaces. The shared contents may be connected
with multiple users. In interest-based online social media networks, users can easily create and
share personal content of interest, such as tweets, photos, music tracks, and videos. The largescale user-contributed content contains rich social media information such as tags, views,
favorites, and comments, which are very useful for mining social influence. The social links such
as views, favorites, and re tweets, indicate certain influence in the community. Since the content
of interest is essentially topic-specific, the underlying social influence is topic-sensitive. TSIM
aims to mine topic-specific influential nodes in the networks. In particular, we take Flickr, one of
the most popular photos sharing websites, as the social media platform in our study. Novel
Topic-Sensitive Influencer Mining (TSIM) framework is used due to interest based social media
networks. TSIM aims to find topical influential users and images. The influence estimation is
determined with a hyper graph learning approach. In the hyper graph, the vertices represent users
and images, and the hyper edges are utilized to capture multi type relations including
visualtextual content relations among images, and social links between users and images. The
influence estimation is determined with a hyper graph learning approach. In the hyper graph, the
vertices represent users and images, and the hyper edges are utilized to capture multi type
6

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

relations including visual textual content relations among images, and social links between users
and images.
13.COLLEGE BUS LOCATOR
V.Surryaprabbha,M.R.Subashini, S.Sangavi, N.Sivaranjani
Department of Information Technology
V.S.B. Engineering College

Mobile Phone Tracker application is a tracking application with which you can control
track the mobile phones. Using this application we can have variety of tracking systems that are
ideal for personal tracking, Child Tracking, Elderly Tracking or Business use like Vehicle
Tracking, Fleet Management, Bus Tracking. It will also allow your Relatives/College bus to
Track your location. This is especially useful for Security & Business applications. Instead of
using complex and costly GPS tracking devices you can convert your Mobile phone into a
powerful tracking device. You can see the live tracking using Google Earth or GMAP using web
site.
14. INTEGRATE OF ASPECT BASED ON OPINION MINING USING NAIVE BAYES
CLASSIFIER FOR PRODUCT REVIEW
V.Priyadharsini, N.Subashree, M.Sindhuja, V.Kavitha
Department of Information Technology
Nandha College of Technology
It is a common practice that merchants selling products on the Web ask their customers to review
the products and associated services. As e-commerce is becoming more and more popular, the number of
customer reviews that a product receives grows rapidly. For a popular product, the number of reviews
can be in hundreds. This makes it difficult for a potential customer to read them in order to make a
decision on whether to buy the product. In this project, we aim to summarize all the customer reviews of
a product. This summarization task is different from traditional text summarization because we are only
interested in the specific features of the product that customers have opinions on and also whether the
opinions are positive or negative. We do not summarize the reviews by selecting or rewriting a subset of
the original sentences from the reviews to capture their main points as in the classic text summarization.
In this paper, we only focus on mining opinion/product features that the reviewers have commented on.
A number of techniques are presented to mine such features. The proposed system is used to decisive the
customer reviews and find the aspect from the review and classify the review whether they wrote
positive or negative. This summarization task is different from traditional text summarization because
we are only interested in the specific features of the product that customers have opinions on and also
whether the opinions are positive or negative. In this paper, we only focus on mining opinion/product
features that the reviewers have commented on and compare the more product and rank the product
based on the reviews automatically.

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

15. DYNAMIC BIG DATA STORAGE USING DYNAMIC AUDITING PROTOCOL


WITH MERKEL HASH TREE FOR BLOCK TAG AUTHENTICATION
M.Karthick,P.Lakshmanan,V.M.Naveen,T.Sivakumar,C.Rajavenkateswaran
Department of Information Technology
Nandha College of Technology

Big data is an evolving term that describes any voluminous amount of structured, semistructured and unstructured data. The term is often used when speaking about peat bytes and exabytes of data. The security and privacy is the static and huge challenging issue in big data
storage. There are many ways to compromise data because of insufficient authentication,
authorization, and audit (AAA) controls, such as deletion or alteration of records without a
backup of the original content. The existing research work showed that it can fully support
authorized auditing and fine-grained update requests. However, such schemes in existence suffer
from several common drawbacks:1. Maintaining the storages can be a difficult task and 2. It
requires high resource costs for the implementation. This paper, Propose a formal analysis
technique called full grained updates. It includes the efficient searching for downloading the
uploaded file and also focus on designing the auditing protocol to improve the server-side
protection for the efficient data confidentiality and data availability.
16.NETWORK SECURITY
Prathicksha Viswanathan, Sushmitha Selvam,
Department of Information Technology
Nandha College of Technology

In the field of networking, the specialist area of network security consists of the
provisions made in an underlying computer network infrastructure, policies adopted by
the network administrator to protect the network and the network-accessible resources from
unauthorized access, and consistent and continuous monitoring and measurement of its
effectiveness (or lack) combined together. That way, if something terrible is happening you can
detect it. Therefore, all the tasks that have to be done in network security break down into three
phases or classes: Protection, where we configure our systems and networks as correctly as
possible Detection, where we identify the configuration has changed or that some network traffic
indicates a problem Reaction, after identifying quickly, we respond to any problem and return to
a safe state as rapidly as possible.
17. COMBINING TAGGING SHARE FOR SOCIAL NETWORKS USING MULTI
ACCESS CONTROL AND COMMENT SPAM FILTER
Aishwaryamanju.S, Nithya.E, Soundharyam.S, Sowndharya.S, Saveetha.P
Department of Information Technology
Nandha College of Technology

In most group key management protocols, group members are authenticated by the group
leader one by one. That is, n authentication messages are required to authenticate n group

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

members. Then, these members share one common group key for the group communication. In
authentication protocols, users are simultaneously authenticated by the requester. That is, one
authentication message is required to authenticate n session peers. Then, the requester negotiates
one secret key with each user instead of sharing one group key among all users. Spam is
commonly defined as unsolicited messages and the goal of spam categorization is to distinguish
between spam and legitimate messages. Spam used to be considered a mere nuisance, but due to
the abundant amounts of spam being sent today, it has progressed from being a nuisance to
becoming a major problem. Naive Bays classifier spam filters calculate the probability of a
message being spam based on its contents. Unlike simple other filters, Bayesian spam filtering
learns from spam and from good message, resulting in a very robust, adapting and efficient antispam approach that, best of all, returns hardly any false positives.
18. SUMMARIZED AUTOMATED HASH-TAG TWEET SEGMENTATION
K.K.Deepa,K.Gowthamapriya,B.Keerthana,T.Krishnakaarthik
Department of Information Technology
Nandha College of Technology

An increasing number of datas over online social network have created tremendous issues
which become hard to accessible and use. So there is a need to improve the efficiency of the tweet
maintenance and searching. While information extraction algorithms facilitate the extraction of
structured relations, they are often expensive and inaccurate. The proposed system of the project
presents an automatic annotation approach with semantic content extraction. The first aligns the data
units on a cluster into different groups such that the data in the same group have the same semantic.
The main objective of the project is extracting the labels from the documents. The documents are
predicted and prioritized by labels. Then, for each group this annotates it from different aspects and
aggregates the different annotations to predict a final annotation label for it. An annotation wrapper
for the document is automatically constructed and can be used to explain the new result pages from the
same web database. The proposed system will makes the data extraction in a better way than the
existing one.
19. AN ITRUST MISBEHAVIOR DETECTION SCHEME IN DELAY-TOLERANT
NETWORKS
P.Sakthivel, C.Selvarathi
Department of Computer Science And Engineering
M.Kumarasamy College of Engineering

Network security consists of the provisions and policies adopted by a network


administrator to prevent and monitor unauthorized access. The Delay-tolerant networks (DTNs)
provide regions where end-to-end network connectivity is not available. The intermediate nodes
use communication path are expected to store-carry-forward the in-transit messages (or bundles)
in opportunistic data forwarding. Malicious and selfish behaviors represent a serious threat
against routing in DTNs. The unique network characteristics designing a misbehavior detection
scheme in DTN is regarded as a great challenge. Existing system propose iTrust a probabilistic
misbehavior detection scheme for secure DTN routing toward efficient trust establishment. The

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

idea of iTrust is introducing a periodically available Trusted Authority (TA) to judge the nodes
behavior based on the collected routing evidences and probabilistically checking. Model iTrust
as the inspection game and use game theoretical analysis to demonstrate that by setting an
appropriate investigation probability. The TA ensure the security of DTN routing at a reduced
cost. To improve the efficiency of the proposed scheme collecting the forwarding history
evidence from its upstream and downstream nodes by TA. This is further reduce the cost of
detection probability. The extensive analysis results demonstrate the effectiveness and efficiency
of the proposed scheme.
20. ADVANCED APPLICATIONS FOR SMART MOBILE USING 9 AXIS
GYROSCOPIC SENSOR ON ANDROID PLATFORM
G.Senthil Kumar,
Dr.Nallini Institute of Engineering And Technology
S.Sridharan
Department of Computer Science And Engineering
Pollachi Institute of Engineering And Technology

Now a days one of the most important device in our lives are the mobile phones. Mobile
phones were designed to primarily support voice communication. The rapid development of
technology is placing an enormous demand for mobile phones and similar devices as we are
requiring more and more applications from these mobile devices like weather forecasting,
navigation, gaming, entertainment, health monitoring, seismic exploration, map orientation,
gesture recognition, vibration, tap and tilt detection etc. These applications can be done by 9 axis
gyroscopic sensors.9 axis gyroscopic sensor is a combination of three sensors those are
accelerometer, gyroscope, compass. By using 9 axis gyroscopic sensors in smart mobiles we can
have smart sensing like sensitive gesture detection, tap detections, tilt detections, angular
velocity detections, and direction detections, vibration detections etc.
21. IMPLEMENTATION OF M-LEARNING SYSTEMS IN SMARTPHONES
S.Krishna , R.Ramya
Department of Information Technology
Tejaa Shakthi Institute of Technology For Women

This paper first analyzes the concept and features of micro lecture, mobile learning, and
ubiquitous learning, then presents the combination of micro lecture and mobile learning, to
propose an overlay of micro-learning through mobile terminals. Details are presented of a
micro lecture mobile learning system (MMLS) that can support multi-platforms, including PC
terminals and smart phones. The system combine intelligent push, speech recognition, video
annotation, Lucene full-text search, clustering analysis, Android elopement, another
technologies. The platform allows learners to access micro lecture videos and other highquality micro lecture resources wherever and whenever they like, in whatever time intervals
they have available. Teachers can obtain statistic a analysis results of the micro lecture in
MMLS to provide teaching/learning feedback and an effective communication platform. MML
Spromotesthe development of micro lecture and mobile earning. A statistical analysis of the
implementation of the system shows that students using MMLS to assist their learning had
improved results on their final exams and gave a higher evaluation of the curriculum than those
who did not. The advantages and disadvantages of MMLS are also analyzed.
10

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

22. DESIGN AND ANALYSIS OF LOW POWER ADDERS FOR ARITHMATIC


APPLICATIONS
N.Aseer Mary,M.Nivetha,D.Mohanapriya
Department of Computer Science And Engineering
Bannari Amman Institute of Technology

The prevalent blocks used in digital signal processing hardware are the adder, multiplier
and delay elements. Better the performance of adder structure better will be the performance of
multipliers in total aspect. Reducing power dissipation, delay and area at the circuit level is
considered as one of the major factors in developing low power systems. In this paper we have
introduced a new (i) 8 transistor (8T) full adder (ii) Proposed Shannon based (8T)adder using
pass transistor logic which has better power, delay performance than the existing adders.
Performance comparison of the proposed 8T adder has been made by comparing its performance
with 10T SERF, 10T CLRCL, and the existing 14T full adders. The proposed 8T full adder
structure has improved performance characteristics and suitable for Array, Carry Save and Dadda
multipliers. Also three versions of 3 tap FIR filter namely Broadcast, Unfolded Broadcast,
Unfolded and Retimed Broadcast structures have been implemented using three different
multipliers. Each of these multipliers used for the Filters are implemented using all the existing
full adders and the proposed 8 T full adder. Results show that circuits implemented using
proposed 8T full adder has better power, delay and cascaded performance when compared with
the peer ones. All the simulations were carried out using TSMC Complementary Metal Oxide
Semiconductor (CMOS) 120 nm technology file with a supply voltage of 1.8V. Tools used are
Tanner EDA tools.
23. ENHANCED AUTHENTICATED ANONYMOUSSECURE ROUTING FOR
MANETS
Divya.D, Girija.M , Gokul Krishnan.T, Manoj Kumar.PJagadhesh.M
Department of Electronics And Communication Engineering
SNS College of Technology

Anonymous communications are important for many applications of the mobile ad hoc
networks (MANETs). A major requirement on the network is to provide un identifiability and
unlink ability for mobile nodes and their traffics. (MANETs) use anonymous routing protocols
that hide node identities and routes from outside observers in order to provide anonymity
protection. However, the existing routing protocol Authenticated anonymous secure routing
(AASR) do not provide anonymity protection to data sources, destinations and routes. In this
paper, we propose an enhanced authenticated anonymous secure routing (EAASR) to offer high
anonymity protection and to prevent the attacks. This dynamically partitions the network field
into zones and randomly chooses nodes in zones as intermediate relay nodes, which form a non
traceable anonymous route. In addition, it hides the data initiator and receiver to strengthen
source and destination anonymity protection. It effectively counters intersection and timing
attacks. We implemented the node creation, route discovery, data transmission and attacker

11

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

prevention using Network simulator2 (NS2) tool. Then the performance analysis is done for
different parameters and it is shown in Xgraph.
24. ADOPTION OF SELF CONTAINED PUBLIC KEY MANAGEMENT SCHEME IN
AN ACKNOWLEDGEMENT BASED INTRUSION DETECTION SYSTEM FOR
MANETS
Deepa .M ,Parvathi.M
Department of Computer Science And Engineering
Nandha Engineering College

MANET has become the popular trend nowadays which is a migration from wired
networks to wireless network. MANET gains popularity because of its self configuring ability,
dynamic topology. It is highly used in mission critical applications and emergency disasters.
Because of its dynamic nature, MANET is prone to security risks and attacks. In this paper, a key
management scheme which is self contained and public is represented which is to be used in the
acknowledgement based intrusion detection system for authenticating the acknowledgement
packets. This scheme achieves near zero communication overhead while providing security
services. Cryptographic keys in small numbers are given as input at all nodes prior to the
deployment in network. Mathematical Combinations of pairs of keys, both public and private is
used for better utilization of storage space. This means a combination of more than one key pair
is utilized by nodes for the encryption and the decryption of messages.
25. VEHICLE SHOWROOM MANAGEMENT SYSTEM
C.Dinesh Kumar, G.Manikandan, C.Kirubakaran, R.Sengottaiyan, M.Karthika
Department of Computer Science And Engineering
Nandha Engineering College

Data mining is the process of extraction of Hidden knowledge from the databases.
Clustering is one the important functionality of the data mining Clustering is an adaptive
methodology in which objects are grouped together, based on the principle of optimizing the
inside class similarity and minimizing the class-class similarity. Various clustering algorithms
have been developed resulting in a better performance on datasets for clustering. In k-means
clustering, we are given a set of n data points in d-dimensional space Rd and an integer k and
the problem is to determine a set of k points in Rd, called centers, so as to minimize the mean
squared distance from each data point to its nearest center. A popular heuristic for k-means
clustering is Lloyd's algorithm. In this paper, we present a simple and efficient implementation
of Lloyd's k-means clustering algorithm, which we call the filtering algorithm, genetic Kmeans clustering algorithm and Pre K-means clustering model.

26. DIFFERENT SECURITY POLICIES IN ANDROID BASED SMARTPHONES


12

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

K.Rajamurugan, K.Gunasekar
Department of Computer Science And Engineering
Nandha Engineering College

Nowadays Smartphone is an effective tool for increasing the productivity of business


user. With their increasing computational power, storage and capacity of Smartphones allow end
users to perform several tasks and always be updated while they on the move. Companies are
eager to footing employee-owned Smartphones because of the growing productivity of their
employees. Still, security responsibilities about data distribution, leakage and loss have blocked
the adoption of Smartphones for corporate use. In MOSES, a policy-based framework for
implementing software isolation of operations and data on the Android platform and its possible
to specify specific Security Profiles within a single Smartphone. Each security profile is
identified with a set of policies that control the access to operations and data. Profiles are not
predefined or hard coded and they can be named and tested at any time. One of the main aspects
of MOSES is the dynamic switching from one security profile to another. We run a accurate set
of analysis using our full performance of MOSES. The results of the analysis certify the
feasibility of our proposal.
27. PRIVACY PROTECTION IN BIG DATA USING ENHANCED TOP-DOWN
SPECIALIZATION APPROACH
S.Sivasankar, T.Prabhakaran
Department of Computer Science And Engineering
Nandha Engineering College

Data size has grown a large in present years by the development of internet in
large manner so that big data era arrives, with the cloud computing users able to store
large amount of data in ease manner. Users now use both the structured data and as well
as unstructured data. In big data due to its large size all the tasks are consuming more
amount of time. The internet users also share their private data like health records and
financial transaction records for mining or data analysis purpose during that time data
anonymization is used for hiding identity or sensitive intelligence so that data owners do
not suffer with economical loss. Anonymizing large scale data within a short span of
time is a challenging task to overcome that Enhanced Top Down Specialization
approach (ETDS) can be developed which is an enhancement of Two Phase Top down
Specialization approach (TPTDS).

28. PATIENT HEALTH MONITORING USING WIRELESS POWER TRANSFER


TECHNOLOGY
13

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

R.Angamuthu S.Anusiya Y.Beula Sherlin B.Gokul,S.Vigneswaran


Department of Computer Science And Engineering
SNS College of Technology

With the advancement of biomedical instrumentation technologies, the sensor based


healthcare monitoring system is gaining more attention day by day. Wireless power transfer
technology (WPT) gains its popularity, broad range of application and research are performed in
the field of medical applications. The condition of a human body is assessed by measuring the
vital signs. These are very much useful in understanding the health status and for detecting the
medical problems of a person. The vital signs are assessed through temperature sensor, heart beat
sensor, ECG sensor. These sensors have associated circuits for signal processing and data
transmission. Powering the circuit is always a crucial design issue. Batteries cannot be used in
implantable sensors which can come in contact with the blood resulting in serious health risks.
An alternate approach is to supply power wirelessly for tether-less and battery- less operation of
the circuits. Inductive power transfer is an efficient method of wireless power transfer in the
sensor based monitoring system.
29. SEARCH ENGINE OPTIMIZER
B.Geethamani
Department of Computer Science And Engineering
Park College of Technology

Search engine optimization (SEO) is the process of affecting the visibility of a website or
a web page in a search engine's "natural" or un-paid ("organic") search results. In general, the
earlier (or higher ranked on the search results page), and more frequently a site appears in the
search results list, the more visitors it will receive from the search engine's users. SEO may target
different kinds of search, including image search, local search, video search, anemic search news
search and industry-specific vertical search engines. As an Internet marketing strategy, SEO
considers how search engines work, what people search for, the actual search terms or keywords
typed into search engines and which search engines are preferred by their targeted audience.
Optimizing a website may involve editing its content, HTML and associated coding to both
increase its relevance to specific keywords and to remove barriers to the activities of search
engines. Promoting a site to increase the number of back links, or inbound links, is another SEO
tactic.

30. GENERATING SUMMARY RISK SCORES SPAWNING ARBITRARY PERIL


NOTCHES FOR MOBILE APPLICATION

14

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15
Parvadhavarthni S, Lakshmi Narayeni S
Department of Computer Science And Engineering
Tejaa Shakthi Institute of Technology For Women

One of Androids main defense mechanisms against malicious apps is a risk


communication mechanism which, before a user installs an app, warns the user about the
permissions the app requires, trusting that the user will make the right decision. This
approach has been shown to be ineffective as it presents the risk information of each app in a
stand-alone fashion and in a way that requires too much technical knowledge and time to
distill useful information. We discuss the desired properties of risk signals and relative risk
scores for Android apps in order to generate another metric that users can utilize when choosing
apps. We present a wide range of techniques to generate both risk signals and risk scores that are
based on heuristics as well as principled machine learning techniques. Experimental results
conducted using real-world data sets show that these methods can effectively identify malware as
very risky, are simple to understand, and easy to use.
31. FEATURE EXTRACTION BASED BREAST CANCER DETECTION USING
MAMMOGRAM ON MRI IMAGES
Anu.P,Roja.P,Jaya priya.S
Department of Electronics And Communication Engineering
Bannari Amman Institute of Technology

Early detection of breast cancer can improve survival rates to a great extent. Interobserver and intra-observer errors occur frequently in analysis of medical images, given the high
variability between interpretations of different radiologists. To offset this variability and to
standardize the diagnostic procedures, efforts are being made to develop automated techniques
for diagnosis and grading of breast cancer images. This review aims at providing an overview
about recent advances and developments in the field of Computer Aided Diagnosis(CAD) of
breast cancer using mammograms, specifically focusing on the mathematical aspects of the
same, aiming to act as a mathematical primer for intermediates and experts in the field.
32. A RELIABLE DATA TRANSMISSION FOR CLUSTER-BASED WIRELESS
SENSOR NETWORKS
Jeevitha.A, Satheeshkumar.S
Department of Computer Science And Engineering

Nandha Engineering College


A wireless sensor Network enabled many devices and the more wide spread use of
Wireless Sensor Network. WIRELESS sensor network (WSN) is a network system comprised
the distributed devices using wireless sensor nodes to guide physical or environmental
conditions, such as sound, temperature, and motion. Secure data transmission is a critical issue
for wireless sensor networks (WSNs). Clustering is an effective and practical way to enhance the
system performance of WSNs. Sensor used for these purposes needs to be deployed very slowly
and in a random fashion Clustering is a technique employed to increase the various capabilities
15

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

of a sensor network. We propose two secure and efficient data transmission (SET) protocols for
clustered Wireless sensor Network CWSNs, called SET-IBS by using the identity-based digital
signature (IBS) scheme and SET-IBOOS by using identity-based online/offline digital
signature(IBOOS) scheme. This application facilitate to facilitate require packet Delivery from
one or more senders to multiple receivers, provisioning security in group communications is
pointed out as a critical and challenging goal In this paper, we study a secure data transmission
for cluster-based Wireless Sensor Network (CWSNs).The results show that the proposed
protocols have more performance than the existing secure protocols for CWSNs, in terms of
security overhead and energy consumption.
33. A LIGHTWEIGHT PROACTIVE SOURCE ROUTING PROTOCOL TO IMPROVE
OPPORTUNISTIC DATA FORWARDING IN MANET
Elackya E.C, Sasirekha.S
Department of Computer Science And Engineering
Nandha Engineering College

Opportunistic data forwarding has drawn much attention in the research community of
multi hop wireless networking, with most research conducted for stationary wireless networks.
One of the reasons why opportunistic data forwarding has not been widely utilized in mobile ad
hoc networks (MANETs) is the lack of an efficient lightweight proactive routing scheme with
strong source routing capability. In this paper, a lightweight proactive source routing (PSR)
protocol is proposed. PSR can maintain more network topology information than distance vector
(DV) routing to facilitate source routing, although it has much smaller overhead than traditional
DV-based protocols [e.g., destination-sequenced DV (DSDV)], link state (LS)-based routing
[e.g., optimized link state routing (OLSR)], and reactive source routing [e.g., dynamic source
routing (DSR)].PSR yields similar or better data transportation performance than all other
baseline protocols.
34. ANALYSIS OF ITEMS IN LARGE TRANSACTIONAL DATABASE USING
FREQUENT AND UTILITY MINING
B.Sangameshwari. P.Uma
Department of Computer Science And Engineering
Nandha Engineering College

There are grouping of techniques, strategies and unique zones of the investigation which
are valuable and stamped as the essential field of information mining Advancements. Various
MNC's and inconceivable affiliations are worked in better places of the unique countries. Every
one spot of operation may create extensive volumes of information. Corporate boss oblige access
from all such sources and take crucial decisions .The information conveyance focus is used
inside the basic business regard by improving the sufficiency of managerial decision making. In
a questionable and extremely forceful nature's turf, the estimation of basic information structures,
for instance, these are smoothly seen however in today the earth, adequacy or speed is by all
record by all account not the only key for forcefulness. This kind of immense measure of
information's are available as tera- to peta-bytes which has doubtlessly changed in the scopes of
16

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

science and building. To look at, supervise and settle on a decision of such kind of colossal
measure of information that oblige frameworks called the data mining which will changing in
various fields.This paper confers extent of the data mining which will accommodating in the
business arena.
35. A SECURE AUDITING PROTOCOL FOR DATA SHARING IN
CLOUD COMPUTING ENVIRONMENT
T.Esther Dyana,S.Maheswari
Department of Computer Science And Engineering
Nandha Engineering College

Cloud Computing is an internet based computing where virtual shared servers provide
software, infrastructure, platform, devices and many other resources and hosting to customers on
a pay-as-you-use basis. Cloud computing customers do not own the physical infrastructure rather
they rent the usage from a third-party provider. Data owners host their data on cloud servers and
users can access the data from cloud servers. Due to the data outsourcing also introduces some
new security challenges, which requires an auditing service to check the data integrity in the
cloud. Some of the existing remote integrity checking methods can only serve for static archive
data and thus, cannot be applied to the auditing service since the data in the cloud can be
dynamically updated. Thus an efficient and secure dynamic auditing protocol is preferred to
convince data owners that the data are correctly stored in the cloud. In this paper, first design an
auditing framework for cloud storage systems and propose an efficient and privacy-preserving
auditing protocol to support the data dynamic operations, which is efficient and provably secure
in the random oracle model.
36. SEMANTIC ENHANCED WEB-PAGE RECOMMENDATION BASED ON
ONTOLOGY USING WEB USAGE MINING
Shrigowtham.M.N, S.Kavitha
Department of Computer Science And Engineering
Nandha Engineering College

Web-page recommendation plays an important role in smart Web systems. It not


only reduces the Web traffic, but also gives effective personalized Web-page
recommendation for every user. Effective knowledge discovery from Web usage data
and attaining exact knowledge representation are typical task. This paper proposes a
novel method to provide better Web-page recommendation through semantic
enhancement based on knowledge representation model using domain and Web usage
knowledge. An ontology based model was proposed to represent the domain knowledge,
which is a powerful way of knowledge representation using today in many Websites.
Finally, Web usage data founded from Web log file was integrated with the domain
knowledge representation model to provide effective Web-page recommendation.

17

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

37. IDENTIFYING SPECIALITY IN SENTIMENT ANALYSIS VIA INHERENT AND


EXTERNAL DOMAIN RELEVANCE
S.Veeramani, S.Karuppusamy
Department of Computer Science And Engineering
Nandha Engineering College

Opinion mining (also known as sentiment analysis) aims to analyze peoples opinions,
sentiments, and attitudes facing entities such as products, services, and their attributes.
Information retrieval is the process of extracting the informations based on the occurrences of
the terms in the document. We discuss about the method to identify features from online reviews
by extracting the difference opinion feature statistics across two different large numbers of
documents namely domain specific corpus and domain independent corpus. Defining a set of
syntactic dependence rules, we extract the list of candidate opinion features from the domain
review corpus. For each extracted candidate feature, we estimate a Intrinsic domain relevance,
which represents the statistical association of the candidate to the given domain corpus. The
Extrinsic domain relevance, which reflects the statistical relevance of the candidate to the
domain independent corpus. The candidates with IDR scores exceeding a predefined intrinsic
relevance threshold and EDR scores less than another extrinsic relevance threshold are
confirmed as valid opinion features.
38. AN IMPROVED RESOURCEALLOCATION MECHANISM FOR VM-BASED
DATA CENTERS USING QUALITY OF SERVICE
V.Manimaran, S.Prabhu
Department of Computer Science And Engineering
Nandha Engineering College
Cloud computing grants business client to scale up and down their resource usage based
on their needs. Many of the touted benefits in the cloud model come from resource multiplexin
g through virtualization concepts. Dynamic consolidation of virtual machines (VMs) is an
effective approach to improve the utilization of resources and energy efficiency in cloud data
centers. Finding out when it is best to reallocate VMs from an overfull host is an aspect of
dynamic VM consolidation that directly influences the reserve exploitation and quality of
service (QoS) delivered by the system In this paper, we introduces VM Migration and optimal
precedence, a technique that obviously migrates only the working set of an idle VM and support
green computing by optimizing the number of servers in use. We use the maximum precedence
algorithm to reduce the trouble in virtual machine. We develop a set of heuristics that put off
trouble in the system efficiently while saving energy used.

39. A TOOLKIT FOR MODELLING AND SIMULATION OF CLOUD


COMPUTING ENVIRONMENT USING CONTEMPORARY VM LOAD BALANCERS
B.Uvaraj, N.Shanthi

18

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15
Department of Computer Science And Engineering
Nandha Engineering College

This paper proposes the job scheduling in cloud environment. Cloud computing
is a model for delivering information technology services in which resources are
retrieved from the internet through web-based tools and applications, alternative a direct
connection to a server. Users can set up and boot the needed resources and they have to
pay only for the required resources. Thus, in the future hand over a mechanism for
efficient resource management and assignment will be an important objective of Cloud
computing. The issue is to achieve the goal of management multiple virtualization
platforms and multiple virtual machine migrations across physical machines without
disruption method. We discuss that ensure load balance when multiple virtual machines
run on multiple physical machines. We present a system which is implementation of
optimization with Dynamic Resource Allocation dealing with virtualization machines
on physical machines, practice DRA method in this system. The dynamic results
accepted that the virtual machine which loading becomes too high it will automatically
migrated to another low loading physical machine without service interrupt. We realize
that this approach results in a tractable solution for scheduling applications in the public
cloud. It also saves on electricity which share to a significant portion of the operational
expenses in large data centers. We develop a set of heuristics that prevent overload in
the system effectively while saving energy used. It trace driven simulation and
experiment results demonstrate that our algorithm achieves good performance.
40. A FRAMEWORK TO CREDIT CARD ENDORSEMENT USING FINGERPRINT
AND ONE TIME PASSWORD FOR AUTHENTICATION COMBINED WITH SSO
PROTOCOL IN CLOUD
V.Karunya ,Dr.S.Prabhadevi
Department of Computer Science And Engineering
Nandha Engineering College

Cloud computing is one of the emerging technologies, that takes network users to the
next level. Cloud is a technology where resources are paid per usage rather than owned. One of
the biggest challenges in this technology is Security. Though users use service providers
resources, there is a great level of reluctance from users end because of significant security
threats packed with this technology. Research in this core has provided a number of solutions to
overcome these security barriers; each of these has its own pros and cons. This paper brings
about a new model of a security system where in users are to provide multiple biometric finger
prints during enrolment for a service. These templates are stored at the cloud providers end. The
users are authenticated based on these finger print templates which have to be provided in the
order of random numbers that are generated every time. Both finger prints templates and images
provided every time are encrypted for enhanced security. When working with credit card
transaction SSO solutions allow users to sign on only once and have their identities automatically
verified by each application or service they want to access afterwards. We build on proxy
signature schemes to introduce the public key cryptographic approach to single sign-on

19

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

frameworks, which represents an important milestone towards the construction of provably


secure single sign-on application for online transaction.
41. SECURITY ENHANCEMENTS FOR MOBILE AD HOC
NETWORKS WITH TRUST MANAGEMENT
R.Arthi, E.Padma,Dr.N.Shanthi
Department of Computer Science And Engineering
Nandha Engineering College

Mobile ad hoc networks (MANETs) are a dynamic network in which the mobile node
does not have any infrastructure. Link breakages exist due to its high mobility of nodes which
leads to frequent path failures and route discoveries. The neighbor coverage and probabilistic
mechanism significantly decreases the number of retransmissions so as to reduce the routing
overhead. Since security is also a challenging factor in adhoc networks a concept of secured
efficient routing is included with NCPR which enables a new trust approach based on the extent
of friendship between the nodes is proposed which makes the nodes to co-operate and prevent
flooding attacks in an ad hoc environment. All the nodes in an ad hoc network are categorized as
friends, acquaintances or strangers based on their relationships with their neighboring nodes.
During network initiation all nodes will be strangers to each other. A trust estimator is used in
each node to evaluate the trust level of its neighboring nodes. This approach combines the
advantages of the neighbor coverage knowledge and the probabilistic mechanism, which can
significantly decrease the number of retransmissions so as to reduce the routing overhead, and
improve the security. Specifically, throughput and packet delivery ratio can be improved
significantly.
42. LINK SCHEDULING FOR EXPLOITING SPATIAL REUSE INMULTIHOP MIMO
NETWORKS
A.Mohan Kumar,K.Gunasekaran
Department of Computer Science And Engineering
Nandha Engineering College

Multiple-Input-Multiple-Output (MIMO) has great potential for enhancing the throughput


of multihop wireless networks via spatial multiplexing or spatial reuse. Spatial reuse with Stream
Control (SC) provides a considerable improvement of the network throughput over spatial
multiplexing. The gain of spatial reuse, however, is still not fully exploited. There exist large
numbers of additional data streams, which could be transmitted concurrently with those data
streams scheduled by stream control at certain time slots and vicinities. In this paper, we address
the issue of MIMO link scheduling to maximize the gain of spatial reuse and thus network
throughput. We propose a Receiver-Oriented Interference Suppression model (ROIS), based on
which we design both centralized and distributed link scheduling algorithms to fully exploit the
gain of spatial reuse in multihop MIMO networks. Further, we address the traffic-aware link
scheduling problem by injecting non-uniform traffic load into the network. Through theoretical
analysis and comprehensive performance evaluation, we achieve the following results: 1) link
scheduling based on ROIS achieves significant higher network throughput than that based on

20

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

stream control, with any interference range, number of antennas, and average hop length of data
flows. 2) The traffic-aware scheduling is enticingly complementary to the link scheduling based
on ROIS model. Accordingly, the two scheduling schemes can be combined to further enhance
the network throughput.
43. EFFICIENT AND SECURE WIRELESS COMMUNICATIONS FOR ADVANCED
METERING INFRASTRUCTURE IN SMART GRIDS
Venkateswaran.D ,Satheeshkumar.S
Department of Computer Science And Engineering
Nandha Engineering College

An experiment is carried out to measure the power consumption of households. The


analysis on the real measurement data shows that the significant change of power consumption
arrives in a Poisson manner. Based on this experiment, a novel wire-less communication scheme
is proposed for the advanced metering infrastructure (AMI) in smart grid that can significantly
improve the spectrum efficiency. The main idea is to transmit only when a significant power
consumption change occurs. On the other hand, the policy of transmitting only when change
occurs may bring a security issue; i.e., an eavesdropper can monitor the daily life of the house
owner, particularly the information of whether the owner is at home. Hence, a defense scheme is
proposed to combat this vulnerability by adding artificial spoofing packets. It is shown by
numerical results that the defense scheme can effectively prevent the security challenge

44. SLA-BASED RESOURCE ALLOCATION FOR SOFTWARE AS A SERVICE


PROVIDER (SAAS) IN CLOUD COMPUTING ENVIRONMENTS
T.Gothandeshwaran , K.Shanmuga Priya
Department of Computer Science And Engineering
Nandha Engineering College

Cloud computing has been considered as a solution for solving enterprise application
distribution and configuration challenges in the traditional software sales model. Migrating from
traditional software to Cloud enables on-going revenue for software providers. However, in order
to deliver hosted services to customers, SaaS companies have to either maintain their own
hardware or rent it from infrastructure providers. This requirement means that SaaS providers
will incur extra costs. In order to minimize the cost of resources, it is also important to satisfy a
minimum service level to customers. Therefore, this paper proposes resource allocation
algorithms for SaaS providers who want to minimize infrastructure cost and SLA violations. Our
proposed algorithms are designed in a way to ensure that Saas providers are able to manage the
dynamic change of customers, mapping customer requests to infrastructure level parameters and
handling heterogeneity of Virtual Machines. We take into account the customers Quality of
Service parameters such as response time, and infrastructure level parameters such as service
initiation time. This paper also presents an extensive evaluation study to analyse and demonstrate
that our proposed algorithms minimize the SaaS providers cost and the number of SLA
violations in a dynamic resource sharing Cloud environment.

21

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

45. DYNAMIC RESOURCE ALLOCATION USING VIRTUAL MACHINES FOR


CLOUD COMPUTING ENVIRONMENT
P.Malathi,Dr.S.Arumugam
Department of Computer Science And Engineering
Nandha Engineering College

Cloud computing allows business customers to scale up and down their resource usage
based on needs. Many of the touted gains in the cloud model come from resource multiplexing
through virtualization technology. In this paper, we present a system that uses virtualization
technology to allocate data center resources dynamically based on application demands and
support green computing by optimizing the number of servers in use. We introduce the concept
of skewness to measure the unevenness in the multidimensional resource utilization of a
server. By minimizing skewness, we can combine different types of workloads nicely and
improve the overall utilization of server resources. We develop a set of heuristics that prevent
overload in the system effectively while saving energy used. Trace driven simulation and
experiment results demonstrate that our algorithm achieves good performance.
46. WORKLOAD BALANCING AND ADAPTIVE RESOURCE MANAGEMENT FOR
THE SWIFT STORAGE SYSTEM ON CLOUD
E.Keerthi , S.Maheshwari
Department of Computer Science And Engineering
Nandha Engineering College

The demand for big data storage and processing has become a challenge in todays
industry. To meet the challenge, there is an increasing number of enterprises adopting distributed
storage systems. Frequently, in these systems, storage nodes intensively holding hotspot data
could become system bottlenecks while storage nodes without hotspot data might result in low
utilization of computing resource. This stems from the fact that almost all the typical distributed
storage systems only provide data-amount-oriented balancing mechanisms without considering
the different access load of data. To eliminate the system bot-tlenecks and optimize the resource
utilization, there is a demand for such distributed storage systems to employ a workload
balancing and adaptive resource management framework. In this paper, we propose a framework
of workload balancing and resource management for Swift, a widely used and typical distributed storage system on cloud. In this framework, we design workload monitoring and
analysis algo-rithms for discovering overloaded and underloaded nodes in the cluster. To balance
the workload among those nodes, Split, Merge and Pair Algorithms are implemented to regulate
physical machines while Re-source Reallocate Algorithm is designed to regulate virtual
machines on cloud. In addition, by leveraging the mature architecture of distributed storage
systems, the framework resides in the hosts and operates through API interception. To
demonstrate its effectiveness, we conduct experiments to evaluate it. And the experimental
results show the framework can achieve its goals.

22

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

47. BIG DATA: ISSUES, CHALLENGES, TOOLS AND GOODPRACTICES


G.Deepak, T.Prabhakaran
Department of Computer Science And Engineering
Nandha Engineering College

Big data is defined as large amount of data which requires new technologies and
architectures so that it becomes possible to extract value from it by capturing and analysis
process. Due to such large size of data it becomes very difficult to perform effective analysis
using the existing traditional techniques. Big data due to its various properties like volume,
velocity, variety, variability, value and complexity put forward many challenges. Since Big data
is a recent upcoming technology in the market which can bring huge benefits to the business
organizations, it becomes necessary that various challenges and issues associated in bringing and
adapting to this technology are brought into light. This paper introduces the Big data technology
along with its importance in the modern world and existing projects which are effective and
important in changing the concept of science into big science and society too. The various
challenges and issues in adapting and accepting Big data technology, its tools (Hadoop) are also
discussed in detail along with the problems Hadoop is facing. The paper concludes with the
Good Big data practices to be followed.
48. A SIMPLE BUT POWERFUL HEURISTIC METHOD FOR ACCELERATING KMEANS CLUSTERINGOF LARGE-SCALE DATA IN LIFE SCIENCE
S.Monisha,D.Vanathi
Department of Computer Science And Engineering
Nandha Engineering College

K-means clustering has been widely used to gain insight into biological systems from
large-scale life science data. To quantify the similarities among biological data sets, Pearson
correlation distance and standardized Euclidean distance are used most frequently; however,
optimization methods have been largely unexplored. These two distance measurements are
equivalent in the sense that they yield the same k-means clustering result for identical sets of k
initial cancroids. Thus, an efficient algorithm used for one is applicable to the other. Several
optimization methods are available for the Euclidean distance and can be used for processing the
standardized Euclidean distance; however, they are not customized for this context. We instead
approached the problem by studying the properties of the Pearson correlation distance, and we
invented a simple but powerful heuristic method for markedly pruning unnecessary computation
while retaining the final solution. Tests using real biological data sets with 50-60K vectors of
dimensions 10 2001 (_400 MB in size) demonstrated marked reduction in computation time for
k 10-500 in comparison with other state-of-the-art pruning methods such as Elkans and
Hamerlys algorithms.
49. HIGHLY COMPARATIVE FEATURE-BASEDTIME-SERIES CLASSIFICATION
Subhashini Padma.M, Vanitha.D
Department of Computer Science And Engineering
23

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15
Nandha Engineering College

A highly comparative, feature-based approach to time series classification is introduced


that uses an extensive database of algorithms to extract thousands of interpretable features from
time series. These features are derived from across the scientific time-series analysis literature,
and include summaries of time series in terms of their correlation structure, distribution, entropy,
stationarity, scaling properties, and fits to a range of time-series models. After computing
thousands of features for each time series in a training set, those that are most informative of the
class structure are selected using greedy forward feature selection with a linear
classifier. The resulting feature-based classifiers automatically learn the differences between
classes using a reduced number of time-series properties, and circumvent the need to calculate
distances between time series. Representing time series in this way results in orders of
magnitude of dimensionality reduction, allowing the method to perform well on very large data
sets containing long time series or time series of different lengths. For many of the data sets
studied, classification performance exceeded that of conventional instance-based classifiers,
including one nearest neighbor classifiers using euclidean distances and dynamic time warping
and, most importantly, the features selected provide an understanding of the properties of the
data set, insight that can guide further scientific investigation.
Index TermsTime-series analysis, classification, data mining
50. ON THE USE OF SIDE INFORMATION FOR MINING TEXT DATA
M.K.Vijaymeena, V.Aruna
Department of Computer Science And Engineering
Nandha Engineering College

In many text mining applications, side-information is available along with the text
documents. Such side-information may be of different kinds, such as document provenance
information, the links in the document, user-access behavior from web logs, or other non-textual
attributes which are embedded into the text document. Such attributes may contain a tremendous
amount of information for clustering purposes. However, the relative importance of this sideinformation may be difficult to estimate, especially when some of the information is noisy. In
such cases, it can be risky to incorporate side-information into the mining process, because it can
either improve the quality of the representation for the mining process, or can add noise to the
process. Therefore, we need a principled way to perform the mining process, so as to maximize
the advantages from using this side information. In this paper, we design an algorithm which
combines classical partitioning algorithms with probabilistic models in order to create an
effective clustering approach. We then show how to extend the approach to the classification
problem. We present experimental results on a number of real data sets in order to illustrate the
advantages of using such an approach.
51. A COOPERATIVE SEARCH BASED SOFTWARE ENGINEERING APPROACH
FOR CODE SMELL DETECTION
S.Kiruthika, S.Karuppusamy
Department of Computer Science And Engineering
Nandha Engineering College
24

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

We propose in this paper to consider code-smells detection as a distributed optimization problem.


The idea is that different methods are combined in parallel during the optimization process to
find a consensus regarding the detection of code-smells. To this end, we used Parallel
Evolutionary algorithms (P-EA) where many evolutionary algorithms with different adaptations
(fitness functions, solution representations, and change operators) are executed, in a parallel
cooperative manner, to solve a common goal which is the detection of code-smells. An empirical
evaluation to compare the implementation of our cooperative P-EA approach with random
search, two single population-based approaches and two code-smells detection techniques that
are not based on meta-heuristics search. The statistical analysis of the obtained results provides
evidence to support the claim that cooperative P-EA is more efficient and effective than state of
the art detection approaches based on a benchmark of nine large open source systems where
more than 85 percent of precision and recall scores are obtained on a variety of eight different
types of code-smells.
52. A SURVEY ON DDOS ATTACK DETECTION METHODOLOGY
T.K.Subramaniam, B.Deepa
Department of Computer Science And Engineering
Nandha Engineering College

Denial of Service (DoS) or Distributed-Denial of Service (DDoS) are major threat to


network security . Network is collection of nodes that interconnect with each other for exchange
the Information. This information is required for that node is kept confidentially. Attacker in the
network may capture this confidential information and misused. So security is the major issue.
There are many security attacks in network. One of the major threats to internet service is DDoS
(Distributed denial of services) attack. DDoS attack is a malicious attempt to suspending or
interrupting services to target node. DDoS or DoS is an attempt to make network resource or the
machine is unavailable to its intended user. Many idea are developed for avoiding the DDoS or
DoS. DDoS happen in two ways naturally or it may due to some botnets .Various schemes are
developed defence against to this attack. Main idea of this paper is present basis of DDoS attack.
Types of DDoS attack, components of DDoS attack, survey on different mechanism to detect a
DDoS or DoS Detection technique.

53. PROTECT PERVASIVE SOCIAL NETWORKING BASED ON TWODIMENSIONAL TRUST LEVELS


S.Indhumathi, M.Senthaamarai
Department of Computer Science And Engineering
Nandha Engineering College

25

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

Social networking has extended its popularity from the Internet to mobile domains.
Nowadays, the Internet can work collaboratively with cellular networks and self-organized
mobile ad hoc networks to offer advanced pervasive social networking (PSN) at any time and in
any place. It is important to secure data communications in PSN for protecting crucial instant
social activities and supporting reliable social computing and data mining. Obviously, trust plays
an important role in PSN for reciprocal activities among strangers. It helps people overcome
perceptions of uncertainty and risk and engages in trusted social behaviors. In this paper, we
utilize two dimensions of trust levels evaluatedby either a trusted server or individual PSN nodes
or both to control PSN data access in a heterogeneous manner on the basis of attribute-based
encryption. We formally prove the security of our scheme and analyze its communication and
computation complexity. Extensive analysis and performance evaluation based on
implementation show that our proposed scheme is highly efficient and provably secure under
relevant system and security models.
54. SECURE AND EFFICIENT DATATRANSMISSION FOR CLUSTER
BASED WIRELESS SENSOR NETWORK.
B.Sharmila.M.Parvathi
Department of Computer Science And Engineering
Nandha Engineering College

Secure data transmission is a critical issue for wireless sensor networks (WSNs).
Clustering is an effective and practical way to enhance the system performance of WSNs. In
this paper, we study a secure data transmission for cluster-based WSNs (CWSNs), where the
clusters are formed dynamically and periodically. We propose two secure and efficient data
transmission (SET) protocols for CWSNs, called SET-IBS and SET-IBOOS, by using the
identity-based digital signature (IBS) scheme and the identity-based online/ offline digital
signature (IBOOS) scheme, respectively. In SET-IBS, security relies on the hardness of the
Diffie-Hellman problem in the pairing domain. SET-IBOOS further reduces the
computational overhead for protocol security, which is crucial for WSNs, while its security
relies on the hardness of the discrete logarithm problem. We show the feasibility of the SETIBS and SET-IBOOS protocols with respect to the security requirements and security
analysis against various attacks. The calculations and simulations are provided to illustrate
the efficiency of the proposed protocols. The results show that the proposed protocols have
better performance than the existing secure protocols for CWSNs, in terms of security overhead
and energy consumption
55. INFORMATION SECURITY IN BIG DATA: PRIVACY AND DATA MINING
R.Gowthamy, P.Uma
Department of Computer Science And Engineering
Nandha Engineering College

The growing popularity and development of data mining technologies bring serious threat
26

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

to the security of individual's sensitive information. An emerging research topic in data mining,
known as privacy-preserving data mining (PPDM), has been extensively studied in recent years.
The basic idea of PPDM is to modify the data in such a way so as to perform data mining
algorithms effectively without compromising the security of sensitive information contained in
the data. Current studies of PPDM mainly focus on how to reduce the privacy risk brought by
data mining operations, while in fact, unwanted disclosure of sensitive information may also
happen in the process of data collecting, data publishing, and information (i.e., the data mining
results) delivering. In this paper, we view the privacy issues related to data mining from a wider
perspective and investigate various approaches that can help to protect sensitive information. In
particular, we identify four different types of users involved in data mining applications, namely,
data provider, data collector, data miner, and decision maker. For each type of user, we discuss
his privacy concerns and the methods that can be adopted to protect sensitive information. We
briey introduce the basics of related research topics, review state-of-the-art approaches, and
present some preliminary thoughts on future research directions. Besides exploring the privacypreserving approaches for each type of user, we also review the game theoretical approaches,
which are proposed for analyzing the interactions among different users in a data mining
scenario, each of whom has his own valuation on the sensitive information. By differentiating the
responsibilities of different users with respect to security of sensitive information, we would like
to provide some useful insights into the study of PPDM.
56. ISSUES CURRENT PROPOSALS AND FUTURE ENHANCEMENTS IN WIRELESS
SENSOR NETWORKS
S.Anbumalar, Dr.S.Prabhadevi
Department of Computer Science And Engineering
Nandha Engineering College

Recent developments in wireless sensor networks, wireless communications, and digital


electronics have enabled development of low cost, low power, multifunctional sensor nodes are
small and freedom to communicate in short distances. However, it has still remained an open
challenge to deploy sensor nodes in wireless environment as we have to deal with innumerable
constraints for their complete implementation. In this paper, a detailed survey has been carried
out to analyze various techniques, which could be used to address present unresolved issues in
wireless sensor networks. In recent times wireless sensor networks have received significant
attention from researchers due to its unlimited potential. However despite all this effort at
providing improved services in these areas, issues still remain unsolved and many more have
arised based on the proffered solutions. This paper is focused on presenting to the reader arising
issues in wireless sensor networks. We have concentrated on grouping issues in Wireless sensor
network into three broad groups namely; the technical issues, the design issues, the system
issues, the communication protocol issues, topology issues, key management issues, other wsn
issues and recentand research issues. By solving these issues, we can close the wide gap between
Wireless Sensor Network Technology invention and its deployment on the field.
57. INTERFERENCE-BASED TOPOLOGY CONTROL ALGORITHMFOR DELAY27

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

CONSTRAINED MOBILE AD HOC NETWORKS


R.Saranya , E.Padma
Department of Computer Science And Engineering
Nandha Engineering College

As the foundation of routing, topology control should minimize the interference among
nodes, and increase the network capacity. With the development of mobile ad hoc networks
(MANETs), there is a growing requirement of quality of service (QoS) in terms of delay. In order
to meet the delay requirement, it is important to consider topology control in delay constrained
environment, which is contradictory to the objective of minimizing interference. In this paper,
we focus on the delay-constrained topology control problem, and take into account delay and
interference jointly. We propose a cross-layer distributed algorithm called interference-based
topology control algorithm for delay-constrained (ITCD) MANETs with considering both the
interference constraint and the delay constraint, which is different from the previous work. The
transmission delay, contention delay and the queuing delay are taken into account in the
proposed algorithm. Moreover, the impact of node mobility on the interference-based topology
control algorithm is investigated and the unstable links are removed from the topology. The
simulation results show that ITCD can reduce the delay and improve the performance effectively
in delay-constrained mobile ad hoc networks.
58. SECURE DATA AGGREGRATION TECHNIC FOR WIRELESS SENSOR
NETWORK IN PRESCRENCE OF COLLUSION ATTACK
P. Bhuvaneswari , P. Thirumoorthi
Department of Computer Science And Engineering
Nandha Engineering College

Due to limited computational power and energy resources, aggregation of data from
multiple sensor nodes aggregation is known to be highly vulnerable to node compromising
attacks. Since WSN are usually unattended and without tamper resistant hardware, they are
highly susceptible to such attacks. Thus, ascertaining trustworthiness of data and reputation of
sensor nodes is crucial for WSN. As the performance of very low power processors dramatically
improves, future aggregator nodes will be capable of performing more sophisticated data
aggregation algorithms, thus making WSN less vulnerable. Iterative filtering algorithms hold
great promise for such a purpose. Such algorithms simultaneously aggregate data from multiple
sources and provide trust assessment of these sources, usually in a form of corresponding weight
factors assigned to data provided by each source. In this paper we demonstrate that several
existing iterative filtering algorithms, while significantly more robust against collusion attacks
than the simple averaging methods, are nevertheless susceptive to a novel sophisticated collusion
attack we introduce. To address this security issue, we propose an improvement for iterative
filtering techniques by providing an initial converging done at approximation for such algorithms
which makes them not only collusion robust, but also more accurate and faster the aggregating
node is usually accomplished by simple methods such as averaging
59. PRIVACY-PRESERVING MULTI-KEYWORD RANKED SEARCHOVER
ENCRYPTED CLOUD DATA
28

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

K.Deepa,S.Prabhu
Department of Computer Science And Engineering
Nandha Engineering College

With the advent of cloud computing, data owners are motivated to outsource their
complex data management systems from local sites to commercial public cloud for great
flexibility and economic savings. But for protecting data privacy, sensitive data has to be
encrypted before outsourcing, which obsoletes traditional data utilization based on plaintext
keyword search. Thus, enabling an encrypted cloud data search service is of paramount
importance. Considering the large number of data users and documents in cloud, it is crucial for
the search service to allow multi-keyword query and provide result similarity ranking to meet the
effective data retrieval need. Related works on searchable encryption focus on single keyword
search or Boolean keyword search, and rarely differentiate the search results. In this paper, for
the first time, we define and solve the challenging problem of privacy-preserving multi-keyword
ranked search over encrypted cloud data (MRSE), and establish a set of strict privacy
requirements for such a secure cloud data utilization system to become a reality. Among various
multi-keyword semantics, we choose the efficient principle of coordinate matching, i.e., as
many matches as possible, to capture the similarity between search query and data documents,
and further use inner product similarity to quantitatively formalize such principle for similarity
measurement. We first propose a basic MRSE scheme using secure inner product computation,
and then significantly improve it to meet different privacy requirements in two levels of threat
models. Thorough analysis investigating privacy and efficiency guarantees of proposed schemes
is given, and experiments on the real-world dataset further show proposed schemes indeed
introduce low overhead on computation and communication.
60. A STUDY ON MULTIMODAL BIOMETRICS AUTHENTICATION AND
TEMPLATE PROTECTION
N.Aravindhraj, Dr.N.Shanthi
Department of Computer Science And Engineering
Nandha Engineering College

Biometric cryptosystems provide an innovative solution for cryptographic key


generation, encryption as well as biometric template protection. Besides high authentication
accuracy, a good biometric cryptosystem is expected to protect biometric templates effectively,
which requires that helper data does not reveal significant information about the templates. This
work involves the review of several authentication and template protection techniques such as
fuzzy vault, Social network analysis, Delunay quadrangle mechanism and topology code. These
mechanisms are compared and the results are examined using the parameters such as Genuine
Acceptance Rate (GAR), False Acceptance Rate(FAR), False Reject Rate(FRR) and Equal Error
Rate(ERR).
61. PREDICTION OF RELATIONSHIP BETWEEN INTEGRATED SURFACE
DROUGHT INDEX (ISDI) DETECTED DROUGHT CONDITION AND CROP YIELD
R.Kalpana, Dr.S.Arumugam
Department of Computer Science And Engineering

29

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15
Nandha Engineering College

Drought affects a large number of people and cause more losses to society compared to
other natural disasters. India is a drought disaster-prone country. The frequent occurrences of
drought possess an increasingly severe threat to the Indian agricultural production. Drought a
very complex phenomenon and it is difficult to accurately quantify it , because it has immense
spatial and temporal variability. In the existing system, implement the ISDI model construction
for evaluating the accuracy and the effectiveness. The ISDI model using a variety of methods
and data, there is still need some work to be done in our future research because of its complex
spatial and temporal characteristics of drought. To overcome limitation, the performance of the
drought can be measured by using the Spatial and temporal characteristics of information. We
collect the dataset from different regions and also collect the time varying information. In
proposed system, we predict the drought conditions by using the supervised learning mechanism.
It can be implemented by using the Bayesian supervised machine learning algorithm. Through
this algorithm we can achieve the accuracy and performance, and also improve the effectiveness
of predicting various drought conditions.
62. FIDELITY-BASED PROBABILISTIC Q-LEARNING FOR CONTROL OF
QUANTUM SYSTEMS
A.Gomala Priya, S.Sasireka
Department of Computer Science And Engineering
Nandha Engineering College

The balance between exploration and exploitation is a key problem for reinforcement
learning methods, especially for Q-learning. In this paper, a fidelity-based probabilistic Qlearning (FPQL) approach is presented to naturally solve this problem and applied for learning
control of quantum systems. In this approach, fidelity is adopted to help direct the learning
process and the probability of each action to be selected at a certain state is updated iteratively
along with the learning process, which leads to a natural exploration strategy instead of a pointed
one with configured parameters. A probabilistic Q-learning (PQL) algorithm is first presented to
demonstrate the basic idea of probabilistic action selection. Then the FPQL algorithm is
presented for learning control of quantum systems. One example (a spin-1/2 system) is
demonstrated to test the performance of the FPQL algorithm. The results show that FPQL
algorithms attain a better balance between exploration and exploitation, and can also avoid local
optimal policies and accelerate the learning process.
63. PRIVACY CONSERVATION FOR FRAMEWORK SECURE DATA ENCRYPTED
USING OSN
R.Krishnamoorthy,V.Aruna
Department of Computer Science And Engineering
Nandha Engineering College

A secure data sharing scheme in OSNs based on cipher text policy attribute based proxy
re-encryption and secret sharing. This system presents a multiparty access control model, which
enables the disseminator to update the access policy of cipher text if their attributes gratify the
30

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

existing access policy. Also provide check ability on the outputs returned from the OSNs service
earner to guarantee the exactness of part decrypted cipher text. The refuge and routine analysis
results indicate that the proposed scheme is secure and efficient in OSNs. The key policy
attribute are used for unfolding encrypting data and policy employed in user`s key, and the cipher
text policy is the access structure on the cipher text and the access structure can also be presentday in either monotonic or non-monotonic. Privately achieve this protection through a new
approach to building secure Systems: building practical systems that compute on encrypted data,
without access to the decryption key. In this setting, individually designed and built a database
system (CryptDB), a web application platform (Mylar), and two mobile systems, as well as
developed new cryptographic schemes for them.
64. AN ENCHANCED AND RELIABLE AUTHENTICATION PROTOCOL AGAINST
KEYLOGGING ATTACKS
B.Amutha ,R.Indhumathi ,M.Jayashree ,M.Mathumathi
Department of Computer Science And Engineering
Sasurie College of Engineering

The Project entitled An Enchanced and Reliable Authentication Protocol Against


Keylogging Attacks is designed and developed by using Microsoft Visual Studio 2010 acting as
front end tool and MS SQL Server 2008 acting as back end tool. Data access control and data
authentication is an efficient way to ensure the data security in the internet. Online services are
providing an effective solution for sharing information through website. This project initiates the
study of two specific security threats on online security based password authentication in
distributed systems. SMS-based password authentication is one of the most commonly used
security mechanisms to determine the identity of a remote client, who must hold a valid phone
number and the corresponding password to carry out a successful authentication with the server.
The authentication is usually integrated with a key establishment protocol and yields SMS-based
password-authenticated key agreement.The project creates a new authentication scheme with
conditional key verification. In sensitive web sites like defense and banking, user needs to enter
their passwords and other sensitive details for the accessing, but the information can be stealing
by the adversaries. In this case the proposed system provides a new authentication scheme
against those password and information stealing attacks using QR images and Mobile devices.

65. AN ADROITIC SURVALAIENCE CAMERA WITH K-MEANS ALGORITHM


K.Aravinth,S,Gowtham kumar,S.Gowtham,T.Kapil
Department of Computer Science And Engineering
Sasurie College of Engineering

Intelligent video surveillance systems deal with the real-time monitoring of persistent
and transient objects within a specific environment. An automated video surveillance and
alarming system provides surveillance and alerts the security guard of any undesired activity via
his cell phone. It would be a promising replacement of traditional human video surveillance
31

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

system. It provides a high degree of security. Detection of moving objects in video streams is the
first relevant step of information extraction in many computer vision applications. Aside from the
intrinsic usefulness of being able to segment video streams into moving and background
components, detecting moving objects provides a focus of attention for recognition,
classification, and activity analysis, making these later steps more efficient. We propose an
approach based on self-organization through artificial neural networks, widely applied in human
image processing systems and more generally in cognitive science. The proposed approach can
handle scenes containing moving backgrounds, gradual illumination variations and camouflage,
has no bootstrapping limitations, can include into the background model shadows cast by moving
objects.
66. AUTOMATIC SUMMARISATION OF BUG REPORT
Deivasigamani .M,Mukesh kumar.K,Bibin k biju ,Jafar Shadiq .S
Department of Computer Science And Engineering
Sasurie College of Engineering

A bug tracking and reporting system or defect tracking system is a software application
that keeps track of reported software bugs in software development projects. It may be regarded
as a type of issue tracking system .Bug Tracking System is the system which enables to detect
the Defects. It not merely detects the Defects but provides the complete information regarding
Defects detected. Bug Tracking System ensures the user of it who needs to know about a provide
information regarding the identified Defect. Using this no Defect will be unfixed in the
developed application. The developer develops the project as per customer requirements. The
Defect details in the database table are accessible to both project manager and developer.
67. AUTOMATIC LICENSE PLATE RECOGNITION USING ADAPTIVE
THRESHOLDING AND CC ANALYSIS TECHNIQUE
Aiswarya.V, Nandhine shree,N,Rajesh Kumar.M,Prasanna ManikandanS
Department of Information Technology
Karpagam Institute of Technology

The project presents license plate recognition system using connected component
analysis and template matching model for accurate identification. Automatic license plate
recognition (ALPR) is the extraction of vehicle license plate information from an image and
verify with stored license plate samples. The system model uses already captured images for this
recognition process. First the recognition system starts with character identification based on
number plate extraction, Splitting characters and template matching. Here, Adaptive thresholding
used to suppress background and detect foreground region having brighter pixels. Morphological
filtering is used to reduce noise objects from background. Connected component analysis is
utilized to split the segmented objects for extraction of individual objects geometric features.
ALPR as a real life application has to quickly and successfully process license plates under
different environmental conditions, such as indoors, outdoors, day or night time. It plays an
important role in numerous real-life applications, such as automatic toll collection, traffic law
enforcement, parking lot access control, and road traffic monitoring. The system uses different
templates for identifying the characters from input image. After character recognition, an
32

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

identified group of characters will be compared with database number plates for authentication.
The proposed model has low complexity and less time consuming interms of number plate
segmentation and character recognition. This can improve the system performance and make the
system more efficient by taking relevant samples.
68. PRIVACY PRESERVING PUBLIC AUDITING FOR SHARED DATA IN THE
CLOUD USING MD5
A.Afzal Ahmed,K.Jayaram,R.Logeshwaran,K.SamGodwin, P.UmaMaheshwari
Department of Information Technology
Karpagam Institute of Technology

Using cloud storage, users can remotely store their data and enjoy the on-demand highquality applications and services from a shared pool of configurable computing resources without
the burden of local data storage and maintenance. However, the fact that users no longer have
physical possession of the outsourced data makes the data integrity protection in cloud computing a
formidable task, especially for users with constrained computing resources. Moreover, users should
be able to just use the cloud storage as if it is local, without worrying about the need to verify its
integrity. Sharing data in a multi-owner manner while preserving data and identity privacy from
an untrusted cloud is still a challenging issue, due to the frequent change of the membership. Data
access control is an effective way to ensure the data security in the cloud. Due to data outsourcing
and untrusted cloud servers, the data access control becomes a challenging issue in cloud storage
systems. MD5 is regarded as one of the most suitable technologies for data access control in cloud
storage, because it gives data owners more direct control on access policies. In this paper, we
design an expressive, efficient data access control scheme for multi-authority cloud storage
systems, where there are multiple authorities co-exist and each authority is able to issue attributes
independently. This efficiently audits the integrity of shared data with dynamic groups.
69. SECURED DATA AGGREGATION SCHEME IN SENSOR NETWORKS
K.E.Eswari,K.Gunasekar
Department of Computer Science And Engineering
Nandha Engineering College

Mobile devices such as smart phones are gaining an ever-increasing popularity. Most
smart phones are equipped with a rich set of embedded sensors such as camera, microphone,
GPS, accelerometer, ambient light sensor, gyroscope, and so on. The data generated by these
sensors provide opportunities to make sophisticated inferences about not only people but also
their surrounding and thus can help improve peoples health as well as life. This paper studies
how an entrusted aggregator in mobile sensing can periodically obtain desired statistics over the
data contributed by multiple mobile users, without compromising the privacy of each user.
Although there are some existing works in this area, they either require bidirectional
communications between the aggregator and mobile users in every aggregation period, or have
high-computation overhead or cannot support large plaintext spaces. Also, they do not consider
the Min aggregate, which is quite useful in mobile sensing. To address these problems, we
propose an efficient protocol to obtain the Sum aggregate, which employs an additive
homomorphism encryption and a novel key management technique to support large plaintext
33

NCFC2T15

5th NATIONAL CONFERENCE ON FUTURISTIC COMPUTING AND


COMMUNICATION TECHNOLOGIES NCFC2T15

space. We also extend the sum aggregation protocol to obtain the Min aggregate of time-series
data. The proposed scheme has three contributions. First, it is designed for a multi-application
environment. The base station extracts application-specific data from aggregated cipher texts.
Next, it mitigates the impact of compromising attacks in single application environments.
Finally, it degrades the damage from unauthorized aggregations.
70. AUTHENTICATE EMAIL BY FRACTAL RECOGNITION AND SECURE SPAM
FILTERING
S.Merlin Subidha
Department of Computer Science And Engineering
Jerusalem Engineering College

Effective network security targets a variety of threats and stops them from entering or
spreading on your network . Network security consists of the provisions and policies to prevent
and monitor unauthorized access, misuse, modification, or the denial of computer network and
network accessible resources. Network Security hinges on simple goals such as keeping
unauthorized persons from gaining access to resources and ensuring that authorized persons can
access the resource they need. Authentication is the process of confirming the identification of a
user that is trying to log on or access resources. Network operating systems require that a user be
authenticated in order to log onto the network. This can be done by entering a password,
inserting a smartcard and entering the associated PIN, providing a fingerprint, voice pattern
sample, or retinal scan,etc but still the user faces insecurity in authentication. Spam is
increasingly a core problem affecting network security and performance. Indeed, it has been
estimated that 80% of all email messages are spam. In the existing system the facial recognition
technique is introduced for authentication and spam filtering technique is implemented for
detecting spam mail. However these techniques are not effective because in case of any changes
in face the system results in false authentication and in case of spam detection, spammers use
different names or words to intrude the users mail. The proposed system introduces fractal
recognition technique for authentication and blocks the spammers domain to avoid intrusion. In
fractal recognition technique the users skull is detected for providing effective authentication
and in spam blocking the spammers domain id is traced and blocked for providing secure spam
filtering. This project ensures efficient similarity matching, reducing storage utilization in mails
and securing email from spam.

34

NCFC2T15

You might also like