Volume 5 Number 3 December 2015

1

Received Signal Strength Based Energy And Location Aware Cluster Routing Protocol (Rsselacrp) For Wireless Sensor Networks
K. Vignesh,Dr.N.Radhika

Abstract: This article restores an ensemble of location aware protocol termed as received signal strength, energy and location aware cluster routing protocol (RSSELACRP) based on signal strength and residual energy. Among the wireless sensor nodes, the cluster-head sensor node will be chosen based on the residual energy and received signal strength. Clustering is performed based on the nodes location information conceived from the GPS. Relay node selection is carried out by using chain of cluster-head sensor nodes. An adaptive intercluster mechanism is proposed. The performance metrics such as number of packets delivered to the sink, energy utilization rate, the energy standard deviation and average hops are taken to considering the performance of RSSELACRP and is compared with the AELAR [13] and ELACRP protocol. Simulation results show that RSSELACRP outperforms AELAR [13] and ELACRP in terms of chosen performance metrics.

2

Artificial Neural Network Architectures for solving Double Dummy Bridge Problem in Imperfect Information Game
Dr.M.Dharmalingam Dr.R.Amalraj

Abstract: Contract Bridge is an intelligent game, which enhances the creativity with various skills and quest to acquire the intricacies of the game, because no player knows exactly what moves other players are capable of during their turn. The Bridge being a game of imperfect information is to be equally well defined, since the outcome at any intermediate phase is purely based on the decision made on the immediate prior stage. One along with the architectures of Artificial Neural Networks (ANN) is applied by training on sample deals and used to estimate the number of tricks to be taken by one pair of bridge players is the key idea behind Double Dummy Bridge Problem (DDBP) implemented with the neural network paradigm. This study mainly focuses on Cascade-Correlation Neural Network (CCNN) and Elman Neural Network (ENN) which are used to solve the Bridge problem by using Resilient Back-Propagation (R-prop) algorithm. The Work Point Count System (WPCS) is an exclusive, most important and popular system which is used to bid a final contract in Bridge game.

3

An Improved Cuckoo Search Based Robust Ensemble Co-Clustering Lgorithm (Ics - Recca) For Gene Data Clustering
Ms.R.Rajeswari Dr. G.GunaSekaran

Abstract: This research work intends to propose a system with improved cuckoo searchbased robust ensemble co-clustering algorithm (ICS - RECCA) for enzyme clustering. The cuckoo search algorithm has been inspired by the obligate brood parasitism of some cuckoo species by laying their eggs in the nests of other host birds (of other species). Some host birds can engage direct conflict with the intruding cuckoos. Based on the improved cuckoo search optimization technique it has been applied for spectral co-clustering ensemble with constructive mathematical modeling. The proposed algorithm (ICS-RECCA) is capable enough to perform co-clustering with the objective function as the primary component. Simulation results proved that the proposed mechanism ICS-RECCA performs better in terms of accuracy and computation time.

4

ETCOT: A Novel Architecture for Proficient Web Information Retrieval
A.V. Seetha Lakshmi Dr. S. P. Victor, MCA, ME, Ph.D

Abstract: The web based data retrieval has to interpret the content regarding the Keywords. The day to day effort that leads to huge collection of information through the web. The IR systems help to retrieve necessary information from massive databases over the internet. The key concern of this paper is to reduce the number of assessments and it will reduce the time consuming and provide the optimized search result. The major difficulty would be resolved using a novel architecture known to be Enhanced Theme Condensation and Optimization Technique (ETCOT) for information retrieval. This technique deals with optimizing the content of the web to reduce the assessment analysis and the key objective of this scheme is to obtain the required information from IR System using the categorization technique which summarizes the contents of the web into a normalized form. The assessment analysis is minimized to overhead the content which occurs in various web pages, this kind of dispensation is acknowledged theme condensation. Thus the new technique is implemented in this context which deals with these IR systems in formulated manner and provides an optimized search outcome.

5

A Novel Approach of Implementing Radix Sort by Avoiding Zero Bits Based on Divide and Conquer Technique
T Mathialakan S Mahesan

Abstract: Sorting plays a major role in theoretical computations and makes data analysing easy. Even though existing algorithms show good performance in time and space consumption, new techniques are still being fabricated. An efficient non comparative, individual digit clustering, integer sorting algorithm, radix sort seems to be taking O(n). However, in general, it takes O(〖nlog(〗⁡〖n)〗 ) as the best performance comparison based algorithms do. In order to achieve a better performance algorithm than existing ones, we introduce an inspiring technique based on radix sort. This technique divides set of decimal integers in two groups with respect to bits in the same position in the binary representation. This steps is repeated within each subgroup and the process is continued until the list is sorted. Most significant non zero bit of maximum value is used as pivot bit for partition in each step and avoids to run through leading zero bits. This consumes O(cn) time where c is significantly less thanlog⁡n.

6

An Efficient Bayes Classifiers Algorithm on 10-fold Cross Validation for Heart Disease Dataset
R.Nithya Dr.D.Ramyachitra P.Manikandan

Abstract: The Classification technique forecast the categorical and prediction models to predict continuous valued functions. Generally, classification is the process of organizing data into categories for its most effective and capable use. The data classification method makes essential data that is easy to find and retrieve. In this paper the performance of three Bayes classifiers algorithms namely Naïve Bayes, Bayes Net and Naïve Bayes Multinomialare analyzed. The heart disease dataset is used for estimating the performance of the algorithms by using the cross validation parameter. And finally the comparative analysis is performed by using the factors such as the classification accuracy and error rates on all algorithms.

7

Implementation of a High - Quality Image Scaling Processor
Jennifer Eunice.R

Abstract: A high quality algorithm is proposed for VLSI Implementation of an image scaling processor. Scaling is a process, which is non-trivial that involves a trade-off between efficiency, smoothness and sharpness. Enlarging an image (up sampling or interpolating) is generally common for making smaller images to fit in full screen mode. The proposed algorithm comprises a spatial sharpening filter, a clamp filter and a bilinear interpolator. The Spatial sharpening filter and a clamp filter are used together as a pre-filter to remove blurring and aliasing effect from the bilinear interpolator. To minimize the Memory buffers and computational complexity for the proposed design, a T-model and inverse T-model filter realization is done by both sharpening and clamp filters. By using the T – model and inverse T – model filter the memory buffer requirement is reduced to one-line-buffer.

8

Integrated Attribute Based Multi Level Encryption Framework for Improved Cloud Security using Hybrid Algorithms TOR-RCT-TREM
V.Poongodi Dr.K.Thangadurai

Abstract: The growth of service orient architecture has great impact in different domain of applications. The cloud environment is such a SOA which provides different services at different levels to solve the problem of data management. In cloud environment, the user can access the services to store and retrieve the data whenever necessary. In such a loosely coupled environment, the security for the data is necessary and to provide such security, there are different encryption standards described earlier. The earlier methods suffers with the problem of securing data from unauthorized access and from various threats. To overcome the issues of various QoS parameters of cloud, we propose an integrated framework which enforces attribute based multi level encryption standard. The method classifies the data attributes into three different classes and for each class of attribute the method enforces different encryption standards. The proposed model improves the performance of security and reduces the overall time complexity in various factors.

9

Contour Generalization Based Nearest Neighbor Approximation for Mining Uncertain Data
M. Kalavathi Dr. P. Suresh

Abstract: Recently, data uncertainty has developed into an accepted topic in database and data mining area due to the enormous amount of uncertainty involved. The previous methods extend traditional spatiotemporal tolerance for coping with data uncertainty. The methods used Continuous Range Queries and Superseding Nearest Neighbor search, thus rely on conventional multidimensional index. Such methods cannot handle uncertain objects to NN classifier based continuous queries. At the same time, Nearest Neighbor, which is essential characteristics of uncertain objects, has not been considered in measuring contour segments in uncertain data. In this paper, a Contour Generalization based NN approximation (CG-NN Approximation) technique is developed, with the objective of solving spatiotemporal tolerance problem. Contour Generalization technique reduces the overhead count on analyzing the NN based continuous queries and uses a distance function to produce high dimensionality on spatiotemporal queries. An optimal Contour Generalization algorithm produces an optimal Contour Generalization for every input query points.CG-NN Approximation considers various types of queries on uncertain database, determines the query type, and distance function to provide solution for queries on uncertain data. Contour Generalization approximates a polygonal contour, so that it is sufficiently close (i.e.,) nearest neighbor and has less contour segments. The lesser contour segments takes less storage-space and thus minimizing the computational overhead. Experimental evaluation is measured in terms of computation overhead, spatiotemporal tolerance, storage space and query processing efficiency. Experimental analysis shows that CG-NNA technique is able to reduce the computation overhead for continuous queries on uncertain data by 19.35% and reduce the average operation cost by21.24% compared to the state-of-the-art works

10

BPN Based Ensemble Classifier For Gas Sensor Array Drift Dataset
D. Arul Pon Daniel K. Thangavel

Abstract: Much work has been done on classification for the past fifteen years to develop adapted techniques and robust algorithms. The problem of data correction in presence of simultaneous sources of drift, other than sensor drift, should also be investigated, since it is often the case in practical situations. The classification systems, however, are not work on the gas sensor domain, where the benefit of correct classification of chemical components is also the cost of wrong classification is different for all pairs of predicted and actual classes. BPN is a competitive machine learning technique, which has been applied in different domains for classification. In this paper BPN have been implemented for Gas Sensor Array Drift Dataset. The experimental results show that the BPN classifies the drift dataset with an average accuracy of 97% than the other classifiers. The proposed method is compared with C4.5 and SVM.