Volume 1 Number 4 March 2012


Student Performance PredictionModeling: A Bayesian Network Approach
M. Ramaswami, R. Rathinasabapathy

Abstract: In recent times, prediction of student’s academic performance at higher secondary level examination remains a difficult task. It requires vast amount of academic, socio-economic and other environmental factors for proposing an efficient prediction models. Moreover the selection of appropriate algorithm for model construction is also a non-trivial process. To this end, we explore the use of Bayesian networks for predicting academic performance of higher secondary students in India, based on values of socio-economic and other academic attributes. We present an analysis on the data obtained from the students of the higher secondary schools containing 35 attributes with 5650 data objects. The paper explains the application of the Bayesian approach in the field of education and shows that the Bayesian network classifier has a potential to be used as a tool for prediction of student’s academic performance.


Geometric Attack Resistant Image Watermarking Scheme For Providing High Security
J. Veerappan, G. Pitchammal

Abstract: The main theme of this application is to provide an algorithm color image watermark to manage the attacks such as rotation, scaling and translation. In the existing watermarking algorithms, those exploited robust features are more or less related to the pixel position, so they cannot be more robust against the attacks. In order to solve this problem this application focus on certain parameters rather than the pixel position for watermarking. Two statistical features such as the histogram shape and the mean of Gaussian filtered low-frequency component of images are taken for this proposed application to make the watermarking algorithm robust to attacks and also interpolation technique is used to increase the number of bites to be needed.


Two-Dimensional Object Recognition using SVM-KNN augmented by local and global feature
R. Muralidharan, C. Chandrasekar

Abstract: The performance of the SVM-KNN as classifier for recognizing 2D objects from the image is discussed in this paper. The classifier is supported by the features extracted from the image. The feature vector proposed in this paper is a fusion of local and global features. For feature vector formation, Hu’s Moment Invariant is computed to represent the image as global feature, which is invariant to translation, rotation and scaling and Hessian-Laplace detector and PCA-SIFT descriptor as local feature. Also the feature vector constructed is high-dimensional that reduces the performance of the classifier, in order to reduce the size of feature vector without losing the important features Kernel Principal Component Analysis is applied. The proposed method is experimented in MATLAB and tested with the COIL-100 and CALTECH 101 databases, for the both databases the results are shown. Also the experiment is performed on the noisy images of both the databases. To prove the efficiency of the proposed method, Neural Network model (BPN) is performed and the comparative results are given.


Remote Sensing Image Compression Using 3D-SPIHT Algorithm
D. Napoleon, S. Sathya, M. Praneesh, M. Siva Subramanian

Abstract: To measure the characteristics of an object from a distance, the technique of remote sensing is used. By this technique we can collect all information about an object without making any physical contact with that object. 3D SPIHT algorithm to code objects with arbitrary shape and adds spatial and temporal scalability features to it. It keeps important features of the original 3D SPIHT coder such as compression efficiency, full embeddedness and rate scalability. Set Partitioning in hierarchical trees (SPIHT) algorithm to remote sensing images, using a wavelet decomposition method. The wavelet decomposition is accomplished with wavelet filters implemented with the lifting method. Result show that our algorithm with filters performs as well and better in lossless coding systems using 3D SPIHT and wavelet transforms on remote sensing images.


Mammogram Image Segmentation using Rough Set Theory
K. Thangavel, C. Velayutham

Abstract: The presence of microcalcification in mammogram image has been considered as a indicator of malignant types of breast cancer, and its detection is important to prevent and treat the disease. This paper proposes an effective approach, relative dependency measure using the Rough Set Theory (RST) in order to automatic detection of microcalcification in digitized mammogram images. The preprocessing of mammogram image is essential before detection and segmentation of microcalcification. However, the presence of artifacts and pectoral muscle can disturb the detection of microcalcification and reduce the rate of accuracy in the Computer Aided Diagnosis (CAD). Its inclusion can affect the results of intensity-based image processing methods and needs to be identified and removed before further analysis. These processes are performed in the preprocessing stage. 117 mammogram images from the MIAS database have been used for evaluation. The computational results are evaluated with the reports already available in the MIAS database.


Energy Efficient and Secure Communication Protocol in MANET
G. Siva Kumar, M. Kaliappan, L. Jerart Julus

Abstract: Recent year a rapid development and widespread application of mobile ad hoc networks suffer from security attacks and privacy issues which dramatically impede their applications. To cope with the attacks, a large variety of intrusion detection techniques such as authentication, authorization, cryptographic protocols and key management schemes have been developed. Clustering methods allow fast connection, better routing and topology management of mobile ad hoc networks (MANET). This paper, we have introduced new mechanism called Energy Efficiency and Secure Communication Protocol (EESCP) is to divide the MANET into a set of 2-hop clusters where each node belongs to at least one cluster. The nodes in each cluster elect a leader node (cluster head) to serve as the IDS for the entire cluster. To balance the resource consumption weight based leader election model is used, which elected an optimal collection of leaders to minimize the overall resource consumption and obtaining secure communication using diffie-Hellman key exchange protocol.


Text-independent Mono and Cross Lingual Speaker Identification with the Constraint of Limited Data
B. G. Nagaraja, H. S. Jayanna

Abstract: Speaker recognition is a biometric process of automatically recognizing speaker who is speaking on the basis of speaker dependent features of the speech signal. Nowadays, speaker identification system plays a very important role in the field of fast growing internet based communication/transactions. In this paper, closed-set text-independent speaker identification in the context of Mono and Cross-lingual are demonstrated for Indian languages with the constraint of limited data. The languages considered for the study are English, Hindi and Kannada. Since the standard Multi-lingual database is not available, experiments are carried out on an our own created database of 30 speakers, who can speak the three different languages. Speaker identification system based on Mel-frequency cepstral coefficients–Vector Quantization (MFCC-VQ) framework is considered. It was found out in the experimental study that the Mono-lingual speaker identification gives better performance with English as a training and testing language though it is not a native language of speakers considered for the study. Further, it was observed in cross-lingual study that the use of English language either in training or testing gives better identification performance.


Improved Bootstrapping Scheme with Cache Consistency for Invalidation Attack Resistance on MANET
P. Parameswari, C. Chandrasekar

Abstract: Bootstrap protocol initiates neighbor node relationship and exchange knowledge information with other nodes in dynamic MANET. Security properties of Bootstrap protocol have need of an adversary effect undermine the correctness, and improvement in anticipation of an adversary. Our previous work presented an improved bootstrap security model with efficient cache consistency scheme to share proper and correct information sharing between the mobile nodes of the ad hoc network. Cache consistency scheme is a server control mechanism adapts the process of caching a data item and instructs the query node to updates data in the respective mobile cache nodes. In this paper, the proposed framework presents Secure Bootstrapping Scheme which uniformly operates as a good basis for new distributed bootstrapping scheme development in MANET to identify intrusion, damage recovery with cache consistency strategies. The proposal creates invalidation resistance techniques which identify frequently accessed data in which updates are pushed to its maximal. The multicast tree used for pushing cache invalidation and frequently accessed data to improve the system performance. The framework is common enough to hold existing schemes. This makes available us to compare these schemes according to a common base. In this paper, multicast tree built to validate the cached data with faster updates in ad hoc networks. Experiments are carried out with NS-2 simulator to show its effectiveness of restricting invalidation attack in terms of response time, attack resistance rate, false positive rate and Bandwidth.


Cluster Based Evaluation of Image Fusion Algorithms
R. Barani, M. Sumathi

Abstract: The general requirements of an image fusion process are that it should preserve all valid and useful pattern information from the source images, while at the same time not introducing any artifacts. However it is not possible for the fusion algorithms to combine images that may contain all information from the source images without introducing some form of artifacts. As the image fusion technologies have been developing in many applications such as remote sensing, medical imaging, machine vision, military applications in recent years, the methods that can assess or evaluate the performance of fusion algorithms are of very important. Since the various image fusion algorithms combine images at different levels, they may result in some form of artifacts in the fused image. Hence more number of methods or quality metrics is required to evaluate the quality of fused image. This paper described twelve image quality metrics which are used to evaluate the quality of an image. Using these metrics, some of the image fusion algorithms are evaluated and are clustered into three groups (good, average and worst) using Fuzzy-C-Means (FCM) clustering technique by considering the cumulative metric value as an objective criterion.


Genetic Algorithm and J48 Based Link Spamdexing Classifier for Web Search Engine
S. K. Jayanthi, S. Sasikala

Abstract: Search engine acts as doorsteps for the web surfers to seek information from the WWW. Web spam is a technique of manipulating the content and link of the website for the improving the visibility of the sites at search engines. The sole intention behind web spam is commercial purpose to promote the website. This paper proposes two types of classifiers for discriminating the spamdexing. Among them one is based on genetic algorithm (GA), and another one is based on C4.5 algorithm. The later is implemented as J48 classification model in WEKA [8]. WES SPAM UK-2007 Link-based features Dataset is used for the experiments. As a result, GA Decision tree and J48 Decision tree both are yielded by inferring the vital link-based features. The decision tree can easily spot out which feature influences the spamcity measure. By concentrating on that feature, users visit to the spam webpage can be minimized. Only link-based attributes are considered in this paper. A comparison has been done between the classifiers. Experimental results show that GA based classifier seems to be a better discriminator for spam which yields accuracy 0.912 and J48 classifier yields the accuracy of 0.891.


Cryptography Based on Combination of Hybridization and Cube’s Rotation
D. Rajavel, S. P. Shantharajah

Abstract: We propose a new and improved cryptographic algorithm based on combination of cubical data hybridization and rotation. Hybridization was performed with shuffled cubes, which are generated from randomly gained rotation square. The obtained hybrid cube was again shuffled via rotation square, which in turn generated from randomly selected another rotation square. Cubic rotation was performed as same that of simple Rubik’s cube shuffling. In general, four phase of rotation was carried out, in which first one involves rotation of cubes which has sequence of number from 1 to n*n*n, the second one involves the rotation of hybrid cube to achieve cubical keys, the third phase of rotation has been carried out in developing the cipher message by rotating the original message and last one involves the rotation of cipher cube to regain the original message. Random selection was performed in generation of rotation square and random selection of cubical encryption key. The generated key was in cubical form and cipher text generated from this encryption algorithm is more secure from cryptanalysis because rotation and hybridization makes data unpredictable.


Lossless BMP Image Compression With New Efficient Arithmetic Approaches
Suranjan Ganguly, Anupam Bera, Anirban Kundu

Abstract: In this paper efficient approaches are made out of several approaches for developing a image compression [1][2] algorithm based on use of arithmetic approaches. Image processing technologies are used to pre-process the images which upon which the arithmetic rules are applied. Image data are stored and then these data are processed. These values are expressed in the power and remainder of 2. Then Huffman [8] and LZW (Lempel- Ziv – Welch) algorithms are used on these processed data. Here a satisfactory result is achieved. Then the extracted image value is dived these values by 26 and used the quotient and remainder portion of this arithmetic to compress image by using Huffman and RLE algorithms separately, on quotient part first RLE (Run Length Encoding) algorithm is applied then Huffman algorithm is used and on remainder part Huffman algorithm is applied. Another approach is used by spreading the extracted value from the image to ARGB to form a new compressed image. BMP (Windows bitmap) and JPEG (Joint Photographic Experts Group) file formats are considered for this compression [2][4] algorithm. The maximum success rate of compression of this new arithmetic based approach is 27.75 % on BMP images.