Volume 4 Number 1 June 2014


Suitable QoS Parameters Survey for Standard Web Services & Web Applications to Understand their Cloud Deployability
S K V Jayakumar , Jayraj Singh , K Suresh Joseph

Abstract:The vital role of web plays in different application domains such as, distribution of information, business, education, industry, and entertainment, E-commerce, survey, online shopping, core banking and much more. In the recent years, in cloud environment there is huge demand for Web Services and Web applications portability or deployability because of the rapid and enormous requirement in the development of e- business solutions; cloud offers its services through using web services & web applications. These are chosen by any user on the basis of their properties which could be either functional or non-functional or both. In this paper, some standards, qualities & suitability of web services & web applications have been thoroughly surveyed and analyzed to understand whether they are cloud deployable or not. In precise, the required qualities and standards for a web services and web applications for their effective deployability are examined over cloud.


An Detection of Lung Nodules Triumph Over by Ribs and Clavicles with Multi Scale Approximation Substantial Trained Artificial Neural Network
S Sri Iswarya Lakshmi, E Sudha

Abstract:When lung nodules super impose with ribs or clavicles in chest radiographs, it can be tough for radiologists as well as computer-aided diagnostic design to recognize these nodules.In this theory,we evolved an image-processing method for triumph over the contrast of ribs and clavicles in chest radiographs by means of a multi scale substantial trained artificial neural network (s-TANN).An s-TANN is a eminently nonlinear filter that can be trained by use of input chest radiographs and the correlate with “training” images.We manipulated “bone” images acquired by use of a twin-energy deduction method as the training images.For adequate suppression of ribs having numerous spatial frequencies,we evolved a multi scale s-TANN be composed of multi scale integration and Disintegrate methods and three s-TANNs for three different-scale images. Subsequently trained with input chest radiographs and the equivalent twin-energy bone images, the multi scale s-TANN was efficient to afford “bone-image-like” images which were identical to the training bone images.By deducting the bone-image-like images from the equivalent chest radiographs,we were efficient to yield “soft-tissue-image-like” images where ribs and clavicles were substantially suppressed.We depleted a affirmation test database be composed of 118 chest radiographs with pulmonary nodules and an individualistic test database be composed of 136 digitized screen-film chest radiographs with 136 solitary pulmonary nodules collected from 14 medical institutions in this study. When our method was be significant to non trained lung nodules,ribs and clavicles in the chest radiographs were triumph over substantially,while the perceptibility of nodules and lung vessels was maintained.Thus, our image-processing method for rib suppression by means of a multi scale s-TANN would be probably advantageous for radiologists as well as for CAD design in investigation of lung nodules on chest radiographs.


Mammogram Image Classification: Non-Shannon Entropy based Ant-Miner
R Roselin , K Thangavel , C Velayutham

Abstract: Mammography plays the central role in the early detection of breast cancer because it can show changes in the breast up to two years before a patient or physician can feel them. Mammogram is a radiograph of the breast tissue. Finding the breast tumour before they turn deadly is a challenge and that the medical technology so far failed master. A newly developed system should help radiologist in more accurate diagnosis. This paper applies data mining technique on mammogram image processing, more specifically it applies swarm intelligence based classification algorithm called Ant-Miner. The original Ant-Miner algorithm uses Shannon Entropy in its heuristic function. A novel idea of using Non-Shannon entropy measure in the heuristic function has been analysed because of its non-extensiveness and a comparative analysis is made with the decision tree algorithm C4.5 in terms of accuracy, number of rules, True Positive Rate (TPR), and False Positive Rate (FPR) and it is reported that Tsallis entropy based Ant-Miner has been proposed for its performance in mammogram image classification.


Swarm Optimized Feature Selection of EEG Signals for Brain-Computer Interface
K Akilandeswari, G M Nasira

Abstract: A Brain-Computer Interface (BCI) is a communication system which uses cerebral activity to control external devices or computers. BCI research’s goal is to provide communication capability to the people who are totally paralyzed or suffer neurological neuromuscular disorders like amyotrophic lateral sclerosis, brain stem stroke or spinal cord injury. A BCI system records brain signals and applies machine learning algorithms to classify such signals, performing a computer controlled action. This study investigates effects of feature selection for Electroencephalograph (EEG) signals. Feature extractions using Walsh Hadamard Transform (WHT) and feature selection with Principal Component Analysis (PCA) are also studied. Feature selection through Particle Swarm Optimization (PSO) is proposed. Classification of the features is achieved through Bagging and decision tree classifiers.


A Comparative Study on K-Means and Fuzzy C-Means Algorithm for Breast Cancer Analysis
A Sathish, J Mohana Sundaram

Abstract: Fuzzy C-Means is a method of clustering which allows one piece of data belongs to two or many clusters. This method is frequently used in pattern recognition. It is based on minimizing functions. Fuzzy Partitioning is carried out through an interactive optimization of the objective function, with the update of membership the cluster centers. Fuzzy c-means is one of them and it is used widely in such applications as a clustering algorithm. In this study, we applied a different clustering algorithm, for data reduction process. We realized the performance evaluation experiments on standard Chainlink and Iris datasets, while the main application was conducted by using Wisconsin Breast Cancer dataset which is taken from the UCI Machine learning repository. In this paper the comparison is done using two tools such as WEKA for K-Means and Mat Lab for Fuzzy C-Means. WEKA (Waikato Environment for Knowledge Analysis) is a data mining tool which contains the collection of algorithms such as association, clustering, classification etc with visualization. Mat Lab (matrix laboratory) is a numerical computing environment and fourth-generation programming language. It allows matrix manipulations, plotting of functions and data, implementation of algorithms, creation of user interfaces, and interfacing.


Analysis of NNBP model for SNE learning process
Sangita Babu, S Gokila

Abstract: Learning process are attained the maturity of applying contemporary algorithms and models for the modern teaching learning process in the entire subjects. The education system adopts the technology to support learners to access the content and evaluation system. The sophisticated models provides the online teaching using multimedia content for learning process but the learning models are involved in the enhancement of learning . The learning models are involved to evaluate the learner’s performance dependences on the learning related factors. This research papers provide the continuation of the previous constructed neural network back propagation model to optimise the BPNN for effective learning process. This covers the memory model which determines learner’s internal attributes process weight and its influences.


Comparative Study of Job Scheduling in Grid Environment Based on BEE`S Algorithm
M Abdullah, P Selvi

Abstract: Grid computing is a new computing model; it’s performing bulk operations among the connected network. Since it is possible to run multiple applications simultaneously may require multiple resources but often do not have the resources that time scheduling system is essential for allocating resources. Scheduling algorithms must be designed for Maximize utilization of the resources and minimize the execution time. In this Paper, comparing the make span and execution time of the bee’s algorithm such as Artificial Bee Colony Algorithm, GA based ABC and an Efficient Modified Artificial Bee Colony Algorithm. The result shows that An Efficient Modified Artificial Bee Colony Algorithm will give better result compared to other algorithms.


Gratuitous Power Reduction in LTE Networks
A Santhosh, A Shakin Banu

Abstract:Long term evolution networks have larger bandwidth as well as larger power consumption for transmission. Dynamic base station switching is reduce the power consumption and that is requires high computational complexity as well as large signaling overhead. We propose a practically implementable switching based energy saving (SES) algorithm that can be operated with low computational complexity. SES algorithm can be operated in a distributed manner and key design principle of this algorithm is to turn of the base station one by one when low traffic period. It will minimally affect the network, by using introduced notion of network impact, the switch off base station users can be handover to neighboring base station. In order to reduce the signaling and implementation overhead the heuristic version of SES can be implemented. In our analysis 50-60% of power consumption can be reduced in metropolitan Area Network.


Viscosity : A Challenge for Injection Mould Designer
Muralithar Lakkanna, Ravikiran Kadoli, G C Mohan Kumar

Abstract: Mould design is vital activity to injection mould thermoplastics with direct repercussions to yield quality, productivity and thereby frugality. As it involves various critical decisions; one such prominent decision being specifying runner size. Unfortunately mould designers intuitively resort to wisely specifying it and then exasperate to optimize / manipulate through independent control parameters [1]. Hence the manuscript intends to leverage the advantage of computational intelligence through a generic, simple, inexpensive preventive design methodology specifically for a particular thermoplastic. In pursuit an exclusive runner cross section size design criteria was imperatively deduced from first principles, further to enhance comprehensiveness its pervasive empirical relationship was quantitatively concocted as an explicit function of in-situ injectant state for an available machine specifications and desired moulding component features. Reckoning apparent viscosity range’s behavioural character to assort almost all thermoplastics relative to in-situ spatiotemporal injectant state perplexity [2]; Continuous Sensitivity Method (CSM) was embraced to sensitise it over an infinite dimensional range. Thereon inferences accorded direct exponential proportionality of runner size with discrete slope and altitude for each thermoplastic behaviour, therefore, we concluded that all thermoplastics are injection mouldable subject to properly design feed system.


Adaptive TXOP Allocation Based on DiffServ for QoS Enhancement in IEEE 802.11e WLAN
V Ilayaraja , C V Baskar

Abstract: IEEE 802.11e WLAN has provided a new access mechanism called the hybrid coordination function (HCF) for quality-of service (QoS) support. Here, we provide HCF scheduler to allocate transmission opportunities (TXOPs) for the stations. TXOP is the time under which a station can send its burst of data packets to other stations. An station can be either in the good or bad channel state and it can be either in constant bit rate or variable bit rate. In the HCF scheduler, channel interference, busty traffic streaming and packet loss causes insufficiency in the strict periodic TXOP allocation scheme. In this, we propose a mechanism of adaptive TXOP allocation. This method works in accordance with channel and traffic conditions, adaptively to compute. The computation is carried by using Differentiated Service (DiffServ) and prediction, providing more access to the time-bounded multimedia applications such as Voice over IP (VoIP), videoconferencing, etc.,