International Science Index

International Journal of Computer, Electrical, Automation, Control and Information Engineering

16
15077
The Problem of Using the Calculation of the Critical Path to Solver Instances of the Job Shop Scheduling Problem
Abstract:

A procedure commonly used in Job Shop Scheduling Problem (JSSP) to evaluate the neighborhoods functions that use the non-deterministic algorithms is the calculation of the critical path in a digraph. This paper presents an experimental study of the cost of computation that exists when the calculation of the critical path in the solution for instances in which a JSSP of large size is involved. The results indicate that if the critical path is use in order to generate neighborhoods in the meta-heuristics that are used in JSSP, an elevated cost of computation exists in spite of the fact that the calculation of the critical path in any digraph is of polynomial complexity.

Paper Detail
1289
downloads
15
2918
Combining ILP with Semi-supervised Learning for Web Page Categorization
Abstract:

This paper presents a semi-supervised learning algorithm called Iterative-Cross Training (ICT) to solve the Web pages classification problems. We apply Inductive logic programming (ILP) as a strong learner in ICT. The objective of this research is to evaluate the potential of the strong learner in order to boost the performance of the weak learner of ICT. We compare the result with the supervised Naive Bayes, which is the well-known algorithm for the text classification problem. The performance of our learning algorithm is also compare with other semi-supervised learning algorithms which are Co-Training and EM. The experimental results show that ICT algorithm outperforms those algorithms and the performance of the weak learner can be enhanced by ILP system.

Paper Detail
1060
downloads
14
4164
On the Noise Distance in Robust Fuzzy C-Means
Abstract:
In the last decades, a number of robust fuzzy clustering algorithms have been proposed to partition data sets affected by noise and outliers. Robust fuzzy C-means (robust-FCM) is certainly one of the most known among these algorithms. In robust-FCM, noise is modeled as a separate cluster and is characterized by a prototype that has a constant distance δ from all data points. Distance δ determines the boundary of the noise cluster and therefore is a critical parameter of the algorithm. Though some approaches have been proposed to automatically determine the most suitable δ for the specific application, up to today an efficient and fully satisfactory solution does not exist. The aim of this paper is to propose a novel method to compute the optimal δ based on the analysis of the distribution of the percentage of objects assigned to the noise cluster in repeated executions of the robust-FCM with decreasing values of δ . The extremely encouraging results obtained on some data sets found in the literature are shown and discussed.
Paper Detail
1112
downloads
13
15793
Forecasting Enrollment Model Based on First-Order Fuzzy Time Series
Abstract:

This paper proposes a novel improvement of forecasting approach based on using time-invariant fuzzy time series. In contrast to traditional forecasting methods, fuzzy time series can be also applied to problems, in which historical data are linguistic values. It is shown that proposed time-invariant method improves the performance of forecasting process. Further, the effect of using different number of fuzzy sets is tested as well. As with the most of cited papers, historical enrollment of the University of Alabama is used in this study to illustrate the forecasting process. Subsequently, the performance of the proposed method is compared with existing fuzzy time series time-invariant models based on forecasting accuracy. It reveals a certain performance superiority of the proposed method over methods described in the literature.

Paper Detail
1457
downloads
12
7127
Application of Neural Networks in Financial Data Mining
Abstract:

This paper deals with the application of a well-known neural network technique, multilayer back-propagation (BP) neural network, in financial data mining. A modified neural network forecasting model is presented, and an intelligent mining system is developed. The system can forecast the buying and selling signs according to the prediction of future trends to stock market, and provide decision-making for stock investors. The simulation result of seven years to Shanghai Composite Index shows that the return achieved by this mining system is about three times as large as that achieved by the buy and hold strategy, so it is advantageous to apply neural networks to forecast financial time series, the different investors could benefit from it.

Paper Detail
2255
downloads
11
972
An Evaluation of Algorithms for Single-Echo Biosonar Target Classification
Abstract:

A recent neurospiking coding scheme for feature extraction from biosonar echoes of various plants is examined with avariety of stochastic classifiers. Feature vectors derived are employedin well-known stochastic classifiers, including nearest-neighborhood,single Gaussian and a Gaussian mixture with EM optimization.Classifiers' performances are evaluated by using cross-validation and bootstrapping techniques. It is shown that the various classifers perform equivalently and that the modified preprocessing configuration yields considerably improved results.

Paper Detail
1042
downloads
10
12222
Restartings: A Technique to Improve Classic Genetic Algorithms Performance
Abstract:

In this contribution, a way to enhance the performance of the classic Genetic Algorithm is proposed. The idea of restarting a Genetic Algorithm is applied in order to obtain better knowledge of the solution space of the problem. A new operator of 'insertion' is introduced so as to exploit (utilize) the information that has already been collected before the restarting procedure. Finally, numerical experiments comparing the performance of the classic Genetic Algorithm and the Genetic Algorithm with restartings, for some well known test functions, are given.

Paper Detail
1246
downloads
9
8331
Evaluation of Algorithms for Sequential Decision in Biosonar Target Classification
Abstract:

A sequential decision problem, based on the task ofidentifying the species of trees given acoustic echo data collectedfrom them, is considered with well-known stochastic classifiers,including single and mixture Gaussian models. Echoes are processedwith a preprocessing stage based on a model of mammalian cochlearfiltering, using a new discrete low-pass filter characteristic. Stoppingtime performance of the sequential decision process is evaluated andcompared. It is observed that the new low pass filter processingresults in faster sequential decisions.

Paper Detail
965
downloads
8
4375
A Fuzzy Classifier with Evolutionary Design of Ellipsoidal Decision Regions
Abstract:

A fuzzy classifier using multiple ellipsoids approximating decision regions for classification is to be designed in this paper. An algorithm called Gustafson-Kessel algorithm (GKA) with an adaptive distance norm based on covariance matrices of prototype data points is adopted to learn the ellipsoids. GKA is able toadapt the distance norm to the underlying distribution of the prototypedata points except that the sizes of ellipsoids need to be determined a priori. To overcome GKA's inability to determine appropriate size ofellipsoid, the genetic algorithm (GA) is applied to learn the size ofellipsoid. With GA combined with GKA, it will be shown in this paper that the proposed method outperforms the benchmark algorithms as well as algorithms in the field.

Paper Detail
1134
downloads
7
7398
C@sa: Intelligent Home Control and Simulation
Abstract:

In this paper, we present C@sa, a multiagent system aiming at modeling, controlling and simulating the behavior of an intelligent house. The developed system aims at providing to architects, designers and psychologists a simulation and control tool for understanding which is the impact of embedded and pervasive technology on people daily life. In this vision, the house is seen as an environment made up of independent and distributed devices, controlled by agents, interacting to support user's goals and tasks.

Paper Detail
951
downloads
6
11756
Context for Simplicity: A Basis for Context-aware Systems Based on the 3GPP Generic User Profile
Abstract:

The paper focuses on the area of context modeling with respect to the specification of context-aware systems supporting ubiquitous applications. The proposed approach, followed within the SIMPLICITY IST project, uses a high-level system ontology to derive context models for system components which consequently are mapped to the system's physical entities. For the definition of user and device-related context models in particular, the paper suggests a standard-based process consisting of an analysis phase using the Common Information Model (CIM) methodology followed by an implementation phase that defines 3GPP based components. The benefits of this approach are further depicted by preliminary examples of XML grammars defining profiles and components, component instances, coupled with descriptions of respective ubiquitous applications.

Paper Detail
8133
downloads
5
14049
Automatic Camera Calibration for Images of Soccer Match
Abstract:

Camera calibration plays an important role in the domain of the analysis of sports video. Considering soccer video, in most cases, the cross-points can be used for calibration at the center of the soccer field are not sufficient, so this paper introduces a new automatic camera calibration algorithm focus on solving this problem by using the properties of images of the center circle, halfway line and a touch line. After the theoretical analysis, a practicable automatic algorithm is proposed. Very little information used though, results of experiments with both synthetic data and real data show that the algorithm is applicable.

Paper Detail
1274
downloads
4
8602
Approaches and Schemes for Storing DTD-Independent XML Data in Relational Databases
Abstract:
The volume of XML data exchange is explosively increasing, and the need for efficient mechanisms of XML data management is vital. Many XML storage models have been proposed for storing XML DTD-independent documents in relational database systems. Benchmarking is the best way to highlight pros and cons of different approaches. In this study, we use a common benchmarking scheme, known as XMark to compare the most cited and newly proposed DTD-independent methods in terms of logical reads, physical I/O, CPU time and duration. We show the effect of Label Path, extracting values and storing in another table and type of join needed for each method's query answering.
Paper Detail
1081
downloads
3
2078
Self-Organization of Clusters Having Locally Distributed Patterns for Highly Synchronized Inputs
Abstract:

Many experimental results suggest that more precise spike timing is significant in neural information processing. We construct a self-organization model using the spatiotemporal pat-terns, where Spike-Timing Dependent Plasticity (STDP) tunes the conduction delays between neurons. We show that, for highly syn-chronized inputs, the fluctuation of conduction delays causes globally continuous and locally distributed firing patterns through the self-organization.

Paper Detail
810
downloads
2
7510
A Flexible and Scalable Agent Platform for Multi-Agent Systems
Abstract:
Multi-agent system is composed by several agents capable of reaching the goal cooperatively. The system needs an agent platform for efficient and stable interaction between intelligent agents. In this paper we propose a flexible and scalable agent platform by composing the containers with multiple hierarchical agent groups. It also allows efficient implementation of multiple domain presentations of the agents unlike JADE. The proposed platform provides both group management and individual management of agents for efficiency. The platform has been implemented and tested, and it can be used as a flexible foundation of the dynamic multi-agent system targeting seamless delivery of ubiquitous services.
Paper Detail
1230
downloads
1
14270
Affine Projection Algorithm with Variable Data-Reuse Factor
Abstract:

This paper suggests a new Affine Projection (AP) algorithm with variable data-reuse factor using the condition number as a decision factor. To reduce computational burden, we adopt a recently reported technique which estimates the condition number of an input data matrix. Several simulations show that the new algorithm has better performance than that of the conventional AP algorithm.

Paper Detail
870
downloads