International Science Index
Climate Safe House: A Community Housing Project Tackling Catastrophic Sea Level Rise in Coastal Communities
New Zealand, an island nation, has an extensive coastline peppered with small communities of iconic buildings known as Bachs. Post WWII, these modest buildings were constructed by their owners as retreats and generally were small, low cost, often using recycled material and often they fell below current acceptable building standards. In the latter part of the 20th century, real estate prices in many of these communities remained low and these areas became permanent residences for people attracted to this affordable lifestyle choice. The Blueskin Resilient Communities Trust (BRCT) is an organisation that recognises the vulnerability of communities in low lying settlements as now being prone to increased flood threat brought about by climate change and sea level rise. Some of the inhabitants of Blueskin Bay, Otago, NZ have already found their properties to be un-insurable because of increased frequency of flood events and property values have slumped accordingly. Territorial authorities also acknowledge this increased risk and have created additional compliance measures for new buildings that are less than 2 m above tidal peaks. Community resilience becomes an additional concern where inhabitants are attracted to a lifestyle associated with a specific location and its people when this lifestyle is unable to be met in a suburban or city context. Traditional models of social housing fail to provide the sense of community connectedness and identity enjoyed by the current residents of Blueskin Bay. BRCT have partnered with the Otago Polytechnic Design School to design a new form of community housing that can react to this environmental change. It is a longitudinal project incorporating participatory approaches as a means of getting people ‘on board’, to understand complex systems and co-develop solutions. In the first period, they are seeking industry support and funding to develop a transportable and fully self-contained housing model that exploits current technologies. BRCT also hope that the building will become an educational tool to highlight climate change issues facing us today. This paper uses the Climate Safe House (CSH) as a case study for education in architectural sustainability through experiential learning offered as part of the Otago Polytechnics Bachelor of Design. Students engage with the project with research methodologies, including site surveys, resident interviews, data sourced from government agencies and physical modelling. The process involves collaboration across design disciplines including product and interior design but also includes connections with industry, both within the education institution and stakeholder industries introduced through BRCT. This project offers a rich learning environment where students become engaged through project based learning within a community of practice, including architecture, construction, energy and other related fields. The design outcomes are expressed in a series of public exhibitions and forums where community input is sought in a truly participatory process.
Parametric Design as an Approach to Respond to Complexity
A city is an intertwined texture from the relationship of different components in a whole which is united in a one, so designing the whole complex and its planning is not an easy matter. By considering that a city is a complex system with infinite components and communications, providing flexible layouts that can respond to the unpredictable character of the city, which is a result of its complexity, is inevitable. Parametric design approach as a new approach can produce flexible and transformative layouts in any stage of design. This study aimed to introduce parametric design as a modern approach to respond to complex urban issues by using descriptive and analytical methods. This paper firstly introduces complex systems and then giving a brief characteristic of complex systems. The flexible design and layout flexibility is another matter in response and simulation of complex urban systems that should be considered in design, which is discussed in this study. In this regard, after describing the nature of the parametric approach as a flexible approach, as well as a tool and appropriate way to respond to features such as limited predictability, reciprocating nature, complex communications, and being sensitive to initial conditions and hierarchy, this paper introduces parametric design.
Forthcoming Big Data on Smart Buildings and Cities: An Experimental Study on Correlations among Urban Data
Cities are complex systems of diverse and inter-tangled activities. These activities and their complex interrelationships create diverse urban phenomena. And such urban phenomena have considerable influences on the lives of citizens. This research aimed to develop a method to reveal the causes and effects among diverse urban elements in order to enable better understanding of urban activities and, therefrom, to make better urban planning strategies. Specifically, this study was conducted to solve a data-recommendation problem found on a Korean public data homepage. First, a correlation analysis was conducted to find the correlations among random urban data. Then, based on the results of that correlation analysis, the weighted data network of each urban data was provided to people. It is expected that the weights of urban data thereby obtained will provide us with insights into cities and show us how diverse urban activities influence each other and induce feedback.
Structuring and Visualizing Healthcare Claims Data Using Systems Architecture Methodology
Healthcare delivery systems around the world are in
crisis. The need to improve health outcomes while decreasing
healthcare costs have led to an imminent call to action to transform
the healthcare delivery system. While Bioinformatics and Biomedical
Engineering have primarily focused on biological level data and
biomedical technology, there is clear evidence of the importance
of the delivery of care on patient outcomes. Classic singular
decomposition approaches from reductionist science are not capable
of explaining complex systems. Approaches and methods from
systems science and systems engineering are utilized to structure
healthcare delivery system data. Specifically, systems architecture is
used to develop a multi-scale and multi-dimensional characterization
of the healthcare delivery system, defined here as the Healthcare
Delivery System Knowledge Base. This paper is the first to contribute
a new method of structuring and visualizing a multi-dimensional and
multi-scale healthcare delivery system using systems architecture in
order to better understand healthcare delivery.
A Self Organized Map Method to Classify Auditory-Color Synesthesia from Frontal Lobe Brain Blood Volume
Absolute pitch is the ability to identify a musical note without a reference tone. Training for absolute pitch often occurs in preschool education. It is necessary to clarify how well the trainee can make use of synesthesia in order to evaluate the effect of the training. To the best of our knowledge, there are no existing methods for objectively confirming whether the subject is using synesthesia. Therefore, in this study, we present a method to distinguish the use of color-auditory synesthesia from the separate use of color and audition during absolute pitch training. This method measures blood volume in the prefrontal cortex using functional Near-infrared spectroscopy (fNIRS) and assumes that the cognitive step has two parts, a non-linear step and a linear step. For the linear step, we assume a second order ordinary differential equation. For the non-linear part, it is extremely difficult, if not impossible, to create an inverse filter of such a complex system as the brain. Therefore, we apply a method based on a self-organizing map (SOM) and are guided by the available data. The presented method was tested using 15 subjects, and the estimation accuracy is reported.
Systems Engineering Management Using Transdisciplinary Quality System Development Lifecycle Model
The successful realization of complex systems is dependent not only on the technology issues and the process for implementing them, but on the management issues as well. Managing the systems development lifecycle requires technical management. Systems engineering management is the technical management. Systems engineering management is accomplished by incorporating many activities. The three major activities are development phasing, systems engineering process and lifecycle integration. Systems engineering management activities are performed across the system development lifecycle. Due to the ever-increasing complexity of systems as well the difficulty of managing and tracking the development activities, new ways to achieve systems engineering management activities are required. This paper presents a systematic approach used as a design management tool applied across systems engineering management roles. In this approach, Transdisciplinary System Development Lifecycle (TSDL) Model has been modified and integrated with Quality Function Deployment. Hereinafter, the name of the systematic approach is the Transdisciplinary Quality System Development Lifecycle (TQSDL) Model. The QFD translates the voice of customers (VOC) into measurable technical characteristics. The modified TSDL model is based on Axiomatic Design developed by Suh which is applicable to all designs: products, processes, systems and organizations. The TQSDL model aims to provide a robust structure and systematic thinking to support the implementation of systems engineering management roles. This approach ensures that the customer requirements are fulfilled as well as satisfies all the systems engineering manager roles and activities.
Application of Systems Engineering Tools and Methods to Improve Healthcare Delivery Inside the Emergency Department of a Mid-Size Hospital
Emergency department (ED) is considered as a complex system of interacting entities: patients, human resources, software and hardware systems, interfaces, and other systems. This paper represents a research for implementing a detailed Systems Engineering (SE) approach in a mid-size hospital in central Indiana. This methodology will be applied by “The Initiative for Product Lifecycle Innovation (IPLI)” institution at Indiana University to study and solve the crowding problem with the aim of increasing throughput of patients and enhance their treatment experience; therefore, the nature of crowding problem needs to be investigated with all other problems that leads to it. The presented SE methods are workflow analysis and systems modeling where SE tools such as Microsoft Visio are used to construct a group of system-level diagrams that demonstrate: patient’s workflow, documentation and communication flow, data systems, human resources workflow and requirements, leadership involved, and integration between ER different systems. Finally, the ultimate goal will be managing the process through implementation of an executable model using commercialized software tools, which will identify bottlenecks, improve documentation flow, and help make the process faster.
Frequency Response of Complex Systems with Localized Nonlinearities
Finite Element Models (FEMs) are widely used in order to study and predict the dynamic properties of structures and usually, the prediction can be obtained with much more accuracy in the case of a single component than in the case of assemblies. Especially for structural dynamics studies, in the low and middle frequency range, most complex FEMs can be seen as assemblies made by linear components joined together at interfaces. From a modelling and computational point of view, these types of joints can be seen as localized sources of stiffness and damping and can be modelled as lumped spring/damper elements, most of time, characterized by nonlinear constitutive laws. On the other side, most of FE programs are able to run nonlinear analysis in time-domain. They treat the whole structure as nonlinear, even if there is one nonlinear degree of freedom (DOF) out of thousands of linear ones, making the analysis unnecessarily expensive from a computational point of view. In this work, a methodology in order to obtain the nonlinear frequency response of structures, whose nonlinearities can be considered as localized sources, is presented. The work extends the well-known Structural Dynamic Modification Method (SDMM) to a nonlinear set of modifications, and allows getting the Nonlinear Frequency Response Functions (NLFRFs), through an ‘updating’ process of the Linear Frequency Response Functions (LFRFs). A brief summary of the analytical concepts is given, starting from the linear formulation and understanding what the implications of the nonlinear one, are. The response of the system is formulated in both: time and frequency domain. First the Modal Database is extracted and the linear response is calculated. Secondly the nonlinear response is obtained thru the NL SDMM, by updating the underlying linear behavior of the system. The methodology, implemented in MATLAB, has been successfully applied to estimate the nonlinear frequency response of two systems. The first one is a two DOFs spring-mass-damper system, and the second example takes into account a full aircraft FE Model. In spite of the different levels of complexity, both examples show the reliability and effectiveness of the method. The results highlight a feasible and robust procedure, which allows a quick estimation of the effect of localized nonlinearities on the dynamic behavior. The method is particularly powerful when most of the FE Model can be considered as acting linearly and the nonlinear behavior is restricted to few degrees of freedom. The procedure is very attractive from a computational point of view because the FEM needs to be run just once, which allows faster nonlinear sensitivity analysis and easier implementation of optimization procedures for the calibration of nonlinear models.
Augmented Reality for Maintenance Operator for Problem Inspections
Current production-oriented factories need maintenance operators to work in shifts monitoring and inspecting complex systems and different equipment in the situation of mechanical breakdown. Augmented reality (AR) is an emerging technology that embeds data into the environment for situation awareness to help maintenance operators make decisions and solve problems. An application was designed to identify the problem of steam generators and inspection centrifugal pumps. The objective of this research was to find the best medium of AR and type of problem solving strategies among analogy, focal object method and mean-ends analysis. Two scenarios of inspecting leakage were temperature and vibration. Two experiments were used in usability evaluation and future innovation, which included decision-making process and problem-solving strategy. This study found that maintenance operators prefer build-in magnifier to zoom the components (55.6%), 3D exploded view to track the problem parts (50%), and line chart to find the alter data or information (61.1%). There is a significant difference in the use of analogy (44.4%), focal objects (38.9%) and mean-ends strategy (16.7%). The marked differences between maintainers and operators are of the application of a problem solving strategy. However, future work should explore multimedia information retrieval which supports maintenance operators for decision-making.
Integrated Design in Additive Manufacturing Based on Design for Manufacturing
Nowadays, manufactures are encountered with production of different version of products due to quality, cost and time constraints. On the other hand, Additive Manufacturing (AM) as a production method based on CAD model disrupts the design and manufacturing cycle with new parameters. To consider these issues, the researchers utilized Design For Manufacturing (DFM) approach for AM but until now there is no integrated approach for design and manufacturing of product through the AM. So, this paper aims to provide a general methodology for managing the different production issues, as well as, support the interoperability with AM process and different Product Life Cycle Management tools. The problem is that the models of System Engineering which is used for managing complex systems cannot support the product evolution and its impact on the product life cycle. Therefore, it seems necessary to provide a general methodology for managing the product’s diversities which is created by using AM. This methodology must consider manufacture and assembly during product design as early as possible in the design stage. The latest approach of DFM, as a methodology to analyze the system comprehensively, integrates manufacturing constraints in the numerical model in upstream. So, DFM for AM is used to import the characteristics of AM into the design and manufacturing process of a hybrid product to manage the criteria coming from AM. Also, the research presents an integrated design method in order to take into account the knowledge of layers manufacturing technologies. For this purpose, the interface model based on the skin and skeleton concepts is provided, the usage and manufacturing skins are used to show the functional surface of the product. Also, the material flow and link between the skins are demonstrated by usage and manufacturing skeletons. Therefore, this integrated approach is a helpful methodology for designer and manufacturer in different decisions like material and process selection as well as, evaluation of product manufacturability.
Complex Network Approach to International Trade of Fossil Fuel
Energy has a prominent role for development of
nations. Countries which have energy resources also have strategic
power in the international trade of energy since it is essential for all
stages of production in the economy. Thus, it is important for
countries to analyze the weaknesses and strength of the system. On
the other side, international trade is one of the fields that are analyzed
as a complex network via network analysis. Complex network is one
of the tools to analyze complex systems with heterogeneous agents
and interaction between them. A complex network consists of nodes
and the interactions between these nodes. Total properties which
emerge as a result of these interactions are distinct from the sum of
small parts (more or less) in complex systems. Thus, standard
approaches to international trade are superficial to analyze these
systems. Network analysis provides a new approach to analyze
international trade as a network. In this network, countries constitute
nodes and trade relations (export or import) constitute edges. It
becomes possible to analyze international trade network in terms of
high degree indicators which are specific to complex networks such
as connectivity, clustering, assortativity/disassortativity, centrality,
etc. In this analysis, international trade of crude oil and coal which
are types of fossil fuel has been analyzed from 2005 to 2014 via
network analysis. First, it has been analyzed in terms of some
topological parameters such as density, transitivity, clustering etc.
Afterwards, fitness to Pareto distribution has been analyzed via
Kolmogorov-Smirnov test. Finally, weighted HITS algorithm has
been applied to the data as a centrality measure to determine the real
prominence of countries in these trade networks. Weighted HITS
algorithm is a strong tool to analyze the network by ranking countries
with regards to prominence of their trade partners. We have
calculated both an export centrality and an import centrality by
applying w-HITS algorithm to the data. As a result, impacts of the
trading countries have been presented in terms of high-degree
Computer-Assisted Management of Building Climate and Microgrid with Model Predictive Control
With 40% of total world energy consumption,
building systems are developing into technically complex large
energy consumers suitable for application of sophisticated power
management approaches to largely increase the energy efficiency
and even make them active energy market participants. Centralized
control system of building heating and cooling managed by
economically-optimal model predictive control shows promising
results with estimated 30% of energy efficiency increase. The research
is focused on implementation of such a method on a case study
performed on two floors of our faculty building with corresponding
sensors wireless data acquisition, remote heating/cooling units and
central climate controller. Building walls are mathematically modeled
with corresponding material types, surface shapes and sizes. Models
are then exploited to predict thermal characteristics and changes in
different building zones. Exterior influences such as environmental
conditions and weather forecast, people behavior and comfort
demands are all taken into account for deriving price-optimal climate
control. Finally, a DC microgrid with photovoltaics, wind turbine,
supercapacitor, batteries and fuel cell stacks is added to make the
building a unit capable of active participation in a price-varying
energy market. Computational burden of applying model predictive
control on such a complex system is relaxed through a hierarchical
decomposition of the microgrid and climate control, where the
former is designed as higher hierarchical level with pre-calculated
price-optimal power flows control, and latter is designed as lower
level control responsible to ensure thermal comfort and exploit
the optimal supply conditions enabled by microgrid energy flows
management. Such an approach is expected to enable the inclusion
of more complex building subsystems into consideration in order to
further increase the energy efficiency.
Synchronization of Semiconductor Laser Networks
In this paper, synchronization of multiple chaotic
semiconductor lasers is achieved by appealing to complex system
theory. In particular, we consider dynamical networks composed by
semiconductor laser, as interconnected nodes, where the interaction
in the networks are defined by coupling the first state of each node.
An interest case is synchronized with master-slave configuration in
star topology. Nodes of these networks are modeled for the laser and
simulate by Matlab. These results are applicable to private
Quality Control of Automotive Gearbox Based On Vibration Signal Analysis
In more complex systems, such as automotive
gearbox, a rigorous treatment of the data is necessary because there
are several moving parts (gears, bearings, shafts, etc.), and in this
way, there are several possible sources of errors and also noise. The
basic objective of this work is the detection of damage in automotive
gearbox. The detection methods used are the wavelet method, the
bispectrum; advanced filtering techniques (selective filtering) of
vibrational signals and mathematical morphology. Gearbox vibration
tests were performed (gearboxes in good condition and with defects)
of a production line of a large vehicle assembler. The vibration
signals are obtained using five accelerometers in different positions
of the sample. The results obtained using the kurtosis, bispectrum,
wavelet and mathematical morphology showed that it is possible to
identify the existence of defects in automotive gearboxes.
Bounded Rational Heterogeneous Agents in Artificial Stock Markets: Literature Review and Research Direction
In this paper, we provided a literature survey on the
artificial stock problem (ASM). The paper began by exploring the
complexity of the stock market and the needs for ASM. ASM
aims to investigate the link between individual behaviors (micro
level) and financial market dynamics (macro level). The variety of
patterns at the macro level is a function of the AFM complexity. The
financial market system is a complex system where the relationship
between the micro and macro level cannot be captured analytically.
Computational approaches, such as simulation, are expected to
comprehend this connection. Agent-based simulation is a simulation
technique commonly used to build AFMs. The paper proceeds by
discussing the components of the ASM. We consider the roles
of behavioral finance (BF) alongside the traditionally risk-averse
assumption in the construction of agent’s attributes. Also, the
influence of social networks in the developing of agents interactions is
addressed. Network topologies such as a small world, distance-based,
and scale-free networks may be utilized to outline economic
collaborations. In addition, the primary methods for developing
agents learning and adaptive abilities have been summarized.
These incorporated approach such as Genetic Algorithm, Genetic
Programming, Artificial neural network and Reinforcement Learning.
In addition, the most common statistical properties (the stylized facts)
of stock that are used for calibration and validation of ASM are
discussed. Besides, we have reviewed the major related previous
studies and categorize the utilized approaches as a part of these
studies. Finally, research directions and potential research questions
are argued. The research directions of ASM may focus on the macro
level by analyzing the market dynamic or on the micro level by
investigating the wealth distributions of the agents.
Maximizing Performance of the Membranes Based on Quaternized Polysulfone/Polyvinil Alcohol for Biomedical Applications: Rheological Investigations
The rheological response of blends obtained from
quaternized polysulfone and polyvinyl alcohol in N-methyl-2-
pyrrolidone as against structural peculiarity of polymers from the
blend, composition of polymer mixtures, as well as the types of
interactions were investigated. Results show that the variation of
polyvinyl alcohol composition in the studied system determines
changes of the rheological properties, suggesting that the PVA acts as
a plasticizer. Consequently, rheological behavior of complex system,
described by the nonlinear flow curve, indicates the impact of
polyvinil alcohol content to polysulfone solution, in order to facilitate
the subsequently preparation of bioactive membranes.
Handling Complexity of a Complex System Design: Paradigm, Formalism and Transformations
Current systems complexity has reached a degree that
requires addressing conception and design issues while taking into
account environmental, operational, social, legal and financial
aspects. Therefore, one of the main challenges is the way complex
systems are specified and designed. The exponential growing effort,
cost and time investment of complex systems in modeling phase
emphasize the need for a paradigm, a framework and an environment
to handle the system model complexity. For that, it is necessary to
understand the expectations of the human user of the model and his
limits. This paper presents a generic framework for designing
complex systems, highlights the requirements a system model needs
to fulfill to meet human user expectations, and suggests a graphbased
formalism for modeling complex systems. Finally, a set of
transformations are defined to handle the model complexity.
A New Model to Perform Preliminary Evaluations of Complex Systems for the Production of Energy for Buildings: Case Study
The building sector is responsible, in many
industrialized countries, for about 40% of the total energy
requirements, so it seems necessary to devote some efforts in this
area in order to achieve a significant reduction of energy
consumption and of greenhouse gases emissions.
The paper presents a study aiming at providing a design
methodology able to identify the best configuration of the system
building/plant, from a technical, economic and environmentally point
Normally, the classical approach involves a building's energy
loads analysis under steady state conditions, and subsequent selection
of measures aimed at improving the energy performance, based on
previous experience made by architects and engineers in the design
team. Instead, the proposed approach uses a sequence of two wellknown
scientifically validated calculation methods (TRNSYS and
RETScreen), that allow quite a detailed feasibility analysis.
To assess the validity of the calculation model, an existing,
historical building in Central Italy, that will be the object of
restoration and preservative redevelopment, was selected as a casestudy.
The building is made of a basement and three floors, with a
total floor area of about 3,000 square meters.
The first step has been the determination of the heating and
cooling energy loads of the building in a dynamic regime by means,
which allows simulating the real energy needs of the building in
function of its use. Traditional methodologies, based as they are on
steady-state conditions, cannot faithfully reproduce the effects of
varying climatic conditions and of inertial properties of the structure.
With this model is possible to obtain quite accurate and reliable
results that allow identifying effective combinations building-HVAC
The second step has consisted of using output data obtained as
input to the calculation model, which enables to compare different
system configurations from the energy, environmental and financial
point of view, with an analysis of investment, and operation and
maintenance costs, so allowing determining the economic benefit of
The classical methodology often leads to the choice of
conventional plant systems, while our calculation model provides a
financial-economic assessment for innovative energy systems and
low environmental impact.
Computational analysis can help in the design phase, particularly
in the case of complex structures with centralized plant systems, by
comparing the data returned by the calculation model for different
A Temporal QoS Ontology for ERTMS/ETCS
Ontologies offer a means for representing and sharing
information in many domains, particularly in complex domains. For
example, it can be used for representing and sharing information
of System Requirement Specification (SRS) of complex systems
like the SRS of ERTMS/ETCS written in natural language. Since
this system is a real-time and critical system, generic ontologies,
such as OWL and generic ERTMS ontologies provide minimal
support for modeling temporal information omnipresent in these SRS
documents. To support the modeling of temporal information, one
of the challenges is to enable representation of dynamic features
evolving in time within a generic ontology with a minimal redesign
of it. The separation of temporal information from other information
can help to predict system runtime operation and to properly design
and implement them. In addition, it is helpful to provide a reasoning
and querying techniques to reason and query temporal information
represented in the ontology in order to detect potential temporal
inconsistencies. To address this challenge, we propose a lightweight
3-layer temporal Quality of Service (QoS) ontology for representing,
reasoning and querying over temporal and non-temporal information
in a complex domain ontology. Representing QoS entities in separated
layers can clarify the distinction between the non QoS entities
and the QoS entities in an ontology. The upper generic layer of
the proposed ontology provides an intuitive knowledge of domain
components, specially ERTMS/ETCS components. The separation of
the intermediate QoS layer from the lower QoS layer allows us to
focus on specific QoS Characteristics, such as temporal or integrity
characteristics. In this paper, we focus on temporal information that
can be used to predict system runtime operation. To evaluate our
approach, an example of the proposed domain ontology for handover
operation, as well as a reasoning rule over temporal relations in this
domain-specific ontology, are presented.
Expert Based System Design for Integrated Waste Management
Recently, an increasing number of researchers have
been focusing on working out realistic solutions to sustainability
problems. As sustainability issues gain higher importance for
organisations, the management of such decisions becomes critical.
Knowledge representation is a fundamental issue of complex
knowledge based systems. Many types of sustainability problems
would benefit from models based on experts’ knowledge. Cognitive
maps have been used for analyzing and aiding decision making. A
cognitive map can be made of almost any system or problem. A
fuzzy cognitive map (FCM) can successfully represent knowledge
and human experience, introducing concepts to represent the essential
elements and the cause and effect relationships among the concepts to
model the behaviour of any system. Integrated waste management
systems (IWMS) are complex systems that can be decomposed to
non-related and related subsystems and elements, where many factors
have to be taken into consideration that may be complementary,
contradictory, and competitive; these factors influence each other and
determine the overall decision process of the system. The goal of the
present paper is to construct an efficient IWMS which considers
various factors. The authors’ intention is to propose an expert based
system design approach for implementing expert decision support in
the area of IWMSs and introduces an appropriate methodology for
the development and analysis of group FCM. A framework for such a
methodology consisting of the development and application phases is
Application of Neural Network on the Loading of Copper onto Clinoptilolite
The study investigated the implementation of the
Neural Network (NN) techniques for prediction of the loading of Cu
ions onto clinoptilolite. The experimental design using analysis of
variance (ANOVA) was chosen for testing the adequacy of the
Neural Network and for optimizing of the effective input parameters
(pH, temperature and initial concentration). Feed forward, multi-layer
perceptron (MLP) NN successfully tracked the non-linear behavior of
the adsorption process versus the input parameters with mean squared
error (MSE), correlation coefficient (R) and minimum squared error
(MSRE) of 0.102, 0.998 and 0.004 respectively. The results showed
that NN modeling techniques could effectively predict and simulate
the highly complex system and non-linear process such as ionexchange.
Role and Relative Effectiveness of Immune System for Combating Small Pox and AIDS
The human body has a complex system of innate and adaptive mechanisms for combating infection. This article discusses the role and relative effectiveness of these mechanisms in relation to small pox and AIDS.
Use of Gaussian-Euclidean Hybrid Function Based Artificial Immune System for Breast Cancer Diagnosis
Due to the fact that there exist only a small number of complex systems in artificial immune system (AIS) that work out nonlinear problems, nonlinear AIS approaches, among the well-known solution techniques, need to be developed. Gaussian function is usually used as similarity estimation in classification problems and pattern recognition. In this study, diagnosis of breast cancer, the second type of the most widespread cancer in women, was performed with different distance calculation functions that euclidean, gaussian and gaussian-euclidean hybrid function in the clonal selection model of classical AIS on Wisconsin Breast Cancer Dataset (WBCD), which was taken from the University of California, Irvine Machine-Learning Repository. We used 3-fold cross validation method to train and test the dataset. According to the results, the maximum test classification accuracy was reported as 97.35% by using of gaussian-euclidean hybrid function for fold-3. Also, mean of test classification accuracies for all of functions were obtained as 94.78%, 94.45% and 95.31% with use of euclidean, gaussian and gaussian-euclidean, respectively. With these results, gaussian-euclidean hybrid function seems to be a potential distance calculation method, and it may be considered as an alternative distance calculation method for hard nonlinear classification problems.
Risk Management and Security Practice in Customs Supply Chain: Application of Cross ABC Method to the Moroccan Customs
It is widely assumed that the case of Customs Supply Chain is classified as a complex system, due to not only the variety and large number of actors, but also their complex structural links, and the interactions between these actors, that’s why this system is subject to various types of Risks. The economic, political and social impacts of those risks are highly detrimental to countries, businesses and the public, for this reason, Risk management in the customs supply chain is becoming a crucial issue to ensure the sustainability, security and safety. The main characteristic of customs risk management approach is determining which goods and means of transport should be examined? To what extend? And where future compliance resources should be directed? The purposes of this article are, firstly to deal with the concept of customs supply chain, secondly present our risk management approach based on Cross Activity Based Costing (ABC) Method as an interactive tool to support decision making in customs risk management. Finally, analysis of case study of Moroccan customs to putting theory into practice and will thus draw together the various elements of a structured and efficient risk management approach.
Cloud Computing Support for Diagnosing Researches
One of the main biomedical problem lies in detecting dependencies in semi structured data. Solution includes biomedical portal and algorithms (integral rating health criteria, multidimensional data visualization methods). Biomedical portal allows to process diagnostic and research data in parallel mode using Microsoft System Center 2012, Windows HPC Server cloud technologies. Service does not allow user to see internal calculations instead it provides practical interface. When data is sent for processing user may track status of task and will achieve results as soon as computation is completed. Service includes own algorithms and allows diagnosing and predicating medical cases. Approved methods are based on complex system entropy methods, algorithms for determining the energy patterns of development and trajectory models of biological systems and logical–probabilistic approach with the blurring of images.
The Estimation of Human Vital Signs Complexity
Nonstationary and nonlinear signals generated by living complex systems defy traditional mechanistic approaches, which are based on homeostasis. Previous our studies have shown that the evaluation of the interactions of physiological signals by using special analysis methods is suitable for observation of physiological processes. It is demonstrated the possibility of using deep physiological model, based on the interpretation of the changes of the human body’s functional states combined with an application of the analytical method based on matrix theory for the physiological signals analysis, which was applied on high risk cardiac patients. It is shown that evaluation of cardiac signals interactions show peculiar for each individual functional changes at the onset of hemodynamic restoration procedure. Therefore, we suggest that the alterations of functional state of the body, after patients overcome surgery can be complemented by the data received from the suggested approach of the evaluation of functional variables’ interactions.
Mathematical Modeling of Uncompetitive Inhibition of Bi-Substrate Enzymatic Reactions
Currently, mathematical and computer modeling are widely used in different biological studies to predict or assess behavior of such a complex systems as a biological are. This study deals with mathematical and computer modeling of bi-substrate enzymatic reactions, which play an important role in different biochemical pathways. The main objective of this study is to represent the results from in silico investigation of bi-substrate enzymatic reactions in the presence of uncompetitive inhibitors, as well as to describe in details the inhibition effects. Four models of uncompetitive inhibition were designed using different software packages. Particularly, uncompetitive inhibitor to the first [ES1] and the second ([ES1S2]; [FS2]) enzyme-substrate complexes have been studied. The simulation, using the same kinetic parameters for all models allowed investigating the behavior of reactions as well as determined some interesting aspects concerning influence of different cases of uncompetitive inhibition. Besides, it has been shown that uncompetitive inhibitors exhibit specific selectivity depending on mechanism of bi-substrate enzymatic reaction.
Tuberculosis Modelling Using Bio-PEPA Approach
Modelling is a widely used tool to facilitate the evaluation of disease management. The interest of epidemiological models lies in their ability to explore hypothetical scenarios and provide decision makers with evidence to anticipate the consequences of disease incursion and impact of intervention strategies.
All models are, by nature, simplification of more complex systems. Models that involve diseases can be classified into different categories depending on how they treat the variability, time, space, and structure of the population. Approaches may be different from simple deterministic mathematical models, to complex stochastic simulations spatially explicit.
Thus, epidemiological modelling is now a necessity for epidemiological investigations, surveillance, testing hypotheses and generating follow-up activities necessary to perform complete and appropriate analysis.
The state of the art presented in the following, allows us to position itself to the most appropriate approaches in the epidemiological study.
Reliability Approximation through the Discretization of Random Variables using Reversed Hazard Rate Function
Sometime it is difficult to determine the exact reliability for complex systems in analytical procedures. Approximate solution of this problem can be provided through discretization of random variables. In this paper we describe the usefulness of discretization of a random variable using the reversed hazard rate function of its continuous version. Discretization of the exponential distribution has been demonstrated. Applications of this approach have also been cited. Numerical calculations indicate that the proposed approach gives very good approximation of reliability of complex systems under stress-strength set-up. The performance of the proposed approach is better than the existing discrete concentration method of discretization. This approach is conceptually simple, handles analytic intractability and reduces computational time. The approach can be applied in manufacturing industries for producing high-reliable items.
Radiation Damage as Nonlinear Evolution of Complex System
Irradiated material is a typical example of a complex
system with nonlinear coupling between its elements. During
irradiation the radiation damage is developed and this development
has bifurcations and qualitatively different kinds of behavior.
The accumulation of primary defects in irradiated crystals is
considered in frame work of nonlinear evolution of complex system.
The thermo-concentration nonlinear feedback is carried out as a
mechanism of self-oscillation development.
It is shown that there are two ways of the defect density evolution
under stationary irradiation. The first is the accumulation of defects;
defect density monotonically grows and tends to its stationary state
for some system parameters. Another way that takes place for
opportune parameters is the development of self-oscillations of the
The stationary state, its stability and type are found. The
bifurcation values of parameters (environment temperature, defect
generation rate, etc.) are obtained. The frequency of the selfoscillation
and the conditions of their development is found and
rated. It is shown that defect density, heat fluxes and temperature
during self-oscillations can reach much higher values than the
expected steady-state values. It can lead to a change of typical
operation and an accident, e.g. for nuclear equipment.