• A k-nearest neighbour technique for experience-based adaptation of assembly stations

      Scrimieri, Daniele; Ratchev, S.M. (2014-01)
      We present a technique for automatically acquiring operational knowledge on how to adapt assembly systems to new production demands or recover from disruptions. Dealing with changes and disruptions affecting an assembly station is a complex process which requires deep knowledge of the assembly process, the product being assembled and the adopted technologies. Shop-floor operators typically perform a series of adjustments by trial and error until the expected results in terms of performance and quality are achieved. With the proposed approach, such adjustments are captured and their effect on the station is measured. Adaptation knowledge is then derived by generalising from individual cases using a variant of the k-nearest neighbour algorithm. The operator is informed about potential adaptations whenever the station enters a state similar to one contained in the experience base, that is, a state on which adaptation information has been captured. A case study is presented, showing how the technique enables to reduce adaptation times. The general system architecture in which the technique has been implemented is described, including the role of the different software components and their interactions.
    • Kernel P systems: from modelling to verification and testing

      Gheorghe, Marian; Ceterchi, R.; Ipate, F.; Konur, Savas; Lefticaru, Raluca (2018-05)
      A kernel P system integrates in a coherent and elegant manner some of the most successfully used features of the P systems employed in modelling various applications. It also provides a theoretical framework for analysing these applications and a software environment for simulating and verifying them. In this paper, we illustrate the modelling capabilities of kernel P systems by showing how other classes of P systems can be represented with this formalism and providing a number of kernel P system models for a sorting algorithm and a broadcasting problem. We also show how formal verification can be used to validate that the given models work as desired. Finally, a test generation method based on automata is extended to non-deterministic kernel P systems.
    • Kinetic model development and simulation of simultaneous hydrodenitrogenation and hydrodemetallization of crude oil in trickle bed reactor

      Jarullah, Aysar Talib; Mujtaba, Iqbal M.; Wood, Alastair S. (2011)
      One of the more difficult tasks in the petroleum refining industries that have not been considered largely in the literature is hydrotreating (HDT) of crude oil. The accurate calculations of kinetic models of the relevant reaction scheme are required for obtaining helpful models for HDT reactions, which can be confidently used for reactor design, operating and control. In this work, an optimization technique is employed to evaluate the best kinetic models of a trickle bed reactor (TBR) process utilized for hydrodenitrogenation (HDN) and hydrodemetallization (HDM) that includes hydrodevanadization (HDV) and hydrodenickelation (HDNi) of crude oil based on pilot plant experiments. The minimization of the sum of the squared errors (SSE) between the experimental and estimated concentrations of nitrogen (N), vanadium (V) and nickel (Ni) compounds in the products is used as an objective function in the optimization problem to determine the kinetic parameters. A series of experimental work was conducted in a continuous flow isothermal trickle bed reactor, using crude oil as a feedstock and the commercial cobalt¿molybdenum on alumina (Co¿Mo/¿-Al2O3) as a catalyst. A three-phase heterogeneous model based on two¿film theory is developed to describe the behaviour of crude oil hydroprocessing in a pilot¿plant trickle bed reactor (TBR) system. The hydroprocessing reactions have been modelled by power law kinetics with respect to nitrogen, vanadium and nickel compounds, and with respect to hydrogen. In this work, the gPROMS (general PROcess Modelling System) package has been used for modelling, simulation and parameter estimation via optimization. The model simulations results were found to agree well with the experiments carried out in a wide range of the studied operating conditions. The model is employed to predict the concentration profiles of hydrogen, nitrogen, vanadium and nickel along the catalyst bed length in three phases.
    • A knowledge based methodology for planning and designing of a flexible manufacturing system (FMS)

      Khan, M. Khurshid; Hussain, I.; Noor, S. (2011)
      This paper presents a Knowledge-Based (KB) integrated approach for planning and designing of number of machining centres, selection of material handling system, layout and networking architecture and cost analysis for a Flexible Manufacturing Systems (FMS). The KB model can be applied for integrating the decision issues at both the planning and designing stages of an FMS for three types of layouts (single row, double row, and loop) and three MHS types (robot-conveyor, AGV-conveyor and a hybrid AGV-robot-conveyor). The KB methodology starts from a suitable information input, which includes demand per year of part types, part type’s information, machining centre’s calculation, Material Handling System (MHS) selection, machining centre’s layout selection, networking selection and financial analysis. The KB methodology is developed by using AM, an expert system shell, and contains over 1500 KB rules. The performance of the system has been verified and validated through four published and four industrial case studies, respectively. The validation results from industry show that the KB methodology is capable of considering detailed design inputs and is able to assist in designing and selecting a practical FMS. It is concluded that a KB system for the present FMS application is a viable and efficient methodology.
    • Knowledge based system implementation for lean process in low volume automotive manufacturing (LVAM) with reference to process manufacturing

      Mohamed, N.M.Z.Nik; Khan, M. Khurshid (2011)
      Global manufacturing industry mostly depends on new product development and processes to become competitive. The product development process for automotive industry is normally complicated, lengthy, expensive, and risky. Hence, a study of lean manufacturing processes for low volume manufacturing in automotive industry is proposed to overcome this issue by eliminating all wastes in the lengthy process. This paper presents a conceptual design approach to the development of a hybrid Knowledge Based (KB) system for lean process in Low Volume Automotive Manufacturing (LVAM). The research concentrates on the low volume processes by using a hybrid KB system, which is a blend of KB system and Gauging Absences of Pre-requisites (GAP). The hybrid KB/GAP system identifies all potential waste elements of low volume process manufacturing. The KB system analyses the difference between the existing and the benchmark standards for lean process for an effective implementation through the GAP analysis technique. The proposed model explores three major lean process components, namely Employee Involvement, Waste Elimination, and Kaizen (continuous improvement). These three components provide valuable information in order for decision makers to design and implement an optimised low volume manufacturing process, but which can be applied in all process manufacturing, including chemical processing.
    • Knowledge Exchange, Technology Transfer and the Academy

      Earnshaw, Rae A. (2012)
      The relationship between the academy and the business community is currently perceived to be important to the future of both parties. Universities provide graduates to meet the needs and requirements of society and industry, and the latter supplies products and services to meet the needs of the market place. Whether public or private, industry increasingly seeks to use tools and techniques that increase efficiency and effectiveness, whilst at the same time maximizing quality and minimizing cost. The current trend towards companies outsourcing their R & D requirements to reduce corporate overheads and optimize staffing levels means that Universities can utilize the opportunity and bid to supply this expertise. Universities also generate their own spin-outs from intellectual property they create, as well as licensing technology to industry, rather than transferring it. However, the relationship between university and industry is not without its challenges, chief of which is the historical commitment of the academy to advance knowledge whether it is directly applicable or not. In addition, there are many fundamental and important long term research issues that many would argue are the primary duty of the academy to address, which may have no direct application in the short to medium term. This is resulting in increasing tensions in the academy, and in the priorities for national and international funding agencies. There can also be significant cultural differences and reward models between the academy and industry which give rise to difficult issues for staff at the interface. This chapter reviews the current developments and the issues at the interface between business and the academy.
    • A knowledge-based genetic algorithm for unit commitment

      Aldridge, C.J.; McKee, S.; McDonald, J.R.; Galloway, S.J.; Dahal, Keshav P.; Bradley, M.E.; Macqueen, J.F. (2001)
      A genetic algorithm (GA) augmented with knowledge-based methods has been developed for solving the unit commitment economic dispatch problem. The GA evolves a population of binary strings which represent commitment schedules. The initial population of schedules is chosen using a method based on elicited scheduling knowledge. A fast rule-based dispatch method is then used to evaluate candidate solutions. The knowledge-based genetic algorithm is applied to a test system of ten thermal units over 24-hour time intervals, including minimum on/off times and ramp rates, and achieves lower cost solutions than Lagrangian relaxation in comparable computational time.
    • Knowledge-based Lean Six Sigma Maintenance System for Sustainable Buildings

      Al Dairi, Jasim S.S.; Khan, M. Khurshid; Munive-Hernandez, J. Eduardo (2017)
      Purpose– This paper develops a Knowledge-based (KB) System for Lean Six Sigma (LSS) Maintenance in environmentally Sustainable Buildings (Lean6-SBM). Design/methodology/approach– The Lean6-SBM conceptual framework has been developed using the rule base approach of KB system and joint integration with Gauge Absence Prerequisites (GAP) technique. A comprehensive literature review is given for the main pillars of the framework with a typical output of GAP analysis. Findings– Implementation of LSS in the sustainable building maintenance context requires a pre-assessment of the organisation’s capabilities. A conceptual framework with a design structure is proposed to tackle this issue with the provision of an enhancing strategic and operational decision making hierarchy. Research limitations/implications– Future research work might consider validating this framework in other type of industries. Practical implications– Maintenance activities in environmentally sustainable buildings must take prodigious standards into consideration and, therefore, a robust quality assurance measure has to be integrated. Originality/value– The significance of this research is to present a novel use of hybrid KB/GAP methodologies to develop a Lean6-SBM system. The originality and novelty of this approach will assist in identifying quality perspectives while implementing different maintenance strategies in the sustainable building context.
    • Knowledge-Based Lean Six Sigma System for Enhancing Quality Management Performance in Healthcare Environment

      Al Khamisi, Yousuf N.; Khan, M. Khurshid; Munive-Hernandez, J. Eduardo (2018)
      This paper presents the development of a Knowledge-Based System (KBS) to support the implementation of Lean Six Sigma (L6σ) principles applied to enhance Quality Management (QM) performance within a Healthcare Environment. The process of KBS building has been started by acquiring knowledge from experts in field of L6σ and QM in healthcare. The acquired knowledge has been represented in a rule-based approach for capturing L6σ practices. These rules are produced in IF….THEN way where IF is the premise and THEN is the action. The produced rules have been integrated with Gauging Absence of Pre-requisites (GAP) technique to facilitate benchmarking of best practice in a healthcare environment. A comprehensive review of the structure of the system is given, detailing a typical output of the KBS. Implementation of L6σ principles to enhance QM performance in a Healthcare Environment requires a pre-assessment of the organisation’s competences. The KBS provides an enhanced strategic and operational decision making hierarchy for achieving a performance benchmark. This research presents a novel application of a hybrid KBS with GAP methodology to support the implementation of L6σ principles to enhance QM performance in a healthcare environment.
    • Knowledge-Discovery Incorporated Evolutionary Search for Microcalcification Detection in Breast Cancer Diagnosis.

      Peng, Yonghong; Yao, Bin; Jiang, Jianmin (2006)
      Objectives The presence of microcalcifications (MCs), clusters of tiny calcium deposits that appear as small bright spots in a mammogram, has been considered as a very important indicator for breast cancer diagnosis. Much research has been performed for developing computer-aided systems for the accurate identification of MCs, however, the computer-based automatic detection of MCs has been shown difficult because of the complicated nature of surrounding of breast tissue, the variation of MCs in shape, orientation, brightness and size. Methods and materials This paper presents a new approach for the effective detection of MCs by incorporating a knowledge-discovery mechanism in the genetic algorithm (GA). In the proposed approach, called knowledge-discovery incorporated genetic algorithm (KD-GA), the genetic algorithm is used to search for the bright spots in mammogram and a knowledge-discovery mechanism is integrated to improve the performance of the GA. The function of the knowledge-discovery mechanism includes evaluating the possibility of a bright spot being a true MC, and adaptively adjusting the associated fitness values. The adjustment of fitness is to indirectly guide the GA to extract the true MCs and eliminate the false MCs (FMCs) accordingly. Results and conclusions The experimental results demonstrate that the incorporation of knowledge-discovery mechanism into the genetic algorithm is able to eliminate the FMCs and produce improved performance comparing with the conventional GA methods. Furthermore, the experimental results show that the proposed KD-GA method provides a promising and generic approach for the development of computer-aided diagnosis for breast cancer.
    • kPWorkbench: a software framework for Kernel P systems

      Gheorghe, Marian; Ipate, F.; Mierla, L.M.; Konur, Savas (2015)
      P systems are the computational models introduced in the context of membrane computing, a computational paradigm within the more general area of unconventional computing. Kernel P (kP) systems are defined to unify the specification of different variants of P systems, motivated by challenging theoretical aspects and the need to model different problems. In this paper, we present kPWorkbench, a software framework developed to support kP systems. kPWorkbench integrates several simulation and verification tools and methods, and provides a software suit for the modelling and analysis of membrane systems.
    • kPWorkbench: A software suit for membrane systems

      Konur, Savas; Mierla, L.M.; Ipate, F.; Gheorghe, Marian (2020)
      Membrane computing is a new natural computing paradigm inspired by the functioning and structure of biological cells, and has been successfully applied to many different areas, from biology to engineering. In this paper, we present kPWorkbench, a software framework developed to support membrane computing and its applications. kPWorkbench offers unique features, including modelling, simulation, agent-based high performance simulation and verification, which allow modelling and computational analysis of membrane systems. The kPWorkbench formal verification component provides the opportunity to analyse the behaviour of a model and validate that important system requirements are met and certain behaviours are observed. The platform also features a property language based on natural language statements to facilitate property specification.
    • Laboratory experimental study of ocean waves propagating over a partially buried pipeline in a trench layer

      Sun, K.; Zhang, J.; Gao, Y.; Jeng, D.; Guo, Yakun; Liang, Z. (2019-02-01)
      Seabed instability around a pipeline is one of the primary concerns in offshore pipeline projects. To date, most studies focus on investigating the wave/current-induced response within a porous seabed around either a fully buried pipeline or a thoroughly exposed one. In this study, unlike previous investigations, a series of comprehensive laboratory experiments are carried out in a wave flume to investigate the wave-induced pore pressures around a partially embedded pipeline in a trench layer. Measurements show that the presence of the partially buried pipeline can significantly affect the excess pore pressure in a partially backfilled trench layer, which deviates considerably from that predicted by the theoretical approach. The morphology of the trench layer accompanied with the backfill sediments, especially the deeper trench and thicker backfill (i.e.,b≥1D,e≥0.5D), provides a certain degree of resistance to seabed instability. The amplitude of excess pore pressure around the trench layer roughly exhibits a left-right asymmetric distribution along the periphery of the pipeline, and decays sharply from the upper layer of the trench to the lower region. Deeper trench depth and thicker buried layer significantly weaken the pore-water pressures in the whole trench area, thus sheltering and protecting the submarine pipeline against the transient seabed liquefaction.
    • Laminar and turbulent analytical dam break wave modelling on dry-downstream open channel flow

      Taha, T.; Lateef, A.O.A.; Pu, Jaan H. (2018-09)
      A dam break wave caused by the discontinuity in depth and velocity of a flow is resulted from instantaneous release a body of water from a channel and classified naturally as a rapidly varied unsteady flow. Due to its nature, it is hard to be accurately represented by analytical models. The aim of this study is to establish the modelling differences and complexity echelons between analytically simulated explicit laminar and turbulent dry bed dam break wave free surface profiles. An in-depth solution to the free surface profile has been provided and evaluated by representing the reported dam break flow measurements at various locations. The methodology adopted utilizes the free surface profile formulations presented by Chanson 1,2, which are developed using the method of characteristics. In order to validate the results of the presented analytical models in illustrating the dam break wave under dry bed conditions, published experimental data provided by Schoklitsch 3, Debiane 4 and Dressler 5 are used to compare and analyze the performance of the dam break waves under laminar and turbulent flow conditions.
    • A laminar flow model of aerosol survival of epidemic and non-epidemic strains of Pseudomonas aeruginosa isolated from people with cystic fibrosis

      Clifton, I.J.; Fletcher, L.A.; Beggs, Clive B.; Denton, M.; Peckham, D.G. (2008)
      Cystic fibrosis (CF) is an inherited multi-system disorder characterised by chronic airway infection with pathogens such as Pseudomonas aeruginosa. Acquisition of P. aeruginosa by patients with CF is usually from the environment, but recent studies have demonstrated patient to patient transmission of certain epidemic strains, possibly via an airborne route. This study was designed to examine the survival of P. aeruginosa within artificially generated aerosols. Survival was effected by the solution used for aerosol generation. Within the aerosols it was adversely affected by an increase in air temperature. Both epidemic and non-epidemic strains of P. aeruginosa were able to survive within the aerosols, but strains expressing a mucoid phenotype had a survival advantage. This would suggest that segregating individuals free of P. aeruginosa from those with chronic P. aeruginosa infection who are more likely to be infected with mucoid strains may help reduce the risk of cross-infection. Environmental factors also appear to influence bacterial survival. Warming and drying the air within clinical areas and avoidance of humidification devices may also be beneficial in reducing the risk of cross-infection.
    • Landfill leachate treatment with ozone and ozone/hydrogen peroxide systems.

      Tizaoui, Chedly; Bouselmi, L.; Mansouri, L.; Ghrabi, A. (2007)
      In the search for an efficient and economical method to treat a leachate generated from a controlled municipal solid waste landfill site (Jebel Chakir) in the region of greater Tunis in Tunisia, ozone alone and ozone combined with hydrogen peroxide were studied. The leachate was characterised by high COD, low biodegradability and intense dark colour. A purpose-built reactor, to avoid foaming, was used for the study. It was found that ozone efficacy was almost doubled when combined with hydrogen peroxide at 2 g/L but higher H2O2 concentrations gave lower performances. Enhancement in the leachate biodegradability from about 0.1 to about 0.7 was achieved by the O3/H2O2 system. Insignificant changes in pH that may due to buffering effect of bicarbonate was found. A small decrease in sulphate concentrations were also observed. In contrast, chloride concentration declined at the beginning of the experiment then increased to reach its initial value. Estimates of the operating costs were made for comparison purposes and it was found that the O3/H2O2 system at 2 g/L H2O2 gave the lowest cost of about 3.1 TND (2.3 USD)/kg COD removed.
    • Large-scale data analysis using the Wigner function

      Earnshaw, Rae A.; Lei, Ci; Li, Jing; Mugassabi, Souad; Vourdas, Apostolos (2012)
      Large-scale data are analysed using the Wigner function. It is shown that the ‘frequency variable’ provides important information, which is lost with other techniques. The method is applied to ‘sentiment analysis’ in data from social networks and also to financial data.
    • Lattice-Boltzmann coupled models for advection-diffusion flow on a wide range of Péclet numbers

      Dapelo, Davide; Simonis, S.; Krause, J.J.; Bridgeman, John (Elsevier, 2021-04)
      Traditional Lattice-Boltzmann modelling of advection–diffusion flow is affected by numerical instability if the advective term becomes dominant over the diffusive (i.e., high-Péclet flow). To overcome the problem, two 3D one-way coupled models are proposed. In a traditional model, a Lattice-Boltzmann Navier–Stokes solver is coupled to a Lattice-Boltzmann advection–diffusion model. In a novel model, the Lattice-Boltzmann Navier–Stokes solver is coupled to an explicit finite-difference algorithm for advection–diffusion. The finite-difference algorithm also includes a novel approach to mitigate the numerical diffusivity connected with the upwind differentiation scheme.