• Data mining in real-world traditional Chinese medicine clinical data warehouse

      Zhou, X.; Liu, B.; Zhang, X.; Xie, Q.; Zhang, R.; Wang, Y.; Peng, Yonghong (2014)
      Real-world clinical setting is the major arena of traditional Chinese medicine (TCM) as it has experienced long-term practical clinical activities, and developed established theoretical knowledge and clinical solutions suitable for personalized treatment. Clinical phenotypes have been the most important features captured by TCM for diagnoses and treatment, which are diverse and dynamically changeable in real-world clinical settings. Together with clinical prescription with multiple herbal ingredients for treatment, TCM clinical activities embody immense valuable data with high dimensionalities for knowledge distilling and hypothesis generation. In China, with the curation of large-scale real-world clinical data from regular clinical activities, transforming the data to clinical insightful knowledge has increasingly been a hot topic in TCM field. This chapter introduces the application of data warehouse techniques and data mining approaches for utilizing real-world TCM clinical data, which is mainly from electronic medical records. The main framework of clinical data mining applications in TCM field is also introduced with emphasizing on related work in this field. The key points and issues to improve the research quality are discussed and future directions are proposed.
    • Data mining: concepts and techniques

      Han, J.; Kamber, M.; Pei, J. (2012-06-22)
    • Data quality and governance in a UK social housing initiative: Implications for smart sustainable cities

      Duvier, Caroline; Anand, Prathivadi B.; Oltean-Dumbrava, Crina (2018-05)
      Smart Sustainable Cities (SSC) consist of multiple stakeholders, who must cooperate in order for SSCs to be successful. Housing is an important challenge and in many cities, therefore, a key stakeholder are social housing organisations. This paper introduces a qualitative case study of a social housing provider in the UK who implemented a business intelligence project (a method to assess data networks within an organisation) to increase data quality and data interoperability. Our analysis suggests that creating pathways for different information systems within an organisation to ‘talk to’ each other is the first step. Some of the issues during the project implementation include the lack of training and development, organisational reluctance to change, and the lack of a project plan. The challenges faced by the organisation during this project can be helpful for those implementing SSCs. Currently, many SSC frameworks and models exist, yet most seem to neglect localised challenges faced by the different stak
    • Data quality and governance in a UK social housing initiative: Implications for smart sustainable cities

      Duvier, Caroline; Anand, Prathivadi B.; Oltean-Dumbrava, Crina (2018)
      Smart Sustainable Cities (SSC) consist of multiple stakeholders, who must cooperate in order for SSCs to be successful. Housing is an important challenge and in many cities, therefore, a key stakeholder are social housing organisations. This paper introduces a qualitative case study of a social housing provider in the UK who implemented a business intelligence project (a method to assess data networks within an organisation) to increase data quality and data interoperability. Our analysis suggests that creating pathways for different information systems within an organisation to ‘talk to’ each other is the first step. Some of the issues during the project implementation include the lack of training and development, organisational reluctance to change, and the lack of a project plan. The challenges faced by the organisation during this project can be helpful for those implementing SSCs. Currently, many SSC frameworks and models exist, yet most seem to neglect localised challenges faced by the different stakeholders. This paper hopes to help bridge this gap in the SSC research agenda.
    • Data quality challenges in the UK social housing sector

      Duvier, Caroline; Neagu, Daniel; Oltean-Dumbrava, Crina; Dickens, D. (2018)
      The social housing sector has yet to realise the potential of high data quality. While other businesses, mainly in the private sector, reap the benefits of data quality, the social housing sector seems paralysed, as it is still struggling with recent government regulations and steep revenue reduction. This paper offers a succinct review of relevant literature on data quality and how it relates to social housing. The Housing and Development Board in Singapore offers a great example on how to integrate data quality initiatives in the social housing sector. Taking this example, the research presented in this paper is extrapolating cross-disciplinarily recommendations on how to implement data quality initiatives in social housing providers in the UK.
    • Data-flow vs control-flow for extreme level computing

      Evripidou, P.; Kyriacou, Costas (2013)
      This paper challenges the current thinking for building High Performance Computing (HPC) Systems, which is currently based on the sequential computing also known as the von Neumann model, by proposing the use of Novel systems based on the Dynamic Data-Flow model of computation. The switch to Multi-core chips has brought the Parallel Processing into the mainstream. The computing industry and research community were forced to do this switch because they hit the Power and Memory walls. Will the same happen with HPC? The United States through its DARPA agency commissioned a study in 2007 to determine what kind of technologies will be needed to build an Exaflop computer. The head of the study was very pessimistic about the possibility of having an Exaflop computer in the foreseeable future. We believe that many of the findings that caused the pessimistic outlook were due to the limitations of the sequential model. A paradigm shift might be needed in order to achieve the affordable Exascale class Supercomputers.
    • De-smokeGCN: Generative Cooperative Networks for joint surgical smoke detection and removal

      Chen, L.; Tang, W.; John, N.W.; Wan, Tao Ruan; Zhang, J.J. (IEEE, 2020-05)
      Surgical smoke removal algorithms can improve the quality of intra-operative imaging and reduce hazards in image-guided surgery, a highly desirable post-process for many clinical applications. These algorithms also enable effective computer vision tasks for future robotic surgery. In this paper, we present a new unsupervised learning framework for high-quality pixel-wise smoke detection and removal. One of the well recognized grand challenges in using convolutional neural networks (CNNs) for medical image processing is to obtain intra-operative medical imaging datasets for network training and validation, but availability and quality of these datasets are scarce. Our novel training framework does not require ground-truth image pairs. Instead, it learns purely from computer-generated simulation images. This approach opens up new avenues and bridges a substantial gap between conventional non-learning based methods and which requiring prior knowledge gained from extensive training datasets. Inspired by the Generative Adversarial Network (GAN), we have developed a novel generative-collaborative learning scheme that decomposes the de-smoke process into two separate tasks: smoke detection and smoke removal. The detection network is used as prior knowledge, and also as a loss function to maximize its support for training of the smoke removal network. Quantitative and qualitative studies show that the proposed training framework outperforms the state-of-the-art de-smoking approaches including the latest GAN framework (such as PIX2PIX). Although trained on synthetic images, experimental results on clinical images have proved the effectiveness of the proposed network for detecting and removing surgical smoke on both simulated and real-world laparoscopic images.
    • Decision support for coordinated road traffic control actions

      Dahal, Keshav P.; Almejalli, Khaled A.; Hossain, M. Alamgir (2013)
      Selection of the most appropriate traffic control actions to solve non-recurrent traffic congestion is a complex task, which requires significant expert knowledge and experience. Also, the application of a control action for solving a local traffic problem could create traffic congestion at different locations in the network because of the strong interrelations between traffic situations at different locations of a road network. Therefore, coordination of control strategies is required to make sure that all available control actions serve the same objective. In this paper, an Intelligent Traffic Control System (ITCS) based on a coordinated-agent approach is proposed to assist the human operator of a road traffic control centre to manage the current traffic state. In the proposed system, the network is divided into sub-networks, each of which has its own associated agent. The agent of the sub-network with an incident reacts with other affected agents in order to select the optimal traffic control action, so that a globally acceptable solution is found. The agent uses an effective way of calculating the control action fitness locally and globally. The capability of the proposed ITCS has been tested for a case study of a part of the traffic network in the Riyadh city of Saudi Arabia. The obtained results show its ability to identify the optimal global control action. (C) 2012 Elsevier B.V. All rights reserved.
    • Decision-making model for supply chain risk management in the petroleum industry

      Aroge, Olatunde O.; Rahmanian, Nejat; Munive-Hernandez, J. Eduardo; Abdi, Reza (2020)
      The purpose of this paper is to develop a decision-making model for supporting the management of risks in supply chain. This proposed model is applied to the case of the oil industry in Nigeria. A Partial Least Square Structural Equation Model (PLS-SEM) is developed to measure the significance of the influence of risk management strategy on mitigating disruption risks and their correlations with the performance of activities in the supply chain and relevance of key performance measures in the organisation. The model considered seven aspects: behavioural-based management strategy, buffer based oriented management strategy, exploration and production risks, environmental and regulatory compliance risks, geopolitical risks, supply chain performance, and organisational performance measures. A survey questionnaire was applied to collect data to populate the model, with 187 participants from the oil industry. Based on the PLS-SEM methodology, an optimised risk management decision-making method was developed and accomplished. The results show that behavioural-based mechanism predicts the capacity of the organisation to manage risks successfully in its supply chain. The approach proposed provides a new and practical methodology to manage disruption risks in supply chains. Further, the behavioural-based mechanism can help to formulate risk management strategies in the oil industry.
    • Decomposition of manufacturing processes: a review

      Mohamed, N.M.Z.Nik; Khan, M. Khurshid (2012)
      Manufacturing is a global activity that started during the industrial revolution in the late 19th century to cater for the large-scale production of products. Since then, manufacturing has changed tremendously through the innovations of technology, processes, materials, communication and transportation. The major challenge facing manufacturing is to produce more products using less material, less energy and less involvement of labour. To face these challenges, manufacturing companies must have a strategy and competitive priority in order for them to compete in a dynamic market. A review of the literature on the decomposition of manufacturing processes outlines three main processes, namely: high volume, medium volume and low volume. The decomposition shows that each sub process has its own characteristics and depends on the nature of the firm’s business. Two extreme processes are continuous line production (fast extreme) and project shop (slow extreme). Other processes are in between these two extremes of the manufacturing spectrum. Process flow patterns become less complex with cellular, line and continuous flow compared with jobbing and project. The review also indicates that when the product is high variety and low volume, project or functional production is applied.
    • Decoupling control in statistical sense: minimised mutual information algorithm

      Zhang, Qichun; Wang, A. (2016-01)
      This paper presents a novel concept to describe the couplings among the outputs of the stochastic systems which are represented by NARMA models. Compared with the traditional coupling description, the presented concept can be considered as an extension using statistical independence theory. Based on this concept, the decoupling control in statistical sense is established with the necessary and sufficient conditions for complete decoupling. Since the complete decoupling is difficult to achieve, a control algorithm has been developed using the Cauchy-Schwarz mutual information criterion. Without modifying the existing control loop, this algorithm supplies a compensative controller to minimise the statistical couplings of the system outputs and the local stability has been analysed. In addition, a further discussion illustrates the combination of the presented control algorithm and data-based mutual information estimation. Finally, a numerical example is given to show the feasibility and efficiency of the proposed algorithm.
    • Decreased brain venous vasculature visibility on susceptibility-weighted imaging venography in patients with multiple sclerosis is related to chronic cerebrospinal venous insufficiency

      Zivadinov, R.; Poloni, G.U.; Marr, K.; Schirda, C.V.; Magnano, C.R.; Carl, E.; Bergsland, N.; Hojnacki, D.; Kennedy, C.; Beggs, Clive B.; et al. (2011)
      BACKGROUND: The potential pathogenesis between the presence and severity of chronic cerebrospinal venous insufficiency (CCSVI) and its relation to clinical and imaging outcomes in brain parenchyma of multiple sclerosis (MS) patients has not yet been elucidated. The aim of the study was to investigate the relationship between CCSVI, and altered brain parenchyma venous vasculature visibility (VVV) on susceptibility-weighted imaging (SWI) in patients with MS and in sex- and age-matched healthy controls (HC). METHODS: 59 MS patients, 41 relapsing-remitting and 18 secondary-progressive, and 33 HC were imaged on a 3T GE scanner using pre- and post-contrast SWI venography. The presence and severity of CCSVI was determined using extra-cranial and trans-cranial Doppler criteria. Apparent total venous volume (ATVV), venous intracranial fraction (VIF) and average distance-from-vein (DFV) were calculated for various vein mean diameter categories: < .3 mm, .3-.6 mm, .6-.9 mm and > .9 mm. RESULTS: CCSVI criteria were fulfilled in 79.7% of MS patients and 18.2% of HC (p < .0001). Patients with MS showed decreased overall ATVV, ATVV of veins with a diameter < .3 mm, and increased DFV compared to HC (all p < .0001). Subjects diagnosed with CCSVI had significantly increased DFV (p < .0001), decreased overall ATVV and ATVV of veins with a diameter < .3 mm (p < .003) compared to subjects without CCSVI. The severity of CCSVI was significantly related to decreased VVV in MS (p < .0001) on pre- and post-contrast SWI, but not in HC. CONCLUSIONS: MS patients with higher number of venous stenoses, indicative of CCSVI severity, showed significantly decreased venous vasculature in the brain parenchyma. The pathogenesis of these findings has to be further investigated, but they suggest that reduced metabolism and morphological changes of venous vasculature may be taking place in patients with MS.
    • Deep face recognition using imperfect facial data

      Elmahmudi, Ali A.M.; Ugail, Hassan (2019-10)
      Today, computer based face recognition is a mature and reliable mechanism which is being practically utilised for many access control scenarios. As such, face recognition or authentication is predominantly performed using ‘perfect’ data of full frontal facial images. Though that may be the case, in reality, there are numerous situations where full frontal faces may not be available — the imperfect face images that often come from CCTV cameras do demonstrate the case in point. Hence, the problem of computer based face recognition using partial facial data as probes is still largely an unexplored area of research. Given that humans and computers perform face recognition and authentication inherently differently, it must be interesting as well as intriguing to understand how a computer favours various parts of the face when presented to the challenges of face recognition. In this work, we explore the question that surrounds the idea of face recognition using partial facial data. We explore it by applying novel experiments to test the performance of machine learning using partial faces and other manipulations on face images such as rotation and zooming, which we use as training and recognition cues. In particular, we study the rate of recognition subject to the various parts of the face such as the eyes, mouth, nose and the cheek. We also study the effect of face recognition subject to facial rotation as well as the effect of recognition subject to zooming out of the facial images. Our experiments are based on using the state of the art convolutional neural network based architecture along with the pre-trained VGG-Face model through which we extract features for machine learning. We then use two classifiers namely the cosine similarity and the linear support vector machines to test the recognition rates. We ran our experiments on two publicly available datasets namely, the controlled Brazilian FEI and the uncontrolled LFW dataset. Our results show that individual parts of the face such as the eyes, nose and the cheeks have low recognition rates though the rate of recognition quickly goes up when individual parts of the face in combined form are presented as probes.
    • Deep learning technology for predicting solar flares from (Geostationary Operational Environmental Satellite) data

      Nagem, Tarek A.M.; Qahwaji, Rami S.R.; Ipson, Stanley S.; Wang, Z.; Al-Waisy, Alaa S. (2018)
      Solar activity, particularly solar flares can have significant detrimental effects on both space-borne and grounds based systems and industries leading to subsequent impacts on our lives. As a consequence, there is much current interest in creating systems which can make accurate solar flare predictions. This paper aims to develop a novel framework to predict solar flares by making use of the Geostationary Operational Environmental Satellite (GOES) X-ray flux 1-minute time series data. This data is fed to three integrated neural networks to deliver these predictions. The first neural network (NN) is used to convert GOES X-ray flux 1-minute data to Markov Transition Field (MTF) images. The second neural network uses an unsupervised feature learning algorithm to learn the MTF image features. The third neural network uses both the learned features and the MTF images, which are then processed using a Deep Convolutional Neural Network to generate the flares predictions. To the best of our knowledge, this work is the first flare prediction system that is based entirely on the analysis of pre-flare GOES X-ray flux data. The results are evaluated using several performance measurement criteria that are presented in this paper.
    • Defect recognition in concrete ultrasonic detection based on wavelet packet transform and stochastic configuration networks

      Zhao, J.; Hu, T.; Zheng, R.; Ba, P.; Mei, C.; Zhang, Qichun (2021-01)
      Aiming to detect concrete defects, we propose a new identification method based on stochastic configuration networks. The presented model has been trained by time-domain and frequency-domain features which are extracted from filtering and decomposing ultrasonic detection signals. This method was applied to ultrasonic detection data collected from 5 mm, 7 mm, and 9 mm penetrating holes in C30 class concrete. In particular, wavelet packet transform (WPT) was then used to decompose the detected signals, thus the information in different frequency bands can be obtained. Based on the data from the fundamental frequency nodes of the detection signals, we calculated the means, standard deviations, kurtosis coefficients, skewness coefficients and energy ratios to characterize the detection signals. We also analyzed their typical statistical features to assess the complexity of identifying these signals. Finally, we used the stochastic configuration networks (SCNs) algorithm to embed four-fold cross-validation for constructing the recognition model. Based upon the experimental results, the performance of the presented model has been validated and compared with the genetic algorithm based BP neural network model, where the comparison shows that the SCNs algorithm has superior generalization abilities, better fitting abilities, and higher recognition accuracy for recognizing defect signals. In addition, the test and analysis results show that the proposed method is feasible and effective in detecting concrete hole defects.
    • Deflection of concrete structures reinforced with FRP bars.

      Kara, Ilker F.; Ashour, Ashraf F.; Dundar, C. (2013-01)
      This paper presents an analytical procedure based on the stiffness matrix method for deflection prediction of concrete structures reinforced with fiber reinforced polymer (FRP) bars. The variation of flexural stiffness of cracked FRP reinforced concrete members has been evaluated using various available models for the effective moment of inertia. A reduced shear stiffness model was also employed to account for the variation of shear stiffness in cracked regions. Comparisons between results obtained from the proposed analytical procedure and experiments of simply and continuously supported FRP reinforced concrete beams show good agreement. Bottom FRP reinforcement at midspan section has a significant effect on the reduction of FRP reinforced concrete beam deflections. The shear deformation effect was found to be more influential in continuous FRP reinforced concrete beams than simply supported beams. The proposed analytical procedure forms the basis for the analysis of concrete frames reinforced with FRP concrete members.
    • Demographically weighted traffic flow models for adaptive routing in packet-switched non-geostationary satellite meshed networks

      Mohorcic, M.; Svigelj, A.; Kandus, G.; Hu, Yim Fun; Sheriff, Ray E. (Elsevier, 2003)
      In this paper, a performance analysis of adaptive routing is presented for packet-switched inter-satellite link (ISL)networks, based on shortest path routing and two alternate link routing forwarding policies. The selected routing algorithm and link-cost function are evaluated for a low earth orbit satellite system, using a demographically weighted traffic flow model. Two distinct traffic flow patterns are modelled: hot spot and regional. Performance analysis, in terms of quality of service and quantity of service, is derived using specifically developed simulation software to model the ISL network, taking into account topology adaptive routing only, or topology and traffic adaptive routing.
    • Denial of service detection using dynamic time warping

      Diab, D.M.; AsSadhan, B.; Binsalleeh, H.; Lambotharan, S.; Kyriakopoulos, K.G.; Ghafir, Ibrahim (2021)
      With the rapid growth of security threats in computer networks, the need for developing efficient security‐warning systems is substantially increasing. Distributed denial‐of‐service (DDoS) and DoS attacks are still among the most effective and dreadful attacks that require robust detection. In this work, we propose a new method to detect TCP DoS/DDoS attacks. Since analyzing network traffic is a promising approach, our proposed method utilizes network traffic by decomposing the TCP traffic into control and data planes and exploiting the dynamic time warping (DTW) algorithm for aligning these two planes with respect to the minimum Euclidean distance. By demonstrating that the distance between the control and data planes is considerably small for benign traffic, we exploit this characteristic for detecting attacks as outliers. An adaptive thresholding scheme is implemented by adjusting the value of the threshold in accordance with the local statistics of the median absolute deviation (MAD) of the distances between the two planes. We demonstrate the efficacy of the proposed method for detecting DoS/DDoS attacks by analyzing traffic data obtained from publicly available datasets.