Collections in this community

Recent Submissions

  • Peripheral Refractive Error and its Association with Myopia Development and Progression. An examination of the role that peripheral retinal defocus may play in the origin and progression of myopia

    Mallen, Edward A.H.; Barrett, Brendan T.; Jamal, Heshow (University of BradfordFaculty of Life Sciences, 2019)
    Purpose: Currently there are attempts to slow myopia progression by manipulating peripheral refractive error. This study proposed to establish the distribution of peripheral refractive errors in hyperopic, emmetropic and myopic children and to test the hypothesis that relative peripheral hyperopia is a risk factor in the onset and progression of myopia. Methods: Refraction was measured under non-cycloplegic conditions, at 0°, 10° (superior, inferior, temporal and nasal retina) and 30° (temporal and nasal retina), at distance and near. Central spherical equivalent refractive error (SER) was used to classify the eyes as myopic (≤ −0.75 D), emmetropic (−0.75 < SER < +0.75 D) or hyperopic (≥ +0.75 D). Relative peripheral refraction was calculated as the difference between the central (i.e. foveal) and peripheral refractive measurements. At baseline, measurements were taken from 554 children and in a subset of 300 of these same children at the follow-up visit. The time interval between initial and follow-up measurement was 9.71 ± 0.87 months. Results: Results were analysed on 528 participants (10.21 ±0.94 years old) at baseline and 286 longitudinally. At baseline, myopic children (n=61) had relative peripheral hyperopia at all eccentricities at distance and near, except at 10°-superior retina where relative peripheral myopia was observed at near. Hyperopic eyes displayed relative peripheral myopia at all eccentricities, at distance and near. The emmetropes showed a shift from relative peripheral myopia at distance to relative peripheral hyperopia at near at all eccentricities, except at 10°-superior retina, where the relative peripheral myopia was maintained at near. In the longitudinal data analysis, myopes who became more myopic did not show greater relative peripheral hyperopia at baseline compared with myopic sub-groups whose central refraction remained stable. Conclusions: The peripheral refractive profile differences between different refractive groups that are reported in other studies have been confirmed in this study. Relative peripheral hyperopia is not found to be a significant risk factor in the onset or progression of myopia in children.
  • In-line process measurements for injection moulding control. In-line rheology and primary injection phase process measurements for injection moulding of semi-crystalline thermoplastics, using instrumented computer monitored injection moulding machines, for potential use in closed loop process control

    Coates, Philip D.; Speight, Russell G. (University of BradfordDepartment in Mechanical and Manufacturing Engineering, 1993)
    In-line rheological and process measurements are studied, during the primary injection phase, as a potential aid to closed loop process control for injection moulding. The feasibilities of attaining rheological and process measurements of sufficient accuracy and precision for use in process control are investigated. The influence of rheological and process measurements on product quality are investigated for semi-crystalline thermoplastic materials. A computer based process and machine parameter monitoring system is utilised to provide accurate and precise process data for analysis
  • Education Reform in England and the Transformation of School Teachers’ Working Lives: A Labour Process Perspective

    Ford, Jackie M.; Smith, Andrew J.; Morrell, Sophie E. (University of BradfordFaculty of Management, Law and Social Sciences, 2020)
    The academy school programme, OFSTED’s use of school performance data, and performance management and performance related pay reforms are dramatically transforming the work and employment landscape in teaching. Yet there is limited knowledge of teachers’ experiences of work in relation to this context. The purpose of this thesis is to explore the impact of these education reforms on school teachers’ working lives through a labour process perspective. A critical realist ethnography of an inner-city secondary academy school was conducted over four months. This comprised a six-week shadowing phase, document collection and 26 semi-structured interviews with Teachers, Managers, HR and Trade Union Representatives. Findings reveal that the removal of a contextual value added measure from school performance metrics leads to an increase in teachers’ workloads and an extension of their working hours. This is compounded by an unofficial erosion of teachers’ directed working time that infiltrates through the academy trust. Pressures on workload also stem from management-led initiatives generated by appraisals in leadership programmes. Furthermore, teachers’ work becomes standardised and re-organised through the heterarchical multi-academy trust model in an effort to improve the school’s OFSTED rating. Performance related pay reforms act as a parallel instigator to the standardisation of work, polarising the creative and mundane aspects of teaching across the workforce, whilst oppositional orientations to work form as the majority of teachers align with a shared sense of commitment to work. This thesis amalgamates labour process theory with the hollowing out thesis, making key theoretical, conceptual, empirical and methodological contributions, alongside practical recommendations.
  • Crystal and Particle Engineering: Pharmaceutical Cocrystals through Antisolvent and Liquid-Liquid Phase Separation Technologies

    Paradkar, Anant R.; Kelly, Adrian L.; Vangala, Venu R.; Sajid, Muhammad A. (University of BradfordFaculty of Life Sciences, 2019)
    The effects of polymer concentration and solvents on cocrystal morphology of low solubility drugs were investigated, both of which had an impact. The melting temperatures also decreased with increasing polymer concentration. Placing the binding agent, benzene, at different interfaces induced morphological changes, such as formation of porous cocrystals. Previously liquid-liquid phase separation (LLPS) has been reported as a hindrance in the crystallisation process impeding further development. A phase diagram was constructed, and different phases were categorised into 4 types. After separation of the highly concentrated amorphous Oil Phase II, it was prone to gradual crystallisation. Crystallisation took place over 30-60 minutes; this allowed the in-situ monitoring. A novel cocrystallisation technique was developed; from (LLPS). Cocrystals of indomethacin with saccharin and nicotinamide were obtained by mixing Oil Phase II with the coformers. In-situ monitoring by spectroscopic had gradual changes in spectra; characteristic peaks increased in height and area with the formation of crystals until the reaction was complete. With crystal formation, the XRD spectra gradually had a sharper baseline due to a decrease in the amorphous indomethacin. The photoluminescence (PL) spectra displayed several peaks coupling into one large hump together with increasing intensity as the sample crystallised. There was a shift in the peak absorbance of the pure drug crystals obtained from LLPS and the indomethacin:saccharin cocrystal obtained from LLPS. Amorphous stabilisation was achieved by mixing polymer (PVP) with Oil Phase II. There were no changes to the XRD diffractogram as the sample did not undergo crystallisation.
  • Designing an Operations Performance Management System – A case-study of a leading global automotive parts supplier

    Hussain, Zahid I.; Gast, Carsten G. (University of BradfordDepartment – School of Management, 2019)
    This research focuses on a contemporary Operations Performance Management System (OPMS) designed for a leading global automotive parts supplier. It synthesises an integrated and holistic OPMS to increase the effectiveness and efficiency of the automotive parts supplier to ultimately improve financial margin. The study is motivated by the need of an process-oriented automotive parts supplier to excel in regards to its operations management to ultimately secure a best-in-class cost basis in times of significant changes in the automotive industry. The research design is based on a qualitative single case-study and deploys semistructured interviews with the management of the case-study organisation. In addition, hundreds of documents were analysed to evidence the creation of the OPMS. Finally, participant observation was used to allow for triangulation and contextualisation of findings. The findings reveal a contemporary OPMS. It presents an intelligent and integrated steering logic from corporate level to single operational processes. It integrates performance measurement and management in acknowledgement of the specific needs to the case-study organisation. The overall aim of this thesis is to make a practical contribution to this area as achieved by the presented OPMS. This study extends the existing literature by contributing a customised, highlyintegrated OPMS for a process-oriented automotive parts supply industry. It embeds the ‘Target Costing Methodology’ as an example for a performance management tool into the OPMS. Furthermore, the study explores the impact of digitalisation on OPMS. This research has synthesised an OPMS that emphasises a shift towards intelligent performance measurement for achieving value in the chain, in areas such as procurement and manufacturing. This shift is strongly influenced by digital transformation, which is not yet holistically commanded by the case-study organisation. The research does shed light upon how to optimise resource utilisation based on increased operational focus and managerial accountability. This approach will lead to continual organisational learning as part of the ‘Plan-Do-Check-Action’ management process.
  • Ultra-Wideband Imaging System For Medical Applications. Simulation models and Experimental Investigations for Early Breast Cancer & Bone Fracture Detection Using UWB Microwave Sensors

    Abd-Alhameed, Raed A.; Noras, James M.; Mirza, Ahmed F. (University of BradfordFaculty of Engineering and Informatics, 2019)
    Near field imaging using microwaves in medical applications is of great current interest for its capability and accuracy in identifying features of interest, in comparison with other known screening tools. Many imaging methods have been developed over the past two decades showing the potential of microwave imaging in medical applications such as early breast cancer detection, analysis of cardiac tissues, soft tissues and bones. Microwave imaging uses non-ionizing ultra wideband (UWB) electromagnetic signals and utilises tissue-dependent dielectric contrast to reconstruct signals and images using radar-based or tomographic imaging techniques. Microwave imaging offers low health risk, low operational cost, ease of use and user-friendliness. This study documents microwave imaging experiments for early breast cancer detection and bone fracture detection using radar approach. An actively tuned UWB patch antenna and a UWB Vivaldi antenna are designed and utilised as sensing elements in the aforementioned applications. Both UWB antennas were developed over a range of frequency spectrum, and then characteristics were tested against their ability for microwave imaging applications by reconstructing the 3D Inversion Algorithm. An experiment was conducted using patch antenna to test the detection of variable sizes of cancer tissues based on a simple phantom consisting of a plastic container with a low dielectric material emulating fatty tissue and high dielectric constant object emulating a tumour, is scanned between 4 to 8 GHz with the patch antenna. A 2-D image of the tumour is constructed using the reflected signal response to visualize the location and size of the tumour. A Vivaldi antenna is designed covering 3.1 to 10.6 GHz. The antenna is tested via simulation for detecting bone fractures of various sizes and 2-D images are generated using reflected pulses to show the size of fracture. The Vivaldi antenna is optimised for early breast cancer detection and detailed simulated study is carried out using different breast phantoms and tumour sizes. Simulations are backed with the experimental investigation with the test setup used for patch antenna. Generated images for simulations and experimental investigation show good agreement, and show the presence of tumour with good location accuracy. Measurements indicate that both prototype microwave sensors are good candidates for tested imaging applications.
  • Effect of Process Parameters and Material Attributes on Crystallisation of Pharmaceutical Polymeric Systems in Injection Moulding Process. Thermal, rheological and morphological study of binary blends polyethylene oxide of three grades; 20K, 200K and 2M crystallised under various thermal and mechanical conditions using injection moulding

    Gough, Timothy D.; Isreb, Mohammad; Mkia, Abdul R. (University of BradfordFaculty of Life Sciences. School of Pharmacy and Medical Sciences, 2019)
    Crystallisation is gaining a lot of interest in pharmaceutical industry to help designing active ingredients with tailored physicochemical properties. Many factors have been found to affect the crystallisation process, including process parameters and material attributes. Several studies in the literature have discussed the role of these parameters in the crystallisation process. A comprehensive study is still missing in this field where all the significant terms are taken into consideration, including the square effect and the interaction terms between different parameters. In this study, a thorough investigation into the main factors affecting crystallisation of a polymeric system, processed via injection moulding, was presented and a sample of response optimisation was introduced which can be mimicked to suite a specific need. Three grades of pure polyethylene oxide; 20K, 200K and 2M, were first characterised using differential scanning calorimetry (DSC), thermogravimetric analysis (TGA), powder X-ray diffraction (PXRD) and shear rheometry. The onset of degradation and the rate varied according to molecular weight of polyethylene oxide (PEO). The peak melting temperature and the difference in enthalpy between melting and crystallisation were both in a direct proportion with PEO molecular weight. PEO200K and PEO2M struggle to recrystallise to the same extent of the original state at the tested cooling rates, while PEO20K can retain up to a similar crystallinity degree when cooled at 1 °C/min. Onset of crystallisation temperature (Tc1) was high for PEO2M and the difference between the 20K and 200K were pronounced at low cooling rate (20K is higher than 200K). The rheometer study showed that PEO2M has a solid-like structure around melting point which explains the difficulty in processing this grade at a low temperature via IM. PEO20K was almost stable within the strain values studied (Newtonian behaviour). For higher grades, PEO showed a shear thinning behaviour. The complex viscosity for PEO2M is characterised by a steeper slope compared to PEO200K, which indicates higher shear thinning sensitivity due to higher entanglement of the longer chains. For binary blends of PEO, the enthalpy of crystallisation studied by DSC was in direct proportion to the lowest molecular weight PEO content (PEOL %) in PEO20K/200K and PEO20K/2M blends. The effect of PEOL% on Tc1 became slightly pronounced for PEO20K-2M blends where Tc1 exhibited slight inverse proportionality to PEOL% and it became more significant for PEO200K-2M blends. It was interesting to find that Tc1 for the blends did not necessarily lie between the values of the homopolymers. In all binary blends, Tc1 was inversely proportional to cooling rate for the set of cooling rates tested. Thermal analysis using hot stage polarised light microscopy yields different behaviours of various PEO grades against the first detection of crystals especially where the lowest grade showed highest detection temperature. Visual observation of PEO binary blends caplets processed at various conditions via injection moulding (IM) showed the low-quality caplets processed at mould temperature above Tc1 of the sample. The factors affecting crystallisation of injection moulded caplets were studied using response surface methodology for two responses; peak melting temperature (Tm) and relative change in crystallinity (∆Xc%) compared to an unprocessed sample. Mould temperature (Tmould) was the most significant factor in all binary blend models. The relationship between Tmould and the two responses was positive non-linear at the Tmould ˂ Tc1. Injection speed was also a significant factor for both responses in PEO20K-200K blends. For Tm, the injection speed had a positive linear relationship while the opposite trend was found for ∆Xc%. The interaction term found in the RSM study for all models was only between the injection speed and the PEOL % which shows the couple effect between these two factors. Molecular effect was considered a significant factor in all ∆Xc% models across the three binary blends. The order of ∆Xc% sensitivity to the change in PEOL% was 3, 5 and 7 % for 20K-200K, 200K-2M and 20K-2M.
  • Novel PLA-based materials with improved thermomechanical properties and processability through control of morphology and stereochemistry. A study in improving toughness and processability of PLA by blending with biodegradable polymers and the two PLA enantiomers PLLA and PDLA to accelerate crystallinity and heat resistance

    Kelly, Adrian L.; Gough, Timothy D.; Kassos, Nikolaos (University of BradfordFaculty of Engineering and informatics, 2019)
    Polylactic acid (PLA) is an aliphatic polyester, derived from sustainable natural sources that is biodegradable and can be industrially composted. This material has been in the spotlight recently due to its sustainability and properties. However it has been invented in 1932 by Carothers and then patented by DuPont in 1954 (Standau et al. 2019). The properties of this material though limit its use for applications mainly in the medical sector and in some cases single use packaging. In this research, PLA based blends with improved rheological and thermomechanical properties are investigated. The focus is based in proposing strategies in improving these properties based on commercial methods and processing techniques. In this work, commercial grade PLA has been blended with polycaprolactone (PCL) and polybutylene succinate (PBS) in binary and ternary formulations via twin screw extrusion. PCL has been known to act as an impact modifier for PLA, but to cause a corresponding reduction in strength. Results showed that the binary PLA blends containing PBS and PCL, had reduced viscosity, elastic modulus and strength, but increased strain at break and impact strength. Morphological and thermal analysis showed that the immiscibility of these additives with PLA caused these modifications. Incorporation of a small loading of PBS had a synergistic effect on the PLA-PCL blend properties. Miscibility was improved and enhanced mechanical properties were observed for a ternary blend containing 5wt% of both PBS and PCL compared to binary blends containing 10% of each additive. To increase heat resistance of PLA, the material’s crystallinity has to be increased. However PLA has a relatively slow crystallisation rate making it difficult and expensive to be used in commercial applications where heat resistance is needed. For this reason the chiral nature of PLA has been used to investigate the effect of stereochemistry of PLA in crystallisation. Optically pure PDLA was added to its enantiomer in small amounts (up to 15%) and the properties and crystallisation mechanism of these blends was investigated. Results showed that the addition of PDLA accelerated crystallinity and developed a stucture that increased heat resistance, melt strength and stiffness. Finally, a processing model of developing a fully stereocomplex PLA part based in commercial techniques is proposed. Injection moulded PLA showed even higher heat resistance without the need of further processing the product (increasing crystallinity).
  • The use of Silent Substitution in measuring isolated cone- and rod- Human ERGs

    McKeefry, Declan J.; Tripathy, Srimant P.; Barrett, Brendan T.; Kommanapalli, Deepika (University of BradfordFaculty of Life Sciences, 2018)
    After over a decade of its discovery, the Electroretinogram (ERG) still remains the objective tool that is conventionally used in assessment of retinal function in health and disease. Although there is ongoing research in developing ERG recording techniques, interpretation and clinical applications, there is still a limited understanding on how each photoreceptor class contribute to the ERG waveform and their role and/or susceptibilities in various retinal diseases still remains unclear. Another limitation with currently used conventional testing protocols in a clinical setting is the requirement of an adaptation period which is time consuming. Furthermore, the ERG responses derived in this manner are recorded under different stimulus conditions, thus, making comparison of these signals difficult. To address these issues and develop a new testing method, we employed silent substitution paradigm in obtaining cone- and rod- isolating ERGs using sine- and square- wave temporal profiles. The ERGs achieved in this manner were shown to be photoreceptor-selective. Furthermore, these responses did not only provide the functional index of photoreceptors but their contributions to their successive postreceptoral pathways. We believe that the substitution stimuli used in this thesis could be a valuable tool in functional assessment of individual photoreceptor classes in normal and pathological conditions. Furthermore, we speculate that this method of cone/rod activity isolation could possibly be used in developing faster and efficient photoreceptor-selective testing protocols without the need of adaptation.
  • The Presence and Use of Interactive Features on Kurdish News Websites in the Iraqi Kurdistan region. A case study of interactivity of news Kurdish websites of the Iraqi Kurdistan Region

    Reeve, Carlton; Salih, Hunar R.S. (University of BradfordDepartment of Media Design and Technology. Faculty of Engineering and Informatics, 2018)
    Internet has emerged as an interactive platform. Thus, new communication technologies are challenging the traditional media with interactive devices turning online journalism into a rich media environment. While new information technologies have enabled media organisations to use interactive features in the constructed presentation of news websites, few news websites in the Iraqi Kurdistan region are maximizing such features. Also, this thesis argues that despite the lack of a good infrastructure in the field of communication technology and the Internet in the Iraqi Kurdistan Region (IKR), online journalism has become a major part of Kurdish media outlets and distinctive from traditional media because of its interactive nature. The study presented in this thesis focus on interactivity in online journalism by examine interactive features of Kurdish news websites of the (IKR) and analyses how news is presented and to what extent these news websites tried to apply the interactive features on their hompages and inside the news pages. The level of interactivity of those Kurdish websites was also measured using several dimensions of interactivity by conducting web-based content analysis. The analysis of the qualitative part based on in-depth interviews with Kurdish reporters, editors, editor in chiefs, media experts and web developers. The findings show that the Kurdish news websites did not fully utilize and enhance interactive features in online journalism.
  • Design and Analysis of Anomaly Detection and Mitigation Schemes for Distributed Denial of Service Attacks in Software Defined Network. An Investigation into the Security Vulnerabilities of Software Defined Network and the Design of Efficient Detection and Mitigation Techniques for DDoS Attack using Machine Learning Techniques

    Awan, Irfan U.; Hu, Yim Fun; Pillai, Prashant; Sangodoyin, Abimbola O. (University of BradfordFaculty of Engineering and Informatics, 2019)
    Software Defined Networks (SDN) has created great potential and hope to overcome the need for secure, reliable and well managed next generation networks to drive effective service delivery on the go and meet the demand for high data rate and seamless connectivity expected by users. Thus, it is a network technology that is set to enhance our day-to-day activities. As network usage and reliance on computer technology are increasing and popular, users with bad intentions exploit the inherent weakness of this technology to render targeted services unavailable to legitimate users. Among the security weaknesses of SDN is Distributed Denial of Service (DDoS) attacks. Even though DDoS attack strategy is known, the number of successful DDoS attacks launched has seen an increment at an alarming rate over the last decade. Existing detection mechanisms depend on signatures of known attacks which has not been successful in detecting unknown or different shades of DDoS attacks. Therefore, a novel detection mechanism that relies on deviation from confidence interval obtained from the normal distribution of throughput polled without attack from the server. Furthermore, sensitivity analysis to determine which of the network metrics (jitter, throughput and response time) is more sensitive to attack by introducing white Gaussian noise and evaluating the local sensitivity using feed-forward artificial neural network is evaluated. All metrics are sensitive in detecting DDoS attacks. However, jitter appears to be the most sensitive to attack. As a result, the developed framework provides an avenue to make the SDN technology more robust and secure to DDoS attacks.
  • International Joint Venture (IJV) Control Design: A Case Study of an Emerging Market IJV

    Wang, Chengang; Owens, Martin D.; Chu, Irene; Ekpo, Itoro U. (University of BradfordFaculty of Management, Law and Social Science, 2019)
    This study aims to explore the various factors that influence the international joint venture (IJV) parent firms to use a specific control mechanism in an emerging market (EM). The study adopted a single case study design involving an IJV between a Nigerian firm as the local partner and a Chinese firm as the foreign partner. Data was collected through twenty semi structured interviews from both the parent firms and the IJV; and complemented by observations of the IJV activities, relevant information from newspapers; magazines; company brochures and newsletters; and website of the parent firms, IJV and the government regulating body. The study revealed that the design of formal and social control is influenced by a range of factors identified in the literature. This includes resource contribution and bargaining power, previous experience of the IJV managers, knowledge transfer, trust-building, environmental uncertainty, and institutional forces. In contrast to findings from existing studies, this study also reveals that a combination of factors can influence the use of a particular control mechanism. By examining the types of control exercised by each partner and the antecedent of each control type, this study complements prior research by incorporating insights from transaction cost theory, resource dependency theory, social exchange theory, institutional theory and organisational lerning perspective to provide a more integrative explanation of IJV control design. Specifically, it explains how one partner develops certain types of formal and social control according to its individual resource contribution and dependency and can adjust controls to achieve its various objectives.
  • Data driven agent-based micro-simulation in social complex systems

    Neagu, Daniel; Gheorghe, Marian; Makinde, Omololu A. (University of BradfordDepartment of Computer Science, 2019)
    We are recently witnessing an increase in large-scale micro/individual/- granular level behavioural data. Such data has been proven to have the capacity to aid the development of more accurate simulations that will ef- fectively predict the behaviours of complex systems. Despite this increase, the literature has failed to produce a structured modelling approach that will effectively take advantage of such granular data, in modelling com- plex systems that involve social phenomenons (i.e. social complex sys- tems). In this thesis, we intend to bridge this gap by answering the question of how novel structural frameworks, that systematically guides the use of micro-level behaviour and attribute data, directly extracted from the ba- sic entities within a social complex system can be created. These frame- works should involve the systematic processes of using such data to di- rectly model agent attributes, and to create agent behaviour rules, that will directly represent the unique micro entities from which the data was ex- tracted. The objective of the thesis is to define generic frameworks, that would create agent based micro simulations that would directly reflect the target complex system, so that alternative scenarios, that cannot be inves- tigated in the real system, and social policies that need to be investigated before being applied on the social system can be explored. In answering this question, we take advantage of the pros of other model- ing techniques such as micro simulation and agent based techniques in cre- ating models that have a micro-macro link, such that the micro behaviour that causes the macro emergence at the simulation’s global level can be easily investigated. which is a huge advantage in policy testing. We also utilized machine learning in the creation of behavioural rules.This created agent behaviours that were empirically defined. Therefore, this thesis also answers the question of how such structural framework will empirically create agent behaviour rules through machine learning algorithms. In this thesis we proposed two novel frameworks for the creation of more accurate simulations. The concepts within these frameworks were proved using case studies, in which these case studies where from different so- cial complex systems, so as to prove the generic nature of the proposed frameworks. In concluding of this thesis, it was obvious that the questions posed in the first chapter had been answered. The generic frameworks had been created, which bridged the existing gap in the creation of accurate mod- els from the presently available granular attribute and behavioral data, al- lowing the simulations created from these models accurately reflect their target social complex systems from which the data was extracted from.
  • A Holistic Approach to Dynamic Modelling of Malaria Transmission. An Investigation of Climate-Based Models used for Predicting Malaria Transmission

    Konur, Savas; Peng, Yonghong; Asyhari, A.Taufiq; Modu, Babagana (University of BradfordFaculty of Engineering and Informatics, 2020)
    The uninterrupted spread of malaria, besides its seasonal uncertainty, is due to the lack of suitable planning and intervention mechanisms and tools. Several studies have been carried out to understand the factors that affect the development and transmission of malaria, but these efforts have been largely limited to piecemeal specific methods, hence they do not offer comprehensive solutions to predict disease outbreaks. This thesis introduces a ’holistic’ approach to understand the relationship between climate parameters and the occurrence of malaria using both mathematical and computational methods. In this respect, we develop new climate-based models using mathematical, agent-based and data-driven modelling techniques. A malaria model is developed using mathematical modelling to investigate the impact of temperature-dependent delays. Although this method is widely applicable, but it is limited to the study of homogeneous populations. An agent-based technique is employed to address this limitation, where the spatial and temporal variability of agents involved in the transmission of malaria are taken into account. Moreover, whilst the mathematical and agent-based approaches allow for temperature and precipitation in the modelling process, they do not capture other dynamics that might potentially affect malaria. Hence, to accommodate the climatic predictors of malaria, an intelligent predictive model is developed using machine-learning algorithms, which supports predictions of endemics in certain geographical areas by monitoring the risk factors, e.g., temperature and humidity. The thesis not only synthesises mathematical and computational methods to better understand the disease dynamics and its transmission, but also provides healthcare providers and policy makers with better planning and intervention tools.
  • Intelligent based Packet Scheduling Scheme using Internet Protocol/Multi-Protocol Label Switching (IP/MPLS) Technology for 5G. Design and Investigation of Bandwidth Management Technique for Service-Aware Traffic Engineering using Internet Protocol/Multi-Protocol Label Switching (IP/MPLS) for 5G

    Hu, Yim Fun; Abd-Alhameed, Raed A.; Mustapha, Oba Z. (University of BradfordFaculty of Engineering and Informatics, 2019)
    Multi-Protocol Label Switching (MPLS) makes use of traffic engineering (TE) techniques and a variety of protocols to establish pre-determined highly efficient routes in Wide Area Network (WAN). Unlike IP networks in which routing decision has to be made through header analysis on a hop-by-hop basis, MPLS makes use of a short bit sequence that indicates the forwarding equivalence class (FEC) of a packet and utilises a predefined routing table to handle packets of a specific FEC type. Thus header analysis of packets is not required, resulting in lower latency. In addition, packets of similar characteristics can be routed in a consistent manner. For example, packets carrying real-time information can be routed to low latency paths across the networks. Thus the key success to MPLS is to efficiently control and distribute the bandwidth available between applications across the networks. A lot of research effort on bandwidth management in MPLS networks has already been devoted in the past. However, with the imminent roll out of 5G, MPLS is seen as a key technology for mobile backhaul. To cope with the 5G demands of rich, context aware and multimedia-based user applications, more efficient bandwidth management solutions need to be derived. This thesis focuses on the design of bandwidth management algorithms, more specifically QoS scheduling, in MPLS network for 5G mobile backhaul. The aim is to ensure the reliability and the speed of packet transfer across the network. As 5G is expected to greatly improve the user experience with innovative and high quality services, users’ perceived quality of service (QoS) needs to be taken into account when deriving such bandwidth management solutions. QoS expectation from users are often subjective and vague. Thus this thesis proposes the use of fuzzy logic based solution to provide service aware and user-centric bandwidth management in order to satisfy requirements imposed by the network and users. Unfortunately, the disadvantage of fuzzy logic is scalability since dependable fuzzy rules and membership functions increase when the complexity of being modelled increases. To resolve this issue, this thesis proposes the use of neuro-fuzzy to solicit interpretable IF-THEN rules.The algorithms are implemented and tested through NS2 and Matlab simulations. The performance of the algorithms are evaluated and compared with other conventional algorithms in terms of average throughput, delay, reliability, cost, packet loss ratio, and utilization rate. Simulation results show that the neuro-fuzzy based algorithm perform better than fuzzy and other conventional packet scheduling algorithms using IP and IP over MPLS technologies.
  • The impact of the global financial crisis and institutional settings on corporate financial decisions.

    Özkan, Aydin; Tekin, Hasan (University of BradfordFaculty of Management, Law and Social Sciences, 2019)
    Since theories of corporate finance are recognised to be conditional, this study explores the impact of the global financial crisis (GFC) of 2007-2009 and institutional settings in determining corporate financial decisions. The recession on the supply of credit and demand for credit affects the corporate financial channels. The credit recession causes more agency costs, bankruptcy costs and information asymmetry, which adversely influence both borrowing and investments. Firms reduce debt financing, retain more cash and cut corporate payouts due to a sharp rise in uncertainty. Moreover, the role of institutional settings on corporate decisions differs following the GFC. Three empirical chapters contribute to the literature: First, Chapter 3 investigates the role of GFC on determinants and the adjustment speed of leverage and debt maturity and reveals that the effect of bankruptcy costs, agency costs and information asymmetry only increases on debt maturity, as opposed to leverage in the post-GFC. The adjustment speed of leverage and debt maturity drops after the GFC due to the low supply and demand for credit. Chapter 4 examines how cash holdings have been affected by the GFC across countries which have different agency problems and analyses how the rise of agency costs and information asymmetry can explain cash decisions before and after the GFC. Financially constrained firms have quicker cash holdings’ adjustment compared to unconstrained firms. However, while firms in low-governance countries have slower adjustment speed of cash than those in high-governance countries in pre-crisis, it has been found that it is vice versa in the post-crisis period. Finally, Chapter 5 analyses the effect of agency problems and the GFC on dividend payouts. Contrary to firms in high-governance countries, those in common-law countries are less likely to pay out dividends, as confirmed by the substitute and outcome models, sequentially after the GFC. Also, dividends are used as a signalling device by the GFC. Overall, the GFC and institutional settings impact corporate financial policies of firms to specify where and when their shareholders invest.
  • Pharmaceutical supply chain resilience. An exploratory analysis of vulnerabilities and resilience strategies in the face of dynamic disruptions in the UK pharmaceutical supply chain

    Breen, Liz; Hou, Jiachen; Sowter, Julie; Yaroson, Emilia V. (University of BradfordFaculty of Management, Law and Social Sciences, 2019)
    Pharmaceutical supply chains are susceptible to disruptions which impact on the operational and financial performance of firms as well as patient safety. This study aimed to explore why the Pharmaceutical Supply Chain (PSC) in the UK is susceptible to the impact of dynamic disruptions and examine how resilience strategies have were employed to reduce the effects of these disruptions. The Complex Adaptive System (CAS) theory was used as a framework in an exploratory research design using mixed-methods. The qualitative data were gathered through 23 semi-structured interviews with key supply chain actors across the PSC in the UK to explore their experiences. The findings from these semi-structured interviews were used to develop a survey which was distributed to a broader spectrum of supply chain actors where the final sample from the survey was (n=106). The data were triangulated to discuss the research findings. The initial results revealed power, conflict and complexities as drivers of vulnerabilities in the PSC. Antecedents for building resilience strategies included visibility, flexibility and joint decision making as recovery strategies and resource sharing as the resistance strategy. CAS provided a systemic approach to understanding PSC resilience rather than in parts. In doing so, it took into consideration the various elements that make up the entire system. Thus, vulnerabilities and resilience strategies were outcomes of the interactions between supply chain actors. The findings demonstrated that CAS, as a theory, provided a framework that was beneficial in exploring and gaining insights into PSC resilience. Also, by combining the two datasets (interviews and survey), an original output was proposed -the Pharmaceutical Supply Chain Resilience Framework (PSCRF)- which was used to recommend resilience strategies suitable for mitigating disruptions in the PSC.
  • An Investigation into the Performance of Ethnicity Verification Between Humans and Machine Learning Algorithms

    Ugail, Hassan; Logan, Andrew J.; Jilani, Shelina K. (University of BradfordFaculty of Engineering and Informatics. School of Media, Design and Technology, 2020)
    There has been a significant increase in the interest for the task of classifying demographic profiles i.e. race and ethnicity. Ethnicity is a significant human characteristic and applying facial image data for the discrimination of ethnicity is integral to face-related biometric systems. Given the diversity in the application of ethnicity-specific information such as face recognition and iris recognition, and the availability of image datasets for more commonly available human populations, i.e. Caucasian, African-American, Asians, and South-Asian Indians. A gap has been identified for the development of a system which analyses the full-face and its individual feature-components (eyes, nose and mouth), for the Pakistani ethnic group. An efficient system is proposed for the verification of the Pakistani ethnicity, which incorporates a two-tier (computer vs human) approach. Firstly, hand-crafted features were used to ascertain the descriptive nature of a frontal-image and facial profile, for the Pakistani ethnicity. A total of 26 facial landmarks were selected (16 frontal and 10 for the profile) and by incorporating 2 models for redundant information removal, and a linear classifier for the binary task. The experimental results concluded that the facial profile image of a Pakistani face is distinct amongst other ethnicities. However, the methodology consisted of limitations for example, low performance accuracy, the laborious nature of manual data i.e. facial landmark, annotation, and the small facial image dataset. To make the system more accurate and robust, Deep Learning models are employed for ethnicity classification. Various state-of-the-art Deep models are trained on a range of facial image conditions, i.e. full face and partial-face images, plus standalone feature components such as the nose and mouth. Since ethnicity is pertinent to the research, a novel facial image database entitled Pakistani Face Database (PFDB), was created using a criterion-specific selection process, to ensure assurance in each of the assigned class-memberships, i.e. Pakistani and Non-Pakistani. Comparative analysis between 6 Deep Learning models was carried out on augmented image datasets, and the analysis demonstrates that Deep Learning yields better performance accuracy compared to low-level features. The human phase of the ethnicity classification framework tested the discrimination ability of novice Pakistani and Non-Pakistani participants, using a computerised ethnicity task. The results suggest that humans are better at discriminating between Pakistani and Non-Pakistani full face images, relative to individual face-feature components (eyes, nose, mouth), struggling the most with the nose, when making judgements of ethnicity. To understand the effects of display conditions on ethnicity discrimination accuracy, two conditions were tested; (i) Two-Alternative Forced Choice (2-AFC) and (ii) Single image procedure. The results concluded that participants perform significantly better in trials where the target (Pakistani) image is shown alongside a distractor (Non-Pakistani) image. To conclude the proposed framework, directions for future study are suggested to advance the current understanding of image based ethnicity verification.
  • Three Essays in Financial Economics

    Sharma, Abhijit; Ozkan, Aydin; Grillini, Stefano (University of BradfordDepartment of Accounting, Finance and Economics. School of Management, 2019)
    This thesis consists of three empirical essays in financial economics, with particular focus on the European Union and the Eurozone. The thesis investigates topics related to market liquidity and integration. In particular, it covers the transmission of liquidity shocks across Eurozone markets, the role of market liquidity in the repurchase programme and integration of Eurozone economies in terms of welfare gains from trade. Liquidity and integration have received considerable attention in recent years, particularly within the context of global financial and macroeconomic uncertainty over the last decade. In the first empirical essay, we investigate static and dynamic liquidity spillovers across the Eurozone stock markets. Using a generalised vector autoregressive (VAR) model, we introduce a new measure of liquidity spillovers. We find strong evidence of interconnection across countries. We also test the existence of liquidity contagion using a dynamic version of our static spillover index. Our results indicate that the transmission of shocks increases during periods of higher financial turbulence. Moreover, we find that core economies tend to be dominant transmitters of shocks, rather than absorber. The second essay investigates the role played by market liquidity in the execution of open-market share repurchases in the UK which is the most active market within the EU for this payout method. Using a unique hand collected data set from Bloomberg Professionals, we find that the execution of share repurchases does not depend on the long-term underlying motive, but it rather relies on market liquidity and other macroeconomic variables. We also provide a methodological contribution using censored quantile regression (CQR), which overcomes most of the econometric limitations of the Tobit models, widely employed previously within this literature. The third essay quantifies the welfare gains from trade for the Eurozone countries. We apply a trade model that allows us to estimate the increase in real consumption as a result of trade between countries. We estimate welfare gains using two sufficient aggregate statistics. These are the share of expenditure on domestic goods and the elasticity of exports with respect to trade cost. We offer a methodological contribution for the estimation of elasticities by applying the Poisson pseudo-maximum likelihood (PPML) using a gravity model. PPML allows the estimation of gravity models in their exponential form, allowing the inclusion of zero trade flows and controlling for heteroskedasticity. Previous studies present several econometric limitations as a result of estimating gravity models in their log-linearised form. Our results indicate that joining the euro did not significantly increase trade gains for member countries. Nevertheless, differences across countries are significant and Northern economies experience a higher increase in welfare gains trade as compared to Southern economies.
  • Pharmacist educational interventions for patients with advanced cancer pain living in the community

    Breen, Liz; Fylan, Beth; Blenkinsopp, Alison; Edwards, Zoe (University of BradfordFaculty of Life Sciences, 2019)
    Background: At the end of life, patients living in their own homes experience significantly more pain than those who die in either hospital or hospice care (Office for National Statistics, 2015). With an increasing prevalence of this, person-centred medicines optimisation is essential. Aim: To investigate the feasibility of community pharmacist medicines optimisation services for patients living with advanced cancer pain in community settings. Methods: Mixed methods were used, adopting a pragmatic stance and approach. Qualitative interviews, a systematic review and meta-analysis and a proof-of-concept study were undertaken. Results: Patients with advanced cancer pain need support with their medicines which could be provided by a pharmacist. Patients experienced a significant number of medicines related problems, even those already receiving specialist palliative care. Most problems were addressed by pharmacist advice with the remainder being referred for additional prescribing. Care for patients with cancer pain is currently not person-centred and the current medicines optimisation model is unsuitable for this patient group. An enhanced model of medicines optimisation is therefore presented for patients with advanced cancer and this model can be amended and adopted for other patient groups. Conclusions: An enhanced medicines optimisation model (MOCAP) has been created to inform person-centred medicines optimisation for patients with advanced cancer pain. Feasibility and acceptability were also confirmed and it can be adapted for further clinical use. This model contributes to the goals of the NHS agenda of choice and control of care as proposed in the NHS Long Term Plan (NHS, 2019b).

View more