• Qualitative Adaptive Identification for Powertrain Systems. Powertrain Dynamic Modelling and Adaptive Identification Algorithms with Identifiability Analysis for Real-Time Monitoring and Detectability Assessment of Physical and Semi-Physical System Parameters

      Ebrahimi, Kambiz M.; Pezouvanis, Antonios; Souflas, Ioannis (University of BradfordFaculty of Engineering and Informatics, 2015)
      A complete chain of analysis and synthesis system identification tools for detectability assessment and adaptive identification of parameters with physical interpretation that can be found commonly in control-oriented powertrain models is presented. This research is motivated from the fact that future powertrain control and monitoring systems will depend increasingly on physically oriented system models to reduce the complexity of existing control strategies and open the road to new environmentally friendly technologies. At the outset of this study a physics-based control-oriented dynamic model of a complete transient engine testing facility, consisting of a single cylinder engine, an alternating current dynamometer and a coupling shaft unit, is developed to investigate the functional relationships of the inputs, outputs and parameters of the system. Having understood these, algorithms for identifiability analysis and adaptive identification of parameters with physical interpretation are proposed. The efficacy of the recommended algorithms is illustrated with three novel practical applications. These are, the development of an on-line health monitoring system for engine dynamometer coupling shafts based on recursive estimation of shaft’s physical parameters, the sensitivity analysis and adaptive identification of engine friction parameters, and the non-linear recursive parameter estimation with parameter estimability analysis of physical and semi-physical cyclic engine torque model parameters. The findings of this research suggest that the combination of physics-based control oriented models with adaptive identification algorithms can lead to the development of component-based diagnosis and control strategies. Ultimately, this work contributes in the area of on-line fault diagnosis, fault tolerant and adaptive control for vehicular systems.
    • Qualitative study exploring Maternity Ward Attendants’ perceptions of occupational (work related) stress and the coping methods they adopted within maternity care settings (hospital) in Nigeria

      Bradshaw, Gwendolen; Prowse, Julie M.; Kuforiji, Oluwatoyosi A. (University of BradfordFaculty of Health Studies, 2017)
      Background: Occupational stress is a global and complex phenomenon, and workers in developing countries can be affected by it (International Labour Organisation 2001). Staff within maternity settings have been identified as being at risk of suffering from stress, resulting in adverse health outcomes (Evenden and Sharpe, 2002). However, MWAs’ perceptions of stress have not been captured and are not reflected in the literature. Purpose: The aim of this study was to explore MWAs’ perceptions of occupational stress, possible cause(s), the impact and support available and the coping methods they adopted within maternity care settings (hospital) in Nigeria. Methodology: This study adopted a qualitative methodology. Husserl’s (1962) phenomenological approach was chosen as it enabled the researcher to collect rich, in-depth, descriptive accounts of the MWAs’ perceptions of the phenomenon under study through the use of semi-structured interviews. Findings: The major sources of stress for MWAs included work overload, long working hours, staff shortages, work exploitation and intensification and lack of support from senior staff. The stress levels MWAs experienced impacted on their health and well-being and resulted in related behavioural and physical reactions. Conclusion: This study confirmed that MWAs were exposed to similar stress factors experienced by other health workers and reported in the research literature. Additionally, it demonstrated the need for more qualitative studies to explore the perceptions of occupational stress among under-represented groups of healthcare workers. Importantly, this study created an opportunity to explore the experience of dedicated women facing challenging employment practices in hospital settings in Nigeria. Equally, it gave a voice to these unrecognised, almost invisible women, who were the MWAs that played a key role within the maternity services.
    • Quantitative pharmacoproteomics investigation of anti-cancer drugs in mouse. Development and optimisation of proteomics workflows for evaluating the effect of anti-cancer drugs on mouse liver

      Sutton, Chris W.; Abumansour, Hamza M.A. (University of BradfordFaculty of Life Sciences, 2016)
      Minimizing anti-cancer drug toxicity is a major challenge for the pharmaceutical industry. Toxicity is most frequently due to either the direct interaction of the drug on previously unidentified targets or its conversion to metabolites by drug metabolizing enzymes (e.g. CYP450 enzymes) that cause cellular, tissue or organ damage. Pharmacoproteomics is beginning to take a central role in studying changes in protein expression corresponding to drug administration, the results of which, inform about the mode of action, toxicity, and resistance in pre-clinical and clinical stages of drug development. The main aim of this research is to apply comparative proteomics studies on livers from male and female mice xenograft models treated with major anti-cancer drugs (5-flourouracil, paclitaxel, cisplatin, and doxorubicin) and CYP inducer, TCPOBOP, to investigate their effect on protein expression profiles (proteome). Within this thesis, an attention is paid to optimise a highly validated proteomics workflow for biomarker identification. Proteins were extracted from liver microsomes of mice treated in two separate sets; Set A – male (5-fluoruracil, doxorubicin, cisplatin and untreated) or Set B – female (5-fluoruracil, paclitaxel, TCPOBOP and untreated) using cryo-pulverization and sonication method. The extracts were digested with trypsin ii and the resulting peptides labelled with 4-plex iTRAQ reagents. The labelled peptides were subjected for separation in two-dimensions by iso-electric focusing (IEF) and RP-HPLC techniques before analysis by mass spectrometry and database searching for protein identification. Set A and Set B resulted in identification and quantification of 1146 and 1743 proteins, respectively. Moreover, Set A and Set B recovered 26 and 34 cytochrome P450 isoforms, respectively. The microsomal changes after drug treatments were quite similar. However, more changes were observed in the male set. Up-regulation of MUPs showed the greatest distinction in the protein expression patterns in the treated samples comparing to the untreated controls. In Set A, 5-fluoruracil and cisplatin increased the expression of three isoforms (MUP1, 2, and 6), whereas doxorubicin has increased the expression of four isoforms (MUP1, 2, 3, and 6). On the other side, only TCPOBOP in Set B has increased the expression of two isoforms (MUP1 and 6). Our findings showed that the expression of MUP, normally involved in binding and excretion of pheromones, have drug- and sex-specific differences. The mechanism and significance of MUP up-regulation are ambiguous. Therefore, the impact of each therapeutic agent on MUP and xenobiotic enzymes will be discussed.
    • A Quantitative Security Assessment of Modern Cyber Attacks. A Framework for Quantifying Enterprise Security Risk Level Through System's Vulnerability Analysis by Detecting Known and Unknown Threats

      Awan, Irfan U.; Pagna Disso, Jules F.; Munir, Rashid (University of BradfordFaculty of Engineering and Informatics, 2014)
      Cisco 2014 Annual Security Report clearly outlines the evolution of the threat landscape and the increase of the number of attacks. The UK government in 2012 recognised the cyber threat as Tier-1 threat since about 50 government departments have been either subjected to an attack or a direct threat from an attack. The cyberspace has become the platform of choice for businesses, schools, universities, colleges, hospitals and other sectors for business activities. One of the major problems identified by the Department of Homeland Security is the lack of clear security metrics. The recent cyber security breach of the US retail giant TARGET is a typical example that demonstrates the weaknesses of qualitative security, also considered by some security experts as fuzzy security. High, medium or low as measures of security levels do not give a quantitative representation of the network security level of a company. In this thesis, a method is developed to quantify the security risk level of known and unknown attacks in an enterprise network in an effort to solve this problem. The identified vulnerabilities in a case study of a UK based company are classified according to their severity risk levels using common vulnerability scoring system (CVSS) and open web application security project (OWASP). Probability theory is applied against known attacks to create the security metrics and, detection and prevention method is suggested for company network against unknown attacks. Our security metrics are clear and repeatable that can be verified scientifically
    • Quantum correlations and measurements in tri-partite quantum systems.

      Vourdas, Apostolos; Konstadopoulou, Anastasia; Idrus, Bahari bin (University of BradfordDepartment of Computing, 2012-06-12)
      Correlations and entanglement in a chain of three oscillators A,B,C with nearest neighbour coupling is studied. Oscillators A,B and B,C are coupled but there is no direct coupling between oscillators A,C. Examples with initial factorizable states are considered, and the time evolution is calculated. It is shown that the dynamics of the tri-partite system creates correlations and entanglement among the three oscillators and in particular, between oscillators A,C which are not coupled directly. We have performed photon number selective and non-selective measurements on oscillator A and we investigated their effects on the correlations and entanglement. It is shown that, before the measurement, the correlations between oscillators A,C can be stronger than the correlations of oscillators A,B. Moreover, some entanglement witness shows that oscillators A,C are entangled but the oscillators A,B might or might not be entangled. By using quantum discord, which measures the quantumness of correlations, it is shown that there are quantum correlations between oscillators A,B and after the measurements in both cases of selective and non-selective measurements, oscillators A,B and A,C become classically correlated.
    • Radio Resource Management for Satellite UMTS. Dynamic scheduling algorithm for a UMTS-compatible satellite network.

      Hu, Yim Fun; Chan, Pauline M.L.; Min, Geyong; Xu, Kai J. (University of BradfordSchool of Engineering Design and Technology, 2013-11-20)
      The third generation of mobile communication systems introduce interactive Multicast and Unicast multimedia services at a fast data rate of up to 2 Mbps and is expected to complete the globalization of the mobile telecommunication systems. The implementation of these services on satellite systems, particularly for broadcast and multicast applications to complement terrestrial services is ideal since satellite systems are capable of providing global coverage in areas not served by terrestrial telecommunication services. However, the main bottleneck of such systems is the scarcity of radio resources for supporting multimedia applications which has resulted in the rapid growth in research efforts for deriving efficient radio resource management techniques. This issue is addressed in this thesis, where the main emphasis is to design a dynamic scheduling framework and algorithm that can improve the overall performance of the radio resource management strategy of a UMTS compatible satellite network, taking into account the unique characteristics of wireless channel conditions. This thesis will initially be focused on the design of the network and functional architecture of a UMTS -compatible satellite network. Based on this architecture, an effective scheduling framework is designed, which can provide different types of resource assigning strategies. A functional model of scheduler is defined to describe the behaviours and interactions between different functional entities. An OPNET simulation model with a complete network protocol stack is developed to validate the performance of the scheduling algorithms implemented in the satellite network. Different types of traffic are considered for the OPNET simulation, such as the Poisson Process, ONOFF Source and Self Similar Process, so that the performance of scheduling algorithm can be analyzed for different types of services. A novel scheduling algorithm is proposed to optimise the channel utilisation by considering the characteristics of the wireless channel, which are bursty and location dependent. In order to overcome the channel errors, different code rates are applied for the user under different channel conditions. The proposed scheduling algorithm is designed to give higher priority to users with higher code rate, so that the throughput of network is optimized and at the same time, maintaining the end users¿ service level agreements. The fairness of the proposed scheduling algorithm is validated using OPNET simulation. The simulation results show that the algorithm can fairly allocate resource to different connections not only among different service classes but also within the same service class depending on their QoS attributes.
    • Raman spectroscopic application for the analysis of organic compounds and minerals of astrobiological significance. The detection and discrimination of organic compounds and mineral analogues in pure and mixed samples of astrobiological significance using raman spectroscopy, XRD and scanning electron microscopy

      Edwards, Howell G.M.; Scowen, Ian J.; Alajtal, Adel I. (University of BradfordDivision of Chemical and Forensic Sciences, 2010-09-01)
      Raman spectroscopy has been used to characterise both organic and geological samples in order to build a database for the future characterization of biomarker molecules that are of astrobiological relevance. Characteristic geological features and hydrated minerals recently found on the surface of Mars by the NASA planetary rovers Spirit and Opportunity suggest that a possible biosphere could have once existed there. Analytical instrumentation protocols for the unequivocal detection of biomarkers in suitable geological matrices are critical for future unmanned explorations, including the forthcoming ESA ExoMars mission scheduled for 2018. Several geological features found on the surface of Mars by planetary rovers suggest that a possible extinct biosphere could exist based on similar sources of energy as occurred on Earth. For this reason, analytical instrumental protocols for the detection of isolated biomarkers preserved in suitable geological matrices unequivocally and non-destructively have to be evaluated for future unmanned missions. Raman spectroscopy is currently part of the Pasteur instrumentation suite of the ExoMars mission for the remote detection of extant or extinct life signatures in the Martian surface and subsurface. Terrestrial analogues of Martian sites have been identified and the biogeological modifications resulting from extremophilic survival activity have been studied. Here we present the Raman spectral characterization of several examples of organic compounds which have been recorded using 785 nm, 633 nm and 514 nm laser excitation -polycyclic aromatic hydrocarbons (PAHs), organic acids, chlorophyll and carotenoids. Experimental mixtures of ß-carotene in usnic acid, PAHs in usnic acid and PAHs in mineral matrices have also been investigated. Organic compounds and PAHs located under crystalline minerals samples were identified using a 5x objective lens and 785 nm III excitation. The pure compounds and compound mixtures were also analysed using X-ray powder diffraction and scanning electron microscopy (SEM). The results of this study indicate that near infrared laser at 785 nm provided the clearest and the most informative spectra due to the reduction of fluorescence emission. Higher energy lasers operating in the visible region have resulted in the emission of significant background fluorescence. Few samples fluoresce even with the use of 785 nm excitation and FT-Raman spectroscopy remains the instrument of choice for the analysis of these samples.
    • A Raman Spectroscopic Study of Solid Dispersions and Co-crystals During the Pharmaceutical Hot melt Extrusion Process

      Gough, Timothy D.; Paradkar, Anant R.; Banedar, Parineeta N. (University of BradfordCentre for Pharmaceutical Engineering Sciences, Faculty of Life Sciences, 2015)
      Process Analytical Technology (PAT) is framed with the objective of the design and development of processes to ensure predefined quality of the product at the end of manufacturing. PAT implementation includes better understanding of process, reduction in production time with use of in-line, at-line and on-line measurements, yield improvement and energy and cost reductions. Hot Melt Extrusion process (HME) used in the present work is proving increasingly popular in industry for its continuous and green processing which is beneficial over traditional batch processing. The present work was focused on applications of Raman spectroscopy as off - line and in - line monitoring techniques as a PAT for production of pharmaceutical solid dispersions and co-crystals. Solid dispersions (SDs) of the anti-convulsant Carbamazepine (CBZ) with two pharmaceutical grade polymers have been produced using HME at a range of drug loadings and their amorphous nature confirmed using a variety of analytical techniques. Off-line and in-line Raman spectroscopy has been shown to be suitable techniques for proving preparation of these SDs. Through calibration curves generated from chemometric analysis in-line Raman spectroscopy was shown to be more accurate than off-line measurements proving the quantification ability of Raman spectroscopy as well as a PAT tool. Pure co-crystals of Ibuprofen-Nicotinamide and Carbamazepine-Nicotinamide have been produced using solvent evaporation and microwave radiation techniques. Raman spectroscopy proved its superiority over off-line analytical techniques such as DSC, FTIR and XRD for co-crystal purity determination adding to its key advantage in its ability to be used as an in-line, non-destructive technique.
    • The rational design of drug crystals to facilitate particle size reduction. Investigation of crystallisation conditions and crystal properties to enable optimised particle processing and comminution.

      York, Peter; de Matas, Marcel; Blagden, Nicholas; Leusen, Frank J.J.; Shariare, Mohammad H. (University of BradfordSchool of Life Sciences, 2012-02-29)
      Micronisation of active pharmaceutical ingredients (APIs) to achieve desirable quality attributes for formulation preparation and drug delivery remains a major challenge in the pharmaceutical sciences. It is therefore important that the relationships between crystal structure, the mechanical properties of powders and their subsequent influence on processing behaviour are well understood. The aim of this project was therefore to determine the relative importance of particle attributes including size, crystal quality and morphology on processing behaviour and the characteristics of micronised materials. It was then subsequently intended to link this behaviour back to crystal structure and the nature of molecular packing and intermolecular interactions within the crystal lattice enabling the identification of some generic rules which govern the quality of size reduced powders. In this regard, different sieve fractions of lactose monohydrate and crystal variants of ibuprofen and salbutamol sulphate (size, morphology and crystal quality) were investigated in order to determine those factors with greatest impact on post-micronisation measures of particle quality including particle size, degree of crystallinity and surface energy. The results showed that smaller sized feedstock should typically be used to achieve ultrafine powders with high crystallinity. This finding is attributed to the reduced number of fracture events necessary to reduce the size of the particles leading to decreases in milling residence time. However the frequency of crystal cracks is also important, with these imperfections being implicated in crack propagation and brittle fracture. Ibuprofen crystals with a greater number of cracks showed a greater propensity for comminution. Salbutamol sulphate with a high degree of crystal dislocations however gave highly energetic powders, with reduced degree of crystallinity owing to the role dislocations play in facilitating plastic deformation, minimising fragmentation and extending the residence of particles in the microniser. Throughout these studies, morphology was also shown to be critical, with needle like morphology giving increased propensity for size reduction for both ibuprofen and salbutamol sulphate, which is related to the small crack propagation length of these crystals. This behaviour is also attributed to differences in the relative facet areas for the different morphologies of particles, with associated alternative deformation behaviour and slip direction influencing the size reduction process. Molecular modelling demonstrated a general relationship between low energy slip planes, d-spacing and brittleness for a range of materials, with finer particle size distributions achieved for APIs with low value of highest d-spacings for identified slip planes. The highest d-spacing for any material can be readily determined by PXRD (powder x-ray diffraction) which can potentially be used to rank the milling behaviour of pharmaceutical materials and provides a rapid assessment tool to aid process and formulation design. These studies have shown that a range of crystal properties of feedstock can be controlled in order to provide micronised powders with desirable attributes. These include the size, morphology and the density of defects and dislocations in the crystals of the feedstock. Further studies are however required to identify strategies to ensure inter-batch consistency in these attributes following crystallisation of organic molecules.
    • Rationalizing Ethically Questionable Intentions: An Investigation of Marketing Practices in the USA.

      Reast, Jon; Wallace, James; Overall, Jeffrey Scott (University of BradfordSchool of Management, 2014-05-07)
      In this research, a model for ethically questionable decision-making is developed by amalgamating several decision-making theories. The variables of interest are the techniques of neutralization, perceived moral intensity, Machiavellianism, unethical intentions, and ethical judgment. Using a sample of 276 U.S. marketing professionals, partial least squares structural equation modelling was used to validate the model. Findings reveal that U.S. marketing professionals rationalize their ethically questionable intentions through their: (1) perception of moral intensity (i.e., minimizing the harms on others, perceiving their self-interest as most salient, and indifference to social consensus), (2) reliance on various neutralization techniques, and; (3) judgment of their ethically questionable intentions as ethical. After controlling for the Machiavellian personality trait, Machiavellianism did not have a profound effect on the decision-making process, which implies that marketers, in general, are capable of the cognitive distortions found in this study. The main contribution to knowledge is the synthesis of the techniques of neutralization and the perceived moral intensity construct. Through this amalgamation, knowledge of the intermediary steps in the decision-making process has emerged. A contribution to knowledge involves testing the relationship between Machiavellianism and unethical intentions through the mediating variable of the techniques of neutralization. Through this investigation, it was found that the Machiavellian personality is inconsequential to the decision-making process. As a contribution to managerial knowledge, it was found that through cognitive distortions, marketers are capable of various illicit behaviours, which have been shown to be costly to not only stakeholders, but also to the profitability and reputations of organisations.
    • (Re) Visiting Female Entrepreneurs: An Emancipatory Impulse

      Ford, Jackie M.; Larsen, Gretchen; Dean, Hannah (University of BradfordSchool of Management, 2014-12-23)
      This thesis aims to emancipate female entrepreneurs from the metanarrative of economic growth which has created a false dichotomy of successful male entrepreneur versus an unsuccessful female entrepreneur. This aim is pursued through a multidisciplinary and critical inquiry that destabilises this metanarrative conceptually and empirically. A critical interrogation of economic studies reveals the embeddedness of the metanarrative in neo-classical economic growth theory. Far from being a true reflection of the entrepreneurial experience, the theory has silenced the innovator entrepreneur in economic theory and replaced him/her with an economic rational manager. Concurrently, a re-analysis of Schumpeter’s theorising suggests that his theories do not subordinate female entrepreneurs as claimed by a number of critical theorists. In contrast, his theorising is emancipatory and offers an alternative theoretical framework to the oppressive neo-classical economic growth theory. Oral history methods are used to capture the voices of female entrepreneurs which have largely been excluded from the literature. The oral history narratives challenge the oppressive homogeneity imposed by the metanarrative of economic growth and illustrate the negative influence of the theoretical foundation of neo-classical theory upon the entrepreneurial experience. The study offers theoretical, methodological and empirical contributions to female entrepreneurship studies by presenting a fresh interpretation of Schumpeter’s theorising; including the voices of the female entrepreneurs; and applying research approaches that break away from positivism which dominates entrepreneurial studies. The study has implications for policy makers and practitioners as it generates knowledge that takes account of the current social and economic changes.
    • Reaction calorimetry applied to kinetic problems. The design and construction of an isothermal calorimeter with heat compensation by the Peltier effect, and the application of the calorimeter in the study of reaction kinetics in solvent/water mixtures.

      Diaper, John; Canning, R.G. (University of BradfordNot given, 2010-02-11)
      An isothermal calorimeter controlled by the Peltier effect has been designed and constructed in order to investigate reaction rates in solventwater mixtures. Because a thermal method was used a constant temperature environment was essential and this was achieved by using a water bath controlled to + 0.0010C. This callorinieter has been used to study the alkaline hydrolysis of methyl acetate in dimethylsulphoxide, and tetrahydrofuran - water mixtures at 15, 25 and 35 [degrees]C. The results of other investigations on similar reactions have been reviewed and an attempt has been made to correlate the electrostatic theories of Laidler and Eyring, and Amis and jaffe with these results. Finally, because it appears that specific solvent interactions play a major part in the reaction rates the role of water in the reaction mechanism has been examined. A mechanistic explanation has been proposed in order to correlate the rate of reaction with the composition of water-solvent mixtures which justifies the Laidler and Eyring treatment of solvent effects on ion-molecule reactions.
    • Real investment and dividend policy in a dynamic stochastic general equilibrium (DSGE) model. Corporate finance at an aggregate level through DSGE models.

      Freeman, Mark C.; Huang, Shih-Yun (University of BradfordSchool of Management, 2012-06-15)
      In this thesis, I take a theoretical dynamic stochastic general equilibrium (DSGE) approach to investigate optimal aggregate dividend policy. I make the following contribution: 1. I extend the standard DSGE model to incorporate a residual dividend policy, external financing and default and find that simulated optimal aggregate payouts are much more volatile than the observed data when other variables are close to the values observed in the data. 2. I examine the sensitivity of optimal aggregate dividend policy to the level of the representative agent¿s habit motive. My results show that, when the habit motive gets stronger, the volatility of optimal aggregate payouts increases while the volatility of aggregate consumption decreases. This is consistent with the hypothesis that investors use cash payouts from well diversified portfolios to help smooth consumption. 3. I demonstrate that the variability of optimal aggregate payouts is sensitive to capital adjustment costs. My simulated results show that costly frictions from changing the capital base of the firm cause optimal aggregate dividends and real investments to be smooth and share prices to be volatile. This finding is consistent with prior empirical observations. 4. I run simulations that support the hypothesis that optimal aggregate dividend policy is similar when the representative firm is risk averse to when it has capital adjustment costs. In both cases, optimal aggregate dividends volatility is very low. 5. In all calibrated DSGE models, apart from case 4, optimal aggregate payouts are found to be countercyclical. This supports the hypothesis that corporations prefer to hold more free cash flows for potential investment opportunities instead of paying dividends when the economy is booming, but is inconsistent with observed data. Keywords: Dynamic Stochastic General Equilibrium (DSGE), real business cycle, utility function, habits, dividends
    • A real time 3D surface measurement system using projected line patterns.

      Jiang, Ping; Baruch, John E.F.; Shen, Anqi (University of BradfordDepartment of Computing, 2012-03-20)
      This thesis is based on a research project to evaluate a quality control system for car component stamping lines. The quality control system measures the abrasion of the stamping tools by measuring the surface of the products. A 3D vision system is developed for the real time online measurement of the product surface. In this thesis, there are three main research themes. First is to produce an industrial application. All the components of this vision system are selected from industrial products and user application software is developed. A rich human machine interface for interaction with the vision system is developed along with a link between the vision system and a control unit which is established for interaction with a production line. The second research theme is to enhance the robustness of the 3D measurement. As an industrial product, this system will be deployed in different factories. It should be robust against environmental uncertainties. For this purpose, a high signal to noise ratio is required with the light pattern being produced by a laser projector. Additionally, multiple height calculation methods and a spatial Kalman filter are proposed for optimal height estimation. The final research theme is to achieve real time 3D measurement. The vision system is expected to be installed on production lines for online quality inspection. A new 3D measurement method is developed. It combines the spatial binary coded method with phase shift methods with a single image needs to be captured.
    • The realization of signal processing methods and their hardware implementation over multi-carrier modulation using FPGA technology. Validation and implementation of multi-carrier modulation on FPGA, and signal processing of the channel estimation techniques and filter bank architectures for DWT using HDL coding for mobile and wireless applications.

      Abd-Alhameed, Raed A.; Noras, James M.; Migdadi, Hassan S.O. (University of BradfordFaculty of Engineering and Informatics, 2015)
      First part of this thesis presents the design, validation, and implementation of an Orthogonal Frequency Division Multiplexing (OFDM) transmitter and receiver on a Cyclone II FPGA chip using DSP builder and Quartus II high level design tools. The resources in terms of logical elements (LE) including combinational functions and logic registers allocated by the model have been investigated and addressed. The result shows that implementing the basic OFDM transceiver allocates about 14% (equivalent to 6% at transmitter and 8% at receiver) of the available LE resources on an Altera Cyclone II EP2C35F672C6 FPGA chip, largely taken up by the FFT, IFFT and soft decision encoder. Secondly, a new wavelet-based OFDM system based on FDPP-DA based channel estimation is proposed as a reliable ECG Patient Monitoring System, a Personal Wireless telemedicine application. The system performance for different wavelet mothers has been investigated. The effects of AWGN and multipath Rayleigh fading channels have also been studied in the analysis. The performances of FDPP-DA and HDPP-DA-based channel estimations are compared based on both DFT-based OFDM and wavelet-based OFDM systems. The system model was studied using MATLAB software in which the average BER was addressed for randomized data. The main error differences that reflect the quality of the received ECG signals between the reconstructed and original ECG signals are established. Finally a DA-based architecture for 1-D iDWT/DWT based on an OFDM model is implemented for an ECG-PMS wireless telemedicine application. In the portable wireless body transmitter unit at the patient site, a fully Serial-DA-based scheme for iDWT is realized to support higher hardware utilization and lower power consumption; whereas a fully Parallel-DA-based scheme for DWT is applied at the base unit of the hospital site to support a higher throughput. It should be noted that the behavioural level of HDL models of the proposed system was developed and implemented to confirm its correctness in simulation. Then, after the simulation process the design models were synthesised and implemented for the target FPGA to confirm their validation.
    • A reappraisal of archaeological geophysical surveys on Irish road corridors 2001-2010. With particular reference to the influence of geological, seasonal and archaeological variables

      Gaffney, Christopher F.; Armit, Ian; Bonsall, James P.T. (University of BradfordArchaeological and Environmental Science, 2015-07-15)
      Geophysical surveys in the Republic of Ireland and elsewhere rarely have the opportunity to receive direct, meaningful and quantitative feedback from ground observed excavations, despite their frequent occurrence as a subsequent phase of development-led archaeological projects. This research critically reappraises the largest and most coherent geophysical archive maintained by a single end-user over a ten year period. The geophysical archive has been collated from 170 reports on linear road schemes as a result of commercially-driven assessments in Ireland, to facilitate the biggest analysis of geophysical survey legacy data and subsequent detailed excavations. The analysis of the legacy data archive has reviewed and tested the influence of key variables that have, in some circumstances, affected the methods and outcomes of geophysical assessments in Ireland over the last 10 years. By understanding the impact of those key variables upon the legacy data - which include archaeological feature type, geology, sampling strategy and seasonality - appropriate and new ways to research linear corridors have been suggested that should be employed in future geophysical survey assessments for a range of environments and archaeological site types. The comprehensive analysis of geophysical surveys from the legacy data archive has created definitive statements regarding the validity of geophysical techniques in Ireland. Key failures that occurred in the past have been identified and a thorough investigation of new and novel techniques or methods of survey will facilitate a more robust approach to geophysical survey strategies in the future. The outcomes of this research are likely to have ramifications beyond the Irish road corridors from which the legacy data derives.
    • Reconfigurable modelling of physically based systems: Dynamic modelling and optimisation for product design and development applied to the automotive drivetrain system.

      Ebrahimi, Kambiz M.; Mason, Byron A. (University of BradfordSchool of Engineering, Design and Technology, 2009-08-25)
      The work of this thesis is concerned with the aggregation and advancement of modelling practise as used within modern day product development and optimisation environments making use of Model Based Design (¿MBD¿) and similar procedures. A review of model development and use forms the foundation of the work, with the findings being aggregated into two unique approaches for rapid model development and reconfiguration; the Plug-and-Simulate (¿PaS¿) approach and the Paradigm for Large Model Creation (¿PLMC¿); each shown to posses its own advantages. To support the MBD process a model optimisation algorithm that seeks to eliminate parameters that are of little or no significance to a simulation is developed. Eliminations are made on the basis of an energy analysis which determines the activity of a number of energy elements. Low activity elements are said to be of less significance to the global dynamics of a model and thus become targets for elimination. A model configuration tool is presented that brings together the PLMC and parameter elimination algorithm. The tool is shown to be useful for rapid configuration and reconfiguration of models and is capable of automatically running the optimisation algorithms thus producing a simulation model that is parametrically and computationally optimised. The response of the plug-and-simulate drivetrain submodels, assembled to represent a front wheel drive drivetrain, is examined. The resulting model is subjected to a torque step-input and an empirically obtained torque curve that characterises the input to a drivetrain undergoing steady acceleration. The model displays the expected response in both its full parameter and parameter reduced versions with simulation efficiency gains observed in the parameter reduced version.
    • The reconfiguration of the state in an era of neoliberal globalism: State violence and indigenous responses in the Costa Chica-Montaña of Guerrero, Mexico.

      Pearce, Jenny V.; Parra-Rosales, L.P. (University of BradfordDepartment of Peace Studies, 2009-07-31)
      The adoption of the neo-liberal model in the mid-1980s has forced the governing elites to reconfigure the Mexican State. However, the consolidation of a neoliberal State continues to be incomplete and it has been problematic to fully integrated the Mexican economy in the global market due to the increasing organized crime, the dismantling of previous post-revolutionary control mechanisms, and the growing mobilisation of organised indigenous opposition ranging from the peaceful obstruction of hydroelectric mega-projects in their territories to armed struggle. In view of the State crisis, this thesis argues that there has been a shift in the system of control mechanisms of the State that is leaning towards a more recurrent use of open violence to implement its neo-liberal State project. From a theoretical perspective, the research proposes an innovative approach to understanding the formation of the post-revolutionary State, which transcends the State violence dichotomy established between the ´corporatist´ and the ´critical´ approaches in the contemporary literature. The research highlights the wide spectrum of control mechanisms from hegemonic domination to violence used by the governing elites to compensate the unfinished State formation process in order to maintain socio-political stability without profound structural changes. It explores the enhanced tendency of State violence to replace incorporation in Statesociety relations since the efforts to restructure the economy from the 1980s onwards. The thesis analyses how this tendency has grown particularly in response to indigenous movements in the South of Mexico. The argument is substantiated empirically with two case studies undertaken in the sub-region of Costa Chica-Montaña of Guerrero with data from 79 semi-structured interviews with a wide range of social and political actors, and participant observation in ten indigenous communities. The case studies explore the different State control mechanisms used to advance the State formation model in the post revolutionary period; the impact of the crisis of those mechanisms in the sub-region; the violent resistance of local bosses to the loss of power, and the multiples indigenous responses to the implementation of neoliberal policies in their territories. This research also includes a comparative study to explain some factors that strengthen indigenous articulations, as well as their limits in an era of neoliberal globalisation. One of the most important research findings is that neoliberalism has further weakened the ¿civilianisation¿ power of the State to deal peacefully with civil society sectors, particularly with indigenous peoples, while it has strengthened its ¿centralised-coercive¿ power to carry out the imposed State model. Another finding is that the indigenous initiatives that have reinvented themselves through a new version of their practices and broader alliances have consolidated their alternative models. In contrast, the indigenous responses that have reproduced their traditions have failed.
    • Reconstruction of 3D scenes from pairs of uncalibrated images. Creation of an interactive system for extracting 3D data points and investigation of automatic techniques for generating dense 3D data maps from pairs of uncalibrated images for remote sensing applications.

      Ipson, Stanley S.; Qahwaji, Rami S.R.; Alkhadour, Wissam M. (University of BradfordSchool of Computing, Informatics & Media, 2011-07-06)
      Much research effort has been devoted to producing algorithms that contribute directly or indirectly to the extraction of 3D information from a wide variety of types of scenes and conditions of image capture. The research work presented in this thesis is aimed at three distinct applications in this area: interactively extracting 3D points from a pair of uncalibrated images in a flexible way; finding corresponding points automatically in high resolution images, particularly those of archaeological scenes captured from a freely moving light aircraft; and improving a correlation approach to dense disparity mapping leading to 3D surface reconstructions. The fundamental concepts required to describe the principles of stereo vision, the camera models, and the epipolar geometry described by the fundamental matrix are introduced, followed by a detailed literature review of existing methods. An interactive system for viewing a scene via a monochrome or colour anaglyph is presented which allows the user to choose the level of compromise between amount of colour and ghosting perceived by controlling colour saturation, and to choose the depth plane of interest. An improved method of extracting 3D coordinates from disparity values when there is significant error is presented. Interactive methods, while very flexible, require significant effort from the user finding and fusing corresponding points and the thesis continues by presenting several variants of existing scale invariant feature transform methods to automatically find correspondences in uncalibrated high resolution aerial images with improved speed and memory requirements. In addition, a contribution to estimating lens distortion correction by a Levenberg Marquard based method is presented; generating data strings for straight lines which are essential input for estimating lens distortion correction. The remainder of the thesis presents correlation based methods for generating dense disparity maps based on single and multiple image rectifications using sets of automatically found correspondences and demonstrates improvements obtained using the latter method. Some example views of point clouds for 3D surfaces produced from pairs of uncalibrated images using the methods presented in the thesis are included.
    • Reducing Client-Server Communication for Efficient Real-Time Web Applications: The Use of Adaptive Polling as A Case Study for Multi-User Web Applications

      Ridley, Mick J.; Cullen, Andrea J.; Aziz, Hatem M.
      A key challenge of current multi-user web applications is to provide users with interesting events and information in real-time. This research reviews the most common real-time web techniques to identify drawbacks while exploring solutions to improve simplicity, efficiency, and compatibility within a client-server environment. Two solutions are proposed for enhancing the efficiency of real-time web techniques by reducing client-server communication. First, a model of browser monitoring control observes the browser activity and decides if to postpone client-server communication in the case of inactive tabs. This model was implemented and tested with results demonstrating that a significant number of client-server connections can be avoided in the browser background. These results suggest the solution can be optimised for any real-time technique as it benefits from being a developer side technique that works consistently on all browsers. Second, ‘Adaptive Polling’ is a pull-based real-time web technique to overcome bandwidth issues of the reverse AJAX method of ‘Polling’ by controlling the frequency of requesting updates from the server based on the last server response. This approach is implemented and tested with results showing how a significant number of redundant connections can be avoided while the server does not return updates. This solution is a good alternative to other real-time web techniques as it features low latency, the simplicity of implementation, and compatibility with all browsers and servers.