The importance of contextual factors on the accuracy of estimates in project management. An emergence of a framework for more realistic estimation process

View/ Open
DBA Thesis (3.312Mb)
Download
Publication date
2014Author
Lazarski, AdamSupervisor
Hussain, Zahid I.Rights

The University of Bradford theses are licenced under a Creative Commons Licence.
Institution
University of BradfordDepartment
School of ManagementAwarded
2014
Metadata
Show full item recordAbstract
Successful projects are characterized by the quality of their planning. Good planning that better takes into account contextual factors allows more accurate estimates to be achieved. As an outcome of this research, a new framework composed of best practices has been discovered. This comprises an open platform that project experts and practitioners can work with efficiently, and that researchers can develop further as required. The research investigation commenced in the autumn of 2008 with a pilot study and then proceeded through an inductive research process, involving a series of eleven interviews. These consisted of interviews with four well-recognized experts in the field, four interviews with different practitioners and three group interviews. In addition, a long-running observation of forty-five days was conceptualized, together with other data sources, before culminating in the proposal of a new framework for improving the accuracy of estimates. Furthermore, an emerging framework – and a description of its know-how in terms of application – have been systematically reviewed through the course of four hundred twenty-five days of meetings, dedicated for the most part to improving the use of a wide range of specific project management tools and techniques and to an improvement in understanding of planning and the estimation process associated with it. This approach constituted an ongoing verification of the research’s findings against project management practice and also served as an invaluable resource for the researcher’s professional and practice-oriented development. The results obtained offered fresh insights into the importance of knowledge management in the estimation process, including the “value of not knowing”, the oft-overlooked phenomenon of underestimation and its potential to co-exist with overestimation, and the use of negative buffer management in the critical chain concept to secure project deadlines. The project also highlighted areas of improvement for future research practice that wishes to make use of an inductive approach in order to achieve a socially agreed framework, rather than a theory alone. In addition, improvements were suggested to the various qualitative tools employed in the customized data analysis process.Type
ThesisQualification name
DBACollections
Related items
Showing items related by title, author, creator and subject.
-
Analogy-based software project effort estimation. Contributions to projects similarity measurement, attribute selection and attribute weighting algorithms for analogy-based effort estimation.Neagu, Daniel; Cowling, Peter I.; Azzeh, Mohammad Y.A. (University of BradfordDepartment of Computing School of Computing, Informatics & Media, 2010-10-01)Software effort estimation by analogy is a viable alternative method to other estimation techniques, and in many cases, researchers found it outperformed other estimation methods in terms of accuracy and practitioners¿ acceptance. However, the overall performance of analogy based estimation depends on two major factors: similarity measure and attribute selection & weighting. Current similarity measures such as nearest neighborhood techniques have been criticized that have some inadequacies related to attributes relevancy, noise and uncertainty in addition to the problem of using categorical attributes. This research focuses on improving the efficiency and flexibility of analogy-based estimation to overcome the abovementioned inadequacies. Particularly, this thesis proposes two new approaches to model and handle uncertainty in similarity measurement method and most importantly to reflect the structure of dataset on similarity measurement using Fuzzy modeling based Fuzzy C-means algorithm. The first proposed approach called Fuzzy Grey Relational Analysis method employs combined techniques of Fuzzy set theory and Grey Relational Analysis to improve local and global similarity measure and tolerate imprecision associated with using different data types (Continuous and Categorical). The second proposed approach presents the use of Fuzzy numbers and its concepts to develop a practical yet efficient approach to support analogy-based systems especially at early phase of software development. Specifically, we propose a new similarity measure and adaptation technique based on Fuzzy numbers. We also propose a new attribute subset selection algorithm and attribute weighting technique based on the hypothesis of analogy-based estimation that assumes projects that are similar in terms of attribute value are also similar in terms of effort values, using row-wise Kendall rank correlation between similarity matrix based project effort values and similarity matrix based project attribute values. A literature review of related software engineering studies revealed that the existing attribute selection techniques (such as brute-force, heuristic algorithms) are restricted to the choice of performance indicators such as (Mean of Magnitude Relative Error and Prediction Performance Indicator) and computationally far more intensive. The proposed algorithms provide sound statistical basis and justification for their procedures. The performance figures of the proposed approaches have been evaluated using real industrial datasets. Results and conclusions from a series of comparative studies with conventional estimation by analogy approach using the available datasets are presented. The studies were also carried out to statistically investigate the significant differences between predictions generated by our approaches and those generated by the most popular techniques such as: conventional analogy estimation, neural network and stepwise regression. The results and conclusions indicate that the two proposed approaches have potential to deliver comparable, if not better, accuracy than the compared techniques. The results also found that Grey Relational Analysis tolerates the uncertainty associated with using different data types. As well as the original contributions within the thesis, a number of directions for further research are presented. Most chapters in this thesis have been disseminated in international journals and highly refereed conference proceedings.
-
Neural network based hybrid modelling and MINLP based optimisation of MSF desalination process within gPROMS: Development of neural network based correlations for estimating temperature elevation due to salinity, hybrid modelling and MINLP based optimisation of design and operation parameters of MSF desalination process within gPROMSMujtaba, Iqbal; Sowgath, Md Tanvir (University of BradfordSchool of Engineering Design and Technology, 2007)Desalination technology provides fresh water to the arid regions around the world. Multi-Stage Flash (MSF) distillation process has been used for many years and is now the largest sector in the desalination industry. Top Brine Temperature (TBT) (boiling point temperature of the feed seawater in the first stage of the process) is one of the many important parameters that affect optimal design and operation of MSF processes. For a given pressure, TBT is a function of Boiling Point Temperature (BPT) at zero salinity and Temperature Elevation (TE) due to salinity. Modelling plays an important role in simulation, optimisation and control of MSF processes and within the model, calculation of TE is therefore important for each stages (including the first stage, which determines the TBT). Firstly, in this work, several Neural Network (NN) based correlations for predicting TE are developed. It is found that the NN based correlations can predict the experimental TE very closely. Also predictions of TE by the NN based correlations were found to be good when compared to those obtained using the existing correlations from the literature. Secondly, a hybrid steady state MSF process model is developed using gPROMS modelling tool embedding the NN based correlation. gPROMS provides an easy and flexible platform to build a process flowsheet graphically. Here a Master Model connecting (automatically) the individual unit model (brine heater, stages, etc.) equations is developed which is used repeatedly during simulation and optimisation. The model is validated against published results. Seawater is the main source raw material for MSF processes and is subject to seasonal temperature variation. With fixed design the model is then used to study the effect of a number of parameters (e.g. seawater and steam temperature) on the freshwater production rate. It is observed that, the variation in the parameters affect the rate of production of fresh water. How the design and operation are to be adjusted to maintain a fixed demand of fresh water through out the year (with changing seawater temperature) is also investigated via repetitive simulation. Thirdly, with clear understanding of the interaction of design and operating parameters, simultaneous optimisation of design and operating parameters of MSF process is considered via the application MINLP technique within gPROMS. Two types of optimisation problems are considered: (a) For a fixed fresh water demand throughout the year, the external heat input (a measure of operating cost) to the process is minimised; (b) For different fresh water demand throughout the year and with seasonal variation of seawater temperature, the total annualised cost of desalination is minimised. It is found that seasonal variation in seawater temperature results in significant variation in design and some of the operating parameters but with minimum variation in process temperatures. The results also reveal the possibility of designing stand-alone flash stages which would offer flexible scheduling in terms of the connection of various units (to build up the process) and efficient maintenance of the units throughout the year as the weather condition changes. In addition, operation at low temperatures throughout the year will reduce design and operating costs in terms of low temperature materials of construction and reduced amount of anti-scaling and anti-corrosion agents. Finally, an attempt was made to develop a hybrid dynamic MSF process model incorporating NN based correlation for TE. The model was validated at steady state condition using the data from the literature. Dynamic simulation with step changes in seawater and steam temperature was carried out to match the predictions by the steady state model. Dynamic optimisation problem is then formulated for the MSF process, subjected to seawater temperature change (up and down) over a period of six hours, to maximise a performance ratio by optimising the brine heater steam temperature while maintaining a fixed water demand.
-
Kinetic Modelling Simulation and Optimal Operation of Trickle Bed Reactor for Hydrotreating of Crude Oil. Kinetic Parameters Estimation of Hydrotreating Reactions in Trickle Bed Reactor (TBR) via Pilot Plant Experiments; Optimal Design and Operation of an Industrial TBR with Heat Integration and Economic Evaluation.Mujtaba, Iqbal; Wood, Alastair S.; Jarullah, Aysar Talib (University of BradfordSchool of Engineering, Design and Technology, 2012-01-30)Catalytic hydrotreating (HDT) is a mature process technology practiced in the petroleum refining industries to treat oil fractions for the removal of impurities (such as sulfur, nitrogen, metals, asphaltene). Hydrotreating of whole crude oil is a new technology and is regarded as one of the more difficult tasks that have not been reported widely in the literature. In order to obtain useful models for the HDT process that can be confidently applied to reactor design, operation and control, the accurate estimation of kinetic parameters of the relevant reaction scheme are required. This thesis aims to develop a crude oil hydrotreating process (based on hydrotreating of whole crude oil followed by distillation) with high efficiency, selectivity and minimum energy consumption via pilot plant experiments, mathematical modelling and optimization. To estimate the kinetic parameters and to validate the kinetic models under different operating conditions, a set of experiments were carried out in a continuous flow isothermal trickle bed reactor using crude oil as a feedstock and commercial cobaltmolybdenum on alumina (Co-Mo/¿-Al2O3) as a catalyst. The reactor temperature was varied from 335°C to 400°C, the hydrogen pressure from 4 to10 MPa and the liquid hourly space velocity (LHSV) from 0.5 to 1.5 hr-1, keeping constant hydrogen to oil ratio (H2/Oil) at 250 L/L. The main hydrotreating reactions were hydrodesulfurization (HDS), hydrodenitrogenation (HDN), hydrodeasphaltenization (HDAs) and hydrodemetallization (HDM) that includes hydrodevanadization (HDV) and hydrodenickelation (HDNi). An optimization technique is used to evaluate the best kinetic models of a trickle-bed reactor (TBR) process utilized for HDS, HDAs, HDN, HDV and HDNi of crude oil based on pilot plant experiments. The minimization of the sum of the squared errors (SSE) between the experimental and estimated concentrations of sulfur (S), nitrogen (N), asphaltene (Asph), vanadium (V) and nickel (Ni) compounds in the products, is used as an objective function in the optimization problem using two approaches (linear (LN) and non-linear (NLN) regression). The growing demand for high-quality middle distillates is increasing worldwide whereas the demand for low-value oil products, such as heavy oils and residues, is decreasing. Thus, maximizing the production of more liquid distillates of very high quality is of immediate interest to refiners. At the same time, environmental legislation has led to more strict specifications of petroleum derivatives. Crude oil hydrotreatment enhances the productivity of distillate fractions due to chemical reactions. The hydrotreated crude oil was distilled into the following fractions (using distillation pilot plant unit): light naphtha (L.N), heavy naphtha (H.N), heavy kerosene (H.K), light gas oil (L.G.O) and reduced crude residue (R.C.R) in order to compare the yield of these fractions produced by distillation after the HDT process with those produced by conventional methods (i.e. HDT of each fraction separately after the distillation). The yield of middle distillate showed greater yield compared to the middle distillate produced by conventional methods in addition to improve the properties of R.C.R. Kinetic models that enhance oil distillates productivity are also proposed based on the experimental data obtained in a pilot plant at different operation conditions using the discrete kinetic lumping approach. The kinetic models of crude oil hydrotreating are assumed to include five lumps: gases (G), naphtha (N), heavy kerosene (H.K), light gas oil (L.G.O) and reduced crude residue (R.C.R). For all experiments, the sum of the squared errors (SSE) between the experimental product compositions and predicted values of compositions is minimized using optimization technique. The kinetic models developed are then used to describe and analyse the behaviour of an industrial trickle bed reactor (TBR) used for crude oil hydrotreating with the optimal quench system based on experiments in order to evaluate the viability of large-scale processing of crude oil hydrotreating. The optimal distribution of the catalyst bed (in terms of optimal reactor length to diameter) with the best quench position and quench rate are investigated, based upon the total annual cost. The energy consumption is very important for reducing environmental impact and maximizing the profitability of operation. Since high temperatures are employed in hydrotreating (HDT) processes, hot effluents can be used to heat other cold process streams. It is noticed that the energy consumption and recovery issues may be ignored for pilot plant experiments while these energies could not be ignored for large scale operations. Here, the heat integration of the HDT process during hydrotreating of crude oil in trickle bed reactor is addressed in order to recover most of the external energy. Experimental information obtained from a pilot scale, kinetics and reactor modelling tools, and commercial process data, are employed for the heat integration process model. The optimization problem is formulated to optimize some of the design and operating parameters of integrated process, and minimizing the overall annual cost is used as an objective function. The economic analysis of the continuous whole industrial refining process that involves the developed hydrotreating (integrated hydrotreating process) unit with the other complementary units (until the units that used to produce middle distillate fractions) is also presented. In all cases considered in this study, the gPROMS (general PROcess Modelling System) package has been used for modelling, simulation and parameter estimation via optimization process.