Data management systems are increasingly used in industrial processes. However, data collected as... more Data management systems are increasingly used in industrial processes. However, data collected as part of industrial process operations, such as sensor or measurement instruments data, contain various sources of errors that can hamper process analysis and decision making. The authors propose an operating-regime-based data processing framework for industrial process decision making. The framework was designed to increase the quality and take advantage of available process data use to make informed offline strategic business operation decisions, i.e., environmental, cost and energy analysis, optimization, fault detection, debottlenecking, etc. The approach was synthesized from best practices derived from the available framework and improved upon its predecessor by putting forward the combination of process expertise and data-driven approaches. This systematic and structured approach includes the following stages: (1) scope of the analysis, (2) signal processing, (3) steady-state opera...
We propose a quantitative analysis of an enterprise-wide optimization for operations of crude-oil... more We propose a quantitative analysis of an enterprise-wide optimization for operations of crude-oil refineries considering the integration of planning and scheduling to close the decision-making gap between the procurement of raw materials or feedstocks and the operations of the production scheduling. From a month to an hour, re-planning and rescheduling iterations can better predict the processed crude-oil basket, diet or final composition, reducing the production costs and impacts in the process and product demands with respect to the quality of the raw materials. The goal is to interface planning and scheduling decisions within a time-window of a week with the support of reoptimization steps. Then, the selection, delivery, storage and mixture of crude-oil feeds from the tactical procurement planning up to the blend scheduling operations are made more appropriately. The up-to-down sequence of solutions are integrated in a feedback iteration to both reduce time-grids and as a key performance indicator.
A novel approach to scheduling the startup of oil and gas wells in multiple fields over a decade-... more A novel approach to scheduling the startup of oil and gas wells in multiple fields over a decade-plus discrete-time horizon is presented. The major innovation of our formulation is to treat each well or well type as a batch-process with time-varying yields or production rates that follow the declining, decaying or diminishing curve profile. Side or resource constraints such as process plant capacities, utilities and rigs to place the wells are included in the model. Current approaches to this long-term planning problem in a monthly time-step use manual decision-making with simulators where many scenarios, samples or cases are required to facilitate the development of possible feasible solutions. Our solution to this problem uses mixed-integer linear programming (MILP) which automates the decision-making of deciding on which well to startup next to find optimized solutions. Plots of an illustrative example highlight the operation of the well startup system and the decaying production of wells.
Text: Objectives The PID controller is the most widely used basic regulatory control algorithm. P... more Text: Objectives The PID controller is the most widely used basic regulatory control algorithm. PID control is important in chemical engineering processes as it plays a critical role to form the basis of advanced process control and optimization systems such as model predictive control (MPC) and real-time optimization (RTO). However, its performance can vary greatly on the tuning of its three (3) parameters. There are several different types of the PID tuning rules or heuristics reviewed in [1]. However, it can be demanding for the regular operators, instrument technicians and process engineers or inexperienced process control engineers to choose the most suitable rule and use the rule properly in the actual operating physical system or environment.
At the edge of the manufacturing of crude-oil distillates into refined final products, the produc... more At the edge of the manufacturing of crude-oil distillates into refined final products, the production scheduling and primary distribution gap can be reduced by optimizing production rundown switches of dispositions of distillates in a mixed-integer linear model (MILP) considering discrete time-steps of days, shifts or hours for a delivery horizon of weeks or months. From the process network down to the product distribution side, there are definitions on the assignments, allocations and amounts of distillates to be dispatched downstream. Other challenges involve logistics and quality aspects in further processshops and blend-shops considering diverse tank farms and various transport modes for the distribution. However, such integration of the refining process and tank storage systems can become intractable for industrial-sized problems with complex scheduling configurations considering time-varying rundown supply rates, product demands and pricing. For this, we propose to model the dispositions of distillates using unit-operations as modes of transportation from the distillation sources to the tanks of process-shops and blend-shops for downstream processing and blending before the primary distribution. Additionally, by solving with pooling (groups of tanks) first then post-solve to depool by disaggregating the pooled solution, the determination of the distillate dispositions to tank assignments is facilitated given that scaling to industrial-sized cases without tank aggregation is complicated as highlighted in the examples. Better prediction of the operations scheduling by using small discrete time-steps within a planning horizon allows opportunities for exploring the contract and spot market plays of the finished products.
Understanding the holistic relationship between refinery production scheduling (RPS) and the cybe... more Understanding the holistic relationship between refinery production scheduling (RPS) and the cyber-physical production environment with smart scheduling is a new question posed in the study of process systems engineering. Here, we discuss state-of-the-art RSPs in the crude-oil refining field and present examples that illustrate how smart scheduling can impact operations in the high-performing chemical process industry. We conclude that, more than any traditional off-the-shelf RPS solution available today, flexible and integrative specialized modeling platforms will be increasingly necessary to perform decentralized and collaborative optimizations, since they are the technological alternatives closer to the advanced manufacturing philosophy.
Industrial & Engineering Chemistry Research, 2018
We develop a linear programming (LP) approach for nonlinear (NLP) blending of streams to approxim... more We develop a linear programming (LP) approach for nonlinear (NLP) blending of streams to approximate nonconvex quality constraints by considering property variables as constants, parameters, or coefficients of qualities that we call factors. In a blend shop, these intensive properties of streams can be extended by multiplying the material flow carrying out these amounts of qualities. Our proposition augments equality balance constraints as essentially cuts of quality material flow for each property specification in a mixing point between feed sources and product sinks. In the LP factor formulation, the product blend quality is replaced by its property specification and variables of slacks and/or surpluses are included to close the balance; these are called factor flows and are well known in industry as product giveaways. Examples highlight the usefulness of factors in successive substitution by correcting nonlinear blending deltas in mixed-integer linear models (MILP) and to control product quality giveaw...
We present an initiative for education in Process System Engineering (PSE) covering industrial ap... more We present an initiative for education in Process System Engineering (PSE) covering industrial applications in both prescriptive and predictive analytics. Prescriptive analytics or decision-automation is the science of automating the decisionmaking of any physical system with respect to its design, planning, scheduling, control and operation using any combination of optimization, heuristic, machine-learning and cyber-physical algorithms. Predictive analytics or data-analytics is the science of examining raw data with the purpose of drawing conclusions on the behavior of the systems using data reconciliation and parameter estimation techniques within real-time optimization and control environments. Examples for beginner, intermediate and advanced levels guide the open-users of this educational platform in PSE to evolve toward more complex problems for research, development and deployment of industrial applications in the chemical engineering and multi-related fields.
The scheduling operations in crude-oil refinery industries are commonly based on simulation of di... more The scheduling operations in crude-oil refinery industries are commonly based on simulation of discrete production scenarios for selection, sequence or setups of tanks and unit-operations considering a complex network of continuous-processes within a time-horizon of a week. Although series of works in academia consider continuous-time modeling for optimization of this decision-making, practitioners in the production field rely on discrete time windows to coordinate their operational activities, until now conducted by human beings. However, there are still challenges to automate the solution of such discrete-time problem in reasonable processing time (CPU) for time-steps within the shift of the operators (8h) or even in smaller windows such as 1, 2 or 4 hours. In this direction, this paper introduces modeling, solving and heuristic strategies to handle the complex industrial-sized refinery scheduling problems considering discrete-time formulation. Examples highlights exclusions from several types of heuristic decompositions to reduce the optimization search space in constructive rolling horizon strategies. Additionally, relaxations on mixed-integer linear programs construct the problem by an ad-hoc relax-and-fix iteration.
Advances in modeling and solving capabilities in the crude-oil refinery scheduling has recently a... more Advances in modeling and solving capabilities in the crude-oil refinery scheduling has recently addressed its optimization more accurately by considering wider scope, scale and complexities of the refining process network. In this work, we present examples of these enhancements such as a) a decomposition strategy to enable the optimization of large scale problems; b) a linearization procedure to include nonlinear quality information for blending of streams in an mixed-integer linear programming (MILP) model; and c) multiple distillation units in the form of cascaded towers. We highlight a case optimized in less than 6 minutes whereby the use of the linearization strategy for the nonlinear blending resulted in both an improvement of 2.82% in the objective function and in a reduced gap of the decomposed solutions.
Detecting windows or intervals of when a continuous process is operating in a state of steadiness... more Detecting windows or intervals of when a continuous process is operating in a state of steadiness is useful especially when steady-state models are being used to optimize the process or plant on-line or in real-time. The term steady-state implies that the process is operating around some stable point or within some stationary region where it must be assumed that the accumulation or rate-of-change of material, energy and momentum is statistically insignificant or negligible. This new approach is to assume the null-hypothesis that the process is stationary about its mean subject to independent and identically distributed random error or shocks (white-noise) with the alternativehypothesis that it is non-stationary with a detectable and deterministic slope, trend, bias or drift. The drift profile would be typical of a time-varying inventory or holdup of material with imbalanced flows or even an unexpected leak indicating that the process signal is not steady. A probability of being steady or at least stationary over the window is computed by performing a residual Student-t test using the estimated mean of the process signal without any drift and the estimated standard-deviation of the underlying white-noise driving force. There are essentially two settings or options for the method which are the window-length and the Student-t critical value and can be easily tuned for each process signal that are included in the multivariate detection strategy.
Industrial & Engineering Chemistry Research, 2004
Applications of nonlinear optimization problems with many degrees of freedom have become more com... more Applications of nonlinear optimization problems with many degrees of freedom have become more common in the process industries, especially in the area of process operations. However, most widely used nonlinear programming (NLP) solvers are designed for the efficient solution of problems with few degrees of freedom. Here we consider a new NLP algorithm, IPOPT, designed for many degrees of freedom and many potentially active constraint sets. The IPOPT algorithm follows a primal-dual interior point approach, and its robustness, improved convergence, and computational speed compared to those of other popular NLP algorithms will be analyzed. To demonstrate its effectiveness on process applications, we consider large gasoline blending and data reconciliation problems, both of which contain nonlinear mass balance constraints and process properties. Results on this computational comparison show significant benefits from the IPOPT algorithm.
Industrial & Engineering Chemistry Research, 2013
Nonlinear planning and scheduling models for crude-oil atmospheric and vacuum distillation units ... more Nonlinear planning and scheduling models for crude-oil atmospheric and vacuum distillation units are essential to manage increased complexities and narrow margins present in the petroleum industry. Traditionally, conventional swing-cut modeling is based on fixed yields with fixed properties for the hypothetical cuts that swing between adjacent light and heavy distillates, which can subsequently lead to inaccuracies in the predictions of both its quantity and quality. A new extension is proposed to better predict quantities and qualities for the distilled products by taking into consideration that we require corresponding light and heavy swing-cuts with appropriately varying qualities. By computing interpolated qualities relative to its light and
This paper presents an optimization-based approach for determining how plant feedstocks should be... more This paper presents an optimization-based approach for determining how plant feedstocks should be allocated to storage when there are fewer storage vessels than feedstocks. It is assumed here that material from the storage vessels will be subsequently blended for processing in downstream processes. The objective of the feedstock allocation strategy is chosen to ensure maximum flexibility for downstream process operation. Given the stated objective for feedstock allocation and the physical constraints, the feedstock storage allocation problem is posed in optimization form. The solution of the resulting singular value optimization problem is discussed in terms of semidefinite programming techniques. The ideas presented in the paper are illustrated using a crude oil storage case study. The paper concludes with a number of observations regarding useful extensions to the proposed methods.
Standard benchmarks are important repositories to establish comparisons between competing model a... more Standard benchmarks are important repositories to establish comparisons between competing model and control methods, especially when a new method is proposed. This paper presents details of an Arduino micro-controller temperature control lab as a benchmark for modeling and control methods. As opposed to simulation studies, a physical benchmark considers real process characteristics such as the requirement to meet a cycle time, discrete sampling intervals, communication overhead with the process, and model mismatch. An example case study of the benchmark is quantifying an optimization approach for a PID controller with 5.4% improved performance. A multivariate example shows the quantified performance improvement by using model predictive control with a physics-based model, an autoregressive time series model, and a Hammerstein model with an artificial neural network to capture the static nonlinearity. These results demonstrate the potential of a hardware benchmark for transient modeling and regulatory or advanced control methods.
... Further and unex-pected savings on corrosion control chemicals were also observed for the ref... more ... Further and unex-pected savings on corrosion control chemicals were also observed for the refinery ... similar to the supply chain logistics problem except that our logistics problem has less ... It considers only the crude oil blendshop (inside the production chain) and not the entire ...
... Further and unex-pected savings on corrosion control chemicals were also observed for the ref... more ... Further and unex-pected savings on corrosion control chemicals were also observed for the refinery ... similar to the supply chain logistics problem except that our logistics problem has less ... It considers only the crude oil blendshop (inside the production chain) and not the entire ...
This article describes an effective and simple primal heuristic to greedily encourage a reduction... more This article describes an effective and simple primal heuristic to greedily encourage a reduction in the number of binary or 0 Á/1 logic variables before an implicit enumerative-type search heuristic is deployed to find integer-feasible solutions to 'hard' production scheduling problems. The basis of the technique is to employ well-known smoothing functions used to solve complementarity problems to the local optimization problem of minimizing the weighted sum over all binary variables the product of themselves multiplied by their complement. The basic algorithm of the 'smooth-and-dive accelerator' (SDA) is to solve successive linear programming (LP) relaxations with the smoothing functions added to the existing problem's objective function and to use, if required, a sequence of binary variable fixings known as 'diving'. If the smoothing function term is not driven to zero as part of the recursion then a branch-and-bound or branch-and-cut search heuristic is called to close the procedure finding at least integerfeasible primal infeasible solutions. The heuristic's effectiveness is illustrated by its application to an oil-refinery's crude-oil blendshop scheduling problem, which has commonality to many other production scheduling problems in the continuous and semicontinuous (CSC) process domains. #
Data management systems are increasingly used in industrial processes. However, data collected as... more Data management systems are increasingly used in industrial processes. However, data collected as part of industrial process operations, such as sensor or measurement instruments data, contain various sources of errors that can hamper process analysis and decision making. The authors propose an operating-regime-based data processing framework for industrial process decision making. The framework was designed to increase the quality and take advantage of available process data use to make informed offline strategic business operation decisions, i.e., environmental, cost and energy analysis, optimization, fault detection, debottlenecking, etc. The approach was synthesized from best practices derived from the available framework and improved upon its predecessor by putting forward the combination of process expertise and data-driven approaches. This systematic and structured approach includes the following stages: (1) scope of the analysis, (2) signal processing, (3) steady-state opera...
We propose a quantitative analysis of an enterprise-wide optimization for operations of crude-oil... more We propose a quantitative analysis of an enterprise-wide optimization for operations of crude-oil refineries considering the integration of planning and scheduling to close the decision-making gap between the procurement of raw materials or feedstocks and the operations of the production scheduling. From a month to an hour, re-planning and rescheduling iterations can better predict the processed crude-oil basket, diet or final composition, reducing the production costs and impacts in the process and product demands with respect to the quality of the raw materials. The goal is to interface planning and scheduling decisions within a time-window of a week with the support of reoptimization steps. Then, the selection, delivery, storage and mixture of crude-oil feeds from the tactical procurement planning up to the blend scheduling operations are made more appropriately. The up-to-down sequence of solutions are integrated in a feedback iteration to both reduce time-grids and as a key performance indicator.
A novel approach to scheduling the startup of oil and gas wells in multiple fields over a decade-... more A novel approach to scheduling the startup of oil and gas wells in multiple fields over a decade-plus discrete-time horizon is presented. The major innovation of our formulation is to treat each well or well type as a batch-process with time-varying yields or production rates that follow the declining, decaying or diminishing curve profile. Side or resource constraints such as process plant capacities, utilities and rigs to place the wells are included in the model. Current approaches to this long-term planning problem in a monthly time-step use manual decision-making with simulators where many scenarios, samples or cases are required to facilitate the development of possible feasible solutions. Our solution to this problem uses mixed-integer linear programming (MILP) which automates the decision-making of deciding on which well to startup next to find optimized solutions. Plots of an illustrative example highlight the operation of the well startup system and the decaying production of wells.
Text: Objectives The PID controller is the most widely used basic regulatory control algorithm. P... more Text: Objectives The PID controller is the most widely used basic regulatory control algorithm. PID control is important in chemical engineering processes as it plays a critical role to form the basis of advanced process control and optimization systems such as model predictive control (MPC) and real-time optimization (RTO). However, its performance can vary greatly on the tuning of its three (3) parameters. There are several different types of the PID tuning rules or heuristics reviewed in [1]. However, it can be demanding for the regular operators, instrument technicians and process engineers or inexperienced process control engineers to choose the most suitable rule and use the rule properly in the actual operating physical system or environment.
At the edge of the manufacturing of crude-oil distillates into refined final products, the produc... more At the edge of the manufacturing of crude-oil distillates into refined final products, the production scheduling and primary distribution gap can be reduced by optimizing production rundown switches of dispositions of distillates in a mixed-integer linear model (MILP) considering discrete time-steps of days, shifts or hours for a delivery horizon of weeks or months. From the process network down to the product distribution side, there are definitions on the assignments, allocations and amounts of distillates to be dispatched downstream. Other challenges involve logistics and quality aspects in further processshops and blend-shops considering diverse tank farms and various transport modes for the distribution. However, such integration of the refining process and tank storage systems can become intractable for industrial-sized problems with complex scheduling configurations considering time-varying rundown supply rates, product demands and pricing. For this, we propose to model the dispositions of distillates using unit-operations as modes of transportation from the distillation sources to the tanks of process-shops and blend-shops for downstream processing and blending before the primary distribution. Additionally, by solving with pooling (groups of tanks) first then post-solve to depool by disaggregating the pooled solution, the determination of the distillate dispositions to tank assignments is facilitated given that scaling to industrial-sized cases without tank aggregation is complicated as highlighted in the examples. Better prediction of the operations scheduling by using small discrete time-steps within a planning horizon allows opportunities for exploring the contract and spot market plays of the finished products.
Understanding the holistic relationship between refinery production scheduling (RPS) and the cybe... more Understanding the holistic relationship between refinery production scheduling (RPS) and the cyber-physical production environment with smart scheduling is a new question posed in the study of process systems engineering. Here, we discuss state-of-the-art RSPs in the crude-oil refining field and present examples that illustrate how smart scheduling can impact operations in the high-performing chemical process industry. We conclude that, more than any traditional off-the-shelf RPS solution available today, flexible and integrative specialized modeling platforms will be increasingly necessary to perform decentralized and collaborative optimizations, since they are the technological alternatives closer to the advanced manufacturing philosophy.
Industrial & Engineering Chemistry Research, 2018
We develop a linear programming (LP) approach for nonlinear (NLP) blending of streams to approxim... more We develop a linear programming (LP) approach for nonlinear (NLP) blending of streams to approximate nonconvex quality constraints by considering property variables as constants, parameters, or coefficients of qualities that we call factors. In a blend shop, these intensive properties of streams can be extended by multiplying the material flow carrying out these amounts of qualities. Our proposition augments equality balance constraints as essentially cuts of quality material flow for each property specification in a mixing point between feed sources and product sinks. In the LP factor formulation, the product blend quality is replaced by its property specification and variables of slacks and/or surpluses are included to close the balance; these are called factor flows and are well known in industry as product giveaways. Examples highlight the usefulness of factors in successive substitution by correcting nonlinear blending deltas in mixed-integer linear models (MILP) and to control product quality giveaw...
We present an initiative for education in Process System Engineering (PSE) covering industrial ap... more We present an initiative for education in Process System Engineering (PSE) covering industrial applications in both prescriptive and predictive analytics. Prescriptive analytics or decision-automation is the science of automating the decisionmaking of any physical system with respect to its design, planning, scheduling, control and operation using any combination of optimization, heuristic, machine-learning and cyber-physical algorithms. Predictive analytics or data-analytics is the science of examining raw data with the purpose of drawing conclusions on the behavior of the systems using data reconciliation and parameter estimation techniques within real-time optimization and control environments. Examples for beginner, intermediate and advanced levels guide the open-users of this educational platform in PSE to evolve toward more complex problems for research, development and deployment of industrial applications in the chemical engineering and multi-related fields.
The scheduling operations in crude-oil refinery industries are commonly based on simulation of di... more The scheduling operations in crude-oil refinery industries are commonly based on simulation of discrete production scenarios for selection, sequence or setups of tanks and unit-operations considering a complex network of continuous-processes within a time-horizon of a week. Although series of works in academia consider continuous-time modeling for optimization of this decision-making, practitioners in the production field rely on discrete time windows to coordinate their operational activities, until now conducted by human beings. However, there are still challenges to automate the solution of such discrete-time problem in reasonable processing time (CPU) for time-steps within the shift of the operators (8h) or even in smaller windows such as 1, 2 or 4 hours. In this direction, this paper introduces modeling, solving and heuristic strategies to handle the complex industrial-sized refinery scheduling problems considering discrete-time formulation. Examples highlights exclusions from several types of heuristic decompositions to reduce the optimization search space in constructive rolling horizon strategies. Additionally, relaxations on mixed-integer linear programs construct the problem by an ad-hoc relax-and-fix iteration.
Advances in modeling and solving capabilities in the crude-oil refinery scheduling has recently a... more Advances in modeling and solving capabilities in the crude-oil refinery scheduling has recently addressed its optimization more accurately by considering wider scope, scale and complexities of the refining process network. In this work, we present examples of these enhancements such as a) a decomposition strategy to enable the optimization of large scale problems; b) a linearization procedure to include nonlinear quality information for blending of streams in an mixed-integer linear programming (MILP) model; and c) multiple distillation units in the form of cascaded towers. We highlight a case optimized in less than 6 minutes whereby the use of the linearization strategy for the nonlinear blending resulted in both an improvement of 2.82% in the objective function and in a reduced gap of the decomposed solutions.
Detecting windows or intervals of when a continuous process is operating in a state of steadiness... more Detecting windows or intervals of when a continuous process is operating in a state of steadiness is useful especially when steady-state models are being used to optimize the process or plant on-line or in real-time. The term steady-state implies that the process is operating around some stable point or within some stationary region where it must be assumed that the accumulation or rate-of-change of material, energy and momentum is statistically insignificant or negligible. This new approach is to assume the null-hypothesis that the process is stationary about its mean subject to independent and identically distributed random error or shocks (white-noise) with the alternativehypothesis that it is non-stationary with a detectable and deterministic slope, trend, bias or drift. The drift profile would be typical of a time-varying inventory or holdup of material with imbalanced flows or even an unexpected leak indicating that the process signal is not steady. A probability of being steady or at least stationary over the window is computed by performing a residual Student-t test using the estimated mean of the process signal without any drift and the estimated standard-deviation of the underlying white-noise driving force. There are essentially two settings or options for the method which are the window-length and the Student-t critical value and can be easily tuned for each process signal that are included in the multivariate detection strategy.
Industrial & Engineering Chemistry Research, 2004
Applications of nonlinear optimization problems with many degrees of freedom have become more com... more Applications of nonlinear optimization problems with many degrees of freedom have become more common in the process industries, especially in the area of process operations. However, most widely used nonlinear programming (NLP) solvers are designed for the efficient solution of problems with few degrees of freedom. Here we consider a new NLP algorithm, IPOPT, designed for many degrees of freedom and many potentially active constraint sets. The IPOPT algorithm follows a primal-dual interior point approach, and its robustness, improved convergence, and computational speed compared to those of other popular NLP algorithms will be analyzed. To demonstrate its effectiveness on process applications, we consider large gasoline blending and data reconciliation problems, both of which contain nonlinear mass balance constraints and process properties. Results on this computational comparison show significant benefits from the IPOPT algorithm.
Industrial & Engineering Chemistry Research, 2013
Nonlinear planning and scheduling models for crude-oil atmospheric and vacuum distillation units ... more Nonlinear planning and scheduling models for crude-oil atmospheric and vacuum distillation units are essential to manage increased complexities and narrow margins present in the petroleum industry. Traditionally, conventional swing-cut modeling is based on fixed yields with fixed properties for the hypothetical cuts that swing between adjacent light and heavy distillates, which can subsequently lead to inaccuracies in the predictions of both its quantity and quality. A new extension is proposed to better predict quantities and qualities for the distilled products by taking into consideration that we require corresponding light and heavy swing-cuts with appropriately varying qualities. By computing interpolated qualities relative to its light and
This paper presents an optimization-based approach for determining how plant feedstocks should be... more This paper presents an optimization-based approach for determining how plant feedstocks should be allocated to storage when there are fewer storage vessels than feedstocks. It is assumed here that material from the storage vessels will be subsequently blended for processing in downstream processes. The objective of the feedstock allocation strategy is chosen to ensure maximum flexibility for downstream process operation. Given the stated objective for feedstock allocation and the physical constraints, the feedstock storage allocation problem is posed in optimization form. The solution of the resulting singular value optimization problem is discussed in terms of semidefinite programming techniques. The ideas presented in the paper are illustrated using a crude oil storage case study. The paper concludes with a number of observations regarding useful extensions to the proposed methods.
Standard benchmarks are important repositories to establish comparisons between competing model a... more Standard benchmarks are important repositories to establish comparisons between competing model and control methods, especially when a new method is proposed. This paper presents details of an Arduino micro-controller temperature control lab as a benchmark for modeling and control methods. As opposed to simulation studies, a physical benchmark considers real process characteristics such as the requirement to meet a cycle time, discrete sampling intervals, communication overhead with the process, and model mismatch. An example case study of the benchmark is quantifying an optimization approach for a PID controller with 5.4% improved performance. A multivariate example shows the quantified performance improvement by using model predictive control with a physics-based model, an autoregressive time series model, and a Hammerstein model with an artificial neural network to capture the static nonlinearity. These results demonstrate the potential of a hardware benchmark for transient modeling and regulatory or advanced control methods.
... Further and unex-pected savings on corrosion control chemicals were also observed for the ref... more ... Further and unex-pected savings on corrosion control chemicals were also observed for the refinery ... similar to the supply chain logistics problem except that our logistics problem has less ... It considers only the crude oil blendshop (inside the production chain) and not the entire ...
... Further and unex-pected savings on corrosion control chemicals were also observed for the ref... more ... Further and unex-pected savings on corrosion control chemicals were also observed for the refinery ... similar to the supply chain logistics problem except that our logistics problem has less ... It considers only the crude oil blendshop (inside the production chain) and not the entire ...
This article describes an effective and simple primal heuristic to greedily encourage a reduction... more This article describes an effective and simple primal heuristic to greedily encourage a reduction in the number of binary or 0 Á/1 logic variables before an implicit enumerative-type search heuristic is deployed to find integer-feasible solutions to 'hard' production scheduling problems. The basis of the technique is to employ well-known smoothing functions used to solve complementarity problems to the local optimization problem of minimizing the weighted sum over all binary variables the product of themselves multiplied by their complement. The basic algorithm of the 'smooth-and-dive accelerator' (SDA) is to solve successive linear programming (LP) relaxations with the smoothing functions added to the existing problem's objective function and to use, if required, a sequence of binary variable fixings known as 'diving'. If the smoothing function term is not driven to zero as part of the recursion then a branch-and-bound or branch-and-cut search heuristic is called to close the procedure finding at least integerfeasible primal infeasible solutions. The heuristic's effectiveness is illustrated by its application to an oil-refinery's crude-oil blendshop scheduling problem, which has commonality to many other production scheduling problems in the continuous and semicontinuous (CSC) process domains. #
Uploads
Papers by Jeffrey Kelly