A Machine Learning Approach in Rev MNGT

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

Available online at www.sciencedirect.

com

ScienceDirect
Procedia CIRP 118 (2023) 342–347

16th CIRP Conference on Intelligent Computation in Manufacturing Engineering, CIRP ICME ‘22, Italy

A Machine Learning Approach for


Revenue Management in Cloud Manufacturing
Vincent Adomata,*, Jonas Ehrhardtb, Christian Kobera, Maryam Ahanpanjeha, Jens P. Wulfsberga
a
Institute of Production Engineering / Helmut Schmidt University, Hamburg, Germany
b
Institute of Computer Science in Mechanical Engineering / Helmut Schmidt University, Hamburg, Germany

*Corresponding author. Tel.: N/A; E-mail address: [email protected]

Abstract

Cloud Manufacturing matches supplier capacities and customer demands via an algorithmic agent. The agent aims to maximize its own and the
platform participant’s revenue, hence the platform’s gross revenue. Volatilities in supply and demand, as well as individual pricing models and
willingness to pay, exacerbate the problem for deterministic solvers. In Revenue Management, comparable problems are approached with linear
programming and regression models. Within this paper, we introduce a new Machine Learning (ML) based approach: Machine Learning for
Cloud Manufacturing Cost Prediction (ML-ConCord). ML-ConCord aims to improve prediction quality for Revenue Management in Cloud
Manufacturing by combining a recurrent neural network (RNN) with a linear programming pricing algorithm. We evaluate our approach on a
self-generated dataset and compare its performance with standard regression-based approaches. Models and datasets are published on GitHub.
© 2023 The Authors. Published by Elsevier B.V.
This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0)
Peer-review under responsibility of the scientific committee of the 16th CIRP Conference on Intelligent Computation in Manufacturing Engineering
Keywords: Cloud manufacturing; Machine learning; Recurrent neural network; Prediction; Forecasting; Optimization; Revenue management; Industry 4.0

Within Cyber-Physical Systems (CPS), cloud-based sold at different prices at different points in time and to
applications play an important role [22, 23, 4]. Increasing different consumers. Market segmentation is conducted to
competition forces organizations to improve their efficiency profit from consumers’ individual willingness to pay (WTP).
by applying Industry 4.0 digitalization concepts, like Cloud The concept originates in Revenue Management, e.g., in
Manufacturing (CMfg), to their business models [25, 15]. airlines, where similar seats are priced differently, depending
CMfg allows for collaboration across enterprises via a single on the time of booking and other factors [16].
platform [33, 4]. It enables large-scale sharing of Determining the optimal pricing and capacity allocation for
manufacturing resources and capabilities across the entire CMfg platform participants is difficult as a consequence of
lifecycle without large upfront investments for the CMfg fluctuating demand and supply [10]. Since the dynamic
participants [33]. Participants on the CMfg platform are cloud behavior of CMfg requires the consideration of time-varying
operators, resource providers, and users [33, 17]. Users pay attributes, forecasting becomes an important issue. So far,
the operator for using the shared services offered on the approaches from Revenue Management, like linear
platform [33]. The concept aims to maximize the profits for all programming, were employed to approximate optimal pricing,
participants within the CMfg platform [33]. Participants in situ. However, those methods focus on a given point in
benefit from network effects with a growing number of time, neglecting implications with dynamic effects, like
providers and consumers. Moreover, two-sided markets, seasonal dependencies. As there are established methods in
where organizations can simultaneously act as resource Machine Learning (ML) for forecasting the time-dependent
providers and consumers, can form. behavior of systems, we propose an ML approach for
The overall objective of Revenue Management is to predicting operation and logistic costs within a CMfg platform
increase revenue and consequently profits. Same services are context. To include time-dependencies in the prediction of

2212-8271 © 2023 The Authors. Published by Elsevier B.V.


This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0)
Peer-review under responsibility of the scientific committee of the 16th CIRP Conference on Intelligent Computation in Manufacturing Engineering
10.1016/j.procir.2023.06.059
Vincent Adomat et al. / Procedia CIRP 118 (2023) 342–347 343

Cloud Manufacturing contexts, we formulate the following for dynamic CMfg environments [18]. Some authors consider
research questions (RQ): this unsupported aspect and take the dynamic behavior of
RQ1: Can historical data of Cloud Manufacturing CMfg into account in their scheduling model and develop a
platforms be used to enhance prediction quality of volatile pricing and resourcing mechanism. [31] have proposed an
operation and logistic cost? ACO algorithm for time-varying workflow scheduling
To answer this question, we formulate a simulation model problems to minimize the total cost in a particular period. [32]
of a CMfg platform. Based on the simulation, we generate proposed the optimization method based on the analytic
datasets to train and evaluate different models for predicting hierarchy process (AHP) method and the dynamic
operation and logistic costs. programming of business decisions computing model. [19]
RQ2: Can a Recurrent Machine Learning model exceed proposed a four-layer framework for scheduling based on
regression methods in prediction quality for Revenue deep reinforcement learning (DRL), which includes a deep
Management in CMfg, when using historical data? neural network to cope with time-varying QoS attributes. [20]
To answer the research question, we create an ML model developed an ACO algorithm-based Support Vector
that is trained on historical data from the simulation model in Regression ensemble to forecast a price in CMfg. [24]
RQ1. We evaluated our approach’s performance against the presented a cloud service selection framework, which takes
prediction of a standard linear programming model and the the QoS history of cloud service over different time periods
ground truth of the generated datasets. and ranks all cloud services in each time period.
As a result, we present our Machine Learning approach for While [27] present an extensive MILP formulation for
Cloud Manufacturing Cost Prediction 1 (ML-ConCord). distributing tasks in a generic CMfg network, their model at
MLConCord consists of a recurrent ML model, which is the same time is considered too complex for an application in
trained on historical data from a CMfg platform. The ML the context of CMfg forecasting, due to its large set of
model’s predictions are passed to a linear programming parameters. [6] take a different approach and reduce the
optimizer to calculate the optimal cost settings for future number of parameters. Though the resulting model lacks some
periods, including time dependencies of future events. of the CMfg context’s real-world restrictions, such as
The remaining paper is structured as follows: In Section 1, tardiness cost or overtime hours, the model formulation
we summarize related work. Section 2 introduces the presented is perceived to be a valid foundation for generating
optimization model. In Section 3, we introduce our ML data and comparing forecasting results of Recurrent Neural
approach, ML-ConCord. Section 4 presents the results, Networks (RNN) to statistical methods. Applying ML to the
followed by the discussion in section 5 and the conclusion in various aspects of CMfg became popular in the community
section 6. over recent years [9]. However, many authors aim at
substituting distribution optimization with deep or
1. Related work reinforcement learning techniques, such as [7]. Only [21] use
an RNN to predict future scenarios and deduct schedules
Within this section, we introduce related work regarding based on those forecasts.
the domains of optimization in CMfg with exact algorithms,
metaheuristics, and artificial neural networks. 2. Optimization Model
The concept of CMfg is connected to various problems
from the optimization domain. To distribute tasks and revenue Within this section, we introduce the optimization and
equally across CMfg networks, supplier capacities as well as simulation, we used to simulate a CMfg network.
customer demands constantly need to be matched optimally. As there are only few large-scale CMfg networks
At the same time, the criteria for optimality can be manifold. implemented in reality so far [3], training ML algorithms
While some authors only focus on production capacities, using real-world data is currently still infeasible. Therefore,
others also consider logistics time, cost [29, 6] and quality of we built a data generation model based on existing
service (QoS), creating a multi-objective optimization optimization model formulations. We employ a MILP
problem [5]. [28] aimed at minimizing the total cost time formulation, as it ensures optimality when feasible. This
while maximizing the total pass rate, anti-risk ability, and enables us to evaluate other approaches based on the objective
satisfaction degree. function values of the MILP model.
Optimization algorithms can be classified into (i) exact As a baseline, we adopt the model from [6] as a minimal
algorithms and (ii) meta-heuristic algorithms. While exact approach to the optimization of a CMfg network. We set the
algorithms comprise methods from linear programming (LP) focus on manufacturing and logistic activities only. We
and mixed integer linear programming (MILP) [2, 1, 6], meta assume that a CMfg network consists of a multitude of
heuristics include genetic algorithms (GA), ant colony locations 𝑚𝑚 ∈ 𝑀𝑀, where one manufacturer is located at each
optimization (ACO), particle swarm optimization (PSO) and location. The locations are connected by a defined number of
bee colony Optimization (BCO) [29, 18]. logistic services 𝑙𝑙 ∈ 𝐿𝐿 . Products 𝑘𝑘 ∈ 𝐾𝐾 are composed of a
Although in literature, meta-heuristic algorithms are limited set of operations 𝑖𝑖 ∈ 𝐼𝐼! . Furthermore, we assume that
applied most often, they do not offer an appropriate solution only a subset of operations can be accomplished at each
location, introducing a need for logistic activities to finalize a
product. As a simplification, we neglect capacity restrictions
per location and define that all products must be
1
GitHub repository will be added once accepted
344 Vincent Adomat et al. / Procedia CIRP 118 (2023) 342–347

manufactured. Neglecting the capacity notably reduces model Table 1. Different complexity scenarios setups of our dataset generation.
complexity and relieves the requirement of pricing products With !𝐽𝐽!"#$%& ! describing the number of logistic services per period, !𝐾𝐾!"#$%& !
as number of products per period, and !𝑀𝑀!"#$%& ! as number of locations per
as means of weighting each product for its objective function
period.
value contribution.
Scenario small medium large
By basing on existing modeling approaches, we ensure
model consistency, while relieving the necessity to develop |𝐽𝐽𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 | 2 4 6
own abstractions to CMfg networks. Different from [6], we !𝐾𝐾!"#$%& ! 10 50 100
handle the attribution of manufacturing competencies per !𝑀𝑀!"#$%& ! 5 10 15
location by introducing an additional constraint and an
additional binary parameter 𝐶𝐶"# ∈ 0; 1, since infinite cost are
We set a focus on two parameters:
difficult to implement in software. 𝐶𝐶"# becomes 1 for every
location where a certain operation can be accomplished, and 0 • Logistic costs (LC): Price per transport operation
otherwise. The additional constraint (1) requires every between locations for all logistic services
decision variable 𝛾𝛾$%& to be smaller than the respective 𝐶𝐶$& to 𝐿𝐿𝐿𝐿,(#,#/)!"#$%& ∈ ℚ ∀ 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 ∈ 𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃
ensure the feasibility of every operation being assigned to a
location: • Operating costs (OC): Price per manufacturing
operation and location
𝛾𝛾"!# ≤ 𝐶𝐶"# ∀𝑖𝑖 ∈ 𝐼𝐼! , 𝑘𝑘 ∈ 𝐾𝐾, 𝑚𝑚 ∈ 𝑀𝑀 (1) 𝑂𝑂𝑂𝑂"#!"#$%& ∈ ℚ ∀ 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 ∈ 𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃

The simulation model has to incorporate fluctuations and The resulting datasets are a series of vectors (2). Each
trends into the various sets and parameters to create a relevant vector represents a single period. We define operating costs
abstraction of reality. Besides our coherence with [6], we 𝑂𝑂𝑂𝑂"# as independent of products 𝑘𝑘 ∈ 𝐾𝐾 , favoring the
grounded our data generation model on four additional reduction of vector size, especially in large scenarios.
assumptions about how parameters are developing over time. 𝐿𝐿𝐿𝐿,(#,#/)
'()"*+
These dynamic parameters include (i) costs for performing an AAAAAAAAA⃗ …
𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑'()"*+ = D F ∀ 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 ∈ 𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃 (2)
operation, (ii) logistic costs, (iii) time of performing an 𝑂𝑂𝑂𝑂"#!"#$%&
operation and (iv) the productivity of an enterprise. We define …
the dynamic parameters as follows: Prices for logistic services per route 𝐿𝐿𝐿𝐿,(#,#/)!"#$%& are
(i) Costs for performing an operation: The price to calculated by multiplying the price per distance unit 𝑙𝑙𝑙𝑙, by the
perform an operation at a specific location shows a trend to distance between two locations 𝑑𝑑(#,#/) (3). 𝑙𝑙𝑙𝑙, thereby results
increase over time. We link the exact change to Italian from a base value per logistic service 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑙𝑙𝑙𝑙_𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝,
inflation rates [12]. multiplied by a period dependent seasonal factor
(ii) Logistics costs: Unlike the general price trend, logistics 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑙𝑙𝑙𝑙_𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠, '()"*+ and a period dependent competition
costs per kilometer tend to decrease over time. This is factor 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑙𝑙𝑙𝑙_𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐, '()"*+ .
justified by the increasing operational efficiency of the 𝐿𝐿𝐿𝐿,2#,#' 3 = 𝑙𝑙𝑙𝑙, ∗ 𝑑𝑑2#,#'3
logistic providers caused by competition. !"#$%&

(iii) Time of performing an operation: At certain periods = 𝑏𝑏𝑏𝑏𝑏𝑏𝑒𝑒_𝑙𝑙𝑙𝑙_𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝, ∗ 𝑏𝑏𝑏𝑏𝑏𝑏𝑒𝑒_𝑙𝑙𝑙𝑙_𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠'()"*+ (3)


with respective utilization peaks, the processing time
increases. During the course of the year, however, it also ∗ 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑙𝑙𝑙𝑙_𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐, '()"*+ ∗ 𝑑𝑑(#,#/)
decreases again. Therefore, the time for performing an Prices for operations per location 𝑂𝑂𝑂𝑂"#!"#$%& are calculated
operation is dependent on seasonal fluctuations. by multiplying the machine hour rate per operation and
(iv) Productivity of an enterprise at a certain location: The location 𝑜𝑜𝑜𝑜"# by the required manufacturing time per
more often an individual operation is performed over time, the operation and location 𝑡𝑡"# (4). The product then is divided by
higher the productivity regarding this operation becomes in the productivity factor, which is dependent on the sum of each
the CMfg network. operation's executions over all past periods 𝑝𝑝" '()"*+ . The
We generated datasets with three complexity scenarios: machine hour rate is composed of an operation dependent
small, medium, large (cf. Tab 1). We understand a scenario as base value 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑜𝑜𝑜𝑜_𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜" , multiplied by a factor
one single object of study with a defined size resulting from a incorporating differences over the various production
constant number of parameters across all periods of sampled locations 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑜𝑜𝑜𝑜_𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙# and a seasonal factor, which
data. Every scenario is represented by one data set. The depicts inflation during the course of periods
scenarios vary in the number of logistic services, products, 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑜𝑜𝑜𝑜_𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠456$78 . Likewise, the manufacturing time
and manufacturing operators, respectively locations. The required per operation and location is calculated by
number of logistic services 𝑗𝑗 per scenario is given by the multiplying a base value per operation 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑡𝑡𝑡𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎"
magnitude of 𝐽𝐽'()"*+ ∈ 𝐽𝐽 . The number of products 𝑘𝑘 and by a location factor 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑡𝑡𝑡𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙# and a seasonal factor
locations 𝑚𝑚 are given by 4𝐾𝐾'()"*+ 4 and 4𝑀𝑀'()"*+ 4. 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑡𝑡𝑡𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠456$78 derived from a compressed sine
function. The data sources for the various parameters and sets
mentioned above are depicted in Table 2.
Vincent Adomat et al. / Procedia CIRP 118 (2023) 342–347 345

𝑜𝑜𝑐𝑐"# ∗ 𝑡𝑡"# network with a MILP model to firstly predict future trends
𝑂𝑂𝑂𝑂"#!"#$%& =
𝑝𝑝" '()"*+ from time-series and subsequently pass those to a MILP
model for the calculation of the optimal pricing.
= 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏 𝑜𝑜𝑜𝑜 𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑠𝑠" ∗ 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏 𝑜𝑜𝑜𝑜 𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑠𝑠#
∗ 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏 𝑜𝑜𝑜𝑜 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑙𝑙'()"*+ ∗ 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏 𝑡𝑡 𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑠𝑠" (4)
∗ 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏 𝑡𝑡 𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑠𝑠# ∗ 𝑏𝑏𝑏𝑏𝑠𝑠𝑠𝑠 𝑡𝑡 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑙𝑙'()"*+
1

𝑝𝑝" '()"*+
Fig. 1. Our ML-approach: A prediction model is trained on historical
timeseries data. The predictions for distinct time-periods are passed to a
Table 2. Parameters applied in data generation and their respective data MILP optimization model, where the optimal pricing is subsequently
sources. calculated.
Parameter Source
The prediction of future supplies and demands is heavily
𝑖𝑖 List of 10 consecutive items
[1, 2, 3, . . . , 10] dependent on historical data. Therefore, we employed our
datasets as multivariate time-series. Prediction models aim to
𝑗𝑗 List of 25 consecutive items
[1, 2, 3, . . . , 25] reconstruct features and interdependencies between and
within samples of time-series. Thus, dataset complexity and
𝑘𝑘 All forward combinations in I
[[1], . . . , [1, 2], . . . , [1, . . . , 9, 10]] prominence of features and interdependencies heavily
influence the prediction quality of a model. For this task
𝑚𝑚 100 Largest Italian cities [13]
[1 = 𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅, 2 = 𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀, 3 = 𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁, . . . ] recurrent machine learning models, like the Long-Short-
Term-Memory (LSTM) [11] have been proven to perform
𝑑𝑑(.,.0) Shortest truck routes between cities
calculated with [8] well on the reconstruction and prediction of time-series data
[14].
𝑝𝑝$ !"#$%& Compressed square root function
!"#$%&
1 + ?(Σ 456𝑘𝑘 ∗ 0.001) ∀ 𝑘𝑘 ∈ {𝑘𝑘|𝑖𝑖 ∈ 𝐼𝐼3 } For our approach, we employed a standard LSTM and
added a final linear layer to its output. The LSTM is trained
𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑙𝑙𝑙𝑙_ Truncated normal distribution
𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝7 𝑓𝑓(𝑥𝑥; µ, 𝜎𝜎, 𝑎𝑎, 𝑏𝑏) = 𝑓𝑓(𝑥𝑥; 0.7, 0.2, 0.5, 1) on historical, multivariate time-series data from the CMfg
network. As the scenarios vary in complexity and hence in
𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑙𝑙𝑙𝑙_ Normalized Italian inflation data [12]
𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑙𝑙!"#$%& year 2011 is 100% size of their parameter spaces (cf. Tab. 1), we train a separate
LSTM for each scenario. On each prediction step, the LSTM
𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑙𝑙𝑙𝑙_ Truncated normal distribution
𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐7 !"#$%& 𝑓𝑓(𝑥𝑥; 0.1, 0.04, 0.01, 0.2) predicts the parameters of 𝑂𝑂𝑂𝑂 and 𝐿𝐿𝐿𝐿 for one period of the
scenario it was trained on. The LSTM’s predictions are then
𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑜𝑜𝑜𝑜_ Truncated normal distribution
𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑠𝑠$ 𝑓𝑓(𝑥𝑥; 70, 10, 50, 100) passed into a MILP model, which calculates the optimal
pricing for each predicted period. As MILP model, we applied
𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑜𝑜𝑜𝑜_ Truncated normal distribution
𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑠𝑠. 𝑓𝑓(𝑥𝑥; 1, 0.1, 0, 2) a version of the model introduced in section 2.
As a result, ML-ConCord predicts the optimal future
𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑜𝑜𝑜𝑜_ Normalized Italian inflation data [12]
𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑙𝑙!"#$%& year 2011 is 100% pricing for the CMfg network based on historical data.
𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑡𝑡_ Truncated normal distribution
𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑠𝑠$ 𝑓𝑓(𝑥𝑥; 4.5, 2, 3, 8) 4. Results
𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑡𝑡_ Truncated normal distribution
𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑙𝑠𝑠. 𝑓𝑓(𝑥𝑥; 1, 0.1, 0, 2) Within this section, we present the results of our ML-
ConCord approach. We evaluated the prediction model’s
𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏_𝑡𝑡_ Compressed sine function
𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑙𝑙!"#$%& 1 − | sin(𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝) ∗ 𝜋𝜋/48| ∗ 0.1 performance against two standard regression methods: linear
regression [30] and autoregressive model [26].
To answer RQ 1 we applied the generated historic
3. Solution timeseries data on our prediction model and the two
benchmark methods. We applied a 70/30 train/test split and
Within this section, we introduce our approach trained the methods individually with the same seeds for
MLConCord for the prediction of logistic and operating costs reproducibility. We evaluated the performance of our
in CMfg networks. The approach consists of the combination prediction model and the benchmarks individually for their
of a recurrent neural network, with a MILP model, to predict
capability of predicting 𝑂𝑂𝑂𝑂 and 𝐿𝐿𝐿𝐿. While being trained on a
future operating costs based on historical data from a CMfg dataset including both parameters, we evaluated the models’
network. performance individually per parameter, as the parameters
The problem of predicting the optimal pricing in a CMfg show fundamentally different features (cf. Fig. 2). As metrics,
network is twofold: (i) predicting future supplies and we employed mean absolute error (MAE) and root mean
demands, and (ii) calculating the optimal pricing of operations squared error (RMSE). The results for our prediction models
based on that. For tackling this problem, we created an are compiled in Table 3. In all scenarios and metrics, the
approach, that combines two models with high performance in LSTM shows better results than the benchmark models, with
each domain (cf. Fig. 1). We combined a recurrent neural the exception of the RMSE for LC in the small scenario.
346 Vincent Adomat et al. / Procedia CIRP 118 (2023) 342–347

Table 3. Results of the prediction models. We evaluated the performance with mean absolute error (MAE), mean squared error (MSE), and root mean
squared error (RMSE) for the prediction of operating costs 𝑂𝑂𝑂𝑂 and logistic costs 𝐿𝐿𝐿𝐿.
Linear Reg. Autoreg. LSTM
MAE RSME MAE RSME MAE RSME
Scenario OC LC OC LC OC LC OC LC OC LC OC LC
small 10.67 19.93 13.16 24.83 6.84 22.99 83.00 27.98 5.58 10.98 3.06 39.91
medium 10.00 16.08 12.63 20.07 4.45 18.83 5.92 23.35 3.52 2.15 3.52 1.99
large 9.86 14.88 12.74 18.55 2.36 17.87 3.60 22.06 1.14 0.90 1.17 1.90

three. As this is only a small number of scenarios, we


Table 4. Results for the optimization model based on the different
systematically increased dynamic parameters within the
predictions. The optimizer was applied on the small sub problems of the
predicted time-series. We evaluated the performance for the ML and different scenarios. Hence, we could test our approach’s
statistical approaches based on root mean squared error (RMSE) and absolute performance on different complexity levels.
error (MAE), in comparison to the ground truth over the different complexity The complexity within the scenarios is limited. In our
scenarios.
Linear Reg. Autoreg. LSTM largest scenario, we only included a number of 15 CMfg
platform participants, 100 Products, and six logistic providers.
Scenario MAE RMSE MAE RMSE MAE RMSE
As this poses concerns about the realism of our scenarios, we
small 48.41 60.05 32.13 38.62 14.65 19.82
were able to conclusively determine the models’ performance.
medium 100.03 128.10 58.25 73.51 42.02 51.20 Larger scenarios might have included more participants and
large 181.32 234.40 52.19 76.00 71.87 94.92 products, but the workflow and information flow of our
method would still have remained the same.
The performance of the evaluation of the pricing was only
calculated on sub-problems of model predictions. The
performance of our approach is unaffected by the size of the
pricing problem. Calculating sub-problems for evaluation
purposes is feasible. An application on real data requires the
calculation of the complete problem.
The LSTM approach shows the least deviation to ground
truth for most cases. In scenario ’large’ the autoregressive
modeling outperforms the LSTM. This might be a result of
the exponentially increasing problem complexity, that has to
be covered by the model. With an increasing number of
heavily noised channels, like the 𝐿𝐿𝐿𝐿 channels, the LSTM
performs increasingly worse.

6. Conclusion

Fig. 2. Predictions of two operating costs by our LSTM. The orange line Within this paper, we introduce ML-ConCord, a Machine
marks the point from which the model predicted independently. Learning approach for cost prediction in cloud manufacturing
networks. Our approach consists of an LSTM in combination
To answer RQ 2, we then calculated the optimal pricing for with a MILP model and takes historical time-series data from
each of the predicted periods with our MILP model, based on CMfg networks as input. We evaluated our approach on
the prediction of our prediction models. We compared the generated datasets. Besides a comprehensive data generation
calculated pricing with the ground truth of the test data. We model, we proved that historical data can be used to enhance
evaluated the performance based on MAE and RMSE. The prediction quality of operational and logistic costs in a CMfg
results are compiled in Table 4. The LSTM-based predictions scenario (RQ1). Additionally, we proved that an ML model
show better results of the optimal pricing calculations in the can outperform standard statistical approaches in prediction
small and medium scenarios. For the large scenario, the quality for Revenue Management in CMfg (RQ2).
autoregressive model’s predictions lead to better results in the Future work comprises the evaluation and testing of further
pricing calculations. ML models, for time-series predictions, especially with the
focus on larger feature spaces.
5. Discussion
This research has been founded by dtec.bw, which is a scientific center supported by both
Within this section, we discuss the limitations and universities of the German Federal Armed Forces. The funds with which dtec.bw has been
restrictions of our modeling and ML approach. There are endowed by the German ministry of defense (BMVg) are used to finance research projects and
multiple restrictions we integrated into the modeling of our projects for knowledge and technology transfer.
datasets: The number of simulated scenarios is limited to
Vincent Adomat et al. / Procedia CIRP 118 (2023) 342–347 347

References Liu, Y., Zhang, L., Wang, L., Xiao, Y., Xu, X., Wang, M., 2019b. A
framework for scheduling in cloud manufacturing with deep
[1] Alrifai, M., Skoutas, D., Risse, T., 2010. Selecting skyline services for reinforcement learning, in: 2019 IEEE 17th International Conference on
Industrial Informatics (INDIN), IEEE, Helsinki, Finland. pp. 1775–
QoS-based web service composition, in: Proceedings of the 19th
1780. doi:10.1109/ INDIN41052.2019.8972157.
international conference on World wide web - WWW ’10, ACM Press,
Raleigh, North Carolina, USA. p. 11. doi:10.1145/1772690.1772693. [19] Meng, Q., Xu, X., 2018. Price forecasting using an ACO-based support
vector regression ensemble in cloud manufacturing. Computers &
[2] Ardagna, D., Pernici, B., 2007. Adaptive Service Composition in
Industrial Engineering 125, 171–177. doi:10.1016/j.cie.2018.08.026.
Flexible Processes. IEEE Transactions on Software Engineering 33,
369–384. doi:10.1109/TSE.2007.1011. [20] Morariu, C., Morariu, O., R˘aileanu, S., Borangiu, T., 2020. Machine
learning for predictive scheduling and resource allocation in large scale
[3] Aziz, M.H., Qamar, S., Khasawneh, M.T., Saha, C., 2020. Cloud
manufacturing systems. Computers in Industry 120, 103244.
manufacturing: a myth or future of global manufacturing? Journal of
Manufacturing Technology Management 31, 1325–1350. doi:10.1108/ doi:10.1016/j. compind.2020.103244.
[21] Osterrieder, P., Budde, L., Friedli, T., 2019. The smart factory as a key
JMTM-10-2019-0379.
construct of industry 4.0: A systematic literature review. International
[4] Bauernhansl, T., Krüger, J., Reinhart, G., Schuh, G., 2016.
WGPStandpunkt Industrie 4.0. URL: https://wgp.de/wp-content/ Journal of Production Economics 221. doi:10.1016/j.ijpe.2019.08.011.
[22] Oztemel, E., Gursev, S., 2020. Literature review of industry 4.0 and
uploads/WGP-Standpunkt_Industrie_4-0.pdf.
related technologies. Journal of Intelligent Manufacturing 31, 127–182.
[5] Chen, F., Dou, R., Li, M., Wu, H., 2016. A flexible QoS-aware Web
service composition method by multi-objective optimization in cloud [23] Rehman, Z.u., Hussain, O.K., Hussain, F.K., 2014. Parallel Cloud
Service Selection and Ranking Based on QoS History. International
manufacturing. Computers & Industrial Engineering 99, 423–431.
Journal of Parallel Programming 42, 820–852. doi:10.1007/s10766-013-
doi:10.1016/j. cie.2015.12.018.
[6] Delaram, J., Valilai, O.F., 2018. A Mathematical Model for Task 0276-3.
[24] Sanders, A., Elangeswaran, C., Wulfsberg, J., 2016. Industry 4.0 implies
Scheduling in Cloud Manufacturing Systems focusing on Global
lean manufacturing: Research activities in industry 4.0 function as
Logistics. Procedia Manufacturing 17, 387–394.
doi:10.1016/j.promfg.2018.10.061. enablers for lean manufacturing. Journal of Industrial Engineering and
Management 9, 811–833.
[7] Dong, T., Xue, F., Xiao, C., Li, J., 2020. Task scheduling based on deep
[25] Ullrich, T., 2021. On the Autoregressive Time Series Model Using Real
reinforcement learning in a cloud manufacturing environment.
Concurrency and Computation: Practice and Experience 32. URL: and Complex Analysis. Forecasting 3, 716–728. doi:10.3390/
forecast3040044.
https:// onlinelibrary.wiley.com/doi/10.1002/cpe.5654, doi:10.1002/
[26] Vahedi-Nouri, B., Tavakkoli-Moghaddam, R., Rohaninejad, M., 2019. A
cpe.5654.
[8] GIScience Research Group, 2022. Openrouteservice - The open source Multi-Objective Scheduling Model for a Cloud Manufacturing System
with Pricing, Equity, and Order Rejection. IFAC-PapersOnLine 52,
route planner api. URL: https://github.com/GIScience/ openrouteservice.
2177–2182. doi:10.1016/j.ifacol.2019.11.528.
[9] Halty, A., S´anchez, R., V´azquez, V., Viana, V., Pi˜neyro, P., Alejandro
Rossit, D., 1 Facultad de Ingenier´ıa, Universidad de la Rep´ublica, Julio [27] Wang, L., Guo, S., Li, X., Du, B., Xu, W., 2018. Distributed
manufacturing resource selection strategy in cloud manufacturing. The
Herrera y Reissig 565, Montevideo, Uruguay, 2 Departamento de
International Journal of Advanced Manufacturing Technology 94, 3375–
Ingenier´ıa, Universidad Nacional del Sur and INMABB-CONICET,
Av. Alem 1253, Bah´ıa Blanca, Argentina, 2020. Scheduling in cloud 3388. doi:10.1007/s00170-016-9866-8.
[28] Wang, S.l., Guo, L., Kang, L., Li, C.s., Li, X.y., Stephane, Y.M., 2014.
manufacturing systems: Recent systematic literature review.
Research on selection strategy of machining equipment in cloud
Mathematical Biosciences and Engineering 17, 7378–7397.
doi:10.3934/mbe.2020377. manufacturing. The International Journal of Advanced Manufacturing
Technology 71, 1549–1563. doi:10.1007/s00170-013-5578-5.
[10] Hintsches, A., Spengler, T.S., Volling, T., Wittek, K., Priegnitz, G.,
[29] Wei, W.W.S., 2011. Time Series Regression, in: Lovric, M. (Ed.),
2010. Revenue Management in Make-To-Order Manufacturing: Case
Study of Capacity Control at ThyssenKrupp VDM. Business Research International Encyclopedia of Statistical Science. Springer Berlin
Heidelberg, Berlin, Heidelberg, pp. 1607–1609. doi:10.1007/ 978-3-
3, 173–190. doi:10.1007/BF03342721.
642-04898-2_596.
[11] Hochreiter, S., Schmidhuber, J., 1997. Long short-term memory. Neural
computation 9, 1735–1780. [30] Wei-neng Chen, Yuan Shi, Jun Zhang, 2009. An ant colony optimization
algorithm for the time-varying workflow scheduling problem in grids,
[12] Italian National Institute of Statistics, 2022a. Harmonized consumer
in: 2009 IEEE Congress on Evolutionary Computation, IEEE,
prices for the countries of the European Union (Hicp). URL: http://dati.
istat.it/Index.aspx?QueryId=24868&lang=en#. Trondheim, Norway. pp. 875–880. doi:10.1109/CEC.2009.4983037.
[31] Wu, X., Li, R., Cao, Y., Ni, Y., Xu, X., Qian, X., 2016. The value
[13] Italian National Institute of Statistics, 2022b. Resident population by
network optimization research based on the Analytic Hierarchy Process
age, sex and marital status. URL: https://demo.istat.it/popres/
download.php?anno=2022&lingua=eng. method and the dynamic programming of cloud manufacturing. The
International Journal of Advanced Manufacturing Technology 84, 425–
[14] Karevan, Z., Suykens, J.A., 2020. Transductive lstm for time-series
433. doi:10.1007/s00170-015-8198-4.
prediction: An application to weather forecasting. Neural Networks 125,
1–9. [32] Zhang, L., Luo, Y., Tao, F., Li, B.H., Ren, L., Zhang, X., Guo, H.,
Cheng, Y., Hu, A., Liu, Y., 2014. Cloud manufacturing: a new
[15] Kober, C., Adomat, V., Ahanpanjeh, M., Fette, M., Wulfsberg, J.P.,
manufacturing paradigm, 22.
2022. Digital Twin Fidelity Requirements Model For Manufacturing.
Proceedings of the Conference on Production Systems and Logistics
(CPSL 2022) doi:tbd.
[16] Kubickova, M., 2022. Revenue management in manufacturing:
systematic review of literature. Journal of Revenue and Pricing
Management 21, 147–152. doi:10.1057/s41272-020-00274-y.
[17] Li, B., Zhang, L., Wang, S.L., Tao, F., Cao, J., Jiang, X.D., Song, X.,
Chai, X.D., 2010. Cloud manufacturing: A new service-oriented
networked manufacturing model. Jisuanji Jicheng Zhizao
Xitong/Computer Integrated Manufacturing Systems, CIMS 16, 1–7+16.
[18] Liu, Y., Wang, L., Wang, X.V., Xu, X., Zhang, L., 2019a. Scheduling in
cloud manufacturing: state-of-the-art and research challenges.
International Journal of Production Research 57, 4854–4879.
doi:10.1080/ 00207543.2018.1449978.

You might also like