Development Guide
Development Guide
Development Guide
2020-07-01
Quickly get started with a trial account in Cloud Foundry (CF) Environment.
This quick start guide provides all the information you need to quickly onboard after registering for a free trial
account with SAP Cloud Platform Integration.
Here you find an overview of tasks that you would perform while creating a subaccount in Cloud Foundry
environment.
Trial accounts are intended for personal exploration, and not for production use or team development. The
features included in a trial account are limited, compared to an enterprise account. Consider the following
before using a trial account:
For more information about the regions that are available for trial accounts, see Regions and API Endpoints
Available for the Cloud Foundry Environment.
Related Information
Subscribe to the Process Integration application from the Subscriptions page in the SAP Cloud Platform
cockpit.
Prerequisites
● You have signed up for a free trial account with SAP Cloud Platform Integration in the Cloud Foundry
environment.
● You have navigated to the subaccount in the Cloud Foundry environment.
Procedure
The following information is displayed for the business applications to which your global account is entitled
in the Cloud Foundry environment:
Note
To login to the Cloud Integration application, you have to assign the relevant roles first.
4. Choose Go to Application to provisioning application. For more information, see Provisioning the Tenant
[page 7].
You choose Unsubscribe in the Overview page page to decommission the tenant. Before you unsubscribe the
Process Integration service, make sure you have deleted the Process Integration runtime service instances.
During this process you would have noticed the Subscribe button available that might encourage you to choose
it. You need to refrain yourself from choosing the button until the tenant is successfully decommissioned.
Create and modify application roles and assign users to these roles.
Prerequisites
You are subscribed to Process Integration SaaS application in the Cloud Foundry environment.
Context
As an administrator of the Cloud Foundry environment of SAP Cloud Platform Integration, you can group
application roles in role collections. Typically, these role collections provide authorizations for certain types of
users.
Once you have created a role collection, you can pick the roles that apply to the typical job of an integration
developer. Since the roles are application-based, you must select the application to see which roles come with
the role template of this application. You are free to add roles from multiple applications to your role collection.
Finally, you assign the role collection to the users provided by the SAP ID service.
Procedure
7. To assign the role collections to the user (e-mail address) go to your subaccount, and choose Security
Trust Configuration SAP ID Service .
8. Choose Role Collection Assignment, and enter the user's e-mail address.
9. Choose Show Assignments, to see the role collections that are currently assigned to this user.
Note
For first time users, choose Show Assignments and add the user to the SAP ID Service provider.
Provision a Cloud Integration tenant and receive a consumer-specific URL to access the application.
Prerequisites
You have created the role collection and have assigned it to the users provided by the SAP ID service.
Procedure
1. In the navigation area of the subaccount, choose Subscriptions and go to the Process Integration tile, and
choose Go to Application.
2. The the provisioning application opens in a new browser instance.
For more information on subscribing Process Integration, see Subscribing to Process Integration [page 5].
3. To log on to the application, enter your credentials (use e-mail address assigned while configuring the role
collection).
4. Choose Provision. Once the provision is done use the Tenant URL to access the SAP Cloud Platform
Integration Web UI.
After successful provisioning of the tenant, you can create and deploy integration flows.
Note
Provisioning of Cloud Integration tenant happens only if the subdomain name length is less than or
equal to 16 characters.
Use services in the Cloud Cockpit to create service plan, service instances and service keys.
Context
Create Process Integration runtime service instances to access the endpoints after deploying the integration
flows..
1. You need to assign service plan for a specific subaccounts associated with process integration service.
1. In your Global Account, choose Entitlements to assign service plan to specific subaccounts.
2. To add service plan to a subaccount choose Edit, and under the Process Integration Runtime for the
relevant subaccount select integration-flow from the service plan.
Note
If your subaccount is not visible here, then you haven't created a Cloud Foundry organization yet.
To create one you need to choose Enable Cloud Foundry from the subaccount Overview menu.
Note
For a trial account a space by name dev is created as default. Perform the procedure below to create a
new space.
1. Choose the subaccount, in which you would like to create a new space.
2. Choose Spaces New Space .
3. Enter a space name and select the permissions you would like to assign to your ID.
4. Save the changes.
Note
Allocating space quota plans helps you to better manage resources of subaccount under an Cloud
Foundry organisation. For more information , see Change Space Quota Plans.
3. Use spaces that are available to the subaccount, and access them using the cockpit.
1. Navigate to the newly created space, in which you want to create a service instance.
Note
2. Choose the space and navigate Services Service Marketplace Process Integration Runtime .
3. In Process Integration Runtime service instance, choose Instances New Instance .
4. Choose a Service Plan from the dropdown list, then choose Next.
5. In the Specify Parameter menu, you must enter the below command in the text area to assign roles and
then choose Next. This authorizes the sender system to call a tenant and allows to process the
messages on the tenant.
Source Code
{
"roles":[
"ESBMessaging.send"
]
Note
The role name is case-sensitive and the authentication mode used is basic authentication.
6. In the Assign Application menu, to bind the new service instance by default None selected and then
choose Next.
7. Enter a name for your instance and choose Finish.
4. Create service keys to generate credentials to communicate directly with the Process Integration Runtime
service instance. When configuring the service key, you use a client certificate (exported from the sender
keystore).
1. Choose Instances, from the list then select an instance you are creating a key for.
2. In the navigation area, choose Service Keys and then choose Create Service Key.
3. Enter a name for the service key.
Note
○ As user credentials, for basic authentication mode, use the values of clientid and
clientsecret.
○ To use principal propogation as an authentication mode for an On-Premise service for a given
user:
○ Fire the authentication call with grant type password from the ‘Process Integration
Runtime’ service key.
POST <tokenurl from service key from Process Integration Runtime>?
grant_type=password&username=<email address of the
user>&password=<password of the username>
Basic authentication with UserName/Password: <clientid from service
key from Process Integration Runtime>/< clientsecret from service
key from Process Integration Runtime>
○ Use the access token obtained from the response above to trigger the integration process
to propagation user’s email identity.
Note
Note
These exercises apply for both cases when you use SAP Cloud Platform Integration in the Neo and in the
Cloud Foundry environment.
However, note that at certain steps there are specific things to consider depending on the environment.
Whenever this is the case, it is indicated in this documentation.
A key part of an SAP Cloud Platform Integration project is to develop integration flows. An integration flow
allows you to specify how a message is processed on a tenant. The SAP Cloud Platform Integration Web UI
provides a modeling environment that allows you to design the details of message processing (its senders and
receivers as well as the individual processing steps) with a graphical user interface.
This section shows you step-by-step how to develop and run your first, simple integration flows. In other words,
it gives you an introduction to the tasks of an integration developer. We show you the design of four integration
flows, with increasing complexity.
Note
The first three integration flows are initiated by a timer and don't have a sender. This means that all tasks
related to setting up a sender system to SAP Cloud Platform Integration can be omitted.
The fourth integration flow is initiated by a request from a sender system which is simulated by an HTTP
client.
To complete the tasks, you use the SAP Cloud Platform Integration Web UI.
Before designing any integration flow of this section, you need to create an integration package first and, within
this integration package, create an integration flow. When you have created the integration flow, you add the
steps as described for the specific integration flow exercise.
● The first exercise shows you how to perform a simple smoke test to check whether your tenant cluster is
working correctly and that it processes messages in the expected way. A simple message is created with
the text Hello World! in the message body. The integration flow has no receiver. To check if the message
has been processed successfully, you can go to the monitoring application and check for the message
content there.
More information: Smoke Test Scenario [page 18]
● The second exercise shows you how to extend the smoke test scenario by adding an outbound call to an
external data source. The integration flow requests data exposed by the external component through an
OData application programming interface (API). The message body is created based on that data and, like
in the first exercise, can be displayed by the monitoring application.
More information: Smoke Test Scenario with External Data Source [page 27]
The exercises are designed so that you can do all four of them independently. All steps are described one-by-
one. But you can also start with the first one and, successively, enhance it to derive the second and the thirs
scenario out of the first one.
Note
Prerequisites:
● You have been given access to an SAP Cloud Platform Integration tenant and have integration
developer permissions assigned to your user (authorization group
AuthGroup_IntegrationDeveloper).
● Authorization group AuthGroup_BusinessExpert has been assigned to your user (to allow to assess
message processing log attachments).
● You have set up an e-mail account that you can use as the receiver system for the integration flow (only
required for third exercise with the Mail adapter).
● You have opened the SAP Cloud Platform Integration Web UI (the Web UI URL ends with /itspaces).
Related Information
The SAP Cloud Platform Integration Web UI is your one-stop shop for integration development.
Note that the URL to access the Web UI ends with /itspaces.
When you open the Web UI, the following page is displayed.
● Discover
Here, you can find predefined integration content provided by SAP that you can use out of the box and
adapt to your requirements. As the Getting Started documentation focuses on how to design your own
integration content, we do not go into any more detail on this section.
● Design
This is where you design your integration content. As you progress through the exercise in the Getting
Started documentation, you will spend most of your time in this section. It contains the graphical
integration flow modeling environment.
● Monitor
This is where you can monitor your integration flow. You also use this section to manage additional artifacts
that you need to deploy on your tenant to complement your integration flows (for example, User Credential
artifacts to configure connections using basic authentication).
Design Section
When you go to the Design section, you find a list of integration packages defined for the tenant.
When you select an integration package, you can find the integration flows (and other artifacts) defined for the
package (on the Artifacts tab).
In this Getting Started documentation, we assume that you have not yet defined an integration package for
your integration content. Therefore, the first step is to define an integration package.
Monitor Section
The Monitor section (also referred to as Operations view) has several subsections, each one containing several
tiles. These subsections allow you to perform various tasks that are required for an integration project in
addition to integration content design.
There are other sections and tiles that are required for additional tasks, but these are not required in the
Getting Started exercise, so we will not look at them in any more detail here.
An integration package is used like a folder for your integration content (integration flows, value mappings, and
OData services). You can transport an integration package, for example, if you want to design your integration
content on a test tenant first and then transport it to a production tenant.
1. Open the Web UI using the hyperlink provided to you in the mail from SAP (the links ends with /
itspaces).
2. Go to the Design section of the Web UI.
3. Choose Create.
4. Enter a name and description for your integration package and choose Save.
The integration flow is added to the list of artifacts for the selected integration package.
5. Select the integration flow from the list.
An integration flow template opens that contains the following shapes: Sender (this represents your
sender system), Receiver (this represents a receiver system), Integration Process (this will later contain all
the processing steps that define how a message is processed on the tenant). The Integration Process
shape contains a Start and an End event.
To start modeling, choose Edit. Notice that a palette appears to the left of the integration flow model. This
palette provides access to all integration flow step shapes that you can add to the model.
This is a very simple test to verify that your SAP Cloud Platform Integration is working as expected. You do not
need any receiver system to perform this test.
In this scenario, you create a Hello World text and write it into the message body (scheduled on deployment of
the integration flow). The result is written into the message processing log which you can directly inspect with
the message monitoring application.
Caution
This integration scenario is designed to show how to quickly (without much effort) set up and run an
integration flow without the need to configure and connect to any receiver system. It uses a Script step to
store the message payload in the message processing log (to enable you to easily check in the message
monitoring application if the message was processed without any errors).
Note that this is not according to standard best practice. When designing productive scenarios, don't store
the message payload in the message processing log. This can cause severe issues with memory
consumption. The reason is that tasks such as message processing and message monitoring share the
same memory and CPU which are available on your tenant.
In the course of this exercise, you develop the following integration flow.
To make it as easy as possible for you to develop this first integration flow, you don't need to configure any
sender system. That saves the effort for you to set up a dedicated sender system and to connect it to SAP
Cloud Platform Integration. Instead of this, message processing is triggered by a Timer event, and the inbound
message payload is created within the integration flow, in a dedicated Content Modifier step.
Furthermore, it is also not required that you set up any receiver system. To enable you to check if the message
has been processed correctly, you will configure the integration so that the message payload is written into the
message processing log (where you can easily inspect it using the Monitor application of the Web UI).
1. The Timer event triggers the processing of the message (according to the settings of the Timer's
scheduler).
2. The Content Modifier step creates a message with a simple text content (Hello World!).
3. The Groovy Script step logs the payload of the message (that means, it writes the message content into
the message processing log).
When you have finished the integration flow design, you save and deploy the integration flow.
Related Information
1. Open the integration flow model (Edit mode), select the Sender shape, and choose the recycle bin icon (to
remove the Sender shape).
Note
You can, of course, try out the other settings, which enable the Timer to start message processing
periodically. However, take care when selecting these options and you have added a receiver to your
scenario. For example, in another demo scenario provided in this documentation, the receiver of the
message is an e-mail account, and you don't want your e-mail account to be inundated with
periodically generated e-mails.
As the integration flow has no sender, we use a Content Modifier to create a message from scratch.
1. To add a Content Modifier, go to the palette, choose the Message Transformers icon, and select the Content
Modifier icon.
2. Place the Content Modifier in the model after the Timer Start event.
3. In the Content Modifier properties section, go to the Message Body tab and enter the following string
sequence in the entry field:
Hello World!
With a Groovy Script step, you can configure the integration in such a way that the payload of the message is
written to the message processing log as attachment.
1. To add a Script step (containing a Groovy script), go to the palette and choose the Message Transformers
icon and select the Script icon.
3. Place the Script step shape after the Content Modifier step and connect both shapes.
4. Select the Script step.
The context icons are displayed.
7. Replace this content by the script provided in the coding example below.
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
def Message processData(Message message)
{
def body = message.getBody(java.lang.String) as String;
def messageLog = messageLogFactory.getMessageLog(message);
if(messageLog != null)
{
messageLog.addAttachmentAsString("Log current Payload:", body, "text/
plain");
}
return message;
}
8. Choose OK.
The integration flow model is again displayed.
Save and deploy the integration flow on the tenant to be able to process it.
2. Click Deploy.
A message is displayed that asks you to confirm this action.
Another message is displayed when the validation has been performed and the integration flow
deployment has been triggered.
Note
In case of a modeling error, instead of this message a Validation Failed message is displayed.
Only when you have fixed the error, deployment of the integration flow is triggered.
Open the integration package that contains the integration flow to deploy. Go to the Artifacts tab, click the
Actions button (next to the name of the integration flow which you like to deploy) and select Deploy.
Run the integration flow and check the result of message processing.
When the integration flow has been deployed successfully, the message is processed without any further
trigger (based on the settings of the timer).
1. Go to the Operations view and select a tile under Monitor Message Processing.
If your integration flow has been processed successfully, the status Completed should be shown.
2. Select the integration flow and analyze the details area to the right of the integration flow list.
This shows you that the message has been processed correctly.
This is a very simple test to verify that your SAP Cloud Platform Integration is working as expected. You do not
need any receiver system to perform this test.
In this scenario, you access an OData service and get information about a product (for a specific product ID).
The result is written into the message processing log which you can directly inspect with the message
monitoring application.
Caution
This integration scenario is designed to show how to quickly (without much effort) set up and run an
integration flow without the need to configure and connect to any receiver system. It uses a Script step to
store the message payload in the message processing log (to enable you to easily check in the message
monitoring application if the message was processed without any errors).
Note that this is not according to standard best practice. When designing productive scenarios, don't store
the message payload in the message processing log. This can cause severe issues with memory
consumption. The reason is that tasks such as message processing and message monitoring share the
same memory and CPU which are available on your tenant.
In the course of this exercise, you develop the following integration flow.
Furthermore, it is also not required that you set up any receiver system. To enable you to check if the message
has been processed correctly, you will configure the integration so that the message payload is written into the
message processing log (where you can easily inspect it using the Monitor application of the Web UI).
This is how the integration flow will process the message at runtime:
1. The Timer event triggers the processing of the message (according to the settings of the Timer's
scheduler).
2. The first Content Modifier step creates a message with only one element, a productIdentifier (to
identify a product from the product catalog).
The actual value of the productIdentifier is hard-coded in this step. If you like to process the
integration flow with another product identifier, you need to change the value in this step and re-deploy the
integration flow again. This is the drawback which results from abstaining from a dedicated sender system.
3. The second Content Modifier creates a message header (which we also call productIdentifier) and
writes the actual value of the productIdentifier element into it. This header will be used in the
subsequent step.
4. The Request Reply step passes over the message to an external data source and retrieves data (about
products) from there.
The external data source is represented by the lower WebShop shape.
The external data source supports the Open DataProtocol (OData). For our scenario, we use the ESPM
WebShop, which is based on the Enterprise Sales and Procurement Model (ESPM) provided by SAP. The
demo application can be accessed at the following address: https://refapp-espm-ui-
cf.cfapps.eu10.hana.ondemand.com/webshop/index.html
For the connection to the WebShop, an OData receiver channel is used. To query for exactly one product
(for the product identifier provided with the inbound message), the header that has been created in the
preceding Content Modifier is used.
5. The OData service provides the details of one specific product (according to the product identifier provided
with the inbound message).
6. The Groovy Script step logs the payload of the message (that means, it writes the message content into
the message processing log).
You can then run the integration flow and monitor message processing as described under: Run the Integration
Flow and Monitor the Message Processing [page 36].
Related Information
As the integration flow has no sender, we use a Content Modifier to create a message from scratch.
1. To add a Content Modifier, go to the palette, choose the Message Transformers icon, and select the Content
Modifier icon.
2. Place the Content Modifier in the model after the Timer Start event.
3. In the Content Modifier properties section, go to the Message Body tab and enter the following string
sequence in the entry field:
Sample Code
<root>
<productIdentifier>HT-1080</productIdentifier>
</root>
4. Connect the Timer event with the Content Modifier. To do this, select the Timer event, click the arrow icon,
and drag and drop the cursor to the Content Modifier.
Add a Content Modifier to your model to define a header, which will be used in a later step to filter data from the
external source.
If you remember, our input message has only one field: productIdentifier. This field will contain a product
identifier that we want to use to filter the results from the WebShop application.
To make this number available to the integration framework during message processing, SAP Cloud Platform
Integration provides the option to store the value of productIdentifier from the incoming message either
in the message header or in a data container referred to as an exchange property.
We use the first option, and to prepare the message accordingly we use a Content Modifier.
1. Add a second Content Modifier (after the first one) to the integration flow model.
2. In the properties section of the second Content Modifier, go to the Message Header tab and choose Add.
3. Specify the following parameters:
○ Name: Enter any name, for example, productIdentifier. This is the name of the header that will be
created by the Content Modifier step.
○ Type: Select XPath.
Tip
In this example, you use an XML Path Language (XPath) expression to address a dedicated
element of your inbound message. XPath allows you to address any element in an XML structure
by using a well-defined syntax. The expression //<element name> addresses all elements with
name <element name> in the XML document.
4. Connect the first Content Modifier (which defines the message body) with the second one.
In other words, the Content Modifier creates a header with the name productIdentifier, which will contain
the value of the productIdentifier field of the incoming message.
To call the external data source, add a Request Reply step to the integration flow model and connect this step
with the external system using an OData channel.
Note
Follow this procedure in case you use SAP Cloud Platform Integration in the Neo environment.
Remember
There are currently certain limitations when working in the Cloud Foundry environment. For more
information on the limitations, see SAP Note 2752867 .
3. Place the Request Reply shape between the second Content Modifier and the End event in the model.
Furthermore, connect the second Content Modifier with the Request Reply step and the Request Reply
step with the End event.
4. Move the Receiver shape closer to the Request Reply shape (below the Request Reply shape but outside
the Integration Process shape, as shown in the overall integration flow model under Smoke Test Scenario
with External Data Source [page 27]).
5. Connect the Request Reply shape to the Receiver shape (by selecting the Request Reply shape, clicking
the arrow icon, and dragging and dropping the cursor on the Receiver shape).
6. In the next dialog, choose adapter type OData.
Note
This adapter supports different versions of the OData protocol. We select version 2.0.
8. Go to the Connection tab of the OData adapter and enter the following as the Address:
https://refapp-espm-ui-cf.cfapps.eu10.hana.ondemand.com/espm-cloud-web/espm.svc
Tip
This is the endpoint address of the ESPM WebShop's OData application programming interface.
The Query Editor opens, where you can conveniently define the OData query.
The Address field is already populated with the value you just entered
11. Make sure that Remote is selected as the Connection Source, and choose Step 2.
The system connects to the WebShop service and retrieves the metadata from its OData API.
12. Choose the Search icon in the Select Entity field.
Tip
The dollar sign and the curled brackets indicate that we are dealing with Apache's Simple Expression
Language, which is often used in SAP Cloud Platform Integration. In particular, here you see a dynamic
parameter, which has the following effect: The value of the header productIdentifier (which is
identical to the value of the productIdentifier field of the incoming message) is used dynamically
at runtime to define the OData query.
With a Groovy Script step, you can configure the integration in such a way that the payload of the message is
written to the message processing log.
1. To add a Script step (containing a Groovy script), go to the palette and choose the Message Transformers
icon and select the Script icon.
3. Place the Script Step shape after the Request Reply step and connect both shapes.
4. Select the Script step.
The context icons are displayed.
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
def Message processData(Message message)
{
def body = message.getBody(java.lang.String) as String;
def messageLog = messageLogFactory.getMessageLog(message);
if(messageLog != null)
{
messageLog.addAttachmentAsString("Log current Payload:", body, "text/
plain");
}
return message;
}
8. Choose OK.
The integration flow model is again displayed.
9. Save and deploy the integration flow.
Run the integration flow and check the result of message processing.
When the integration flow has been deployed successfully, the message is processed without any further
trigger (based on the settings of the timer).
1. When you have saved and deployed your integration flow, check the deployment status. Go to the Monitor
section of the Web UI and select a tile under Manage Integration Content.
3. Go back to the overview page of the Web UI Monitor section and select a tile under Monitor Message
Processing.
If your integration flow has been processed successfully, the status Completed should be shown.
Create a simple integration scenario that is initiated by a timer, retrieves data from an external source, and
sends the result to an e-mail account (as the receiver system).
A typical challenge addressed by an integration scenario is to retrieve data from a certain source (for example,
product details from a product catalog on a vendor's site) using certain filter criteria. We use the integration
flow described in this section to address such a use case.
In the course of this exercise, you develop the following integration flow:
To make it as easy as possible for you to develop this integration flow, you don't need to configure a sender
system. This saves you the effort of setting up a dedicated sender system and connecting it to SAP Cloud
Platform Integration. Instead, message processing is triggered by a Timer event, and the inbound message is
created within the integration flow, in a dedicated Content Modifier step.
● Update the tenant keystore with the certificates required by the mail server.
● Create and deploy a User Credentials artifact that contains the credentials of the mail account.
1. The Timer event triggers the processing of the message (according to the settings of the Timer's
scheduler).
2. The first Content Modifier step creates a message with only one element: a productIdentifier (to
identify a product from the product catalog).
The actual value of the productIdentifier is hard-coded in this step. If you want to process the
integration flow with another product identifier, you need to change the value in this step and redeploy the
integration flow. This is the drawback of not having a dedicated sender system.
3. The second Content Modifier step creates a message header (which we also call productIdentifier)
and writes the actual value of the productIdentifier element into it. This header is used in the
subsequent step.
4. The Request Reply step passes the message to an external data source from which it retrieves data (about
products).
The external data source is represented by the lower WebShop shape.
The external data source supports the Open DataProtocol (OData). For our scenario, we use the ESPM
WebShop, which is based on the Enterprise Sales and Procurement Model (ESPM) provided by SAP. The
demo application can be accessed at the following address: https://refapp-espm-ui-
cf.cfapps.eu10.hana.ondemand.com/webshop/index.html
An OData receiver channel is used to connect to the WebShop. The header that was created in the
preceding Content Modifier is used to query exactly one product (using the product identifier provided with
the inbound message).
5. The OData service provides the details of this product.
6. Finally, the result of the request is forwarded to an e-mail account using the Mail receiver adapter (the e-
mail server is represented by the Mail_Ser … shape on the right in the integration flow model).
When you have finished integration flow design, you can monitor message processing.
This integration flow introduces you to a number of important aspects of integration development, such as
defining an OData query and using a message header to dynamically query an OData source.
Related Information
Update the Tenant Keystore with the Certificates Required by the Mail Server [page 41]
Create and Deploy a User Credentials Artifact for the E-Mail Account [page 43]
Create the Mail Receiver Channel [page 44]
Monitor Message Processing [page 47]
The tenant keystore contains the key pairs and certificates that are required (on the tenant side) to establish
trusted communication with the connected systems.
When establishing the connection to the SAP Cloud Platform Integration tenant, the e-mail server needs to
authenticate itself against SAP Cloud Platform Integration using a digital server certificate. For this purpose,
the tenant keystore must contain a root certificate that is also trusted by the e-mail server.
You can download the required certificates usually on a dedicated section of the email provider's website. You
might search for server certificate to get more information. However, note that the procedure might differ
depending on the email provider.
To get the e-mail servers' root certificate, you can do the following:
1. Open the website that hosts the mail account you like to address with the Mail adapter.
2. In the browser address field, click the lock icon and select Certificate (Valid) (example for using Google
Chrome).
3. In tab Certification Path doubleclick the uppermost node (which is the root certificate).
The root certificate is stored as file with extension .cer on your computer.
Finally, you need to import the downloaded certificates to the tenant keystore. To do this, open the Keystore
monitor.
1. Go to the Monitor section of the Web UI and select the Keystore tile under Manage Security.
All certificates that are already included in the keystore are displayed. If you have only recently started
working with SAP Cloud Platform Integration, these are the certificates provided by SAP initially when
providing the tenant for you.
2. Choose Add Certificate .
Note
You might also need to change the settings of your e-mail account so that the mail server accepts
connections to remote applications with a lower security level (for example, for Yahoo mail, this is the Less
Secure Apps setting). If you don't do this, the integration flow might raise an error during processing.
Note
If you don't upload the required root certificate to the tenant keystore and try to execute the integration
flow (when having finished its design), message processing will fail with the following error message
starting with:
Sample Code
Deploy a User Credentials artifact that contains the user name and password for your receiver mail account.
1. Go to the Monitor section of the Web UI and select the Security Material tile under Manage Security.
3. As Name, enter the User Credentials name that you specified in the Mail receiver adapter, and as User
enter the e-mail account user name (also specified in the fields From and To in the Mail receiver adapter).
Also provide the password of the mail account.
Note
Storing the user name and password in a separate artifact increases the security level of integration
development.
4. Choose Deploy.
Add a Mail receiver channel to enable the integration flow to send messages to an e-mail account.
1. First, add a second receiver to represent the e-mail account. In the integration flow model (in Edit mode),
select the Participants entry from the palette and select Receiver.
2. Place the Receiver shape on the right side of the model, outside the Integration Process shape.
You can rename the shape to Mail_Receiver (for example).
5. In the Mail adapter properties section (below the model), go to the Connection tab and specify the
following Mail adapter parameters.
The figure shows example settings, which are explained further below.
As you use a Timer event to trigger the message processing, the integration flow is processed as soon as it is
deployed.
1. To check whether the processing has been executed correctly, go to your e-mail account. You should find a
mail with the following content:
2. Finally, check how the message was processed by opening the Monitor section of the Web UI.
3. Choose a tile under Monitor Message Processing and you should find your message with the integration
flow name.
4. Open the integration flow in Edit mode, click the first Content Modifier and on the Message Body tab
change the value of the productIdentifier to HT-2001 and redeploy the integration flow.
5. Once the integration flow has been deployed successfully, you should receive an e-mail with details about
another product.
Create a simple integration scenario that is initiated by a sender (using the HTTPS sender adapter).
With the following steps, you can easily modify and extend the previously built integration flow with the email
receiver (Timer-Initiated Scenario with a Mail Receiver).
The figure shows the integration flow model that you get as a result of this exercise.
In the modified integration flow, an HTTP client instead of a Timer event triggers message processing.
Furthermore (to simplify the design), we have merged the steps processed by two different Content Modifier
steps in the previously built integration flow into one Content Modifier step.
Note
As a prerequisite to execute this integration flow in the Cloud Foundry environment, you need to authorize
the sender system (HTTP client) to call the integration flow endpoint. For that purpose, you create a service
instance on SAP Cloud Platform and generate service key credentials (which can then be used by the HTTP
client to call the integration flow endpoint).
1. The HTTP client (represented by the Sender shape) sends an HTTP request to SAP Cloud Platform
Integration through an HTTPS sender channel. The HTTPS request body, which is in JavaScript Object
Notation (JSON) format, contains a product identifier.
2. The JSON-to-XML converter transforms the request body into XML format (which can be processed in the
following step, the Content Modifier).
3. The Content Modifier creates a message header (which we also call productIdentifier) and writes the
actual value of the productIdentifier element into it. This header is used in the subsequent step.
In this exercise, you use one Content Modifier to create the header and to write the message body.
4. The Request Reply step passes the message to an external data source and retrieves data (about orders)
from there.
The external data source supports the Open DataProtocol (OData). For our scenario, we use the ESPM
WebShop, which is based on the Enterprise Sales and Procurement Model (ESPM) provided by SAP. The
When you have finished the integration flow design, you can send the message through the HTTP client.
Related Information
You perform these steps to authorize the sender (HTTP client) to call the SAP Cloud Platform Integration
integration flow endpoint.
Note
You need to perform these steps only in case you use SAP Cloud Platform Integration in the Cloud Foundry
environment.
These steps imply that you create a service instance on SAP Cloud Platform and generte a service key for it.
The credentials which you get as a result can be used by the HTTP client to call the integration flow endpoint.
In the context of this scenario, you can think of the service instance as a technical user that can be associated
with the sending system's (HTTP client's) request.
You perform the following steps using SAP Cloud Platform Cockpit.
3. Choose Next.
4. Enter the below command in the entry field.
Sample Code
{
"roles":[
"ESBMessaging.send"
]
}
With this command, you associate the service instance with the role ESBMessaging.send which is
required to call an integration flow endoint.
With this step, you generate credentials to communicate with a service instance.
The sender application (HTTP client) uses these credentials (clientid and clientsecret) to access the
SAP Cloud Platform Integration integration flow endpoint.
You need to copy the values of clientid and clientsecret to your clipboard or to a text editor for later
reference.
These values specify the credentials of the user associated with the sending application.
Add an HTTPS sender channel to enable the integration flow to receive HTTP requests.
Note
If you choose the information icon, the version of the integration flow component is displayed.
Do not confuse the version of an individual integration flow component with the software version of
SAP Cloud Platform Integration. An integration flow component gets a new version each time a new
feature is added to it by SAP. Let's imagine a situation where you started modeling an integration flow
some time ago and now want to continue working on it. Let's assume that SAP has updated the
software in the meantime. A new version of an integration flow step or shape that you have used is now
available, containing a new feature. You can continue to use the old component version, but if you want
to use the new feature you need to update to the new version.
3. Click the arrow icon and drag and drop the cursor on the Start event.
The list of available adapter types is displayed in a dialog.
Add a JSON-to-XML converter to convert the HTTP request, which is in JavaScript Object Notation (JSON)
format, to XML for further processing.
With the HTTP client, we send a POST request with a request body in JSON format. To enable the subsequent
steps to process the message, it needs to be converted to XML first. To perform the required conversion, you
can use the JSON-to-XML converter.
1. In the palette, select the Message Transformers entry and then choose Converter.
Set up an HTTP client using Postman and send the HTTP request.
You can now send the POST request to the integration flow.
1. Specify the same authentication settings as for the GET request above.
Sample Code
{
"productIdentifier": "HT-1080"
}
3. Copy the value of the CSRF token (obtained from the GET request above) to the clipboard.
4. Add a header to the request.
In the Key field, enter X-CSRF-Token and in the Value field, enter the value of the CSRF token from your
clipboard.
5. Send the request.
You should get the details of the product with productIdentifier HT-1080.
6. Go to the e-mail account specified in the Mail adapter. You should have received an e-mail like this:
7. Place another POST request with a body containing productIdentifier HT-2001, and you receive details of
another product.
8. Finally, check how the message was processed by opening the Monitor section of the Web UI.
Choose a tile under Monitor Message Processing and you should find your message with the integration
flow name.
Note
These instructions are relevant only when you use SAP Cloud Platform Integration in the Cloud Foundry
environment.
This section describes the security-related aspects of the integration platform and shows which measures you
can take to protect customer data that is passed through the platform during the execution of an integration
scenario.
Customers who use SAP Cloud Platform Integration agree that a significant part of their (and their customers')
sensitive data is processed by and stored within an infrastructure not owned by themselves.
The core task of an integration platform is to serve as the transit place for messages, which may contain
sensitive customer data. First and foremost, these messages must be protected against eavesdropping and
unauthorized access.
Therefore, the integration platform must fulfill the following main requirements:
● The integration infrastructure is already designed and built in such a way that it meets the highest security
standards.
In particular, it must be guaranteed that the technical system landscape, the communication between the
components of the integration platform, and the storage locations of messages are secure.
● The processes related to the usage of Cloud Integration meet the highest security standards.
This relates to the processes at SAP that are related to the development and upgrade of the Cloud
Integration software, the processes related to the provisioning and operation of the customers' virtual
environment by the infrastructure provider, and the customer onboarding process during which customers
set up secure connections between their infrastructure and SAP's integration platform.
● Customers have several options to configure how messages are exchanged within an integration scenario
so that the involved data is protected at the highest level.
In particular, when designing integration flows, customers can choose between several options to protect
messages by establishing secure communication channels (transport-level security) and by configuring
digital encryption and digital signing of messages (message-level security).
This documentation summarizes the measures that are taken by SAP to fulfill these requirements.
Related Information
Note
These instructions are relevant only when you use SAP Cloud Platform Integration in the Cloud Foundry
environment.
In technical terms, the integration platform is designed as a containerized and clustered integration platform in
the cloud. Messages processed by integration flows from different customers are handled on different parts of
the platform (referred to as tenants).
Tenants processing integration flows from different customers are strictly separated from each other in terms
of CPU, data storage and user access.
The following figure shows a bird's eyes view on the technical architecture.
● A multi tenant-capable application comprises a set of microservices (not depicted in the figure) that
accomplish tasks related to the management of a tenant and the preparation of monitoring data. It takes
requests from the dialog users (for example, when an integration developer deploys an integration flow
using the Web user interface).
These microservices run on an application that can be shared across multiple customer tenants.
● A worker (runtime container) processes messages that are exchanged with external systems. Therefore,
the worker is connected to the external systems. In other words, workers process customer data that
might be confidential and has to be protected.
Workers are operated within customer-specific tenants. These tenantsare strictly separated from each
other.
As a consequence of this cluster design, the following main communication paths are active during the
operation of an integration scenario:
Various secure technical protocols can be used for these communication paths. Depending on the adapter
type, the following protocols are available:
● Hyper Text Transfer Protocol (HTTP) over Transport Layer Security (TLS), which is referred to as HTTPS
● SSH File Transfer Protocol (SFTP) for the exchange of data with an SFTP server
● Simple Mail Transfer Protocol (SMTP), Post Office Protocol (POP)3, and Internet Message Access Protocol
(IMAP) for the exchange of data with mail servers
User Access
In addition to the above mentioned components that interact with each other when messages are being
processed and exchanged between the involved systems, additional components come into play when a dialog
user accesses the infrastructure (for example, when an administrator accesses monitoring data or when an
integration developer deploys an integration artifact).
People with different roles can access the infrastructure – both on the side of the infrastructure provider and
on the customer side. Human access points (for dialog users) are:
● Dedicated experts at the side of the infrastructure provider access the infrastructure to provide a tenant
for the customer.
● Experts on the customer side access the infrastructure to design and deploy integration content and to
monitor an integration scenario at runtime (integration developers and tenant administrators).
Processes that are related to the provisioning, update, and usage of the cloud-based integration platform meet
the highest security standards.
Cloud Integration is compliant with various SAP-internal technical policies, procedures, directives, guidelines,
and product standards.
For example, SAP software is developed in compliance with the SAP Secure Development Lifecycle
(SDLC) ,which helps to implement measures such as test-driven development and threat modeling.
SAP certifies that the development, maintenance, and operations of Cloud Integration comply with the
requirements of the following standards:
All data in transit, either exchanged with remote components or internal, can be protected by methods such as
encryption.
Note
These instructions are relevant only when you use SAP Cloud Platform Integration in the Cloud Foundry
environment.
During a scenario, the connected remote systems exchange data with each other based on the configured
transport protocol. These protocols support different options to protect the exchanged data against
unauthorized access. In addition to security at the transport level, the content of the exchanged messages can
also be protected by means of digital encryption and signature.
Transport-Level Security
Each adapter allows you to set up a specific security level based on the underlying transport protocol.
SFTP (Secure Shell File Transfer This protocol is supported by the SFTP sender and receiver adapter.
Protocol)
Secure Shell (SSH) is used to securely transfer files in an open network.
SSH uses a symmetric key length with at least 128 bits to protect FTP communication.
Default length of asymetric keys provided by SAP is 2048 bits..
● User name/password authentication (where the SFTP server authenticates the call
ing component based on the user name and password)
● Public key authentication (where the SFTP server authenticates the calling compo
nent based on a public key)
Secure data transfer with SFTP is based on a combination of symmetric and asymmetric
keys. Symmetric (session) keys are used to encrypt and decrypt data within a session.
Asymmetric key pairs are used to encrypt and decrypt the session keys.
When asymmetric key pairs are used, SFTP also ensures that only authorized public keys
are used by the involved participants.
Supported versions:
HTTP(S) (Hypertext Transfer This protocol is supported by all adapters that allow communication over HTTPS (for ex
Protocol Secure) ample, the IDoc adapter, the SOAP adapters, and the HTTP adapter).
You can protect communication using Transport Layer Security (TLS). In this case, a
symmetric key length of at least 128 bits is used (which is technically enforced). Default
length of asymetric keys provided by SAP is 2048 bits.
Note
SAP Cloud Platform Integration supports TLS 1.1, and 1.2 for inbound and outbound
communication for all HTTP(S)-based channels.
Note
The HTTP receiver adapter also allows you to use HTTP URLs. However, we do not
recommend using this option when transferring confidential data (including the
password for basic authentication).
Also, if the network is not entirely trusted, there is no way to verify whether the result
of an HTTP request originates from a trustworthy source. Therefore, we do not rec
ommend using this option for productive scenarios over the Internet.
Receiver adapters also support principal propagation via SAP Cloud Platform Connector.
Various authentication options (basic authentication using user credentials, client certifi-
cates, or OAuth) are supported depending on the selected sender or receiver adapter.
Caution
Consider that we do not recommend to use basic authentication in productive sce
narios because of the following security aspects:
Basic authentication has the risk that authentication credentials, for example, pass
words, are sent in clear text. Using TLS (transport-layer security, also referred to as
Secure Sockets Layer) as transport-level encryption method (when using HTTPS as
protocol) makes sure that this information is nevertheless encrypted on the trans
port path. However, the authentication credentials might become visible to SAP-in
ternal administrators at points in the network where the TLS connection is termi
nated, for example, load balancers. If logging is not done properly at such devices,
the authentication credentials might become part of log files. Also network monitor
ing tools used at such devices might expose the authentication information to ad
ministrators. Furthermore, the person to whom the authentication credentials be
long (in the example above, the password owner) needs to maintain the password in
a secure place.
SMTP (Simple Mail Transfer Pro These protocols are supported for the exchange of e-mails (in combination with the Mail
tocol) adapter).
IMAP (Internet Message Access To authenticate against the e-mail server, you can send user name and password in plain
Protocol ) text or encrypted (the latter only in case the e-mail server supports this option).
Note
The (optional) password-based authentication only applies to communication be
tween the Cloud Integration system and the mail server. Communication between
mail servers is usually not authenticated. Therefore, you must not assume that data
received by mail comes from a trustworthy source, unless other security measures
(such as digital signatures at message level) are applied.
Message-Level Security
On top of the transport-level security options, you can also secure the communication at message level, where
the content of the exchanged messages can also be protected by means of digital encryption and signatures.
Various security standards are available to do this, as summarized in the table below.
To configure message-level security options, you use dedicated integration flow steps (for example, the
Encryptor and Signer step types).
Signing/verification of payload
Identity and access management features of SAP Cloud Platform are used during the lifecycle of an integration
scenario.
Note
These instructions are relevant only when you use SAP Cloud Platform Integration in the Cloud Foundry
environment.
Access Management
Dialog users who access the platform are authenticated against an identity provider. SAP Identity Service (ID
Service) is used by default. SAP ID Service is the central service for the process of managing identities and
their lifecycles.
Access to dedicated functions of the platform is controlled and protected by authorization checks. A number of
authorization groups are available to manage the authorizations of dialog users. An authorization group is
based on a persona and defines a set of dedicated permissions relating to the tasks that come into play during
the lifecycle of an integration project.
Note
Example:
If the logged-in user has to perform tasks such as designing and deploying integration flows, the user must
be assigned the authorization group AuthGroup.IntegrationDeveloper.
The tasks of persons with integration developer permissions (short: integration developers) constitute a key
part of the SAP Cloud Platform Integration lifecycle. Permissions for the integration developer (who is in charge
of modeling integration flows) are contained in the authorization group
AuthGroup.IntegrationDeveloper.
Note that the roles contained in this authorization group give an integration developer full control over message
processing during runtime.
During integration flow modeling, the integration developer defines how messages are mapped, which
credentials are used, and to which recipients messages are sent. The set of roles provides very powerful
permissions and in some cases allows the integration developer to access sensitive data.
The integration developer can control which credentials are to be used in connections with basic
authentication by deploying the associated User Credentials artifacts on the tenant. These artifacts contain
user names and passwords. Note that, however, a password specified in a User Credentials is never
displayed. Furthermore, passwords cannot be downloaded (by either using the user interface or the
application programming interface). The integration developer, although having full control over the
integration flow, does not have access to credentials of another tenant of the same customer.
Therefore, apply the following measures when designing integration flows for security-sensitive areas:
Tip
Instead of using the predefined authorization groups, you can tailor the permissions to your own
requirements by applying elementary roles that are defined for individual tasks.
More information:
When a sender system calls the integration platform using HTTPS-based (inbound) requests, there are
different ways for the calling sender to authenticate itself against the integration platform. The options are
basic authentication, OAuth, and SAML.
Note
● Authentication
Verifies the identity of the calling entity.
● Authorization
Checks what a user or other entity is authorized to do (for example, as defined by roles assigned to it).
In other words, the authorization check evaluates the access rights of a user or other entity.
3.4.1 Persona
When you perform user management tasks using SAP Cloud Platform Cockpit, you find a set of pre-defined
roles that you can assign to users of the account. According to the main tasks associated with integration
projects, these roles are associated to certain persona relevant for an integration project.
Note
● In the Neo environment, a persona is realized by an authorization group (beginnig with the string
AuthGroup).
● In the CLoud Foundry environment, a persona ia realized by role collection.
Authorization Groups
Role Collection (Cloud
Persona Authorization Group (Neo) Foundry) Description
● Monitoring integration
flows and the status of
integration artifacts
● Reading the message
payload and attach
ments
● Monitoring integration
flows and the status of
integration artifacts
● Deploying security con
tent
● Deploying integration
content (such like inte
gration flows, for exam
ple)
● Deleting messages from
transient data store
● Monitoring integration
flows and the status of
integration artifacts
● Deploying integration
content (such like inte
gration flows, for exam
ple)
● Monitoring integration
flows and the status of
integration artifacts
● Restarting subsystems
of the tenant cluster
● Software development
tasks on VMs of the ten
ant cluster
Note
System developer tasks
are typically required in
the support case by SAP
experts who are sup
posed to perform tasks
like debugging (for ex
ample) on the tenant
cluster.
Note
In order to enable a sender system to process messages on a tenant using HTTPS/basic authentication,
you need to assign to the associated user the role ESBmessaging.send. This role needs to be assigned to
each (technical) user that is supposed to connect to Cloud Integration.
The following table provides an overview of which roles are required in order to accomplish the various tasks
related to SAP Cloud Platform Integration. It is also indicated in how far the tasks and roles are relevant for the
main persona defined for Cloud Integration.
● In the Neo environment, a persona is realized by an authorization group (beginning with the string
AuthGroup).
● In the Cloud Foundry environment, a persona ia realized by role collection.
The mapping of the persona to the authorization groups (Neo) or role collections (Cloud Foundry) is described
under .
In the different environments, the permissions to execute certain tasks are given by different objects.
Supporter/System De
veloper
Tenant Administrator
Tenant Administrator
WebToolingWork
space.Write
Supporter/System De
veloper
Tenant Administrator
WebToolingWork
space.Write
GenerationAnd
Build.generationand
buildcontent
NodeManager.deploy
content
TransportModule.read
TransportModule.write
Note
The role Integra
tionContent.Trans
port is deprecated.
TransportModule.read
TransportModule.write
Note
The role Integra
tionContent.Trans
port is deprecated.
TransportModule.read
TransportModule.write
Note
The role Integra
tionContent.Trans
port is deprecated.
Tenant Administrator
Supporter/System De
veloper
Tenant Administrator
Tenant Administrator
Tenant Administrator
Tenant Administrator
Tenant Administrator
Tenant Administrator
Tenant Administrator
NodeManager.deploy
content
NodeManager.deploy
content
NodeManager.readcre
dentials
NodeManager.deploy
content
NodeManager.deploy
content
NodeManager.deploy
credentials
NodeManager.deploy
securitycontent
NodeManager.readse
curitycontent
Tenant Administrator
NodeManager.read
Tenant Administrator
Tenant Administrator
NodeManager.deploy
securitycontent
Tenant Administrator
AccessPolicies.Write
Tenant Administrator
ESBDataStore.read
ESBDataStore.delete
tion Server.read
Business Expert
(enable/disable trace) NodeManager.read
Tenant Administrator
ConfigurationSer-
vice.RuntimeBusiness
ParameterRead
ConfigurationSer-
vice.RuntimeBusiness
ParameterWrite
Supporter/System De
veloper
Tenant Administrator
ESBDataStore.retry
ESBDataStore.read
ESBDataStore.delete
Tenant Administrator
MessageProcessing
Locks.Delete
NodeManager.read
AuditLog.Read
Note
These instructions are relevant only when you use SAP Cloud Platform Integration in the Cloud Foundry
environment.
Customer data stored at rest is strictly separated and isolated for each tenant. Although different tenants
might share a common physical infrastructure, each tenant stores its data in a separate schema.
For certain use cases the customer can configure if the data at rest is encrypted.
Message content can be stored encrypted. If this security measure is configured, the encryption key that is
generated automatically is unique for each tenant and is renewed periodically.
Data storage encryption uses AES and a key length of 256 bits. The encryption key is not stored in the same
location as the encrypted data.
The following kinds of data can be stored during the execution of an integration scenario:
● Message content
The runtime node writes message content data to the database in dedicated steps of an integration flow.
There is the option to either store message content for a longer time period (the default is 30 days) or
temporarily. Temporarily stored message content can be used for subsequent message processing steps.
Such steps can then also read message content from the database.
There is the option to configure the retention period of the message content.
● Monitoring data
During message processing, the runtime node also writes monitoring data to the database (which is stored
by default for 30 days). Monitoring data comprises the message processing log (MPL), which records the
executed processing steps.
Various types of customer data are processed by and stored on the integration platform at different times. This
data gets the highest level of protection, and SAP takes dedicated measures to guarantee this security level.
Note
These instructions are relevant only when you use SAP Cloud Platform Integration in the Cloud Foundry
environment.
General Information
Governments place legal requirements on industry to protect data and privacy. We provide features and
functions to help you meet these requirements.
Caution
SAP does not provide legal advice in any form. SAP software supports data protection compliance by
providing security features and data protection-relevant functions, such as blocking and deletion of
personal data. In many cases, compliance with applicable data protection and privacy laws is not covered
by a product feature. Furthermore, this information should not be taken as advice or a recommendation
regarding additional features that would be required in specific IT environments. Decisions related to data
protection must be made on a case-by-case basis, taking into consideration the given system landscape
and the applicable legal requirements. Definitions and other terms used in this documentation are not
taken from a specific legal source.
Caution
We assume that you have not maintained any data related to an individual in the tools provided by SAP
Cloud Platform Integration (for example, when using the Web UI to design integration content).
The knowledge of sensitive personal data lies exclusively with you and remains your responsibility.
The tools of SAP Cloud Platform Integration only use technical users or data without any references to
individuals.
User Consent
We assume that software operators, such as SAP customers, collect and store the consent of data subjects,
before collecting their personal data. A data privacy specialist can later determine whether data subjects have
granted, withdrawn, or denied consent.
Information Report
An information report is a collection of data relating to a data subject. A data privacy specialist may be required
to provide such a report or an application may offer a self-service. SAP Cloud Platform Integration assumes
that software operators, such as SAP customers, can provide such information.
When handling personal data, consider the legislation in the different countries where your organization
operates. After the data has passed the end of purpose, regulations may require you to delete the data.
However, additional regulations may require you to keep the data longer. During this period you must block
access to the data by unauthorized persons until the end of the retention period, when the data is finally
deleted.
Data stored on the SAP Cloud Platform Integration platform is only stored for a limited time period (referred to
as retention time).
For more information on the retention times for the various kinds of data stored by SAP Cloud Platform
Integration, see Specific Data Assets [page 80].
Different kinds of data, such as message content or monitoring data, can be stored during the operation of an
integration scenario.
Note
These instructions are relevant only when you use SAP Cloud Platform Integration in the Cloud Foundry
environment.
Such data needs to be considered as sensitive data as it can contain personal information. The following list
provides examples:
● Message content
Messages processed on a runtime node typically contain business data of an integration scenario and
therefore can contain sensitive customer data such as addresses, names, or financial information.
When this data is at-rest, it can be stored encrypted. Note, however, that in some use cases the customer
can configure that the data is not encrypted.
When this data is in-transit, several measures can be taken, such as digital message signing or message
content encryption.
● Monitoring data
The message processing log records the processing steps of an integration flow. Only users assigned to
this tenant and with dedicated permissions can access this data.
● Other data, such as the content of log files
Note
Personal data processed by and stored on the integration platform is handled according to the Data
Processing Agreement, which you can find at http://www.sap.com/about/agreements.html under SAP
Cloud Services Customers.
Due to the tenant isolation concept, data from different customers (stored in different tenants) is strictly
isolated. Additionally, SAP has no access to data stored in customer tenants.
The customer can grant people outside its organization permissions to execute specific tasks on its cluster (for
example, to SAP employees to execute error analysis tasks in support cases).
For more information, see the document SAP Cloud Platform Security: Trust Matters under Data
Governance and Legal Compliance .
Different kinds of data are stored in the SAP Cloud Platform Integration infrastructure during the lifecycle of an
integration project.
Note
These instructions are relevant only when you use SAP Cloud Platform Integration in the Cloud Foundry
environment.
Data Assets
Integration flow tracing Information on the Trace store Log data 60 minutes
data message flow (includ
Business data
ing the message pay
load) and on errors
that occurred during
message processing
Data stored by Data Message content Data store Business data Can be defined by inte
Store operations step stored in dedicated gration developer (de
steps in an integration fault value: 90 days)
flow (contains informa
tion such as message
GUID, message proc
essing log GUID, ten
ant ID, time stamp, and
payload).
Data stored by Persist Message content Message store Business data 90 days
step stored in dedicated
steps in an integration
flow (contains informa
tion such as message
GUID, message proc
essing log GUID, ten
ant ID, time stamp, and
payload)
Note
These instructions are relevant only when you use SAP Cloud Platform Integration in the Cloud Foundry
environment.
Cloud Integration provides user interfaces for designing and deploying message flows, and monitoring them at
runtime.
A Web tool (Web UI) is available to accomplish these tasks. The Web UI is implemented using JavaScript and
HTML (UI5).
This user interface is built to prevent vulnerabilities such as cross-site scripting (XSS) and cross-site request
forgery (XSRF). The built-in security capabilities of these technologies are used together with secure design
and coding principles.
You cannot use application programming interfaces (APIs) in the Cloud Foundry environment to access
certain functions of Cloud Integration.
Hyperlinks
Some links are classified by an icon and/or a mouseover text. These links provide additional information.
About the icons:
● Links with the icon : You are entering a Web site that is not hosted by SAP. By using such links, you agree (unless expressly stated otherwise in your
agreements with SAP) to this:
● The content of the linked-to site is not SAP documentation. You may not infer any product claims against SAP based on this information.
● SAP does not agree or disagree with the content on the linked-to site, nor does SAP warrant the availability and correctness. SAP shall not be liable for any
damages caused by the use of such content unless damages have been caused by SAP's gross negligence or willful misconduct.
● Links with the icon : You are leaving the documentation for that particular SAP product or service and are entering a SAP-hosted Web site. By using such
links, you agree that (unless expressly stated otherwise in your agreements with SAP) you may not infer any product claims against SAP based on this
information.
Example Code
Any software coding and/or code snippets are examples. They are not for productive use. The example code is only intended to better explain and visualize the syntax
and phrasing rules. SAP does not warrant the correctness and completeness of the example code. SAP shall not be liable for errors or damages caused by the use of
example code unless damages have been caused by SAP's gross negligence or willful misconduct.
Gender-Related Language
We try not to use gender-specific word forms and formulations. As appropriate for context and readability, SAP may use masculine word forms to refer to all genders.
SAP and other SAP products and services mentioned herein as well as
their respective logos are trademarks or registered trademarks of SAP
SE (or an SAP affiliate company) in Germany and other countries. All
other product and service names mentioned are the trademarks of their
respective companies.