Best Practice - Data Consistency Check For Logistics PDF
Best Practice - Data Consistency Check For Logistics PDF
Best Practice - Data Consistency Check For Logistics PDF
Contents
1. Introduction .............................................................................................................................3
1.1 Applicability, Goals, and Usage of this Document.................................................................3
1.1.1 Goal of this Best Practice Document and Assessment Guide....................................3
1.1.2 How to use this Best Practice ...................................................................................3
1.1.3 Legend.....................................................................................................................4
2. General Best Practice Procedure for Inconsistencies...............................................................5
2.1 Background Information.......................................................................................................5
2.1.1 Definition of Terms and Types of Inconsistencies ......................................................5
2.1.2 Typical Root Causes for Inconsistencies...................................................................6
2.2 Overall Procedure................................................................................................................9
2.3 Assessment Guide.............................................................................................................13
2.3.1 Understanding the Situation and Inconsistency.......................................................13
2.3.2 Process Understanding ..........................................................................................15
3. Tools and Procedures within an SAP Environment.................................................................18
3.1 Tools to Check Consistency Between ECC/CRM and Within CRM .....................................18
3.1.1 Data Exchange for the Most Common CRM Scenario - CRM Sales........................18
3.1.2 Data Exchange for the CRM Scenario CRM Mobile ................................................19
3.1.3 DIMa – The Tool to Detect and Repair Inconsistencies Between SAP ECC and SAP
CRM 19
3.1.4 Analysis and Correction Reports in Data Exchange of One Order Documents
Between SAP CRM and R/3..................................................................................................20
3.1.5 Miscellaneous Check, Correction and Repair Reports in CRM ................................22
3.2 Tools to Check Consistency Between ECC/SCM and Within SCM......................................23
3.2.1 Introduction ............................................................................................................23
3.2.2 Product Allocation (PAL): Internal Inconsistencies Within SCM or Between ECC/SCM23
3.2.3 Time Series of DP and SNP ...................................................................................25
3.2.4 Shipping and Vehicle Scheduling............................................................................25
3.2.5 Integration Models (External Consistency)..............................................................25
3.2.6 Internal Consistency Check for SCM regarding LC <-> DB .....................................25
3.2.7 Consistency at Interface: Transaction Data (External Consistency) .........................26
3.2.8 Temporary Quantity Assignments (TQAs) ...............................................................27
2
Best Practice: Data Consistency Check
3.2.9 Transaction Data: All Order Types and Stock (External Consistency) ......................33
3.2.10 Sales order requirements and SD mapping tables (external consistency)................35
3.3 Tools to Check Consistency within ECC.............................................................................37
3.3.1 Tools for Processes Involving SD and LE ...............................................................37
3.3.2 Tools for Processes Involving MM...........................................................................47
3.3.3 Inconsistencies between MM and FI.......................................................................49
3.3.4 Tools for Processes Involving WM ..........................................................................51
3.3.5 Tools for Processes Involving PP............................................................................52
3.3.6 Tools for Processes Involving PS............................................................................53
3.3.7 Tools to Check the Logistics Information System ....................................................54
3.4 Tools not Depending on a Particular System ......................................................................55
3.4.1 Tools for ALE Consistency Check ...........................................................................55
3.4.2 Tools for ALE Recovery ..........................................................................................55
3.4.3 Generic Consistency Check for Two Linked Tables in One System .........................56
3.4.4 The Generic External Compare ..............................................................................60
4. Appendix...............................................................................................................................61
4.1 General Roadmap for Analysis...........................................................................................61
4.2 Dos and don’ts for Data Consistency .................................................................................64
4.3 Advantages and Disadvantages of Different Recovery Methods .........................................64
4.4 Further Information ............................................................................................................67
4.4.1 Related information ................................................................................................67
4.4.2 Troubleshooting .....................................................................................................67
2
© 2008 SAP AG
1. Introduction
1.1 Applicability, Goals, and Usage of this Document
To ensure that this Best Practice is the one you need, consider the following goals and
requirements.
1.1.1 Goal of this Best Practice Document and Assessment Guide
This Best Practice provides information how to monitor data consistency and what you
should do once an inconsistency is reported. The document provides the general monitoring
procedure as well as details for several areas where potentially business critical
inconsistencies could occur. The more detailed sections should facilitate the use of available
consistency check tools by providing the general description and logical design of such
reports including relevant business data needed for some frequent instances of
inconsistencies. The goal is to enable you to use them and to understand the guiding
principles behind these tools, so that they may be available as templates for specific cases
involving non-SAP systems.
The Best Practice document is divided into two major areas: Section 2: General Best
Practice Procedure for Inconsistencies provides background information about typical root
causes of inconsistencies, as well as the basic questions you need to ask to identify the most
likely root causes for the specific case you encounter. Section 3: Tools and Procedures
within an SAP Environment describes the most important tools available from SAP and how
they can be used to monitor data consistency on a regular basis and correct possible
inconsistencies. Section 4: Appendix provides additional information like analysis roadmaps
and a comparison of different recovery methods.
In addition to the consistency check methods, the Best Practice document provides basic
deciding factors for recovery methods to determine which is preferable in case of data loss
and how to evaluate them for a given case. Please check the Best Practice Document
“Business Continuity Management for SAP System Landscapes” for a more specific and
detailed view on the area of data recovery and business continuity.
1.1.3 Legend
This symbol indicates a paragraph from where you can navigate to another section of this
document for more detailed information on a topic.
This symbol indicates a paragraph from where you can navigate to another document within
the SAP Service Marketplace for more detailed information on a topic.
This symbol indicates an important paragraph where you find special hints or best practices.
4
© 2008 SAP AG
5
Best Practice: Data Consistency Check
Any good consistency check tool should compare current business object instances between
the two systems, A and B, display all differences categorized into the three difference types
and – if possible – provide drilldown functionality into the exact different fields for difference
type 1.
Three different cases are usually distinguished when investigating inconsistencies:
Inconsistencies between the real world and a system
Inconsistencies between two systems
Inconsistencies within one system
As the data needs to be compared between the two environments independently of the
nature of the environment, and the investigation needs to be based on tools for easy
handling the general procedure to identify the inconsistencies and the data correction will
always take place in a computer system (in case 1, the real world will always be the “leading
system”). Thus, a unified overall procedure can be applied and we will not distinguish
between the three cases unless it is essential for a specific procedure or understanding how
to apply a procedure in the given environment.
5
© 2008 SAP AG
6
Best Practice: Data Consistency Check
6
© 2008 SAP AG
7
Best Practice: Data Consistency Check
ECC WMS
Create Sales Order
Create Delivery
Perform Picking
Print Labels
Perform Packing
IF1
Update Delivery
Create Invoice
7
© 2008 SAP AG
8
Best Practice: Data Consistency Check
Replicate with
Replicate Error Replicate
Correct Error
Object A: Value1 A Object A: Value1 A Object A: Value1 B Object A: Value1 B
Value2 A Value2 A Value2 C Value2 A
System B
8
© 2008 SAP AG
9
Best Practice: Data Consistency Check
example, from MD1) is overwritten by outdated data from the other system during an
assessment. This situation is especially common for master data distribution
MD1 PD1
Create / Change
Material Data Create Extended Data
Exchange Extended
Data
MD2 PD2
Create / Change
Create Extended Data
Material Data
Exchange Extended
Data
Process1 Process2
Reported
Process Step n Process Step m
Difference
Logical
Process Step 4a Mapping Process Step 4b
al
Process Step 2c
gic
Lo ping
p
Process Step 2a Ma Process Step 2b
Technical
Lowest (Underlying) Data Level Mapping Lowest (Underlying) Data Level
Best Practice would be to document all this data already during the implementation
project of the core business processes and to think about possible impacts.
Once the first step of the inconsistency investigation - the understanding of the involved
business processes and data derivation processes - has been finished, the root cause of the
inconsistency and whether a real inconsistency exists have to be identified. The next step is
therefore to check for the common root causes in the steps involved directly with the
inconsistent data. The general rule here is not to dig straight into a technical analysis but first
to rule out the more operational root causes unless the inconsistency is reproducible.
A common root cause for inconsistencies is application and interface monitoring and error
handling (see chapter 2.1.2). This means that investigation of interface monitoring and error
handling should be an important part of the assessment, especially if different systems are
involved. Once the relevant interfaces and business process steps have been identified by
the recording of the involved processes, we need to determine how these are monitored
(application monitoring and interface monitoring) and whether erroneous data is contained in
the system. Besides additional, detailed questions described in the next section in table 5,
the system should be checked for old erroneous data using available standard transactions
(for example, within an SAP system ST22 for short dumps, WE02 for IDocs in error state,
and so on). The exact transactions available to check for this data depend on the interface
technique, the business process steps, and the system itself. If data in error state is found,
the handling of such incorrect data should be discussed within your monitoring team. If
incorrect handling has been identified (for example, reprocessing of BDocs or Updates
without checking that this data is still valid), appropriate error handling procedures need to be
derived.
If the reported differences cannot be explained by error handling procedures and are not
temporary differences, the logical mapping of corresponding steps including an
understanding of the underlying technical data as reached during the assessment should
now be used to verify the data consistency on the lowest technical level available (in our
example: comparison of S-table versus InfoCube). The investigation needs to be started on
the lowest level as higher level, derived data will always show inconsistencies if the lower
level data is already corrupted, even if the involved intermediate processing steps are
correct. Check reports are needed to compare this data. It is important that temporary
differences are filtered out during the use of check reports by using a time window where no
system activities are executed affecting the involved data. Filtering out differences means
that all update tasks or interface processing for the relevant data should be finished and that
no new data is created during the time of the analysis. If such a time window is not available,
the comparison has to be repeated and results need to be compared between the different
runs. Temporary differences will disappear between different runs of the check reports/tools
while real inconsistencies will remain in the result for all runs of the same report. An example
would be to compare the information between the S-Table and the InfoCube at a time where
both systems are in sync and no additional data is written to the S-table during comparison. If
this is not taken into account, the S-table may contain newer data missing in the InfoCube
which would be corrected with the next extraction.
Sometimes, the situation arises that inconsistencies exist in very old data. The old data
should be corrected before attempting to identify the root cause in these cases by reloading
or correction reports whenever possible as old inconsistencies whose root cause has
probably been fixed already may obscure the root cause for new inconsistencies or whether
inconsistencies exist at all.
Possible reports and methods to correct the outdated data can be found in section 3:
Tools and Procedures within an SAP Environment.
The business process steps leading to the low level data need to be traced and debugged
when temporary differences have been filtered out and inconsistencies have already been
identified in the low level data, as a program error is most likely, for example, the Extractor
11
© 2008 SAP AG
12
Best Practice: Data Consistency Check
filling the InfoCube and the update routines for the S-table need to be investigated in our
given example of S-table versus InfoCube. Programming errors leading to a mismatch
between both data sets could be purely technical (for example, an incorrect sign in some
formula), or logically, if the data calculation rules are incorrect (for example, using the
document date instead of a posting date). If temporary differences are ruled out and no
inconsistencies are found in the underlying data, the dependent derived data needs to be
investigated from bottom level to top level. It is important that only data mapped logically is
compared at this stage and no intermediate steps are taken into account. Using the example
for a logical mapping displayed in Figure 2.4 once the processes involved in the technical
mapping of the underlying data has been ruled out as root cause, steps 2a-2c, 3c-3b, and
4a-4b should be compared. If step 2a would be compared to step 2b, just following the order
of steps without taking into account whether a logical connection exists between the two
different data sets, the result of the consistency verification between process1 and process2
would be meaningless at this stage as data would be compared that is not comparable by
definition.
The process chain should be investigated from bottom to top using appropriate means like
check reports or queries until the first real occurrence of inconsistencies has been identified.
Once an inconsistency between two logical connected steps has been found, the technical
process steps involved between these steps need to be debugged and investigated to
identify the technical root cause like, transactional incorrect interfaces, technical
programming error, or logical error.
The root cause for the inconsistency needs to be corrected once it has been identified. The
appropriate measure to do so depends on the nature of the root cause. If a coding error was
identified as the root cause, the coding needs to be corrected. On the other hand, if the issue
was caused by the operation of the solution, appropriate measures, for example, would be to
train your Operations team (for example, using a System Administration workshop) and to
adopt your operation procedures.
Details regarding typical root causes, the distinguishing factors between these, and
appropriate solutions can be found in section 2.1.2: Typical Root Causes for
Inconsistencies.
How dependent data is affected needs to be investigated after the root cause has been
identified and corrected. This could be either consolidated reporting data or follow-up errors
like incorrectly created documents. The business processes using the original data identified
as inconsistent have to be followed to understand the impact of an inconsistency in one
business object on depending business objects. If follow-up documents are created (for
example, creation of controlling or financial data from logistics data), these need to be
corrected as well using appropriate measures.
To recover or correct lost and incorrect data sets, several possibilities exist:
Restore into a parallel system and reload of data from the parallel system by RFC,
LSMW and so on
Reload of data from a leading system
Use correction tools to recover data by relationships to redundant stored data
Manual correction
Complete restore of a backup / point-in-time recovery
Very often, a combination of data recovery methods and tools is required, for
example, individual incorrect sales documents could be corrected manually on the
data base and dependent data could be corrected afterwards by correction reports.
Each of the different methods has certain advantages and disadvantages leading to
12
© 2008 SAP AG
13
Best Practice: Data Consistency Check
different use cases.
You can find a more detailed overview about these methods in Appendix 4.3:
Advantages and Disadvantages of Different Recovery Methods.
Important questions that influence the choice of recovery method and therefore need
to be discussed are:
Does dependent data exist for the inconsistent/lost data?
Could these be used for a reconstruction of data?
How many Business Objects and how many instances are affected?
What quantity/complexity of objects is affected?
Is a backup available for a point-in-time recovery?
How much time would the different methods require?
Questions Answers
Which business objects are affected?
Describe the differences and your
expectations.
Background: Sometimes a difference
may be reported due to misinterpretation
of the data’s meaning.
Which business processes are affected?
Which business process steps are
affected?
Describe the business impact.
Background: The business impact is an
important factor to decide whether you
should continue working in the system
until the differences are resolved.
Table 2.1: Questions to Understand the Business Impact and Severity
The results of these questions determine the business processes and the business process
steps which have to be investigated in more detail. Furthermore, the determined impact is
one of the most important basis factors how business operation should proceed.
Once the severity of the inconsistency has been established and it is known which business
process are affected, the next step should be to investigate the inconsistency detection
13
© 2008 SAP AG
14
Best Practice: Data Consistency Check
process and the inconsistency in more detail. Questions to help with this task are collected in
the following table.
Question Answer
How did you notice and identify the
inconsistent behavior?
If observed by consistency check tools:
Which transactions/reports did you use?
Is the difference reproducible or did it
occur only once?
If it is reproducible:
Are the same inconsistencies reported?
How did you see that they are the same?
Did you observe any rules between
inconsistencies?
Background: If rules are observed, they
provide important information about the
nature and possible root cause of the
inconsistency. If it is observed at certain
times only, it could indicate differences
due to performance bottlenecks; if it is
only observed for certain subsets of one
business objects (for example, only
certain storage locations), this could
indicate a mapping error, and so on.
Do the inconsistencies disappear and
appear again after some time?
Table 2.2: Questions for a First Understanding of the Inconsistency
The results of these questions establish whether we have an inconsistency or only a
temporary difference between two data sets. Any rules/connections identified between the
observed inconsistencies need to be considered in the detailed business process root cause
investigations. The next questions will collect more details on the process and provide the
technical data background to verify the correctness of the data mapping and inconsistency
checking process. At this stage, data is collected to function as starting point for the technical
business process and inconsistency investigation as well as for a later correction.
Question Answer
What are the input parameters and what
did you expect?
If observed by a custom made
report/transaction:
How does the report/transaction work
technically?
Which tables and table fields are
compared?
If between different systems:
How is the data mapped?
14
© 2008 SAP AG
15
Best Practice: Data Consistency Check
location or mapping document date to
posting date. You should evaluate the
meaning of table names and fields as
well as a description of the data he wants
to use for the SAP systems.
Table 2.3: Detailed Questions to Improve the Understanding of the Inconsistency
The description of data mapping needs to be used for new check tools or may be entered
directly into the generic report given in section 3.4.3 if two linked tables in one system are
involved.
2.3.2 Process Understanding
Sometimes an error in the program/interface logic or a mismatch between process design
and system usage leads to inconsistencies, especially when inconsistencies between
different systems or between the system and the physical world are observed. Thus, the
understanding of the design of the core business processes which could lead to the
observed inconsistency forms an essential part of finding the root cause, as you have to map
the process design to the system’s use in the real physical world. A detailed description of
the end-to-end process across all systems and applications has to be recorded during this
business process analysis if not documented in your company already. It is important to get
an overview of the business in general and why the chosen process has distinct priority.
For each process and process step the following information must be known:
Has a process owner been defined for this process?
Are there any other known critical problems/issues?
Short description of the process step
Important technical objects used in the process step (transactions, programs, tables)
Online/background execution of transactions
In addition to these general questions, the following questions which are especially important
to inconsistencies have to be answered. They are used to cross check whether an
operational issue (error handling, monitoring) or process design issue could cause the
detected inconsistency.
Question Answer
Has a new system or process been
introduced recently?
Did the end user training contain
instructions how to handle exceptions?
If yes: What exceptions in this area have
been included?
Have the users been used to another
system/transaction/report for similar
tasks in the past?
What testing activities (user acceptance /
integration / volume) have been
performed?
Table 2.4: Questions to Assess the System Usage
The next block of questions should be used if more than one system is involved. They are
intended to determine whether the inconsistency could be caused by several leading
systems, monitoring and error handling in the area of interface monitoring, or whether an
investigation toward transactional correctness should be started.
15
© 2008 SAP AG
16
Best Practice: Data Consistency Check
Question Answer
Is more than one system involved in the
business process steps?
If yes:
What are the roles of the involved
systems? Which one is the leading
system?
Background: Sometimes no clear leading
system is defined. In these cases, a high
effort is needed to keep data consistent
between the systems as changes in one
system could be overwritten by changes
in another system. Common cases are
master data development systems.
Describe the relevant business process
steps affected by the observed
inconsistency
Are interfaces to other systems involved
in these steps or in prior steps regarding
the used business data?
If yes, how do you perform reconciliation
between the involved systems?
Background: Sometimes the temporal
order of reconciliation reports does not fit
the timing of the business process steps.
For example, when running an MRP in
an external legacy planning system, the
current requirements and stock need to
be synchronized with the SAP ERP
backend.
What interface technology do you use
between the two systems?
Is custom-made coding used?
Do you trigger more than one step by
one call of the interface?
What measures have been taken to
ensure that the interfaces are
transactionally correct?
Please describe your monitoring and
error handling concept?
Background: Especially if more than one
step is triggered by an interface using
custom code, the transactional
correctness could be compromised.
Even when the transactional correctness
is ensured, incomplete monitoring and
error handling could lead to data loss.
Using the example of running MRP in an
external legacy planning system,
unobserved errors in the transfer of
requirements and stock will lead to
incorrect planning results due to
inconsistencies between the systems.
Hint: If one of the first two questions is
16
© 2008 SAP AG
17
Best Practice: Data Consistency Check
answered yes, you should involve an
interface or ABAP expert skilled with
transactional correctness investigations
in the next steps or order an Interface
Management Service.
Table 2.5: Questions for Processes Involving Several Systems
17
© 2008 SAP AG
18
Best Practice: Data Consistency Check
18
© 2008 SAP AG
19
Best Practice: Data Consistency Check
Also, errors in middleware like items deleted in queues can result in inconsistencies between
CRM and R/3. To deal with such inconsistencies proactively, interface monitoring is
advisable.
As CRM often uses filters between R/3 and CRM, special care must be taken, as changing
these filter conditions can result in inconsistencies. For instance, if the condition on R/3 side
is widened, business objects may be sent as delta messages which do not have a
corresponding object on CRM side. If, for instance, filter conditions are widened on the R/3
side, an initial load of the respective business objects must be performed.
3.1.2 Data Exchange for the CRM Scenario CRM Mobile
In the CRM scenario CRM Mobile, laptop users (mobile clients) exchange data (business
objects) with a central CRM system via a complex middleware. In this scenario, the CRM
system has a special database for mobile client data processing: the CDB (consolidated
database). Inconsistencies can occur between the CDB and the central database of the CRM
system as well as between the CDB and the mobile clients. The DIMa can be used to
analyze inconsistencies between the central database of the CRM system and the CDB.
Be aware that inconsistencies in a CRM environment can be temporary. Business
objects are exchanged asynchronously in queued RFCs of the middleware. If you
detect a difference for a business object, the changes that rectify the difference might
already be in the queues but not yet processed by CRM. On the next run of DIMa, this
temporary difference would not occur anymore as the changes would be processed from the
queues. Only persisting differences after several runs of DIMa can be considered as real
inconsistencies.
Two common cases of inconsistencies exist between the mobile clients and the CRM server.
1. Data is missing on mobile clients but is present in the CDB. In this case, the
administrator corrects the inconsistency by performing an extract of the business
object in question using transaction SMOEAC for the respective mobile client.
2. A more serious inconsistency is that data changed on the client is not processed
properly by the CRM server because the server rejects the change. These
inconsistencies are only detected in rare cases as the respective BDocs have the
status finished (F02). To detect such inconsistencies, the system administrators
should monitor for BDocs with state F02 in SMW01/SMW02 transaction. As well on
the mobile clients, rejected objects can be found by navigating to the CRM inbox
tileset. If a rejected change is detected, it can be further analyzed why the rejection
occurred. Then, either correction at CRM server level or corrections on the mobile
client have to be performed in order to reprocess the BDoc successfully.
As the outbound queues are sometimes huge, especially after rollout of mobile clients or
mass updates triggered in R/3, outbound queues are sometimes deleted which results in
inconsistencies. To correct these inconsistencies, an extract of the business objects for the
respective clients has to be performed.
3.1.3 DIMa – The Tool to Detect and Repair Inconsistencies Between
SAP ECC and SAP CRM
The Data Integrity Manager (DIMa) helps to detect and repair inconsistencies between
databases in the SAP CRM system, the R/3 backend and SAP CRM’s mobile part. It
supports consistency checks for the most commonly used business objects: business
partners, products (materials), sales documents and pricing information. The DIMa uses the
SAP CRM middleware and the CRM R/3 plug-in to carry out the extraction and comparison
of business objects. DIMa also provides the possibility to correct inconsistencies in addition
to detecting inconsistencies.
For further details about the use of DIMa, refer to the DIMa documentation and the online
help.
You should execute DIMa for the most important business objects on a regular
basis. The monitoring object “DCCRMDIM” may be used to monitor these regular
runs within the SAP Solution Manager.
22
© 2008 SAP AG
23
Best Practice: Data Consistency Check
As well for the CRM Mobile scenario, internal inconsistencies in look up tables or extract
queues for intelligent and interlinkage objects can be corrected by using the report described
in SAP note 622693. The corresponding correction report is
Z_30/40_EXTRACT_LUTAB_EXTRACTED_F. Refer to the SAP note 622693 for further
details.
Note that this report does a neutral PAL check and an over confirmation might occur
that should be taken care of by a BOP run as a second step. For further information,
see the F1 help in the report /SAPAPO/SDRQCR21 for the PAL checks.
24
© 2008 SAP AG
25
Best Practice: Data Consistency Check
3.2.2.4 Overview of Notes for PAL Consistency:
Component Sap Note Description
SCM-APO- 676128 Product allocations: Control of Product allocation
ATP-BF-PAL assignment
SCM-APO- * All *PAL notes from year 2004: note search with ERP
ATP-BF-PAL AND SCM patchlevel for “product allocation”
Table 3.4: Notes for PAL Consistency
SAP recommends that you perform consistency checks during a posting-free period
of time. If you cannot do this, it is possible that inconsistencies between liveCache
and the SAP APO database that existed only briefly will be displayed.
25
© 2008 SAP AG
26
Best Practice: Data Consistency Check
You should re-check the inconsistencies that were displayed (Ctrl+F2: Button “Check again”)
in this case.
If the inconsistencies are still displayed after the check, you can assume that the
inconsistencies did not just exist temporarily and use the appropriate transaction to correct
them. It is recommended to execute the program in the background (F9 or Menu Path
Program -> Execute in Background as it is possible to use the functionality “Evaluate Last
Background Job” afterwards. See SAP note 425825 for further information on this report.
Several consistency check reports were integrated in this transaction as of APO release 3.1
like transaction /SAPAPO/REST02 for resources. During this reimplementation, the scope
has been enhanced so it is recommended to use the functionality within /SAPAPO/OM17.
The monitoring object “DCAPOM17” exists within the SAP Solution Manager’s
Business Process Monitoring functionality to facilitate the regular monitoring with
OM17
You could utilize proactive online monitoring and alerting within the SAP Solution
Manager’s Business Process Monitoring Framework besides the reactive CIF
specific monitoring. The appropriate monitoring object for qRFC monitoring is
IMQRFCMO
3.2.7.1 Example
CIF post processing displays SCM inbound queues that cannot update SCM. The queues
relate to VMI Sales Order 4185622/10, 4185622/10 and 4186738/20. The delta report
detects this error:
26
© 2008 SAP AG
27
Best Practice: Data Consistency Check
Reactivating the queues, the following error messages are found in the SCM application log
in transaction /SAPAPO/c3: “Location 0001065469 does not exist”:
The main issue that makes it very complicated to find the root cause of non-cleared TQAs is
that the originating transaction cannot be seen in monitoring transaction /SAPAPO/AC06 but
only the transaction GUID. The transaction GUID (TRGUID) is a key for temporary objects
that have been created during an availability check. All these objects have the corresponding
transaction GUID and can therefore be identified by it. The transaction GUID is not saved in
R/3 and linked to the transaction in a special table.
31
© 2008 SAP AG
32
Best Practice: Data Consistency Check
The regular monitoring of this report may be facilitated with the monitoring object
“DCAPOCCR” in the SAP Solution Manager’s Business Process Monitoring
Framework.
It is strongly recommended to apply SAP notes 496779 and 488747, which introduce
the iteration functions for the CIF_DELTAREPORT3.
Due to long delta report runtimes, objects may have been changed and may no longer be
inconsistent. Temporary data inconsistencies occur during the transfer of data between SAP
APO and SAP R/3 and disappear again after the transfer. Note, however, that in the case of
lengthy transfers, inconsistencies may still be displayed as errors.
You should use the iteration for the online comparison and for the comparison in the
background, as well as when saving and loading results.
The user should interactively compare the displayed incorrect objects from the result list
again (iteratively) after the DELTAREPORT3 has performed a comparison and displayed the
result when using the online comparison. If you execute the iteration several times in a row,
you can further minimize the number of probable errors caused by temporary data
inconsistencies.
You should set the indicator Iteration for the comparison in the background where
intermediate results are saved and loaded for comparison. Temporary errors or already
solved errors are cleansed and do not appear in the result list anymore. The update of the
errors is triggered manually as the next step.
You can also use the iteration after the reconciliation to check whether the correction was
successful or not. In this way, you can determine whether an error still exists after the
reconciliation or not.
3.2.9.2 Recommendation
When using the /SAPAPO/CIF_DELTAREPORT3 for sales orders, setting the flag “Use
Table VBBE for Sales Order Comparison” is recommended for a much better performance.
You have to run report SDRQCR21 on the SAP R/3 / ERP side in simulation mode first (see
33
© 2008 SAP AG
34
Best Practice: Data Consistency Check
also paragraph 3.3.1.5) to ensure that the requirements situation is correct in the primary
system if this variant is selected.
If incorrect requirements exist, run SDRQCR21 again with data transfer before executing
/SAPAPO/CIF_DELTAREPORT3 on APO/SCM side. Inconsistencies from ERP are likely
transferred to SCM generating a business impact if you skip this step. The BOP runs could
be executed on a defective data basis resulting in ATP over confirmations (= negative ATP)
or ATP under confirmations.
34
© 2008 SAP AG
35
Best Practice: Data Consistency Check
36
© 2008 SAP AG
37
Best Practice: Data Consistency Check
Figure 3.19: Result of /SAPAPO/CIF_DELTAREPORT3
You can always run the report /SAPAPO/SDRQCR21 on demand, selecting as little data as
required, for example selecting affected sales order numbers only.
Example: The document flow indicates a missing delivery. In this case, the question before
attempting to correct the situation is whether the delivery itself is missing or whether the
information in the document flow is incorrect. The document flow needs to be corrected in the
latter case, while in the first case a delivery needs to be recreated. A correction attempt of
the document flow in the first case will cause inconsistencies with potentially disastrous side
effects.
NONE of these correction reports should be planned on a regular basis in correction
mode! They should only be used to correct single faulty documents after root cause
analysis and may destroy data if used carelessly.
37
© 2008 SAP AG
38
Best Practice: Data Consistency Check
Example: The run of a report for correction of document flow followed by SDVBUK00 after
archiving of billing documents and deliveries will open ALL sales documents again thus
causing a production down situation.
Do not transfer billing documents automatically to the accounting; instead use the
two step procedure. In this procedure you set a posting block in the customizing of
the billing document type and use transaction VFX3 to post the billing documents
into accounting after creation of the billing documents.
By separating the two steps, it is possible to determine whether the root cause is to be found
within SD or FI coding. As a side effect, it is ensured that billing documents exist for all
created accounting documents as the existence of a billing document is a precondition for
the creation of the accounting document in this procedure. In addition, you should document
missing document numbers in FI by report RFBNUM00 (SAP note 148757) and document
update terminations (transaction SM13).
40
© 2008 SAP AG
41
Best Practice: Data Consistency Check
3.3.1.5 Inconsistent SD Requirements
3.3.1.5.1 Introduction
Sales documents (quotation, sales order, scheduling agreement, …) and deliveries have
requirements as long as they are not completed (completely delivered/cancelled/rejected).
Those requirements are written, updated, and deleted accordingly in table VBBE. A detailed
explanation of the handling of requirements can be found in SAP note 547277. Too many,
too few, or simply incorrect sales documents or delivery requirements may trigger follow-on
errors in planning, procurement (production, purchase order) or document processing
(availability check).
41
© 2008 SAP AG
42
Best Practice: Data Consistency Check
It is recommended to use the compare mode if you do not want to lock many
documents. This compare mode should be the normal usage of SDRQCR21.
Exceptional case: X is the percentage of items with requirement errors in your system
(number of items with requirement errors / total number of items selected). It becomes
interesting to switch to the non-compare mode instead of the compare mode when X is very
high because you skip the pre-selection phase and the compare routines to directly
regenerate the items requirement. This non-compare mode should be used exceptionally
(only in case the whole VBBE is corrupted for one or several material plant combinations).
Regarding the runtime of locked and non-locked mode: The non-locked mode is always
faster than the locked mode. You can really start to see the difference when X becomes high.
However, if X is small, the difference should not be very significant. Non-locked mode is not
100% secure but the probability of errors is a lot smaller than for the old version of
SDRQCR21. It can be used without problem in a system with low activity.
Regarding the impact of other jobs on the runtime: The runtime of SDRQCR21 might
increase due to the locking process if there are many items which could not be locked. This
is due to the retry logic. The probability of an item failing to get a lock increases with the ATP
activity of the system. Therefore, a slightly longer runtime might occur if transactions V_V2 or
VL10 are running in parallel to SDRQCR21. Because SDRQCR21 locks the documents it
can influence the runtime of other jobs like VL10 or V_V2. They wait for or retry the
documents that were locked by SDRQCR21. Experience shows that the runtime
degradations should not be very significant.
The new version of SDRQCR21 can be used both during the week or weekend. It does not
really matter and results do not depend anymore on the system activities.
It makes sense to use the compare mode and use the locked (flag "enqueue") and non-
locked mode for tests in productive system. The result list with or without simulation mode is
the same for simulation runs (see SAP note 998102 for additional information).
42
© 2008 SAP AG
43
Best Practice: Data Consistency Check
Background: In this case, the old version of SDRQCR21 was running daily in update mode -
with a large number of errors in the spool lists, until the middle of month 10. Then, the
execution of SDRQCR21 was switched to daily in simulation mode and update mode only on
Sundays. As of month 12, the new version of report SDRQCR21 was scheduled daily in
simulation mode at the same time as SDRQCR21 to compare the spool lists of both report
versions. In the future, only the new version of SDRQCR21 will be scheduled.
The graph indicates the number of changes of Sales Orders and Deliveries measured during
5 weeks, confirming that Sundays are the days with the lowest business activity.
# S O a n d L F C h a n g es : M in im u m o n S u n d a ys
60000
# SO and LF changes
50000
40000
30000
20000
10000
0
26.08.2006 02.09.2006 09.09.2006 16.09.2006 23.09.2006 30.09.2006
da y
The next graphs display the number of changes during a Monday and a Sunday. However, it
was not possible to identify or setup time windows with zero activity on Sundays due to the
business requirements, but the Sundays were chosen as the only and best day to schedule
SDRQCR21 in update mode.
43
© 2008 SAP AG
44
Best Practice: Data Consistency Check
The next comparison chart shows the number of errors found in the spool lists for the old and
new version of report SDRQCR21. Evaluating the spool lists, it was found that about 97% of
the errors detected by the old version of SDRQCR21 were temporary errors (due to
production operation peaks on sales orders and deliveries while the report was running).
30.000
25.000
# of errors
20.000
15.000
10.000
5.000
0
06
06
06
06
06
06
06
06
06
06
06
06
06
06
20
20
20
20
20
20
20
20
20
20
20
20
20
20
day
9.
9.
0.
0.
1.
1.
1.
2.
9.
0.
0.
1.
2.
2.
.1
.0
.0
.0
.1
.1
.1
.1
.1
.1
.1
.1
.1
.1
15
22
29
06
13
20
27
03
10
17
24
01
08
15
Figure 3.23: Comparison Between Old (sdrqcr21) and New (z_sdrqcr21e) Version of SDRQCR21
Note that from the middle of month 10 on, the old version of report SDRQCR21 was running
in simulation mode only, except for Sundays. Running it daily in update mode, the number of
errors in the spool lists would have raised further than in month 9 and the beginning of
month10.
Running the report SDRQCR21 in update mode during high peak system activity with sales
orders/deliveries, the found temporary errors get updated on the database which results in
real errors; those errors would get detected in the next execution of the report.
44
© 2008 SAP AG
45
Best Practice: Data Consistency Check
Therefore, to analyze requirement errors, it is very helpful to schedule report SDRQCR21 on
a regular basis. Otherwise, it is very difficult to find the real errors as they get lost in long
spool lists.
Only the new version of SDRQCR21 is able to filter out the temporary errors avoiding
the creation of fake errors.
45
© 2008 SAP AG
46
Best Practice: Data Consistency Check
46
© 2008 SAP AG
47
Best Practice: Data Consistency Check
These reports also cover the special stocks used by the Industry Solution “Aerospace &
Defense” in the MRO process. Special reports are provided for stock inconsistencies in the
Industry Solution “Oil & Gas” in SAP notes 67261, 212707, 378731, and 447714.
The SAP note 32236 contains the analysis report MBFIRST which gives a first overview of
the system with important details, for example, valuation level, negative stocks allowed,
Material Ledger usage, serial number usage, WM usage, archived material and FI
documents, and so on.. This information is very important when starting the correction
process.
The monitoring objects DCMMMB5K and DCMLCKMC exist to facilitate the regular
monitoring of transaction MB5K and CKMC using the Business Process Monitoring
functionality of the SAP Solution Manager.
As the report reads the table MSEG, the run can be very performance intensive if the
selection is not specific enough.
48
© 2008 SAP AG
49
Best Practice: Data Consistency Check
3.3.2.5 Inconsistencies (quantity) Between Valuation Tables and Stock Tables
The report MBQUANT compares the stock tables against the valuation tables. The list of the
valuation tables can be found later in this document. The performance is not an issue in most
of the cases. This report assumes that the stock tables have the correct information over the
valuation tables thus it suggests a correction there.
Archived documents have no influence on this report. For very big selections MBMSSQUA
has to be used.
3.3.2.7 Purchasing
The report ZCORREKESBYGR corrects the quantities reduced (MRP) (EKES-DABMG) of
confirmations with GR assignment. Further information can be found in SAP note 215072.
The report RM06C020 finds and corrects inconsistencies with dependent requirements for
subcontracting purchase order proposals. Further information can be found in SAP note
115899.
With the correction report ZKORVETV, the table VETVG (Delivery DueIndex) can be set up
for an individual purchase order again. Further information can be found in SAP note 61148.
SAP note 100690 provides correction reports ZKO* for inconsistencies of stock transport
orders and stock transport scheduling agreements regarding quantities at schedule line level
(EKET-GLMNG, EKET-WAMNG, EKET-WEMNG).
SAP note 202875 provides the correction report ZCORRDABMG regarding inconsistencies
of EKET-DABMG and EKES-DABMG.
The report ZKOREKET checks and corrects if EKPO-MENGE of a scheduling agreement
does not correspond to EKET-MENGE. Further information can be found in SAP note 67278.
The reports provided by SAP note 32236 should only be carried out by experienced SAP
consultants as the results of these reports need some interpretation. SAP note 34440
describes the correction procedure applied by the SAP consultants for customers.
50
© 2008 SAP AG
51
Best Practice: Data Consistency Check
Important stock tables available only for Discrete Industries
MCSD Customer Stock
MCSDH
MCSS Total Customer Stock
MCSSH
MSCD Customer stock with vendor
MSCDH
MSCS Customer stock with vendor - Total
MSFD Sales Order Stock with Vendor
MSFDH
MSFS Sales Order Stock with Vendor - Total
MSID Vendor Stock with Vendor
MSIDH
MSIS Vendor Stock with Vendor – Total
MSRD Project Stock with Vendor
MSRDH
MSRS Project Stock with Vendor – Total
3.3.4 Tools for Processes Involving WM
3.3.4.1 Stock Comparison Between WM and IM
Transaction LX23 investigates and corrects stock differences between inventory
management and warehouse management. LX23 can be used both for a centralized and for
a decentralized scenario. LX23 runs on the warehouse management system in the case of a
decentralized scenario and the inventory management stocks are determined by RFC from
the central ERP system. The actual differences between the two inventory management
levels can be compared via an automatic physical inventory where the stock comparison
report first reads all IM stocks including special stocks and reads and summarizes the WM
stocks in a second step. The individual stocks are listed and the difference is calculated.
An automated correction can be executed if you select the 'Clear differences' check box, but
you should bear in mind that the system always assumes that the WM system is the leading
system. Correspondingly, in the event of differences, the IM stock is adjusted to the WM
stock and NOT vice versa. This correction is performed by a generated batch input session
for transaction MI10 which performs an automatic physical inventory which adjusts the IM
stocks to the WM stocks. No batch input session is generated in the case of a decentral
warehouse scenario, but IDocs of message type MBGMCR are created instead and sent to
the central ERP system.
The regular monitoring of failed goods movements can be facilitated with the
Business Process Monitoring in SAP Solution Manager using monitoring object
KPCONF01.
52
© 2008 SAP AG
53
Best Practice: Data Consistency Check
3.3.5.2.2 Failed Cost Postings in PP-SFC
Report CORUCOFC (transaction COFC) is used for identifying and correcting the reason of
failed cost postings and subsequent post processing. See SAP note 565370 “FAQ:
Reprocessing incorrect actual costs (COFC)” for details. Such costs are only of interest for
the R/3 system and are not relevant for SCM nor do they have an impact on APO PP/DS.
The regular monitoring of failed goods movements can be facilitated with the
Business Process Monitoring in SAP Solution Manager using monitoring object
KPCONF01.
53
© 2008 SAP AG
54
Best Practice: Data Consistency Check
3.3.6.2.2 Failed Postings of Costs
Impact of erroneous cost postings: Erroneous cost postings lead to missing costing records,
which for instance could be missing for resource related billing. That would cause the end
customer to be billed less. You should run transaction COFC (report CORUCOFC) for post
processing of erroneous cost calculations on a regular basis to decrease the impact of such
situations. This also applies to PP and PM.
54
© 2008 SAP AG
55
Best Practice: Data Consistency Check
55
© 2008 SAP AG
56
Best Practice: Data Consistency Check
3.4.3 Generic Consistency Check for Two Linked Tables in One System
3.4.3.1 Introduction
Very often, you have to verify data between two different tables which are linked. For
example you could be interested in identifying line items without corresponding header
information (for example, sales order items VBAP without corresponding sales order header
data in VBAK). Or during data load/creation into a new system all materials of a certain
material type (for example, HAWA) should have a specific entry in one of the material master
views (for example, MARC or MVKE). Usually, you would have to write a program or a query
to find this kind of data. Within the service software portal transaction ST13, a new report
/SSA/Z>CD1 has been delivered that can answer all these questions.
The report starts with a selection screen where you can enter the table names you want to
investigate as well as field names and restrictions. You can also differentiate what kind of
check you would like to execute between the linked table (e.g. whether you would like to see
entries where a field has an undesired content or whether you would like to identify missing
fields).
The report creates an appropriate dynamic SQL-statement for the entered table names, use
case, and field restrictions. Once the selection has finished the resulting list is displayed
together with the used SQL-statement in the ALV. As the created SELECT-statement may
not be supported by an appropriate index, the run time of the report may be very long in
certain cases.
The results of regular executions of this report may be monitored from SAP Solution
Manager with the monitoring object “DCGEN001”.
56
© 2008 SAP AG
57
Best Practice: Data Consistency Check
57
© 2008 SAP AG
58
Best Practice: Data Consistency Check
58
© 2008 SAP AG
59
Best Practice: Data Consistency Check
Figure 3.29: Example to Find Table Entries not Having the Desired Field Content
While we have checked so far individual field content with a specified field content, you may
very often also interested in a consistency check between field contents of two linked tables
independent of the value. An example would be verification between customizing settings
and the associated stored values in the movement data. In this case, the only required
information is whether the values correspond, but not actually which field content it is
supposed to be. This type of check can be performed with the report as well. An example for
a selection variant is displayed in Figure 3.30: Compare Two Linked Tables without Knowing
the Intended Field Content.
59
© 2008 SAP AG
60
Best Practice: Data Consistency Check
Figure 3.30: Compare Two Linked Tables without Knowing the Intended Field Content
60
© 2008 SAP AG
61
Best Practice: Data Consistency Check
4. Appendix
4.1 General Roadmap for Analysis
Customer reports
difference
Perform Inconsistency
Assessment
Inconsistencies in
Y low-level data? N
Inconsistency
with new data?
Compare connected steps
using check reports/queries
starting from bottom up
N
Y N
Inconsistency
in step data?
Trace and debug
underlying programs to
find RC
Y
N Intermediate
technical
steps?
Y
Trace and debug step, Trace and debug intermediate
investigate logical correctness steps
61
© 2008 SAP AG
62
Best Practice: Data Consistency Check
Several Verify
Systems Leading
involved? Systems
Questions for
Interface
Monitoring
New System
used?
Questions for
Transactional
Correctness
Questions
regarding Training Interface
Management
/ SMO SA
Detailed Analysis
62
© 2008 SAP AG
63
Best Practice: Data Consistency Check
Does Is a working
dependent Backup
data exist? available?
Dependent
Perform a
manual
correction
Correct
dependent data
63
© 2008 SAP AG
64
Best Practice: Data Consistency Check
64
© 2008 SAP AG
65
Best Practice: Data Consistency Check
No Recovery Method Advantages Disadvantages Requirements
66
© 2008 SAP AG
67
Best Practice: Data Consistency Check
Quantity
3
4
Complexity
67
© 2008 SAP AG
68
Best Practice: Data Consistency Check
© Copyright 2008 SAP AG. All rights reserved.
No part of this publication may be reproduced or transmitted in any form or for any purpose without the express
permission of SAP AG. The information contained herein may be changed without prior notice.
Some software products marketed by SAP AG and its distributors contain proprietary software components of
other software vendors.
Microsoft®, WINDOWS®, NT®, EXCEL®, Word®, PowerPoint® and SQL Server® are registered trademarks of
Microsoft Corporation.
IBM®, DB2®, OS/2®, DB2/6000®, Parallel Sysplex®, MVS/ESA®, RS/6000®, AIX®, S/390®, AS/400®, OS/390®, and
OS/400® are registered trademarks of IBM Corporation.
ORACLE® is a registered trademark of ORACLE Corporation.
TM
INFORMIX®-OnLine for SAP and Informix® Dynamic Server are registered trademarks of Informix Software
Incorporated.
UNIX®, X/Open®, OSF/1®, and Motif® are registered trademarks of the Open Group.
HTML, DHTML, XML, XHTML are trademarks or registered trademarks of W3C®, World Wide Web Consortium,
Massachusetts Institute of Technology.
JAVA® is a registered trademark of Sun Microsystems, Inc. JAVASCRIPT® is a registered trademark of Sun
Microsystems, Inc., used under license for technology invented and implemented by Netscape.
SAP, SAP Logo, R/2, RIVA, R/3, ABAP, SAP ArchiveLink, SAP Business Workflow, WebFlow, SAP EarlyWatch,
BAPI, SAPPHIRE, Management Cockpit, mySAP.com Logo and mySAP.com are trademarks or registered
trademarks of SAP AG in Germany and in several other countries all over the world. All other products mentioned
are trademarks or registered trademarks of their respective companies.
Disclaimer: SAP AG assumes no responsibility for errors or omissions in these materials. These materials are
provided “as is” without a warranty of any kind, either express or implied, including but not limited to, the implied
warranties of merchantability, fitness for a particular purpose, or non-infringement.
SAP shall not be liable for damages of any kind including without limitation direct, special, indirect, or
consequential damages that may result from the use of these materials. SAP does not warrant the accuracy or
completeness of the information, text, graphics, links or other items contained within these materials. SAP has no
control over the information that you may access through the use of hot links contained in these materials and
does not endorse your use of third party Web pages nor provide any warranty whatsoever relating to third party
Web pages.
68
© 2008 SAP AG