+ All Categories
Home > Documents > TDMS Operation guide

TDMS Operation guide

Date post: 26-Dec-2014
Category:
Upload: bizkit87
View: 2,236 times
Download: 12 times
Share this document with a friend
85
Solution Operation Guide SAP Test Data Migration Server Operations Guide Release 3.0 Document Version 1.5, July 2010
Transcript
Page 1: TDMS Operation guide

Solution Operation Guide

SAP Test Data

Migration Server

Operations Guide

Release 3.0

Document Version 1.5, July 2010

Page 2: TDMS Operation guide

2

© Copyright 2009 SAP AG. All rights reserved.

No part of this publication may be reproduced or transmitted in any

form or for any purpose without the express permission of SAP AG.

The information contained herein may be changed without prior

notice.

Some software products marketed by SAP AG and its distributors

contain proprietary software components of other software vendors.

Microsoft, Windows, Outlook, and PowerPoint are registered

trademarks of Microsoft Corporation.

IBM, DB2, DB2 Universal Database, OS/2, Parallel Sysplex,

MVS/ESA, AIX, S/390, AS/400, OS/390, OS/400, iSeries, pSeries,

xSeries, zSeries, z/OS, AFP, Intelligent Miner, WebSphere, Netfinity,

Tivoli, and Informix are trademarks or registered trademarks of IBM

Corporation in the United States and/or other countries.

Oracle is a registered trademark of Oracle Corporation.

UNIX, X/Open, OSF/1, and Motif are registered trademarks of the

Open Group.

Citrix, ICA, Program Neighborhood, MetaFrame, WinFrame,

VideoFrame, and MultiWin are trademarks or registered trademarks of

Citrix Systems, Inc.

HTML, XML, XHTML and W3C are trademarks or registered

trademarks of W3C®, World Wide Web Consortium, Massachusetts

Institute of Technology.

Java is a registered trademark of Sun Microsystems, Inc.

JavaScript is a registered trademark of Sun Microsystems, Inc., used

under license for technology invented and implemented by Netscape.

MaxDB is a trademark of MySQL AB, Sweden.

SAP, R/3, mySAP, mySAP.com, xApps, xApp, SAP NetWeaver, and

other SAP products and services mentioned herein as well as their

respective logos are trademarks or registered trademarks of SAP AG

in Germany and in several other countries all over the world. All other

product and service names mentioned are the trademarks of their

respective companies. Data contained in this document serves

informational purposes only. National product specifications may

vary.

These materials are subject to change without notice. These materials

are provided by SAP AG and its affiliated companies ("SAP Group")

for informational purposes only, without representation or warranty of

any kind, and SAP Group shall not be liable for errors or

omissions with respect to the materials. The only warranties for SAP

Group products and services are those that are set forth in the express

warranty statements accompanying such products and services, if any.

Nothing herein should be construed as constituting an additional

warranty.

Disclaimer

Some components of this product are based on Java™. Any code

change in these components may cause unpredictable and severe

malfunctions and is therefore expressively prohibited, as is any

decompilation of these components.

Any Java™ Source Code delivered with this product is only to be used

by SAP’s Support Services and may not be modified or altered in any

way.

Documentation in the SAP Service Marketplace

You can find this documentation at the following Internet address:

service.sap.com/instguides

In order to make your document available under this alias, contact

GBU AI.

SAP AG

Dietmar-Hopp-Allee 16 69190 Walldorf Germany T +49/18 05/34 34 24 F +49/18 05/34 34 20 www.sap.com

Page 3: TDMS Operation guide

Typographic Conventions

Type Style Represents

Example Text Words or characters that appear on the screen. These include field names, screen titles, pushbuttons as well as menu names, paths and options.

Cross-references to other documentation

Example text Emphasized words or phrases in body text, titles of graphics and tables

EXAMPLE TEXT Names of elements in the system. These include report names, program names, transaction codes, table names, and individual key words of a programming language, when surrounded by body text, for example, SELECT and INCLUDE.

Example text Screen output. This includes file and directory names and their paths, messages, names of variables and parameters, source code as well as names of installation, upgrade and database tools.

Example text Exact user entry. These are words or characters that you enter in the system exactly as they appear in the documentation.

<Example text> Variable user entry. Pointed brackets indicate that you replace these words and characters with appropriate entries.

EXAMPLE TEXT Keys on the keyboard, for example, function keys (such as F2) or the ENTER key.

Icons

Icon Meaning

Caution

Example

Note

Recommendation

Syntax

Page 4: TDMS Operation guide

Operations Guide for SAP Test Data Migration Server

4 July 2010

Contents

1 Getting Started ............................................................................. 7

1.1 Global Definitions .......................................................................... 7

1.2 Important SAP Notes ..................................................................... 8

1.3 History of Changes ...................................................................... 11

2 Technical System Landscape ................................................... 13

2.1 TDMS Implementation Project .................................................... 13

2.2 Embedding TDMS into Current Processes and System Landscape .............................................................................................. 13

2.2.1 Creation of Test Data As-Is Analysis ............................................. 14

2.2.2 Definition of „To Be‟ Situation ........................................................ 18

2.2.3 Transition plan / change management .......................................... 20

2.3 General Remarks .......................................................................... 20

2.4 Details About System Roles ....................................................... 21

2.5 System Settings ........................................................................... 21

2.6 Working With System Shells (General) ...................................... 21

2.7 Shadowing in the Context of SAP TDMS ................................... 22

2.7.1 Shadowing Technology (General) ................................................. 22

2.7.2 Shadowing in a TDMS Process ...................................................... 23

2.8 Related Documentation ............................................................... 25

3 Installation and Upgrade ........................................................... 26

3.1 First-Time Installation .................................................................. 26

3.2 Upgrading from SAP TDMS 2005 to SAP TDMS 3.0 ................. 26

3.3 Installing Support Packages ....................................................... 26

4 Preparations ............................................................................... 27

4.1 Preliminary System Analysis ...................................................... 27

4.2 Choosing a Process Type (Reduction Scenario) ...................... 27

4.2.1 Time-based Scenario: Choosing option for transfer of open items 27

4.3 Choosing and Preparing the Sender System ............................ 28

4.4 Preparing the TDMS Server ......................................................... 28

4.5 Preparing the Receiver System .................................................. 28

4.6 Prerequisites and Preparations for System Shell Creation ..... 29

4.6.1 Staff and Skill Requirements .......................................................... 29

4.6.2 Duration and Timing........................................................................ 29

4.6.3 Preparations .................................................................................... 29

4.7 Options to Prepare Receiver for Transfer of HCM Data ........... 30

4.8 Working With RFC Connections ................................................. 31

Page 5: TDMS Operation guide

Operations Guide for SAP Test Data Migration Server

5 July 2010

4.8.1 Users and User Roles ..................................................................... 31

4.8.2 Destination Statuses ....................................................................... 31

4.8.3 Creating RFC Destinations ............................................................. 31

4.8.4 Reusing Existing RFC Destinations............................................... 32

4.8.5 Passwords ....................................................................................... 32

4.8.6 Synchronization .............................................................................. 32

4.8.7 Locking and Unlocking RFC Destinations .................................... 33

4.8.8 Changing Existing RFC Destinations ............................................ 33

4.8.9 Deleting RFC Destinations ............................................................. 33

4.8.10 Helpful Transactions for Troubleshooting RFC Destinations ..... 33

5 Recommended Procedures and Troubleshooting .................. 34

5.1 General Troubleshooting Information........................................ 34

5.2 Working With Projects, Subprojects and Packages ................. 35

5.3 Handling of Users and User Addresses .................................... 35

5.4 Impact of the Data Transfer on the Sender System .................. 36

5.5 Deletion of Data from Receiver System ..................................... 36

5.5.1 Deletion Scenarios .......................................................................... 36

5.5.2 Locking Users During Deletion ...................................................... 37

5.6 Define Logical System Name in Receiver System .................... 38

5.6.1 Background Information ................................................................. 38

5.6.2 Recommended Procedure to Avoid Inconsistencies ................... 38

5.6.3 Alternative Procedure ..................................................................... 39

6 Performance Issues ................................................................... 40

7 How to Meet Specific Requirements ........................................ 42

7.1 Reorganization of Transfer Packages in Migration Server Overview ................................................................................................. 42

7.2 Deletion of Obsolete Function Groups ...................................... 43

7.3 Data Protection and Privacy ....................................................... 44

7.3.1 Legal Requirements with regards to Data Protection and Privacy 44

7.3.2 TDMS Functionality to comply with Data Protection and Privacy Requirements ............................................................................................... 44

7.4 Data Scrambling (General) .......................................................... 46

7.4.1 Background ..................................................................................... 46

7.4.2 Creating a Rules Package ............................................................... 47

7.4.3 Working With Scrambling Rules .................................................... 47

7.5 Data Scrambling for HCM Data ................................................... 49

7.5.1 Background ..................................................................................... 49

7.5.2 General Concepts ............................................................................ 49

7.5.3 How to Create Customizing for Data Scrambling ......................... 50

Page 6: TDMS Operation guide

Operations Guide for SAP Test Data Migration Server

6 July 2010

7.5.4 Example ............................................................................................ 50

7.6 TDMS and SAP APO .................................................................... 54

7.7 Data That Is Not Transferred ....................................................... 55

7.8 How to Transfer Individual Tables .............................................. 55

7.8.1 Prerequisites .................................................................................... 55

7.8.2 Data Transfer ................................................................................... 56

7.8.3 Example: Transfer the Data of Tables CMFK and CMFP .............. 56

8 Additional Information for Business Process Library ............ 57

8.1 Introduction .................................................................................. 57

8.2 Change Technical Settings ......................................................... 57

8.3 Select Objects to Be Copied ....................................................... 61

8.3.1 Areas of the TDMS BPL Scenario Designer View ......................... 61

8.3.2 Scenario View and Usage View ...................................................... 63

8.3.3 Upload a Selection Set .................................................................... 63

8.3.4 How to Assign the Selection Set to the Starting Table ................ 64

8.3.5 Enhance a Predefined Business Context by Adding Further Tables Detected During Crawler Run ......................................................... 65

8.3.6 How to Create Table Relationships ............................................... 65

8.3.7 Copy Control Settings..................................................................... 67

8.3.8 Copy Control Settings for Entities ................................................. 68

8.3.9 Copy Control Settings for Tables .................................................. 68

8.3.10 How to Copy One or More Specified Tables ................................. 73

8.4 Activity “Define Selection Criteria” ............................................ 73

8.4.1 Selections Tab Page ....................................................................... 73

8.4.2 Tab Page “Selection Criteria” ........................................................ 75

8.4.3 Selection Set .................................................................................... 75

8.4.4 Show Contained Tables in the Tables Tab .................................... 75

8.5 Repeatable execution of BPL packages .................................... 75

9 Consistency in Test Landscapes Built With SAP TDMS ......... 77

9.1 Business Suite Components supported by TDMS ................... 77

9.2 Assumed Customer Scenario ..................................................... 77

9.3 Consistency .................................................................................. 78

9.3.1 Consistency Within Components .................................................. 78

9.3.2 Consistency among Different Components .................................. 82

10 Appendix .................................................................................. 85

10.1 Related Guides ........................................................................... 85

Page 7: TDMS Operation guide

1 Getting Started

1.1 Global Definitions

7 July 2010

1 Getting Started

This guide does not replace the daily operations handbook that we recommend customers create for their specific production operations.

About this Guide

This guide provides a starting point for managing SAP Test Data Migration Server (SAP TDMS) and maintaining and running it optimally. The guide contains additional information for various tasks as well as tips and tricks for using SAP TDMS in connection with other applications. This guide also provides references to the required documentation, so you will sometimes also need other guides such as the master guide.

As of February 1, 2008, the name of the current release of SAP Test Data Migration Server (SAP TDMS) has been changed from SAP TDMS 2006 to SAP TDMS 3.0 to comply with the current naming conventions at SAP. This change concerns only the name of the product. It does not imply any changes in the functionality for SAP TDMS.

Target Groups

Technical Consultants

System Administrators

Solution Consultants

Business Process Owner

Support Specialist

1.1 Global Definitions

Activity: Step in the process monitor that represents a task in a package. For each activity in the process monitor, there is a documentation text that tells you why this particular step is necessary and what you should do. To read the documentation for an activity, select the activity in the process monitor and choose Display text. For most of the activities, there is also an executable function. Depending on the nature of the related step, this function will either start a program (or sequence of programs) in the background or display a selection screen in which you make the required entries. To execute the function for an activity, select the activity and choose Execute.

Migration server overview: Central entry screen for SAP TDMS. The migration server overview lists the existing projects, subprojects and packages and provides the functions for creating new projects, subprojects and packages as well as some other generic functions for SAP TDMS.

Package: Instance of a data transfer with SAP TDMS. The existing packages are listed in the migration server overview. When you select a package from the migration server overview, the process monitor for this package is displayed.

Page 8: TDMS Operation guide

1 Getting Started

1.2 Important SAP Notes

8 July 2010

Process monitor: List of activities to be executed in a TDMS project and related information. The activities are listed in chronological order and grouped in a hierarchy of phases and subphases. Note that you cannot execute an activity in a later phase until all activities from the preceding phases have been completed successfully. For each activity that is currently being executed or has already been executed, the process monitor shows detailed status information.

Project: Compilation of one or more subprojects (which are related from a technical or content point of view)

Process type: Defines the scope and procedure for the data transfer. For example, there is a process type for transferring only master data and Customizing data and another process type for transferring data for a specific business object. When you create a package for SAP TDMS, you select the required process type. The process type determines what activities are shown in the process monitor for a package. A list of the currently available process types is included in the master guide for SAP TDMS.

Subproject: Combination of a sender system, a receiver system, and the RFC connection between them. If it makes sense from an organizational perspective, more than one subproject can be created for a given combination.

System shell: SAP system that contains contains all table structures (table definitions), but no data (content) for most application tables. Due to this setup, a system shell can be used as the basis for building up a new system (by using SAP TDMS) and possibly as a development system, but not as a test system or QA system, because it lacks the required application data for these purposes.

1.2 Important SAP Notes

The list provided below includes only the most frequently used SAP Notes related to SAP Test Data Migration Server. If you are looking for the answer to a more specific question, make sure to also consider the other available SAP Notes for application area XX-PROJ-DMS and its subnodes (for example by doing a search for XX-PROJ-DMS* or for “TDMS” and a key word related to your question).

SAP Note Number Title Comment

SAP Note 970531 Installation and delta upgrade of DMIS 2006_1

Installation of the DMIS add-on is a requirement for installation and use of SAP TDMS. This SAP Note describes the installation procedure

SAP Note 970532 Installation of DMIS Content 2006_1

SAP TDMS is shipped in the DMIS_CNT add-on. This SAP Note describes the installation procedure for DMIS_CNT

SAP Note 1094395 Installation of DMIS_EXT 2007_1 The add-on called DMIS_EXT contains additional process types (reduction scenarios) for SAP TDMS. This SAP Note describes the installation procedure for DMIS_EXT

Page 9: TDMS Operation guide

1 Getting Started

1.2 Important SAP Notes

9 July 2010

SAP Note 1152897 Installation of DMIS_BSC 2008_1 The add-on called DMIS_BSC contains the available process types (reduction scenarios) for SAP BW and CRM systems. This SAP Note describes the installation procedure for DMIS_BSC

SAP Note 970534 Upgrade to R/3 470,ECC 500,ECC 600 with DMIS 2006_1

This SAP Note explains what you should keep in mind regarding the DMIS add-on when doing a system upgrade

SAP Note 970539 Upgrade to R/3 470,ECC 500,ECC 600 with DMIS_CNT 2006_1

This SAP Note explains what you should keep in mind regarding the DMIS_CNT add-on when doing a system upgrade

SAP Note 1152898 Upgrade to R/3 470,ECC 500,ECC 600 mit DMIS_BSC

This SAP Note explains what you should keep in mind regarding the DMIS_BSC add-on when doing a system upgrade

SAP Note 890797 SAP TDMS - required and recommended system settings

This SAP Notes contains information about suitable system settings for SAP TDMS in different constellations (for example for different database types)

SAP Note 894307 TDMS: Tips, tricks, general problems, error tracing

This SAP Notes contains a list of known issues and solutions as well as tips and tricks

SAP Note 1003051 Collective Note for SAP TDMS 3.0

This SAP Note lists the most important SAP Notes for SAP TDMS; it does not claim to be comprehensive if you are looking for the answer to a more specific question, do an additional Notes search for “SAP TDMS” and a key word related to your question

SAP Note 916763 TDMS performance “composite SAP note”

This SAP Note lists known performance issues for SAP TDSMS and provides information about how to analyze and resolve these issues.

SAP Note 1405597 TDMS composite note for Support Package 12

This composite note contains all notes for TDMS Support Package 12 and above

SAP Note 1402704 TDMS Composite Note : Support Package Independent

This composite note will contain notes that are valid for all SPs and are very important

SAP Note 1003051 TDMS composite note : Support Package Dependent

This composite note mainly contains showstoppers with Support Package SP7 and above

Page 10: TDMS Operation guide

1 Getting Started

1.2 Important SAP Notes

10 July 2010

SAP Note 994106 Release limitations TDMS 3.0 This SAP Note informs you about any existing limitations of usage of certain parts of TDMS in certain system environments.

Page 11: TDMS Operation guide

1 Getting Started

1.3 History of Changes

11 July 2010

SAP Note 1228855 Installation DMIS_HR 2008_1 The add-on called DMIS_HR contains the available process types (reduction scenarios) for SAP HR systems. This SAP Note describes the installation procedure for DMIS_HR

SAP Note 1159279 Objects that are transferred with TDMS

This SAP Note gives some background information about the application areas whose data is not (or only partially) transferred by SAP TDMS.

1.3 History of Changes

Make sure you use the current version of this Operations Guide.

You can always find the current version of this guide is at service.sap.com/instguides on

SAP Service Marketplace.

The following table provides an overview of the most important changes in prior versions.

Version Important Changes

1.0 First version of the SAP TDMS Operations Guide

1.1 Chapter on system shell creation enhanced; various minor additions and corrections

1.2 Additional chapters for Business Process Library and HCM (activity groups, data scrambling); various minor additions and corrections

1.3 Additional chapter on consistency in test landscapes built with TDMS

1.4 New sections:

2.1 TDMS Implementation Project

4.7 Options to Prepare Receiver for Transfer of HCM Data

5.6 Define Logical System Name in Receiver System

7.4.4.5 Define Customer-specific Scrambling Type using TDMS Workbench

Changes in existing chapters:

2.2 General Remarks

3.1 First Time Installation

5.1 General Troubleshooting Information

5.6.1 Description of deletion scenaris

6 Performance Issues

9.3.2 Consistency among Different components

Page 12: TDMS Operation guide

1 Getting Started

1.3 History of Changes

12 July 2010

1.5 New sections:

2.2 Embedding TDMS into current processes and system landscape

4.2.1 Time-based scenario: Choosing option for transfer of open items

7.3 Data Protection and Privacy

Changes in existing chapters:

1.2 Important SAP notes

2.1 TDMS Implementation Project

3.3 Installing Support Packages

5.4 Impact of the Data Transfer on the Sender System

8.3.9.4 How to Scramble Data in a BPL context

9. Consistency in Test Landscapes Built With SAP TDMS

9.1 Business Suite Components Supported by TDMS

9.3.1.1 TDMS for ERP

9.3.2.1 Approach to Keep Test Landscapes Consistent

The following table is an overview of the most important changes in the product

Versions Important Changes

SAP Test Data Migration Server 2005 ( April 2006 )

First version

SAP Test Data Migration Server 3.0 ( March 2007 )

SAP TDMS 3.0

SAP Test Data Migration Server 3.0 SP06 ( October 2007 )

SAP TDMS for Business Process Library released

SAP Test Data Migration Server 3.0 SP08 ( December 2008 )

SAP TDMS solutions for BI and CRM, inclusion ofthe activity to Drop SAP Office tables

SAP Test Data Migration Server 3.0 SP09 ( December 2008 )

SAP TDMS solution for HCM released

SAP Test Data Migration Server 3.0 SP13 ( December 2009 )

Improved troubleshooting, repeatable execution of BPL packages

.

Page 13: TDMS Operation guide

2 Technical System Landscape

2.1 TDMS Implementation Project

13 July 2010

2 Technical System Landscape

2.1 TDMS Implementation Project

We recommend starting with a small implementation project for TDMS. You should allow for some time to familiarize yourself with the software before using TDMS to build up non-productive clients in your business operations. The duration of such an implementation project is customer-specific and depends on the technical preconditions and project requirements.

Usually, such a project involves the following steps:

1. Draw up a concept for the usage of TDMS.Refer to chapter ‘2.2 Embedding TDMS into Current Processes and System Landscape’ for more detailed information.

2. - Create test system landscape: (sender, central and receiver system; check if hardware enhancements are necessary) - Define TDMS scenario and depending parameters, such as the from-date. - Define refresh cycles (monthly, quarterly etc.)

3. Install TDMS on each of the participating systems

4. Carry out a technical TDMS (dry) run to get familiar with the software and with the procedure of testing the results. For this purpose, a non-production system might be used as the source system. If you use the time-based scenario, you should start with a small time frame, for example one month, and then increase it successively depending on the result of the transfer.

5. If necessary, carry out application tests in order to find out if the data extract meets the requirements of the tester.

6. Evaluate the technical run and derive necessary measures to optimize the use of TDMS.

7. TDMS „productive‟ run (with a production system as the sender system) to create a client for use in a non-production system

After the first technical (dry) run, you should estimate the data volume of your biggest applications that are to be transferred. Usually, the biggest applications are CO-PA, SD, FI and other applications for which programs for filling internal header tables are available. According to the distribution of application data, you can parameterize the programs for filling internal header tables to avoid storage problems during the transfer. For further information about the parameterization, please refer to the troubleshooting documentation of the programs for filling internal header tables under „Adjust parameter for execution‟.

2.2 Embedding TDMS into Current Processes and System Landscape

Before you start using TDMS you should draw up a concept for the usage of TDMS. Consider that there are different business cases for TDMS:

Page 14: TDMS Operation guide

2 Technical System Landscape

2.2 Embedding TDMS into Current Processes and System Landscape

14 July 2010

1) TDMS replaces full copies

When you replace numerous full copies of productive clients by reduced clients created by TDMS, this significantly saves disk space and administrative costs (which are in proportion to your system size). Yet, you will have additional effort and costs caused by

- the TDMS implementation

- the TDMS execution (for an experienced user, the effort will be slightly higher than the effort for a system copy)

- the reduced set of transferred data, which might lead to some additional test effort at the beginning of the project

In this business case, the net benefit is clearly visible and easy to calculate.

2) Creation of new clients with production-like data

When you create new clients with production-like data, the benefit will be an improved quality of test data because target-oriented and reduced test data will be provided. This will lead to cost savings due to more efficient and quicker tests as well as a better quality of developments. Yet, additional effort and costs are caused by

- newly required hardware

- TDMS implementation

- TDMS execution

In these cases, it is hard to estimate or calculate the benefits, whereas the costs can be clearly determined.

2.2.1 Creation of Test Data As-Is Analysis

As a first step, we recommend that you create an as-is analysis of how test data is currently being created in your system landscape. The following sections are intended to provide you with some aspects that should help you in your analysis.

2.2.1.1 Typical Situations

a) The following situations can be often found in current landscapes:Developers create unit test data in development systems. These systems are typically not provided with production-like data.

b) Test systems are created as recent copies of production systems, usually with the help of system or client copies.

c) Elaborate test data planning and data management This requires more than copying productive data. It involves drawing up a concept about which data from the production system is really needed and where it is needed. In brief, a representative set of „production-like‟ data has to be created. This might also include a representative set of erroneous test data, for example when you want to test a functionality which is designed to avoid errors.

Check how your test data in your various systems is being created and what kind of data you need.

2.2.1.2 Landscape

As a next step, we recommend that you analyze your system landscape and identify where you want to implement TDMS. The following guiding questions might help you to do so:

1. What does your system landscape look like? How is it designed?

Page 15: TDMS Operation guide

2 Technical System Landscape

2.2 Embedding TDMS into Current Processes and System Landscape

15 July 2010

2. What kind of systems do you use? For example, do you use Q&A and test systems, do you use HCM and have this on a separate installation? Are these linked via ALE?

3. What are the transport paths (Customizing, Support Packages etc.) between these systems?

4. What kind of refresh modus do you currently use for which systems, e.g. client copy, system copy…?

5. How often do you refresh your systems?

6. What is your main aim for using TDMS? This depends on your as-is situation. When you currently use system / client copies to copy productive data, your main motivation might be to reduce data and to save hardware costs (see section 2.2). If you mainly work with development data (transports from development system to Q&A system), your main motivation will probably be to improve the quality of test data by providing production data.

2.2.1.3 The answer to all of these questions has an impact on the concept

of the TDMS usage, Characteristics / KPIs

This section describes some basic considerations and KPIs for the usage and implementation of TDMS.

a) Comparison of TDMS with Client and System Copy The following table overview will help you to compare TDMS with other scenarios to create test clients from a production system and help you to evaluate where TDMS would be of benefit for you:

TDMS in implementation phase

TDMS optimally used

Client Copy System copy / restore

1 Refresh of individual clients in a multi-client system

+ + + Impossible

2 Runtime of a single refresh per client

- o - ++

3 Impact on the operation of other clients during refresh

- o o Not relevant

4 Data reduction + ++ o Impossible

5 Parallelization and resource control of data transfer

+ ++ o Not relevant

b) Target client size vs. data quality for several TDMS scenarios See section 2.2 for details.

c) Data volume / Data reduction Regarding the data volume to be transferred and the corresponding data reduction,

Page 16: TDMS Operation guide

2 Technical System Landscape

2.2 Embedding TDMS into Current Processes and System Landscape

16 July 2010

we can provide some guiding values for your orientation. Based on our experience, the following data reduction might be achieved: - Business Process Library: When you use the BPL, you transfer single business contexts like sales orders , purchase orders etc. This scenario is especially suitable for unit tests, and the data volume that is transferred usually amounts to about 1% of the source client size. - Master Data and Customizing Scenario Using this scenario, you transfer all Customizing and master data, but no transaction data. This is suitable for the enhancement of Customizing and master data as well as following unit tests.Typically, the amount of data that is transferred in this scenario is about 10% of the source client. This, however, depends on how long the source system has been live. When it has gone live recently and contains little transaction data, there will be less data reduction. - TDMS for HCM There are three scenarios for TDMS for HCM, starting from the single transfer of selected data up to a mass data transfer. Depending on the scenario you choose, the data reduction will be between about 1 and 99% of source client data.

d) Considerations regarding the total transfer runtime

Basically, there are three areas that have an influence on the runtime of TDMS:

Hardware-related factors: Based on our experience, there are typically the following hardware-related bottlenecks: - I/O: This means physical reading and/or writing of the data. An indicator of a bottleneck are long I/O wait times - Computing power: This limitates the degree of parallelization in general and in particular the parallelization and speed of the reading and writing of the data in cluster tables. An indicator of a bottleneck are low values of idle CPUs. - Main storage: This can limitate the degree of parallelization, in particular in case of complex processing such as the selection of table VBUK. An indicator of a bottleneck are high values for data paging /swapping. - Network capacity (actual data transfer rate): We have seen this only in rare cases and thus there are no further comments on this from our side.

Runtime-critical activities in the process tree: These are dependent on hardware-related factors (see above) and typically include: - Data deletion (receiver client); The impact of the data deletion on the overall runtime is particularly high when the receiver system is a multi-client system and the non-target clients cannot be locked. This requires the deletion scenario „Array Delete‟ technique which is

Page 17: TDMS Operation guide

2 Technical System Landscape

2.2 Embedding TDMS into Current Processes and System Landscape

17 July 2010

suboptimal for the runtime. For a description of the various deletion scenarios and their usage, please refer to section „5.5.1 Deletion Scenarios‟ - Data selection (sender client); This is particularly runtime-critical when the overall data volume of the sender system is high and the portion of the data to be transferred is high as well. This is particularly true when the portion of Logistics data is high and here especially the number of sales documents. Another critical factor is when the daily operations in the sender system require most of the system resources. - Data transfer The runtime of the data transfer is dependent on the data volume to be transferred. In most cases, there is a bottleneck of resource capacities in the receiver system.

Other influencing factors on runtime - Data distribution A large portion of SD and CO-PA data usually results in long runtimes. - Customer-specific tables (Y* and Z*) When your customer-specific tables contain a large amount of data, you should define suitable selection criteria for them. - System settings / system parameters: Settings that are suitable for an intensive dialog operation are usually not suitable for mass data processing that mainly runs in background. Consider the recommendations in SAP Note 890797. - Idle times during the TDMS run due to - unnoticed terminations of activities, for example terminations due to table space overflow - time needed for the analysis and solution of issues - non-working hours (night, weekend etc.) The idle time can vary between some hours only - when there are optimal settings for the system landscape and the user is experienced – and up to several weeks when the system landscape is not optimally set up and the user is unexperienced.

- The overall time needed to work on those process tree activities that are not runtime-critical depends on the experience of the TDMS user. A user with a moderate knowledge of TDMS might need a net time of one day to process these activities.

e) Refresh runtime After the implementation phase of TDMS, the effort and the runtime for the TDMS runs will decrease significantly.

- f) Special considerations for multi-client sender and/or receiver systems In multi-client systems, there is always some impact on performance and response time of the system, even in the non-source and non-target clients. TDMS offers options to control the resource consumption in a target-oriented

Page 18: TDMS Operation guide

2 Technical System Landscape

2.2 Embedding TDMS into Current Processes and System Landscape

18 July 2010

way: - Determination of degree of parallelization for process tree activities. The higher the degree of parallelization, the higher is the resource consumption. Yet, the lower the parallelization is, the longer the run time willl be.Parallelization of resource-intensive and runtime-critical activities

1. Parallelization of data deletion and data selection

2. Partial parallelization of data selection and data transfer (only when sender system has sufficient capacities)

3. Limitation of processes available for TDMS ( ensures a minimum capacity for other non-TDMS activities in the sender and receiver system

2.2.2 Definition of „To Be‟ Situation

Based on the analysis of your current creation of test data and the evaluation of your KPIs, you can derive your individual „to be„ situation and define where and when TDMS shall be used.

In the following, we give some examples of how typical landscapes may look like.

Page 19: TDMS Operation guide

2 Technical System Landscape

2.2 Embedding TDMS into Current Processes and System Landscape

19 July 2010

Scenario A: Simple System Landscape

In a simple system landscape, TDMS typically replaces a full system copy of the production system to the quality assurance system and provides the development system with production data.

Scenario B: More complex system landscape

In more complex system landscapes, a master system - created via a snapshot/mirror technique, system copy or restore from the production system - can serve as a multiple source for TDMS runs. For example, it can be the source for training systems, quality assurance or development systems.

Page 20: TDMS Operation guide

2 Technical System Landscape

2.3 General Remarks

20 July 2010

2.2.3 Transition plan / change management

Based on your „to be‟ situation, you can start the TDMS implementation and draw up a detailed project plan, assign resources etc.

2.3 General Remarks

For a general overview of the required system landscape, see the master guide for SAP TDMS. For information about security-related aspects, such as user roles and authorizations, see the security guide for SAP TDMS.

The system infrastructure for a data transfer using SAP TDMS requires the following system roles:

A sender system (client) from which the data supply for the non-production system is obtained (see also chapter 4.3)

The TDMS server, which includes:

o A central system (client) on which the settings and customizing for the setup of the non-production system are stored

o A control system (client) from which almost all activities for SAP TDMS are triggered and monitored

A receiver system (client), that is, the non-production system to be filled (see also chapter 4.5)

Page 21: TDMS Operation guide

2 Technical System Landscape

2.4 Details About System Roles

21 July 2010

The sender and the receiver must have same release and codepage. For further information regarding TDMS and codepages, please have a look at SAP Note 1249737.

2.4 Details About System Roles

A given system landscape should contain exactly one TDMS server.

The sender system may also serve as the central system if it is on basis release 620, 640 or 700. The receiver system must not be used as the central system.

Any of the systems except the receiver system may also be used as the control system. The reason why the receiver system should not be used as the central system or control system is that the historical data for SAP TDMS needs to be stored permanently, while the receiver system is meant to be refreshed at regular intervals.

The recommended configuration (with regard to data security and performance) is to implement the TDMS server separately. Typically, this would mean that the TDMS server is installed on a machine of its own. However it is also possible to operate it, for example, in an SAP Solution Manager system.

2.5 System Settings

General recommendations:

Provide at least eight batch processes and eight dialog processes for each system (or system role) in a transfer. That is, if the sender and the central system are in the same SAP system, you need to double the minimum number of available processes.

In the receiver system, DB archive logs and SAP system parameter REC/CLIENT should be disabled to improve writing performance.

For more detailed and up-to-date information about required and recommended system settings, see SAP Note 890797. This SAP Note also contains detailed information about recommended settings for the most common database types.

2.6 Working With System Shells (General)

Technically speaking, a system shell is an SAP system that contains only the basis tables and the table structures of the other tables. Due to this setup, a system shell can be used as the basis for building up a new system and possibly as a development system, but not as a test system or QA system, because it lacks the required application data for these purposes.

With SAP TDMS, you can create a system shell and use it as the receiver system in a subsequent SAP TDMS run. (See also chapter 4.6.)

If you need to create non-production systems from scratch on a regular basis, you may want to keep a system shell so that you simply have to copy it every time you need a new receiver system. Depending on your requirements, you can use one of the following types of shells:

Save shell: A save shell is a simple backup of the system shell created in preparation to an SAP TDMS implementation. When a new system is built up using a save shell, all

Page 22: TDMS Operation guide

2 Technical System Landscape

2.7 Shadowing in the Context of SAP TDMS

22 July 2010

transports that have gone into the related sender system after creation of the backup must be imported to the shell.

Master shell: A master shell is also created as a copy of the original system shell. However it is included in the transport chain and receives all transports that go into the production system. As a consequence, it always reflects the current status of the DDIC and cross-client information in the production system, and no additional transports are required when a new system is to be created based on the master shell.

2.7 Shadowing in the Context of SAP TDMS

Consistency of data that is extracted by SAP TDMS can only be ensured if no changes are made in the sender system while the data extraction is taking place. To this purpose, the sender system must be locked while the data extraction is running. However the substantial downtime this implies may not be acceptable if the production system is used as the sender system.

To avoid this issue, you can replicate the production system to a shadow system from which SAP TDMS can then extract the required data.

2.7.1 Shadowing Technology (General)

Storage systems commonly offer possibilities of creating a quick copy of selected storage volumes with almost no impact on production. Generally speaking, there are two different technologies: full copy and snapshot.

A full copy provides a full physical copy of selected storage volumes onto an additional set of storage volumes (disks).

A full copy requires the same amount of storage space for the mirror volumes as for the source volumes. To create the copy, the data is synchronized between the volumes (which is done in the background at storage system level). When the synchronization is finished, the mirror relation can be split so that the mirror volumes contain an image reflecting the state at the time of the split. This full image can be mounted to a backup server or can be used to start a cloned system.

Full Copy

Prod

Prod

Snapshot

Full Copy Snapshot

Storage System

Storage System

Back Back

Page 23: TDMS Operation guide

2 Technical System Landscape

2.7 Shadowing in the Context of SAP TDMS

23 July 2010

A snapshot provides a logical copy of selected storage volumes by maintaining pointers to the respective data blocks.

A snapshot does not require double storage space. When creating a snapshot, the system just maintains a list of pointers to the currently used data blocks on the disk. At the time of the snapshot, these pointers point to exactly the same blocks as used by production. When production continues and new data is stored, the modified blocks are written to a different location in the storage system, leaving the previous blocks (which are still referenced by the snapshot) intact. The pointer to the productive block is updated and now points to the modified block, while the pointer belonging to the snapshot still points to the old data block. When some production data is deleted, only the “productive” pointer to this block is deleted, the “snapshot” pointer and the block itself are kept as they are. These policies insure that the snapshot image remains unchanged by productive operation. When a snapshot image is no longer needed, the corresponding pointers are released and all storage blocks which are no longer referenced (the blocks that had been modified or deleted by production) are released as well.

Since free storage blocks are required for each modification, a sufficient amount of free storage space is required when snapshots are to be used. The required storage space depends on the intensity of modifications and deletions in the production system and on the length of the time period during which a snapshot exists.

Just like a full copy, a snapshot image can be mounted by a backup server and backed up to tape. It can also be used to start a system clone. Starting a system clone requires that the snapshot is writeable, which means that the snapshot can be modified by the system clone. This is possible with most of today‟s snapshot implementations. Modifications to the snapshot are handled in a similar way as modifications in the production system after a snapshot was created. Modified blocks are written to a new location and the pointers of the snapshot are updated. It is important to understand that writes done to the snapshot image have no impact on the original production data blocks; the integrity of the production system is not affected by writes to a snapshot.

2.7.2 Shadowing in a TDMS Process

A TDMS process is typically divided into phases, which have more or less impact on the sender or receiver system. The most critical system is usally the sender, because it is the production system which is heavily used by many users who cannot accept bad response times or even a system downtime. Hence, the use of snapshots should minimize the impact of SAP TDMS runs on the sender system.

2.7.2.1 Package Settings

In this phase, you make some technical settings, such as definition of RFC destinations of sender, central and receiver system. In addition, you find customer-specific tables in the sender system, and decide if you want to convert client IDs and logical system names, and if you want to apply scrambling rules.

From the perspective of RFC settings and TDMS impact on the sender system, it would be best to create a shadow system even before you start with the package settings phase. However the disadvantage of this approach is that inconsistencies between the sender system (production system) and the receiver system are more likely to occur if the time between the shadow system creation and the data transfer to the receiver system is longer.

2.7.2.2 System Analysis

In this phase, you do some analysis and preparation work for the subsequent data transfer in all participating SAP systems. These tasks have an impact on the workload and performance of these systems.

Page 24: TDMS Operation guide

2 Technical System Landscape

2.7 Shadowing in the Context of SAP TDMS

24 July 2010

The performance impact on the productive sender system might be inaccaptable for customers. Therefore we recommend that you create the shadow system before this phase. As mentioned for the package settings phase, however, the disadvantage is that inconsistencies between the productive sender and the receiver system are more likely to occur if the time between the shadow system creation and the data transfer to the receiver system is longer.

If the snapshot is created after the RFC settings in the package settings phase were made, the destination of the sender system has to be changed accordingly right after the shapshot has been created.

2.7.2.3 Preparations in the Receiver System

To make the receiver system ready for the data transfer, you need to remove all client-dependent data. In this phase, you make the necessary preparations and start the functions that delete the client-specific data from the receiver system.

In standard scenarios like TDTIM or TDTCC, this phase has no impact on the sender system. Hence it is not recommended to create the shadow system only in this phase. If you have not created the shadow system already, it is better to wait until this phase has been completed, because this helps to further reduce the risk of data inconsistencies.

2.7.2.4 Data Transfer

In this phase, you make a final size check for the tables to ensure that no significant changes have taken place while your SAP TDMS project was going on, and then start the actual data transfer.

If you want to minimize the risk of data inconsistencies between sender and receiver, we recommend that you create the shadow system directly before you start executing the activities in this phase.

If you do so, you only need to adjust RFC settings of the sender system accordingly. Then the activities of the data transfer refer to the new sender system, that is, the shadow system.

2.7.2.5 Postprocessing Tasks

Finally, some postprocessing is required. Thus, number ranges and table buffers in the receiver system must be reset.

To create a shadow system right before this phase obviously does not make sense.

Page 25: TDMS Operation guide

2 Technical System Landscape

2.8 Related Documentation

25 July 2010

2.8 Related Documentation

The following table shows where you can find more information about the technical system landscape.

Topic Guide/Tool Quick Link on SAP Service Marketplace (service.sap.com)

General overview of the typical system landscape and technical configuration for SAP TDMS

Master Guide instguides

Security aspects Security Guide security

Page 26: TDMS Operation guide

3 Installation and Upgrade

3.1 First-Time Installation

26 July 2010

3 Installation and Upgrade

3.1 First-Time Installation

SAP TDMS is shipped in add-ons called DMIS, DMIS_CNT, DMIS_EXT, DMIS_BSC and DMIS_HR. You can find all required information with regard to the installation of these add-ons (and consequently SAP TDMS 3.0) in SAP Notes 970531, 970532, 1094395, 1152897 and

1228855.

When you install the add-on called DMIS, specific user roles are created in all clients of the relevant systems. Having completed the installation, you must generate the relevant authorization profiles for these roles. For more information, see SAP Note 897100.

3.2 Upgrading from SAP TDMS 2005 to SAP TDMS 3.0

If you have the 2005 version of SAP TDMS installed and want to upgrade to SAP TDMS 3.0, keep in mind that old packages can still be displayed, but not used actively.

When starting to work with SAP TDMS 3.0, create new subprojects and transfer packages rather than using the existing one(s).

For more information, see SAP Note 1054047.

3.3 Installing Support Packages

Support packages for SAP TDMS are made available at regular intervals. Make sure that you always work with the latest support package.

Installation of Support Packages can – in some cases – have an impact on your existing TDMS packages. In particular, refresh runs for existing TDMS packages may not be possible

Page 27: TDMS Operation guide

4 Preparations

4.1 Preliminary System Analysis

27 July 2010

4 Preparations

4.1 Preliminary System Analysis

To be able to choose the optimal project setup and settings for SAP TDMS in your specific system environment, you need to collect some information about the participating SAP systems, particularly:

Data volume of most critical tables in the sender system; tables to watch out for are, for example, BKPF, VBUK, EQUI and CE* tables. If you are planning to use a time-based process type, consider also the planned from-date to get a more precise estimate of the expected data volume for transfer and the tables that are most critical in your specific transfer constellation.

Current total data volume in the sender system

Volume increase per week (average and peaks)

4.2 Choosing a Process Type (Reduction Scenario)

The available process types (reduction scenarios) for SAP TDMS and their typical use cases are described in the current version of the master guide.

An important point to watch out for when using a time-based reduction scenario is the archiving strategy for the main applications in the participating systems, because you may run into problems if you choose a from-date that lies so far in the past that parts of the relevant data have already been archived. On the other hand, you can obtain valuable information about the data volume associated with a given from-date by analyzing the archiving strategy.

4.2.1 Time-based Scenario: Choosing option for transfer of open items

When you have chosen the time-based scenario, you have an option regarding the transfer of open items in Financials.

In Financials, TDMS selects all cleared items for which the posting date (field BKPF-BUDAT) or change date (field BKPF-AEDAT) is equal or higher than the chosen from date. For example, when the fiscal year is equal to the calendar year and you select 1 January of the current year as the selection date, all documents with given values equal or higher than the beginning of the current year will be selected.

For open items, you have the following options:

- Option „all open items‟ (flag P_OP = X; see SAP Note 1044518): o Open items for vendors and customers (tables BSID and BSIK) will be

taken completely (independent of the selection date) o Open items for G/L-accounts (table BSIS): All entries marked as „open item

relevant‟ (flag BSIS-XOPVW = X) will be taken (independent of the selection date)

- Option „open items from selection date on‟ (flag P_OP = blank): o Open items for vendors and customers (tables BSIK and BSID) will be

taken if their posting date or change date is equal or higher than the selection date (dependent on the selection date). In our example, all

Page 28: TDMS Operation guide

4 Preparations

4.3 Choosing and Preparing the Sender System

28 July 2010

customer / vendor open items with posting or change date higher than 1 January will be selected.

o Open items for G/L-account (table BSIS) will be taken if their posting date or change date is equal or higher than the selection date (dependent on the selection date, but independent of identification for „open item relevant‟ (flag BSIS-XOPVW). In our example, all open G/L-account open items with posting date or change date equal or higher 1 January would be transferred.)

4.3 Choosing and Preparing the Sender System

The obvious choice for the sender system is the production system from which you want to copy the data for the non-production system. However a quality assurance system can also be used as the sender system if it was set up as a full copy of the production system.

Alternatively, you may want to use a snapshot or mirror or an existing 1:1 copy of the production system (for example a QA system). The options you have here depend on the hardware you use and can consequently not be covered here. To learn more, contact your hardware provider. For more information on snapshots, see chapter 2.5.

From a technical perspective, SAP TDMS does not require a downtime of the sender system. However note that if you do allow activities in the sender client during the data extraction data that was changed in the sender while the data extraction was going on may be inconsistent.

One tablespace in the sender system (cluster) should provide enough free space to take about ten percent of the expected data volume to be transferred. The table containing the cluster is called DMC_INDXCL in the sender system and CNVMBTCLU in the receiver system.

If possible, place the cluster in a separate tablespace.

Consider also that you need some disk space for programs that are generated in the course of an SAP TDMS project (approximately 3.5 GB per initial transfer package in the control system and 0.5 GB in the sender and receiver system, respectively).

4.4 Preparing the TDMS Server

When setting up the TDMS server (control system and central system), make sure to consider the recommendations for system settings given in SAP Note 890797.

4.5 Preparing the Receiver System

With regard to the system to be used as the receiver system, you have the following options:

Use an existing non-production system that is based on a recent copy of the sender system

Use a 1:1 copy of an existing system (typically the related production system)

Use a system shell (see chapter 4.6)

Page 29: TDMS Operation guide

4 Preparations

4.6 Prerequisites and Preparations for System Shell Creation

29 July 2010

4.6 Prerequisites and Preparations for System Shell Creation

4.6.1 Staff and Skill Requirements

SAP TDMS provides a process for creating a system shell. The process combines the standard system copy functionality (homogenous system copy with R3Load) with the reduction logic included in SAP TDMS. To create a system shell using this procedure, you must have experience in copying systems and be familiar with the administration of the operating system, the database, the ABAP Dictionary and the SAP middleware components used by your SAP solutions.

A technical consultant who is specifically certified for system copy should carry out the system copy procedure onsite. This applies irrespective of whether the system to be migrated is a development system, a test system, or a production system. This ensures that sufficient know-how is available to handle the complexity of the procedure.

4.6.2 Duration and Timing

When copying a system containing production data, choose a well-defined starting time for executing the copy. The following points should be considered:

Depending on the copy strategy, a downtime of the production (sender) system may be required.

The system shell will not be available until the complete shell creation process is finished.

When you use an existing system as the receiver system (system shell to be set up), existing data in the receiver system will be lost.

The shell creation must be aligned with any other activities that may take place in parallel in the participating systems to avoid loss of data and performance issues.

The duration of a shell creation process is essentially determined by the system copy process plus a client copy. The export and import steps will take less time than in a full copy. Most of the application data is not copied. Consequently, the size of exported data should be about 100 to 200 GB.

4.6.3 Preparations

Carry out all preparations as described in the system copy guide for the appropriate database system and SAP release. For shell creation, the homogenous system copy method with R3Load must be used. You can find the current versions of all available system copy guides in SAP Service Marketplace under quick link /systemcopy (http://service.sap.com/systemcopy). Some important steps are:

o Prepare hardware of target system (e.g. sufficient disk space)

o Prepare software of target system (e.g. DB instance and appropriate SAP kernel)

o Make sure that all required CDs/ DVDs for the system copy are available in the appropriate version.

Provide the system details for the sender, control and receiver systems (R3 logins, OS logins for <sid>adm and root).

Create statistics in your source database. This helps to reduce the time required for the export, because you can skip the step in which r3setup/sapinst creates statistics in the source database in preparation to the export if you have up-to-date statistics available.

Page 30: TDMS Operation guide

4 Preparations

4.7 Options to Prepare Receiver for Transfer of HCM Data

30 July 2010

Provide all media that are necessary for the installation of a database instance in the receiver system.

Provide a local r3setup/sapinst directory in the sender and receiver system respectively. (This is a directory from where the setup tools are started.) The size of each directory should be 300MB.

Provide up-to-date versions of R3Load, R3Ldctl and R3szchk in your kernel directory.

You can find details about R3szchk in SAP Note 1047369 - Faster DB accesses for R3szchk using Oracle DBSL.

Provide a directory/share via NFS export in the sender and the receiver system in which the export data can be stored. For a normal system copy, 10% of the database size is sufficient, but it might be necessary to extend this space. For a shell system copy, even less space may be required due to the reduced tables. Since an import via NFS is not officially recommended by the system copy guide, it would be best to also provide local storage on the receiver system as a target for the database import.

Provide contact persons for the areas of SAP basis administration, OS admin and DB admin who can take care of any issues in their respective areas in a timely manner.

The receiver system should already exist (central instance installed) and must have the same release level as the sender system. Note that any existing database content will be deleted. If the receiver system does not yet exist, an installed and patched database parameterized in accordance with SAP recommendations should be pre-installed on the target server. The export directory is either copied to the system (which requires sufficient local disk space) or can be mounted via NFS in advance (use of NFS is not recommended).

Ensure the consistency of the ABAP Dictionary and database of the source system. In particular, make sure that no objects (tables, indexes) exist in the database, but are missing from the ABAP Dictionary and vice versa. If there are differences you can tolerate, you can edit DBDIFF accordingly (transaction DB02 -> Checks -> Database -> ABAP Dictionary). Create any missing objects in the database. For more information, see SAP Note 33814 - Warnings of inconsistencies between database & R/3 DDIC. All tables that have been created manually at database level within the SAP schema user will lead to errors. If an object is not needed any more, delete it from the database or enter it in table DBDIFF. Also, create objects that are missing at database level and only exist in the dictionary in transaction DB02.

4.7 Options to Prepare Receiver for Transfer of HCM Data

There are several options to prepare the receiver for a transfer of HCM data:

Transfer of HCM data into an existing client (full client copy)

Client copy (customizing only) plus time-based scenario (TDTIM) plus HCM scenario

Shell creation plus time-based scenario (TDTIM) plus HCM scenario

Client copy (customizing only) plus time-based reduction and reduction by company code (TDTCC) plus HCM scenario)

Shell creation plus time-based reduction and reduction by company code (TDTCC) plus HCM scenario

Client copy (customizing only) plus master data and customizing scenario (TDMDC) plus HCM scenario

Shell creation plus master data and customizing scenario (TDMDC) plus HCM scenario

Page 31: TDMS Operation guide

4 Preparations

4.8 Working With RFC Connections

31 July 2010

4.8 Working With RFC Connections

All systems that participate in an SAP TDMS project must be linked by remote function calls (RFC). You can access the maintenance view for the RFC connections either from the migration server overview (by choosing RFC Information) or from the process monitor for a package (by executing the function for activity Define Destinations).

To get a more detailed description of the individual functions and tasks in this area, choose the information button on the RFC maintenance view. Consider also the related information in the security guide for SAP TDMS, for example regarding user roles and authorizations.

4.8.1 Users and User Roles

To be able to work with the RFC destinations for SAP TDMS, you need a communication user (CPIC user) with user role SAP_TDMS_USER (SAP_TDMS_USER_EXT for business process library).

If you are not sure whether your user has the role SAP_TDMS_USER assigned, go to the user maintenance (transaction SU01), tab page Roles, and check the settings for your user. On this tab page, you can also see if the user role SAP_TDMS_USER exists in the system, and generate it if this is not yet the case.

4.8.2 Destination Statuses

Note that the overall statuses for definition, connection and synchronization of RFC destinations in the maintenance view for RFC destinations are green only if all destinations have been defined and synchronized correctly and if all connections work. The respective status is red if one or more destinations were not yet defined or synchronized at all, if one or more destinations were not defined or synchronized correctly, and if one or more connections do not work.

If a status is red, open the detailed status information to find out exactly which destination is not yet o. k., and resolve the issue.

4.8.3 Creating RFC Destinations

To create an RFC destination in the context of SAP TDMS, select the relevant system in the maintenance view for RFC destinations, enter the required information, and choose Apply. The system checks if the data you entered is correct and if the connection can be established. If this is the case, the status traffic light for this connection changes to green.

In some cases, particularly if there is a firewall between the participating systems, you may have to maintain the destinations using transaction SM59 rather than the related TDMS function.

Page 32: TDMS Operation guide

4 Preparations

4.8 Working With RFC Connections

32 July 2010

Make sure that all RFC users in all destinations for a TDMS transfer have the same time zone assigned. Otherwise, you might get inconsistent values for start and end times as well as for the duration of activities.

4.8.4 Reusing Existing RFC Destinations

You can reuse the data from an existing RFC destination if you have a user and password for this destination. To do so, proceed as follows:

1. In the maintenance view for RFC destinations, select the system for which you want to reuse a destination.

2. Choose Get Destination.

3. Enter the ID of the required destination, or use the search function.

4. Choose Show to display the technical data for this destination.

5. Enter the user ID you want to use, and choose Accept.

6. Enter the password for the user you specified in the previous step, and choose OK to confirm your selection.

4.8.5 Passwords

The passwords for the connections may consist of eight or fewer characters. Only capital letters and numbers are allowed.

Note that the handling of passwords in RFC management has changed with SAP Netweaver 2004s: The passwords for RFC destinations are no longer stored directly in the table for the RFC connection data, but administered by Secure Storage. To ensure that this does not lead to problems during synchronization of RFC destinations, an additional step is required.

Secure Storage is an ABAP kernel function for encrypted data storage. It acts as a protected storage area for access data for external systems. For more information, see SAP Note 816861.

When you start the synchronization for RFC connections, the synchronization program checks if the release level is SAP Netweaver 2004s or higher for one or more of the participating SAP systems. If this is the case, choose one of the following options:

You maintain the destinations in your central system. If you choose this option, a maintenance view is displayed, and you are prompted to enter the destination passwords. The passwords are then transferred once in unencrypted form.

You maintain the destinations directly in each of the relevant systems by logging on to the system, running transaction CNV_MBT_REP_PWD and entering the passwords. The passwords are then transferred in encrypted form.

4.8.6 Synchronization

Once you have created all required destinations and the overall status traffic light is green, you can start the synchronization. During the synchronization, the administrative data (such as information

Page 33: TDMS Operation guide

4 Preparations

4.8 Working With RFC Connections

33 July 2010

about the project, subproject and package and about the RFC destinations) is distributed to all participating systems. Completion of the synchronization (green traffic light) is a prerequisite for all subsequent steps in an SAP TDMS project.

Subprojects and projects must be unique across all participating systems. To ensure this, the synchronization includes a check for existence of the current subproject and package in every participating system. If the current subproject and/or package number has already been used for a TDMS run in the remote system, an error message is output. You can resolve this error by creating a new package (and subproject respectively) for the current TDMS transfer.

4.8.7 Locking and Unlocking RFC Destinations

To find out if an RFC destination is locked against changes, choose Set Lock Status.

If the profile parameter PCL_RFC_EXPERT is set for your user, you are authorized to lock and unlock destinations.

Once a destination has been unlocked, any user can change the RFC settings for this destination.

4.8.8 Changing Existing RFC Destinations

You can change unlocked destinations using transaction SM59 – either from the central system or (if synchronization has already taken place) from any of the participating systems.

If you change RFC destination settings (especially the participating systems) when you have already started with a TDMS run, the transfer is invalidated!

4.8.9 Deleting RFC Destinations

If you do not need a destination any more, you can delete it. During the deletion, the technical information for the destination is removed from all participating systems, so that it is no longer possible to use the RFC connection once the deletion is completed. The deletion result can have the following statuses:

Deletion was completed successfully: Traffic light is green

Destination did not exist in the first place or has already been deleted: Traffic light is yellow

Error during deletion: Traffic light is red.

4.8.10 Helpful Transactions for Troubleshooting RFC Destinations

SM59 Maintain RFC Destinations

SMGW Gateway Monitor

ST01 System Trace

ST05 RFC Trace

Page 34: TDMS Operation guide

5 Recommended Procedures and Troubleshooting

5.1 General Troubleshooting Information

34 July 2010

5 Recommended Procedures and

Troubleshooting

5.1 General Troubleshooting Information

If you encounter a problem with a specific SAP TDMS activity, proceed as follows:

Check the activity documentation and the log for information about errors and error resolution.

Select the relevant activity in the process tree and choose Troubleshooting. If specific troubleshooting information is available for this activity, it is displayed. The troubleshooting information can consist of special troubleshooting activities and/or documentation.

Example:

If the data transfer aborts due to duplicate key errors, you can use the troubleshooting activity to change the write behavior of the aborted conversion object to “modify”.

Page 35: TDMS Operation guide

5 Recommended Procedures and Troubleshooting

5.2 Working With Projects, Subprojects and Packages

35 July 2010

Further examples for which Troubleshooting functionality is available: - Processing of conversion objects - Maintain parameters for programs for filling internal header tables - Maintain selection parameter / customizable selects

To find the activity that caused the issue and get troubleshooting information for this specific activity, you may have to activate the extended view of the process monitor first. To do so, choose Settings -> Change process tree view.

Check collective SAP Note 1003051 for SAP Notes that refer to the problem you encountered.

Search for other relevant SAP Notes in components XX-PROJ-DMS-*.

If all this does not help, open a message in component XX-PROJ-DMS-TDM and describe the problem. You can also download the questionnaire provided as an attachment to SAP Note 939823, fill it in and attach it to your message.

If you encounter a problem that is not directly related to a specific activity, check SAP Note 894307 for possible solutions. If this does not help, open a message in component XX-PROJ-DMS-TDM and desribe the problem.

5.2 Working With Projects, Subprojects and Packages

All packages in a subproject should be based on the same process type.

Typically, the relationship between a subproject and a given combination of a sender client, a receiver client, and the RFC connection between them is 1:1. However you can create more than one subproject for a given combination if it makes sense in your specific project constellation. The opposite – combining more than one sender, receiver or RFC respectively in one subproject – is not possible. This also means that you cannot have multiple senders or receivers for one transfer.

5.3 Handling of Users and User Addresses

You need a CPIC user in all participating systems of your SAP TDMS project to maintain the RFC connections. These CPIC users must be created as communication users (not system users) and have the role SAP_TDMS_USER (SAP_TDMS_USER_EXT for business process library) assigned.

The dialog user in the central system must have at least profile SAP_TDMS_PROJECT_LEAD_USER (SAP_TDMS_PROJECT_LEAD_USER_EXT for business process library) assigned, because this profile is necessary to create projects, subprojects, and packages.

For more information about users, authorizations, profiles and related topics, see the security guide for SAP TDMS and the information provided in SAP Note 897100.

Do not create new users or change user data in the time interval between saving user addresses and adjusting user addresses. The reason for this is as follows: Addresses of business objects (such as customers, vendors, users or clients) are stored in central address tables. Each address has a 10-digit ID. These IDs are assigned to new addresses in ascending order.

Page 36: TDMS Operation guide

5 Recommended Procedures and Troubleshooting

5.4 Impact of the Data Transfer on the Sender System

36 July 2010

For the purposes of TDMS, user-related data in the receiver system must be kept intact, while customizing and master data (and possibly application data) is transferred from the sender system. However address numbers assigned to user addresses in the receiver system may be assigned to other objects (that are to be transferred) in the sender system. During the transfer, this would lead to duplicate table keys, and the data transfer would fail.

To avoid this, the user address information in the receiver system is stored in temporary data containers (shadow tables) and then deleted from its original places. After the addresses from the sender have been inserted, the users are assigned new addresses based on the information in the temporary data containers. So if user data was added or changed after creation of the shadow tables, inconsistencies in the user data might occur.

5.4 Impact of the Data Transfer on the Sender System

Generally speaking, a downtime of the sender system during a data transfer with SAP TDMS is not mandatory. However we recommend that no changes should be made in the system while the data for transfer is read and stored in a temporary table (cluster). This “cluster technique” makes it possible to separate the reading process from the actual data transfer, which means that the data transfer itself does not have an impact on the sender system.

The reading procedure typically takes about 50% of the time required for the actual transfer. If, for example, the transfer takes ten hours, you should allow another 5 hours for reading the relevant data and storing it in the cluster.

With regard to the fact that the data extraction creates substantial load on the system, it should take place during a relatively “quiet” time to ensure that the available hardware resources are sufficient.

Some companies use snapshot or clone technologies for their storage systems or temporary system copies to minimize the impact on the production system.

When the Material Ledger is active in the sender system, no database changes are allowed when programs for filling internal header tables are running or during data selection. Please check your sender client and lock it if necessary. .

5.5 Deletion of Data from Receiver System

5.5.1 Deletion Scenarios

The default method for deleting data from the receiver system is “drop-insert”, because this method is fastest in most cases. However you may want to change the deletion method to “array-delete”, for example for the following reasons:

The size of the client data to be deleted is 30 per cent or less of the total size of the receiver system.

The receiver system is a multi-client system, and the users in other clients cannot be locked while data is being deleted from the receiver client. Consequently, you cannot use a deletion method that involves dropping tables. Therefore, we recommend that you use activity „Change Deletion Scenario at Package Level – Optional‟ in phase „System Analysis to switch th deletion scenario to

o O (Overall „Array Delete‟ technique)

o F (Full table scan by checking out non-relevant entries)

Page 37: TDMS Operation guide

5 Recommended Procedures and Troubleshooting

5.5 Deletion of Data from Receiver System

37 July 2010

Please note that this usually results in longer runtimes.

The deletion method should only be changed by expert users who have a clear understanding of the possible consequences and the required precautions for avoiding loss of data and other issues.

Note that the deletion scenario can only be changed in the time interval directly before the deletion programs are generated. You can change the deletion scenario only if the parameter PCL_EXPERT is set in your user profile. For a description of how to proceed, see SAP Notes 894307 and 1068059.

Deletion of receiver client data may also affect other clients in the receiver system (depending on deletion technique).

If the receiver system is a multi-client system and the users in all clients can be locked during receiver deletion, we recommend the following:

Use the deletion settings in activity Define Deletion Scenario Automatically. No optional settings in activities Change Deletion Scenario at Package Level – Optional and Change Settings for Tables for Deletion – optional are required.

Please note that the data deletion in a multi-client receiver takes longer than in a single client system. It might take many hours or even more than a day.

If the receiver system is a single-client system, and this client is to be refreshed by TDMS, we recommend using the deletion settings in activity Define Deletion Scenario Automatically. No optional settings in activities Change Deletion Scenario at Package Level – Optional and Change Settings for Tables for Deletion – optiona’ are required.

During the receiver deletion,

do not cancel running deletion jobs

do not restart the receiver

5.5.2 Locking Users During Deletion

If you use the „drop-insert‟ solution (default solution) for deleting data from the receiver system, not only the actual receiver client, but also all other clients in the system must be locked.

If you do not lock the other clients, the users in the other clients will not be able to work properly during the data deletion, and you even run the risk of losing data and getting inconsistencies in the other clients.

If you do not want to lock the other clients, you need to change the deletion scenario to overall array-delete (O) at package level.

Page 38: TDMS Operation guide

5 Recommended Procedures and Troubleshooting

5.6 Define Logical System Name in Receiver System

38 July 2010

5.6 Define Logical System Name in Receiver System

5.6.1 Background Information

The logical system number (LOGSYS) is used in SAP systems to distribute data between various systems and clients. In order to be able to create a functioning distribution model, you have to make sure that each combination of SID and client is uniquely assigned to a logical system number (LOGSYS). Since distribution models are subject to changes, you need to be able to convert logical system numbers. In SAP standard, you can use the transaction BDLS to do this. Since the logical system number in an SAP system is stored in more than 1000 tables, the transaction BDLS usually has a long runtime. In order to avoid the long run time of BDLS (if necessary) when setting up non-production systems, TDMS offers the possibility to convert the logical system number during the data transfer. Yet, TDMS only transfers the content of client-dependent tables. In order to make sure that the target client will be able to run, the data in the cross-client tables T000 and TBDLS referring to the current target client will be adjusted after the data transfer.

5.6.2 Recommended Procedure to Avoid Inconsistencies

Before you can transfer data into a client using TDMS, you have to build up the target client first. There are the following options for this:

Create a new target client in an existing non-production system using SAP Client Copy

Create a new target system using SAP System Copy

Create a new target system using TDMS shell creation (see chapter 4.6)

In each of these cases, you should take care that the logical system number of the new client gets a value that has not yet been used anywhere in your system landscape. In option 1, you have to specify an ID when you define the new client (transaction SCCA) before starting the client copy. In the second and third options, you have to carry out transaction BDLS once after the system copy – as described in the System Copy Guide.

After this, you can start using TDMS. Enter the logical system number of the new client in the TDMS activity Define Logical System Name in Receiver in phase Package Settings. As of SP12, this will be supported by a default value suggested by TDMS and an additional check. It is not necessary to adjust tables T000 and TBDLS (activity Change Logical System Name in phase Postprocessing) if you use this procedure, but you can do it if you want.

By using this procedure, you can make sure that TDMS will not produce any inconsistencies in the logical system numbers.

The runtime of transaction BDLS will be remarkably shorter if you start it during a TDMS run. You must start the transaction after activity Delete Data in Receiver System and wait until it is finished before you start the activity Start Data Transfer. This has to be done once during the first TDMS run and is no longer necessary in later TDMS runs using the same target client.

Page 39: TDMS Operation guide

5 Recommended Procedures and Troubleshooting

5.6 Define Logical System Name in Receiver System

39 July 2010

5.6.3 Alternative Procedure

We explicitly do not recommend and support the procedure described in the following paragraph.

If the target client is a stand-alone client and not part of a logical system landscape (ALE, systems with other Business Suite components), you will usually be able to live with some inconsistencies in the logical system numbers.

In this case, you would not have to carry out transaction BDLS and would not have to see to it that the logical system number of the target client has not yet been used in your system landscape. If it should become necessary to integrate the client into the system landscape at a later point in time, you have to check all cross-client tables that contain a logical system number for relevant entries, and adjust them if necessary. Such tables are, for example, TKEBWTSN, TBTCO, ROIDOCPRMS, ROOSGEN, ROOSPRMS, RSBASIDOC, TBDBANK, TBDBANKC, TBDLST, TKEBWLGN, TKEBWLOGN. Please note that these are only examples and that this list of tables does not claim to be complete. Customer-specific cross-client tables may have to be added.

If it becomes necessary to change the logical system number, you have to do so using transaction BDLS.

Page 40: TDMS Operation guide

6 Performance Issues

5.6 Define Logical System Name in Receiver System

40 July 2010

6 Performance Issues

The programs listed below are known to run very long (or even to terminate) under certain circumstances. If you experience performance problems with any of these programs, refer to SAP Note 916763 for guidance on how to proceed.

o Delete data from receiver system

o Analyze table sizes

o Start data selection (fill cluster)

o Start data transfer

In the context of time-based reduction, internal header tables need to be filled with information that is not directly available, but needed for the reduction. If the data volume is very big, you may experience performance issues with the programs for filling some of these header tables, or the programs may even terminate. To avoid this, you can set certain parameters. The affected tables as well as the corresponding SAP Notes containing information about the relevant parameters are listed below:

o TD05X_FILL_VBUK_1 Note 1058864

o TD05X_FILL_VBUK_2 Note 1054584

o TD05X_FILL_BKPF Note 1044518

o TD05X_FILL_EBAN Note 1054583

o TD05X_FILL_EQUI Note 1037712

To find out in advance if the above mentioned performance issues are likely to occur in your TDMS project, you should estimate the relevant data volume beforehand, considering the number of data records in the system as well as the number of data records to be transferred for the selected from-date, and read the SAP Notes mentioned above to learn more about critical values.

Standard transactions for performance-related tasks that are also relevant in the context of SAP TDMS:

o SM50 (Process Overview)

o SM51 (SAP Servers): Overview of existing SAP servers and related information

o ST06 (Operating System Monitor): Overview of server load and available resources

o ST04 (Database Performance Analysis)

o ST05 (Trace Requests): Particularly important for SQL trace

o SE30 (ABAP Runtime Analysis)

o SM37(Simple Job Selection): Information about overview of programs run in background and related logs)

o ST03 (Workload): Information about system load and analysis options

o STAD (Select Statistical Records): Information about users, programs etc.

Page 41: TDMS Operation guide

6 Performance Issues

5.6 Define Logical System Name in Receiver System

41 July 2010

Consider also the related information provided in the process monitor for your SAP TDMS package (for example about program runtimes) and the general tips and tricks for handling performance issues that are listed in SAP Note 916763.

Recommended initial settings for large databases

o Make sure to switch off the size prediction using activity De- / Activate Size Prediction for Receiver System to avoid execution of programs for filling internal header tables during the system analysis phase.

o When you start working in the phase Data Transfer (which means that this phase is the technically active phase), set parameter „P_CLU‟ to „Y‟ in the following activities before you start the activities for filling internal header tables:

TD05X_FILL_BKPF

TD05X_FILL_CE

TD05X_FILL_EKKO

TD05X_FILL_VBUK

TD05X_FILL_VBUK_1

TD05X_FILL_VBUK_2

TD05X_FILL_VSRESB

TD05X_FILL_WBRK_1

You can do so by placing the cursor on each of these activities, then using the Troubleshooting button and executing the activity Adjust parameter for excution in Troubleshooting tree. Then you get a list of possible parameters for the activity. Set P_CLU to selection value low = Y

o Make sure that the index for VBFA is created.

o Set the number of batch processes for the sender, central and receiver system as high as possible (menu - Process Settings). This is can be done at activity level. You should consider in which of the systems a long-running activity can be executed.

o Choose the most recent from-date possible - at least in a first test package - to get a reliable estimate about expected runtimes.

Page 42: TDMS Operation guide

7 How to Meet Specific Requirements

7.1 Reorganization of Transfer Packages in Migration Server Overview

42 July 2010

7 How to Meet Specific Requirements

7.1 Reorganization of Transfer Packages in Migration Server

Overview

Reorganization of transfer packages as it is currently designed always means a loss of information about created test data transfer packages even if a test data transfer has been performed. The selection of packages for reorganization is in the responsibility of the execution user – but it is of course protected by standard TDMS authorization and registration checks.

The minimum authorization required for reorganization is included in the TDMS user role SAP_TDMS_SUBPROJECT_LEAD. Users with the corresponding authorization additionally need to be registered for the corresponding subproject of a package that is to be reorganized. User role SAP_TDMS_MASTER also includes the required authorization and - because of the special meaning of the role - does not require an explicit registration of the corresponding user for the subproject.

Reorganization of active packages is not possible. To delete a package that is currently active, deactivate it in the migration server overview first. (Transaction: CNV_MBT_TDMS: Menu path: Package Deactivate)

The reorganization procedure performs the following actions:

determination of information assigned to a given transfer package in the local system in corresponding TDMS control tables

determination of application logs assigned to a given transfer package

deletion of corresponding entries in control tables and in application log

deletion of transfer package from TDMS migration server overview, and deletion of reference to the subproject

list output: entries to be deleted in each control table ( in Count mode )

list output: deleted entries in each control table ( in Deletion mode )

Reorganize Package – step by step description

1. Make sure that you have the required authorization (recommended user role: SAP_TDMS_MASTER).

2. Make sure that your user is registered for the subproject of the package to be deleted (not required if your user has the role SAP_TDMS_MASTER).

3. Start report CNV_MBT_PACKAGE_REORG using transaction SE38 – or choose the menu path Package Delete Package in the migration server overview.

4. The default setting of the selection screen is meant to perform just a determination of control table entries and application log entries related to a given package.

In the package number field, enter the number of the package you want to delete. (Option Write Log is not yet implemented as an alternative to list output). Execution in dialog or background is possible.

If you start the program with this option, the result list shows the number of entries to be deleted in all assigned TDMS control tables.

5. To actually execute the reorganization of a package, select the options Delete Objects and List Results in the selection screen. Execution in dialog may take a few minutes.

The result list shows the number of deleted entries in all assigned TDMS control tables as well as the number of deleted application logs.

Page 43: TDMS Operation guide

7 How to Meet Specific Requirements

7.2 Deletion of Obsolete Function Groups

43 July 2010

6. As a result of the deletion process, the package is no longer shown in the TDMS overview transaction.

7.2 Deletion of Obsolete Function Groups

The largest portion of space in on TDMS central systems is taken up by by generated function groups and related function modules that are needed for data transfer. Especially standard reference tables of ABAP Workbench are consuming around 3 to 4 GB for an initial setup package (after generation of modules has been done).

The deletion program for generated function groups CNV_MBT_DTL_FUGR_DELETE therefore identifies function groups and function modules that are no longer required in the current data transfer. The prerequisite for this decision is that corresponding transfer packages have been deleted. If a subproject that used to have packages assigned does not have any assigned packages any longer due to deletion of packages, the corresponding function groups are identified as obsolete and therefore selected for deletion.

Below the subproject level, the corresponding function groups are organized in mass transfer IDs (MTIDs). A mass transfer ID contains all function groups that are used for a transfer package (initial setup) and all its copies („copy for refresh‟ packages). This means that an MTID (and therefore the assigned function groups) is released for deletion only if the corresponding initial setup package with all its copies is deleted. Obsolete MTIDs are listed in table CNVMBTUSEDMTIDS.

Even after execution of function group deletion, do not delete the corresponding entries in table CNVMBTUSEDMTIDS, because otherwise inconsistencies would occur during future data transfers.

To perform the deletion of function groups, the following steps are required:

1. Start report CNV_MBT_DTL_FUGR_DELETE in transaction SE38.

2. The default selection screen should be used for execution, as all obsolete function groups are selected for deletion automatically. Execute the report in the background, because determination of relevant function groups and deletion of function modules and function groups usually take several hours.

As a result of the deletion, the spool (list output) of the report shows the number of deleted function groups.

Page 44: TDMS Operation guide

7 How to Meet Specific Requirements

7.3 Data Protection and Privacy

44 July 2010

7.3 Data Protection and Privacy

The processing of personal data and the free movement of such data are particularly protected by law. There are data protection and privacy provisions that protect personal data such as first name, second name, date of birth, social security number, payroll data, customer relationship information, account numbers etc. Therefore, an important aspect in the context of building up non-productive systems is the compliance to requirements regarding data protection and privacy.

This is particularly important given the fact that non-productive environments – especially test or development landscapes – are less protected than productive systems. Authorizations are given more generously and sometimes even external users have access to these environments. In addition, errors occur or are even caused on purpose which might lead to a loss of confidentiality of data.

7.3.1 Legal Requirements with regards to Data Protection and Privacy

As mentioned above, there are legal requirements regarding data protection and privacy.

These include for example:

The Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data. This is a European Union directive which regulates the processing of personal data within the European Union.

Federal Data Protection Act (Bundesdatenschutzgesetz = BDSG)

This is the German implementation of the above-mentioned EU directive and thus, it protects the processing of personal data (§1). In addition, it stipulates that data processing systems should aim at data reduction and data economy (§3a). This means that they should use as little personal data as possible and make use of possibilities to anonymize them. The limitation of use to specific purposes (Zweckbindungsgebot) allows companies to use customer data only for the purposes that have been stipulated in their underlying contracts. A usage in test and development environments is not covered by this and thus not allowed. Therefore, companies have to either create test data or anonymize productive data, if the creation of test data is not feasible.

ISO/IEC 27001 is an Information Security Management System (ISMS) standard published by the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC). Its full name is ISO/IEC 27001:2005 - Information technology -- Security techniques -- Information security management systems -- Requirements but it is commonly known as "ISO 27001".

7.3.2 TDMS Functionality to comply with Data Protection and Privacy Requirements

Please note that TDMS itself does not process any personal data, but helps you to operate your data processing systems in a way that is in line with data protection and privacy requirements. In principle, it offers the following functionalities that are relevant for data protection:

Anonymization of data

Reduction of data

Transfer of data

Page 45: TDMS Operation guide

7 How to Meet Specific Requirements

7.3 Data Protection and Privacy

45 July 2010

The scrambling functionality includes specific functionality for HCM data and templates with predefined country-specific content. For other (non-HCM) data, you can use the TDMS workbench to create your own scrambling rules and the business process library to scramble BPL business contexts.

Principal strategies to anonymize HCM data

Basically, there are the following strategies to anonymize HCM data:

Data values can be

o replaced randomly by values from a predefined value table

o replaced by a fixed target value

o deleted

o translated, for example A becomes B, B becomes C etc.

o randomized in time period, for example to scramble only the day and month in a date field

o randomized in value range, for example to take one value from a pre-defined value range

o changed according to customer-specific scrambling rules created using the TDMS Workbench

The anonymized content of a field can be passed on to other data fields to keep the anonymized data consistent

There are user exits for customer-specific anonymization requirements

There are templates with pre-defined country-specific scrambling content

You will find a detailed description of the scrambling concept in the section „Data Scrambling for HCM Data‟

Principal strategies to anonymize other ERP data

Using the TDMs workbench, you can create your own rules, package-specific and customer-specific scrambling rules. It includes some example rules that demonstrate how rules may look like and can serve as templates. In principal, you have the possibility of creating field-related or event-related rules. For further details, refer to the section „Data Scrambling (General)‟

For information about scrambling using the business process library, refer to section „How to Scramble Data in a BPL context‟.

For a description of the reduction scenarios and transfer of data, refer to the TDMS Master Guide.

Page 46: TDMS Operation guide

7 How to Meet Specific Requirements

7.4 Data Scrambling (General)

46 July 2010

7.4 Data Scrambling (General)

7.4.1 Background

In its default setup, SAP TDMS transfers data from the sender to the receiver system without changing it. However you may want to scramble certain sensitive data before transferring it to ensure data security. SAP TDMS provides the environment for developing scrambling rules and assigning them to migration packages.

SAP TDMS does not come with any ready-made scrambling rules that could be used directly in SAP TDMS packages. It includes only example rules that demonstrate how rules may look like and can serve as templates.

Developing rules for consistent data scrambling across multiple applications requires in-depth technical and content knowledge of the relevant applications and the relations between them. Consequently, SAP recommends that scrambling rules should only be developed by suitably qualified developers or consultants. If you are interested in a specific training for data scrambling with SAP TDMS, open a message under component XX-PROJ-DMS-TDM.

The scrambling functions provided with SAP TDMS work only for transparent fields.

There are different types of scrambling rules:

field-related rules for scrambling the content of a single field

event-related rules that apply at record level and change the contents of a complete record using a single rule This type of rules is used for scrambling multiple fields at a time (e.g. address scrambling).

The TDMS workbench (transaction CNVMBTTWB) that comes with SAP TDMS provides an environment for developing conversion rules, that is, rules for changing data before it is transferred. As scrambling rules are a specific type of conversion rules, you can use this environment to create scambling rules.

Scrambling rules are always created in the context of a rule package (see section 7.4.2 for details). However you can assign existing scrambling rules to a migration package without having to assign the complete rule package.

There are the following options for applying the rules in SAP TDMS transfers:

The rule package containing the rules is assigned to a standard TDMS package that represents a process type. Whenever this standard package is loaded to create a new migration package, the rules of the rule package are included automatically and the rules are applied to tables or domains as defined within the rules workbench.

The process monitor for the migration package contains an optional activity for maintaining conversion and scambling rules where you can select rules and assign them to the current transfer package.

Page 47: TDMS Operation guide

7 How to Meet Specific Requirements

7.4 Data Scrambling (General)

47 July 2010

To apply scrambling rules, you must import them into client 000 or into the migration client of the TDMS central system. If you import the rules into client 000, they can be assigned to TDMS standard packages (for process types) and loaded into migration packages automatically. If you import the rules into the migration client, they can only be assigned to each migration package individually.

7.4.2 Creating a Rules Package

1. In the TDMS workbench (transaction CNVMBTTWB), select the option Create package, enter a five-character package ID and a description, and save your entries.

2. The package overview screen is displayed. Choose Package Definition.

3. Choose the package type Package module containing conversion or scrambling and the scenario Test data migration server transfer. Enter the minimum and maximum SAP releases to which the rules contained in the package apply. Save your settings.

If you created the rule package in client 000, you now need to assign it to a standard package (process type) to make it available. To do so, proceed as follows:

1. In the TDMS workbench (transaction CNVMBTTWB), select the option Maintain package, enter the ID of the standard package (process type) to which you want to assign your rule package, and save your entries.

2. The package overview screen is displayed. Choose Package Definition.

3. On tab page Dependent Packages, enter your rule package as a dependent package. Save your entries.

7.4.3 Working With Scrambling Rules

To create a new scrambling rule, proceed as follows:

1. In the package overview screen for the relevant rule package, choose Conversion rules Definition.

2. Choose the Create icon.

3. Enter a name and a description for the scrambling rule and save your settings.

4. Enter the rule attributes:

Under Classification, choose Scrambling rule.

Under Rule Type, choose either field-related or event-related. Note that a field related rule can have only one export parameter (the field to be changed) and more than one import parameter. An event-related rule cannot have any parameters, since it does not involve any specific field, but operates at record level.

For field-related scrambling rules, you can additionally maintain consistent scrambling values. If you select this option, the system ensures that the data is scrambled consistently across all tables covered by the scrambling rule. This means that if value A has been scrambled to value B, then in all tables involving the scrambling rule, A is always scrambled to B and not to any other new value.

If you want to use consistent scrambling values, enter an identifier for consistent scrambling. The string entered here is used as the unique identifier for the mapping between values in the sender system and the scrambled values. If there is more than one

Page 48: TDMS Operation guide

7 How to Meet Specific Requirements

7.4 Data Scrambling (General)

48 July 2010

scrambling rule for a particular field, then make sure that you enter the same identifier for all these rules.

5. If you want to create a field-related rule, enter the import and export parameters for the rule on tab page Parameters.

The import parameters represent the input to the rule from the sender system. The export parameters represent the output of the rule that is sent to the receiver system. A field-related rule should have at least one import parameter and one export parameter. Sometimes a rule may need more than one field from the sender system to scramble the data (e.g. for scrambling KUNNR in table AUSP). In such cases, you can define more than one import parameter.

When defining a parameter, enter also the reference domain name and roll name. This information is needed to define the parameters in the includes (source codes) during data conversion (or scrambling).

6. Enter the source code for the rule.

7. Enter the table assignments for the rule.

For an event-related rule, enter the rule name and the step for each table. The step defines the exact point in time during data transfer when this rule should be applied. For scrambling, it is usually EOR (end of record). Data transfer is carried out record by record within a loop. EOR means that conversion takes place inside the loop at the end of every record.

Possible steps are:

BOL – Begin of Loop

EOL – End of Loop

BOT – Begin of Transaction

EOT – End of Transaction

BOR – Begin of Record

EOR – End of Record

BOP – Begin of Process

EOP – End of Process

For a field-related rule, you can make the assignment either at domain level or at field level.

Domain level assignment is the simplest possible way to assign rules to a set of tables. As the name indicates, the assignment works at domain level, that is, the rule is applied to all the tables involving a field that uses the mentioned domain. Choose Domain Level Assignment and enter the required domain(s) and corresponding roll name(s).

Field level assignment is a more detailed way of assigning a rule to a table. Here the relevant table names and field names in the sender and the receiver system must be specified explicitly.

Enter the relevant receiver tables and fields to which the export parameter refers. Then specify the sender parameter(s) that correspond to the import parameters of the rule. To do so, choose a receiver table field by double-clicking. The import parameters are then automatically displayed on the right side of the screen. Enter the field in the table from which the import parameters are fetched.

If you would like to get a suggestion for the sender parameters, select the receiver table field and choose Suggest Sender Parameters. The system then automatically fills the receiver field by scanning all fields in the table and matching the import parameter definitions.

Page 49: TDMS Operation guide

7 How to Meet Specific Requirements

7.5 Data Scrambling for HCM Data

49 July 2010

From the rule maintenance view, you can not only create scrambling rules, but also do the following:

Edit or delete an existing rule

Copy all the contents of a rule into a new rule

Do a consistency check for a rule

Do a domain-based search of tables to get a list of table fields that were defined using the specified domain.

7.5 Data Scrambling for HCM Data

HCM data is particularly sensitive. To preserve the integrity of this data, you will probably want to make it unrecognizable before you transfer it to a non-production system. This is why SAP TDMS provides a special user interface with which you can define customizing for scrambling sensitive HCM data.

7.5.1 Background

Some sensitive fields are obvious: Thus, names, addresses, and bank details must be altered before the respective data records can be used for testing purposes. Some are less clear, though, and sometimes there are dependencies. For example, the German social insurance number depends on the birthday and sex of the individual, this (infotype 13) is also connected to the RD (German payroll) cluster. The Swiss AV and American social security numbers offer similar problems.

Scrambling requires complex customizing because you need to decide how the data should be altered, and determine the dependencies between the fields involved.

7.5.2 General Concepts

7.5.2.1 Inheritance

The customizing for scrambling is possible on 3 levels. The highest level is the project level, followed by subproject level and package level. Customizing at project level is valid for all subprojects and packages for a project. Customizing at subproject level is valid for all packages within a subproject, and customizing at package level has only consequences for that package.

Lower levels inherit the customizing settings from higher levels, but you can redefine customizing for lower levels.

7.5.2.2 Transport requests

All changing activities in scramble customizing are traced and stored in a customizing transport request.

7.5.2.3 Template

SAP delivers a scrambling customizing template for TDMS HCM.

Client: 000, project: *, subproject: *, process type: R, pack-id: TDHSC

Page 50: TDMS Operation guide

7 How to Meet Specific Requirements

7.5 Data Scrambling for HCM Data

50 July 2010

7.5.3 How to Create Customizing for Data Scrambling

There is a specific UI where you implement customizing for scrambling. You can implement customizing for scrambling in the following different ways:

Define scrambling at project or subproject level

You can use transaction CNV_TDMS_HCM_SCRAM to prepare and maintain customizing for scrambling at project or subproject level.

The first step is to decide if you want to define all customizing for scrambling by yourself, or if you want to take over the delivered SAP template as the starting point for your work. Choose Project or Subproject to customize scrambling at a project or subproject level.

To start without the template, choose Create only Standard Rules.

To work with an existing project or subproject, choose Maintenance. If you are working in a distributed landscape, you need an RFC destination to the sender system to connect to the correct DDIC.

To use the SAP template, choose Copy from other Project. You can now copy a scramble definition. To copy the SAP template, just mark the checkbox Copy from Template and the required settings for the source are automatically defaulted.

Pack ID: TDHSC

Project: *

Subproject: *

You can use a description prefix, for example SAP_, to easily identify elements. For example, if you use this prefix and the name of a scramble group in the template is HCM_DE_01, then after the copy the name is SAP_HCM_DE_01. This is useful for copying in add-on mode into an existing project, because it helps you to determine what was new from this copy.

There are the following copy modes:

o Overwriting: Customizing is deleted and replaced by source values.

o Add-on: All incoming values are copied as additional scramble elements into the relevant level.

o Fill gaps: Any scramble elements that were deleted from the existing version are refreshed. For the existing scramble elements, the copy does not overwrite them with the incoming values, but inserts the incoming values as in the add-on mode.

Package process activity – Configuration and Selection – Define Scramble Rules activity

Here, the customizing runs at package level. All settings from project and or subproject level are visible and selectable. In most case, it will be sufficient for you to switch on / off actions, but you may also want to define package-specific rules.

7.5.4 Example

The following example describes how to change the name in HR master data and payroll cluster.

This is only an example, not a complete overview of customizing for scrambling.

Page 51: TDMS Operation guide

7 How to Meet Specific Requirements

7.5 Data Scrambling for HCM Data

51 July 2010

7.5.4.1 Define Field Sets

Naming information is stored in infotype 0002:

VORNA - first name

NACHN - last name

Navigate to the second level screen for field set development. (If the customizing for scrambling is completely empty, create at first a dummy scramble group and a dummy scramble set. Assign the set to the group, select this, and then the field set tool bar is available.)

Create a new field set and name it “First Name”. The primary field line is now selected. Choose the Maintain result parameter button (on the right hand side directly under the tree area). An input window appears. Fill in the DDIC name of the field; for the first name in infotype 0002, it is PA0002-VORNA. If you enter an incorrect value, you get a message and can repeat this action. If the input is correct, the selected line is filled out. On the right side (target side), the field name is marked red. The reason is that there is no key set maintained for the table PA0002 yet (left hand side grid area). If there is already a maintained key set in customizing, the key set is found and displayed in the left grid area.

Next, you maintain the key set. The normal way of doing this is to use the proposal (choose the tool button in key set grid area) and delete all entries that are not needed, with the exception of PA0002-PERNR. The primary field line is finished.

Creating additional result lines:

In the right hand side grid area, you find a list of all DDIC fields with the same datatype as PA0002-VORNA; it will help you to define additional result lines. For example, create a new result line (choose the button on the left hand side of the tree area toolbar), select the field PA0021-FAVOR in the field list, and choose Move in from list on the right side of the tree area toolbar. If it is marked red now, maintain the key set.

To really find all fields in which the first name is stored by using DDIC, you need to be very familiar with the database. However the template already includes a large part of the relevant information.

In infotype 0001, there exist two fields for search help called ENAME and SNAME. These fields contain a concatenation of first and last name in capital letters.

Create a new result line, choose “Maintain result parameter”, and fill in PA0001-ENAME. Maintain the key set if necessary. Inline coding is also required.

To create the inline coding, choose “Inline coding after transfer” and enter the following data the popup window:

TRANSLATE & TO UPPER CASE.

This brings the field content to capital letters as last action of the scrambling.

In the payroll cluster, the first name is stored in the structure NAT-NAME. Create a new result line with NAT-NAME-VORNA. As the key set, you can take the standard TDMS key structure for HCM cluster TDMS_CL_KEY with the field PERNR.

Finally, bring the fields from your own infotypes into the field set.

You then create the next field set for the last name. Do this in the same way as for the first name.

For the fields ENAME and SNAME, you also need inline coding first (left side).

CONCATENATE & OUT INTO & SEPARATED BY SPACE. and

CONCATENATE OUT & INTO & SEPARATED BY SPACE.

Page 52: TDMS Operation guide

7 How to Meet Specific Requirements

7.5 Data Scrambling for HCM Data

52 July 2010

You can also define an unrestricted number of field sets. As you can see from the example of the fields ENAME and SNAME, fields can be members of different field sets at the same time. The usage of condition set in field sets is possible, but makes sense only in very special cases.

7.5.4.2 Define Scramble Set

In this example, you define a scramble set by using two new field sets.

Return to the primary level screen.

Create a scramble set by choosing Create in the scramble set grid area. Fill in the name of the new scramble set. For this example, use “Person Name”.

To assign field sets to a scramble set, you must first assign the scramble set to a scramble group. If you are unsure about how you want to organize your customizing for scrambling at this stage, you can use a dummy scramble group for the purposes of the definition phase. You can use scramble sets in multiple groups. You can later organize all your scramble sets by different categories and switch them on and off at package level.

Select the target scramble set in the tree. In the middle grid, choose “Field-Set into Scramble-Set” on the left tree toolbar. The system creates an additional line at scramble set level for this field set. In our example, start with the field set First name. Then choose the new line in the tree, and the grid areas change. In the right grid, you now find scramble (rule) types. Select one (the best method in this case is Random by value table), and choose Scramb-rule to Field-Set (toolbar of right tree area) to assign this scramble (rule) type.

To maintain values in the random table, choose “edit” in the right tree area toolbar. The system launches a new window. You can enter values into the random table manually or by CSV file upload. The number of values is unrestricted.

Next step is to do the same for the field set Last name.

Note that the sequence of field sets in a scramble set influences the way the information os processed. In our example for the fields ENAME and SNAME, it is important to start with first name.

In most languages, there are different names for men and women. Consequently, a single random table with first names is not enough.

7.5.4.3 Define Condition Set

Navigate to the second level screen for condition set development by choosing Create in the condition set grid area.

Enter “Women” as description text for the condition set. The system displays an empty condition line. Select the third column in the tree at the new condition line and choose Switch field after Scrambling in the middle tree toolbar. Fill the input line with PA0002-GESCH. This is the gender key of infotype 0002. Select the fifth column, choose Create Fixed value, and then enter „2‟. Finally, double-click in the fourth column and choose the equal operator in the popup window. Now the condition set “Women” is ready.

Do the same with fixed value '1' and define the condition set “Man”.

Return to the primary level screen.

Page 53: TDMS Operation guide

7 How to Meet Specific Requirements

7.5 Data Scrambling for HCM Data

53 July 2010

Now assign the condition set “Women” to the feld set line “First name”. Select this line in the tree and the line in condition set grid and press the button Move Cond-Set to Field-Set. Maintain the random table only with names for women.

Add the field set “First name” once more to the scramble set “Person Name”. Move the new line with the Drill up and Drill down buttons directly to the first entry of the field set. Move the condition set “Man” into this line, set the scramble (rule) type to Random by value table, and maintain the value table only with names for men.

7.5.4.4 Define Scramble Group

The last step involves organizing the scramble sets into logical groups. Create a new group by providing a description, and move in some scramble sets. Switch all inserted scramble sets to active state (using the node button in tree to switch to green light, which means that it is active). To include the active scramble sets of a scramble group into the scramble process during data transfer, select the the node check box for this scramble group.

7.5.4.5 Define Customer-Specific Scrambling Type Using TDMS

Workbench

Below you will find an example of how to create a customer-specific scrambling type with the help of the TDMS workbench.

1. Start the TDMS workbench using transaction CNVMBTTWB. 2. Create a rules package and choose a package ID that is within the customer namespace

9nnnn. 3. Choose the tab page Package Definition and confirm the popup to create a package. 4. Choose the tab page Package Attributes and enter the following information:

1. Description Enter a description 2. Pack Type R Package module containing conversion or scrambling rules 3. Scenario TDMS Test data migration server transfer package

5. Choose the tab page Package Definition 6. Push the button Conversion rules Definition and again confirm the popup 7. On the next popup Determine scrambling methods, choose Manual Creation. 8. Push the button Create and create a new rule with rule name and description. 9. Choose the tab page Attributes

Classification: S Scrambling Rule Rule Type: 2 field-related

10. Choose the tab page Parameters to create two new parameters Naming conventions: Incoming value OLD_XXXXX type import (XXXXX – field name to be scrambled; this must be identical to the field name in the related table.) Outgoing value NEW_XXXXX type export

For both parameters, enter in the field data type the name of the domain you would like to scramble; as typing method, you should use TYPE.

11. Write source code The code is the body of a form routine without FORM and ENDFORM statement. Do not change OLD_XXXXX. Store the scramble content in NEW_XXXXX. The runtime environment for your coding is the sender system. The following global variables from HCM environment are available: GV_CALL_REPID Main program that is calling the scrambling engine GV_PACKID Package ID GV_CNVMBTDETS_SND RFC destination of sender system

Page 54: TDMS Operation guide

7 How to Meet Specific Requirements

7.6 TDMS and SAP APO

54 July 2010

GV_CNVMBTDETS_RCV RFC destination of receiver system GV_CNVMBTDETS_CNT RFC destination of control system When global variables are used, the consistency check returns an error (at that time there is no communication between the development platform and TDMS4HCM). You can ignore this and activate your rule anyway.

12. Table Assignment -> Choose Button Domain Level Assignment and create a new entry with the domain name you would like to scramble.

13. Save and activate your work. Make sure that the button Active for the Rule Name is set in the left side of the screen.

A rule package can contain several rules and is independent of the TDMS project structure. The rules can be used in all projects and packages.

Publish your rule in scrambling UI

1. Start the scrambling maintenance UI via transaction CNV_TDMS_HCM_SCRAM at project/subproject level or via Configuration and Selection at package level.

2. Select a Field Set entry in the column tree at the top-level screen.

3. Push the button Create in the scrambling type grid.

4. Select your rule(s) from package 9nnnn (only your descriptions are listed and the checkbox is located right behind them) and press Enter.

5. Now you can use your rule in the same way as standard Scrambling Types.

Release the related customizing and development transport requests together.

7.6 TDMS and SAP APO

With regard to the use of SAP TDMS in connection with SAP Advanced Planner and Optimizer (SAP APO), there are the following options:

Option A:

1. Synchronous (complete) copy of R/3 PROD --> R/3 sandbox (alternatively by means of shell creation) and APO PROD --> APO sandbox.

2. TDMS copy from R/3 SBX --> R/3 QAS plus deletion of application data of non-APO-specific master and transactional data in APO QAS (if necessary)

3. Initial transfer of planning-related master and transaction data from R/3 QAS to APO QAS (if necessary due to TDMS).

Advantage: No liveCache initialization necessary; APO-specific master data (resources etc) do not have to be handled separately

Disadvantage: Additional APO system copy necessary

Option B:

1. Complete copy of R/3 PROD --> R/3 SBX while keeping up APO QAS.

2. TDMS-copy of R/3 SBX --> R/3 QAS and application-specific deletion of non-APO-specific master and transactional data in APO QAS.

Page 55: TDMS Operation guide

7 How to Meet Specific Requirements

7.7 Data That Is Not Transferred

55 July 2010

3. Special handling of prognosis data (Infocubes) and master data (APO-Loader, master data-BAPIs).

4. Initialize liveCache in APO QAS.

5. Initial transfer of planning-relevant master and transactional data from R/3 QAS to APO QAS.

Advantage: No additional APO system copy needed

Disadvantage: Relatively high effort on the application side, heavily dependent of actual customer scenario

We recommend option A.

7.7 Data That Is Not Transferred

By default, the content for the following areas is not transferred:

Areas for which the existing information in the receiver system should remain intact (rather than being replaced with the corresponding information from the sender system):

Users

Authorizations

Roles

Variants

Business Workplace (formerly SAPoffice)

Content of certain tables from the following areas that often contain huge amounts of data, but are usually not needed in the non-production system:

Change documents

ArchiveLinks

SAP Business Workflow

It is possible to include tables from these areas in the transfer if this makes sense from a business perspective or for consistency reasons. However this requires customizing changes and should consequently be done only by expert users. If you have requirements in this area, but do not have the necessary expertise, consider involving consulting specialists for SAP TDMS.

7.8 How to Transfer Individual Tables

If you want to transfer the complete data of a single table, you can do so using the TDMS process type “TDMS: Initial Package for Business Process Library”.

7.8.1 Prerequisites

SAP Test Data Migration Server 3.0 is installed and has at least the following support package levels: DMIS SP7, DMIS_CNT SP7 and DMIS_EXT SP1.

Communication users with the required authorizations (e.g. SAP_TDMS_USER_EXT) have been created in the SAP TDMS server client and both the sender and receiver client.

A dialog user with the required authorizations (e.g. SAP_TDMS_MASTER_EXT) is available in the SAP TDMS server client.

Page 56: TDMS Operation guide

7 How to Meet Specific Requirements

7.8 How to Transfer Individual Tables

56 July 2010

We recommend that you create a new subproject for BPL packages that are only used to transfer single tables without data reduction. Note: One subproject should only be used for one sender/receiver client combination. Do not change RFC information once it has been set up.

7.8.2 Data Transfer

Create a new package “TDMS: Initial Package for Business Process”.

Run activity Define RFC Destination and enter the information of your central, sender and receiver clients.

Run activity Create BPL Scenario below node Setup Scenario in phase Setup Phase for BPL.

o Choose context “/SNP/CUSTOMER | Empty context for customer” on the popup. o On the scenario screen, press Enter to activate the save button in the top button row. o Optionally adjust the number of batch processes and decide about using queued RFC

communication for the data transfer. o Save the scenario and return to the process tree.

Run activity Start Crawler below node “Setup Scenario. In subsequent packages in the same subproject, you can skip this activity. You need to run Start Crawler again only if you want to transfer newly added tables and need to make these tables visible to the BPL.

Switch to the extended view of the process tree and run activity Select Objects to be Copied (Optional) below node Setup Scenario.

1. Select the root node and choose Object New from the menu. 2. On the popup, choose Entity and enter a name for the entity, e.g. DUMMY. Select

Copy containing tables COMPLETELY in the box Copy control and save the entity. Confirm the popup to link the entity to the root node of your scenario.

3. Select the newly created entity and choose Object New from the menu. 4. On the popup, choose Table and enter the name of the table to be transferred. Confirm

the popup to link the table to your entity. 5. Optionally repeat the last two steps to add further tables to be transferred.

Note: Alternatively, you can use the function for mass linking (tables) to add the required tables to your scenario, which is usually faster than creating the tables individually.

6. Optionally choose Scenario Designer Show contained tables from the menu to display information about the data to be transferred in your scenario.

7. When you have finished your scenario configuration, return to the process tree.

7.8.3 Example: Transfer the Data of Tables CMFK and CMFP 1. Run activity Define Selection Criteria below node Setup Scenario and return to the

process tree without changing any selection. (The scenario has been set up to transfer all client-specific data of selected tables.)

2. Run activity Verify Scenario. 3. Optionally run activity Start Deletion (Optional) in phase Deletion Phase for BPL to

delete all data of the selected tables in the receiver client. 4. Run activity Start Data Extraction in phase Extraction Phase for BPL to start the data

transfer of the selected tables. 5. Return to the migration server overview and deactivate your package.

Page 57: TDMS Operation guide

8 Additional Information for Business Process Library

8.1 Introduction

57 July 2010

8 Additional Information for Business

Process Library

8.1 Introduction

You use TDMS BPL (TDMS Business Process Library) to extract business objects, together with their specific data environment, from the sender system and transfer them to a receiver system.

TDMS BPL uses “business contexts” to extract objects. For a business object, a business context contains the knowledge about the related objects, tables and table dependencies. There are business contexts for master data (for example customer or material), transactional data (for example accounting document or purchase order) and business processes (for example purchasing process, contract (FS-CD)).

The process tree for a BPL package consists of different phases. Each phase includes several activities.

Refer to the activity documentation for a detailed description of each activity.

This document provides additional information for some of the activities.

8.2 Change Technical Settings

In the activity Change Technical Settings, you can make various technical settings for your scenario.

SAFETY NOTE: These settings directly influence the features and stability of TDMS BPL. Do not change any settings if you do not know exactly how they affect your BPL processes. If in doubt, contact SAP.

Description Parameter value (default)

Parameter name

Cutoff Value (MB) (smaller tables are completely copied)

0 CutOff.Value_MB

Use: Tables with an amount of data smaller than this value are copied completely, that is, these tables are not treated as dependent tables. The extraction process is faster for these tables.

Directory: Data output (Src. System) {SRCSYS.DataDir} DataDir.Source

Use:

You can replace the placeholder {SRCSYS.DataDir} by a directory in which flat files are to be stored when exporting data.

Notes:

If using multiple application servers, ensure that all of them can access the specified directory.

Ensure that the directory ends with „/‟ (Linux) or „\‟ (Windows).

Page 58: TDMS Operation guide

8 Additional Information for Business Process Library

8.2 Change Technical Settings

58 July 2010

Description Parameter value (default)

Parameter name

Directory: Data input (Tgt. System) {TGTSYS.DataDir} DataDir.Target

Use: You can replace the placeholder {TGTSYS.DataDir} by a directory from which flat files are to be loaded when importing data.

Notes:

If using multiple application servers, ensure that all of them can access the specified directory.

Ensure that the directory ends with „/‟ (Linux) or „\‟ (Windows).

Description Parameter value (default)

Parameter name

Block size at delete 10000 Delete.SafeModeBlockSize

Use: Controls the maximum number of data records that are deleted during a deletion run in the target system for each table access.

Extraction: Number of parallel jobs per table

1 Extract.ParallelProcs.perTable

Use: You specify the number of batch processes for each starting table, for example five parallel extraction processes for MARA starting table.

Note:

Ensure that the number of batch processes for extraction in the scenario master data settings is greater than or equal to the parameter value of Extract.ParallelProcs.perTable.

Extraction: No table compression - Extract.Uncompressed

Use: This setting controls whether the extracted data is transmitted in compressed or uncompressed form / is stored in flat files in compressed or uncompressed form.

Extract.Uncompressed = - Data transmitted in compressed form.

Extract.Uncompressed = X Data transmitted in uncompressed form.

Import: Updating in block mode (faster) X Import.BlockMode

Use: Data records are stored in target table by array insert.

Import.BlockMode = - Parameter is inactive.

Import.BlockMode = X Parameter is active.

Import: Overwrite existing data sets X Import.Override

Use: When this parameter is active, existing data records in the receiver system are overwritten by data records with the same primary key during import.

Import.Override = - Parameter is inactive.

Import.Override = X Parameter is active.

Page 59: TDMS Operation guide

8 Additional Information for Business Process Library

8.2 Change Technical Settings

59 July 2010

Description Parameter value (default)

Parameter name

Buffer size at output (datasets) 1000 Output.BufferSize

Use: Controls the number of data records that are collected from import processes in the target system before they are written to the database.

Note: Values that are too large can lead to termination of the extraction processes because of a lack of storage space.

Description Parameter value (default)

Parameter name

Extraction: Process empty tables - Process.EmptyTables

Use: Tables that are part of the selected business context or were added by customer to the data model in the Select objects to be copied activity and are empty with regard to the sender client are not considered in the extraction. The older the last crawler result is, the higher the probability that such tables have been filled in the meantime. If you activate the parameter Process.EmptyTables (= X), those tables classified as empty in the last crawler result are also considered during extraction, export or import run.

Package size for input (datasets) 1000 Read.PackageSize

Use: This parameter controls the maximum number of data records that can be read for each access to a dependent table in the source system.

Max. package size 5000 ReadAll.PackageSize

Use: This parameter controls the maximum number of data records that can be read for each access to a table which is to be copied completely in the source system.

Maximum depth with recursion 999 Recursion.Maxdepth

Use:

This parameter controls the maximum number of table relationships that can be processed in a row (relevant in the intelligent copy mode only). The parameter counts the “jumps” included in a chain of table relationships. One table relationship corresponds to one “jump”.

For recursions, you can use the parameter Recursion.Maxdepth to indirectly control how many times a recursion can be passed through.

Example:

Recursion.Maxdepth = 15, Recursion.Maxdistance = 4

With parameter Recursion.Maxdistance = 4, a recursion includes 5 table relationships at most. Parameter Recursion.Maxdistance counts intermediary tables in a recursion, parameter Recursion.Maxdepth counts table relationships. With Recursion.Maxdepth = 15, a recursion of maximum distance (that is 4 intermediary tables and 5 table relationships in the example) can be passed through three times.

Page 60: TDMS Operation guide

8 Additional Information for Business Process Library

8.2 Change Technical Settings

60 July 2010

Description Parameter value (default)

Parameter name

Maximum distance with recursion 3 Recursion.Maxdistance

Use: This parameter defines the maximum number of intermediary steps (tables) permitted for recursion.

Example:

Table ABC is a source table. Table dependencies are tracked by table relationships:

(Table ABC <--> Table 1), (Table1 <--> Table 2), (Table 2 <--> Table 3), (Table 3 <--> Table ABC)

=> With default parameter value = 3, table ABC is considered in the fourth table relationship. There are 3 other intermediary tables in the recursion before “jumping” to table ABC again. With Recursion.Maxdistance = 2, table ABC would not be considered anymore.

Temporary directory on DD System {DDSYS.TempDir} Tempdir.DD

Use: You can replace the placeholder {DDSYS.DataDir} by a directory that is accessed from the application server of the central system and that is used (package-dependent) to store temporary data processed by a BPL run. Otherwise, temporary data is stored in the directory defined for DIR_TEMP in transaction AL11.

Notes:

These files are relevant only during the current process, and you can delete them regularly.

If using multiple application servers, ensure that all of them can access the specified directory.

Temporary directory on src. System {SRCSYS.TempDir} Tempdir.Source

Use: You can replace the placeholder {SRCSYS.DataDir} by a directory that is accessed from the application server of the sender system and that is used (package-dependent) to store temporary data processed by a BPL run. Otherwise, temporary data is stored in the directory defined for DIR_TEMP in transaction AL11.

Notes:

These files are relevant only during the current process, and you can delete them regularly.

If using multiple application servers, ensure that all of them can access the specified directory.

Description Parameter value (default) Parameter name

Temporary directory on tgt. System {TRGSYS.TempDir} Tempdir.Target

Use: You can replace the placeholder {TRGSYS.DataDir} by a directory that is accessed from the application server of the receiver system and that is used (package-dependent) to store temporary data processed by a BPL run. Otherwise, temporary data is stored in the directory defined for DIR_TEMP in transaction AL11.

Page 61: TDMS Operation guide

8 Additional Information for Business Process Library

8.3 Select Objects to Be Copied

61 July 2010

Description Parameter value (default) Parameter name

Notes:

These files are relevant only during the current process, and you can delete them regularly.

If using multiple application servers, ensure that all of them can access the specified directory.

8.3 Select Objects to Be Copied

In this activity, you can view the predefined business objects together with their dependent objects and underlying tables that are contained in the selected business context. Furthermore, you can enhance the predefined context by creating your own objects, such as customer table relationships, and by adding customer tables. You can add any tables detected by the crawler.

8.3.1 Areas of the TDMS BPL Scenario Designer View

This view consists of three different areas:

Overview area (right area)

Search area (upper left area)

Search result area (lower left area)

8.3.1.1 Overview Area

The predefined objects and customer objects constitute a data model, which is shown in a hierarchical tree structure in the overview area. At the top of overview area, the scenario symbol and package description represent the root of the model‟s tree structure. The branches of the tree represent objects of different object types.

8.3.1.1.1 Object types in the data model displayed in overview area

The following object types exist in the data model:

Entities: An entity serves as a table container. You can assign tables (objects of type table), and/or further entities directly to an entity. The name of an entity describes the function of the assigned tables. Entities can be either predefined or created by customer.

Tables: An object of the object type table represents a table in the sender/receiver system. A predefined business context contains predefined tables referring to the business object. You can add all tables detected by the crawler to a predefined business context by means of the mass linking (tables) feature. Tables can be either predefined or added by customer.

Selections: A selection filters data records from a table based on certain selection criteria. Selections can be either predefined or created by customer. Selections defined in the Selections tab page within the Define Selection Criteria activity are displayed in Scenario Designer with the description Field selections (generated).

Page 62: TDMS Operation guide

8 Additional Information for Business Process Library

8.3 Select Objects to Be Copied

62 July 2010

Table relationships: The data environment of a business object is extracted by tracking table dependencies. These dependencies are specified in table relationships. A table relationship describes specific properties which the relationship between a dependent table ( = target table) and a table above it ( = source table) must fulfill. The system selects data records from the target table based on these properties.

Exits: An exit modifies extracted table data (for example, by scrambling the data). Exits can be either predefined or created by customer.

8.3.1.1.2 Dependencies between the different objects in the tree structure of the data model

Starting Entity

A starting entity is assigned directly to the scenario. To view the properties of the entity, double-click it. The name of the entity describes the business object whose data environment is extracted. To display the objects attached directly to the starting entity in the tree structure, click the node to the left of the starting entity. Tables and further entities are assigned directly to the starting entity. One of those tables is the starting table.

Starting Table

The process of tracking table dependencies starts with the starting table.

A table is a starting table if it fulfills one of the following conditions:

The table is entered in the properties of an entity as starting table and this entity is assigned directly to the scenario symbol.

The table is to be copied completely because the copy control setting for complete copy has been set (either in the copy control settings for the table itself or in the copy control settings for the parent entity, for more details see Copy Control Settings).

To display objects attached directly to the starting table in the tree structure, click the node to the left of the starting table. A predefined selection and table relationships are assigned directly to the starting table.

Predefined selection

A predefined selection is assigned to the starting table. Double-click the selection to view the properties of the selection. The popup provides information about the table field(s) most commonly used for selecting for the starting table (for example field PARTNER in the business partner context) and placeholders for selection parameter values (for example {Parm.BusinessPartner_FROM} and {Parm.BusinessPartner_TO} in the business partner context).

On the Selection criteria tab page of the Define Selection Criteria activity, you can activate these selection parameters and define parameter values.

Note: Activating these parameters is not mandatory. You can also define the required selections in the Selections tab page within the Define Selection Criteria activity.

Page 63: TDMS Operation guide

8 Additional Information for Business Process Library

8.3 Select Objects to Be Copied

63 July 2010

Table relationships

When a selection for the starting table is available, the processing of table dependencies by table relationships can start.

Further entities and tables

If you open the nodes to the left of further entities, you can view predefined tables attached directly to these entities.

A table must fulfill the following conditions to be included in the scenario:

The table must be assigned directly to an entity AND

The table must be classified as starting table OR must be involved in a table relationship as source table or target table AND

The crawler detected that the table contains data records with the sender client in the client primary key

8.3.1.2 The Search Area

In search area, you can search for:

Predefined entities included in the selected business context. Click the Entities icon to search for these objects.

Entities created by the customer and included in the data model of the scenario. Click the Entities (Customer) icon to search for these objects.

Predefined tables included in the selected business context. Click the Tables icon to search for these objects.

Tables added by the customer to the data model of the scenario by the functionality for mass linking (tables). Click on the Tables (Customer) icon to search for these objects.

8.3.2 Scenario View and Usage View

You can use the usage view to search for all object relationships that contain a given entity or table as a dependent object, that is, as an object attached to a parent object.

Search for the respective entity or table in the search area and double-click the search result in the search result area. The system displays the object, together with the objects directly attached to it, in the scenario view. To switch to usage view, mark the object and proceed as follows:

Click the Goto icon, and choose Usage View

OR

Right-click the marked object. Choose Goto in the menu, and then choose Usage View.

8.3.3 Upload a Selection Set

You define selections in the activity Define Selection Criteria. On the Selections tab page of this activity, you can enter or upload single selection values and selection ranges for each table and table field. For the starting table, if you want to select for several thousand values, you can use a selection set.

Page 64: TDMS Operation guide

8 Additional Information for Business Process Library

8.3 Select Objects to Be Copied

64 July 2010

Example:

Assumptions:

You selected the business context: Material MARA is the starting table in the Material business context.

You want to select for several thousands of material numbers in table field MATNR.

First, create a CSV file, for example using Microsoft Excel. In the spreadsheet, enter table field MATNR in cell A1. Enter the material numbers in the A column starting with cell A2.

A B C

1 MATNR

2 000000000000000498

3 000000000000000570

4 000000000000000571

5 100-100

Note:

MATNR has the alphanumerical domain CHAR with length 18. In the CSV file, ensure that you save entries that only contain numerical elements with leading zeros, for example cell A2. Otherwise, TDMS-BPL cannot interpret the value correctly during the upload.

Save the file in CSV format.

Execute the activity Define Selection criteria. In the menu bar, navigate to Extras -> Selection sets.

Click the Add icon. Provide a selection set ID (for example MARA_LIST), a description and the path of the CSV file. Confirm your input. The system displays the new selection set in an ALV grid.

8.3.4 How to Assign the Selection Set to the Starting Table

Click the starting table to select it. Click the Create icon. Click Selection (Customer), and confirm. You can then assign the selection set to the starting table. To do this, input the corresponding selection condition in the lower table area of the popup as follows (The following table refers to the example in Upload a Selection Set).

Seq.No. FieldID Sign Rel.Op. Low value

1 MATNR I IN [MARA_LIST-MATNR]

The selection condition involves the following elements:

Enter the sequential number in the first column.

Enter the field in the second column.

Enter I in the Sign column (I = Include).

Select IN as the Relational Operator in the fourth column.

In the Low value column, enter an expression in square brackets that consists of the selection set ID, then a minus sign, followed by the name of the table field used in the selection.

Page 65: TDMS Operation guide

8 Additional Information for Business Process Library

8.3 Select Objects to Be Copied

65 July 2010

8.3.5 Enhance a Predefined Business Context by Adding Further Tables Detected During Crawler Run

Within a scenario, you can extend a predefined business context and adapt it for your purposes by adding further tables using the mass linking (tables) feature. You can add any tables detected during the crawler run. As a result of mass linking (tables), the added tables are attached to entities.

The direct assignment to an entity is only a necessary but not a sufficient condition for a table to be included in the scenario. In a second step, an added table must fulfill one of these conditions to be considered during extraction:

The table is involved in a table relationship (as source or target table).

In the properties of the table or the parent entity, the copy control setting for a complete copy has been chosen. Then, the table is classified as a starting table. (For the conditions a starting table must fulfill, refer to the Dependencies between the different objects in the tree structure of the data model section. For copy control settings see Copy Control Settings).

Before adding tables by mass linking (tables), mark an entity to which you want to assign the added tables. You could assign an added table to a predefined entity, but you should create a new entity as table container. To do this, mark the object the new entity will be assigned to (either the scenario symbol or another entity). Choose Create, then Entity (Customer), and confirm. Provide an entity ID and a description, and confirm. The new entity is then automatically attached to the object marked previously.

Mark the new entity and choose the Mass linking (tables) icon. Enter a search criterion in the field Table ID (for example, Z* for all Z-tables).

According to your search criteria, the search result displays in an ALV grid. Mark the lines of all tables you want to add, and confirm. The system assigns the selected tables to the marked entity.

To view information about a table you added by mass linking, double-click it. You can view the context ID, the table ID, the table description, the delivery class and the table type. Choose the information icon to view additional information, such as the table‟s total size in MB, the index size in MB, primary keys, and secondary indexes.

8.3.6 How to Create Table Relationships

There are different reasons for you to create table relationships:

To be included in the scenario, an added table must be either involved in table relationships or classified as a starting table. (For the conditions a starting table must fulfill, refer to the Dependencies between the different objects in the tree structure of the data model section). By executing the functionality for mass linking (tables), you only fulfilled the necessary condition for a table to be extracted, that is to say the direct assignment to an entity, but this is not yet sufficient.

You cannot edit the table relation statements in a predefined table relationship. To add conditions, for example, set the predefined table relationship to inactive and create a new one.

The system extracts the data environment of a certain business object by tracking table dependencies. Table relationships between one source table and one target table, respectively, are defined to track table dependencies. Tracking begins with a starting table to which some dependent tables are assigned by means of table relationships. The starting table is the source table in such a relationship. The dependent table is the target table. In a chain of table relationships, the target table of a table relationship can be the source table in the following table relationship.

Page 66: TDMS Operation guide

8 Additional Information for Business Process Library

8.3 Select Objects to Be Copied

66 July 2010

A Create table relationship popup window opens if you move a table from the search result area to a table in the overview area by using drag and drop or from the overview area by using drag and drop to another table in the overview area.

The moved table becomes the target table in the new table relationship.

The system displays the context ID and the participating tables (source and target table) of the table relationship in the upper area of the window. You can use the Information icon to view detailed information about the table (for example, the primary keys).

To display the table fields of each table, choose Show table structure.

In the lower table area of the table relationship popup, you can view the Field relations tab page, the Conditions tab page and the Extended conditions tab page.

In the Field relations tab page, you define table relations with specific properties between fields of the source table (source field column in popup window) and fields of the target table (target field column in popup window). You specify the properties in the Rational Operator column, for example, equivalence or non-equivalence. The system selects data sets of the dependent table (target table) for which the table relation to the source table has the required properties.

You can use the Propose field description wizard to get suggestions for useful possible table relations between source and target table based on information about data elements, domains and key fields.

On the Conditions tab page, you can define conditions that data sets from the source table must fulfill in order to be processed. In this way, you can restrict the amount of data to be processed.

Example

Assumptions:

Chosen business context: Material

Table relationship of interest: MARC <-> MVER

Source table MARC (plant data for material): Table MARC shows in which plants a certain material is used.

Target table MVER (material consumption): Table MVER shows how much of a certain material was consumed in a certain plant for a certain period.

You are only interested in the material consumptions for plants 1000 and 3000.

1) Set the predefined MARC <-> MVER relationship to inactive: Execute the Select Objects to be Copied activity. The Scenario Designer

view appears. Open the node next to entity Material, if applicable. Click the Column Configuration icon, if applicable. Set an indicator for

Code and ID and confirm. The system displays the technical table names. Open the node next to table MARC Double-click MARC <-> MVER table relationship. Set an indicator at Inactive, and confirm. The predefined MARC <-> MVER relationship is set to inactive

2) Create a new MARC <-> MVER table relationship: Table MVER is part of the Material business context. Therefore, you can

search for MVER in the search area. Table MVER displays in the search result area. Click the search result line,

drag and drop table MVER to table MARC in the overview area.

Page 67: TDMS Operation guide

8 Additional Information for Business Process Library

8.3 Select Objects to Be Copied

67 July 2010

A Create table relationship popup opens. MARC is the source table, MVER the target table. In the lower table area of the popup, the Field relations tab page is active.

Click the Propose field description icon to get suggestions for useful possible table relations between source and target table based on information about data elements, domains and key fields. The wizard suggests the following relations:

Seq.No. Src field Src expression AliasFieldName Rel.Op. Tgt field

1 MATNR = MATNR

2 WERKS = WERKS

3 LFGJA = GJAHR

4 PERKZ = PERKZ

5 SHZET = ZAHLR

You are only interested in MATNR and WERKS. You can delete the last three lines.

To check the consistency of entered table relations in the popup window, click the Check relations icon.

To view information about the SQL source text for the table relationship, use the SQL Preview icon.

3) To consider only material consumptions of plants 1000 and 3000, define a condition. Click the Conditions tab page. Enter the condition, as follows:

Seq.No. Link of cond.

Src field Src expression Rel.Op. Tgt field Tgt expression

1 WERKS = „1000‟

2 OR WERKS = „3000‟

8.3.7 Copy Control Settings

The copy control settings allow you to control the amount of data copied. You can edit copy control settings for entities and tables. You cannot delete predefined objects from the data model in the overview area, but you can use copy control settings to exclude them after all.

For example, you can exclude a branch of the tree structure consisting of an entity and all objects directly and indirectly attached to it from a scenario. If you want to extract a Z table that is not involved in a table relationship, you can choose Copy table completely in the copy control settings. Based on this setting, the system classifies the Ztable as a starting table, and includes it in the scenario.

Page 68: TDMS Operation guide

8 Additional Information for Business Process Library

8.3 Select Objects to Be Copied

68 July 2010

8.3.8 Copy Control Settings for Entities

Double-click an entity that you added to the scenario. You can select one of the following settings for the copy control for the entity:

Criterion Explanation

Standard - no effect on containing / subordinated objects

Normal situation: the tables that belong to the entity are copied based on extraction procedure and selections made

Copy containing tables COMPLETELY

Only relevant in the "intelligent" mode. All DIRECTLY assigned tables are copied completely (and not treated as dependent tables).

Do NOT copy containing tabs; trace dependency

All DIRECTLY assigned tables are not copied, but dependent objects are copied.

Do NOT copy containing tabs; no dependency tracing

Assigned tables are not copied and dependencies are not tracked further. The branch is excluded from extraction.

NEVER copy containing tabs; no dependency tracing

DIRECTLY assigned tables are never copied (even if they are assigned to more entities), dependencies of these tables are also no longer tracked.

The setting "Do NOT copy containing tabs; no dependency tracing" works as if the branch extending from the given entity was cut off in the scenario tree. The tables contained in this branch are not included in the extraction wherever they are attached, directly or indirectly, to the given entity in the scenario tree. However, this does not exclude the possibility that such tables in the scenario tree are attached, directly or indirectly, to further entities within other branches. Such tables can still be part of the extraction due to this assignment to further entities.

If you select the setting "NEVER copy containing tabs; no dependency tracing", only the tables directly attached to the respective entity will be excluded from the extraction everywhere in the scenario tree.

Example:

If the tables 1, 2 and 3 are assigned directly to entity XXX that has been given the copy control setting “NEVER copy….”, these tables are generally excluded from the extraction in the scenario. However, if another entity YYY with the tables 4 and 5 belongs to entity XXX, the tables 4 and 5 will then be included in the extraction, because these tables are not directly assigned to entity XXX.

Therefore it is no contradiction if the setting “DO NOT copy…..” excludes much more tables than the setting “NEVER copy……”.

To edit and maintain the entities of a scenario, choose Edit containing entities.

8.3.9 Copy Control Settings for Tables

Double-click a table that is included in the scenario to edit the properties of this table.

Page 69: TDMS Operation guide

8 Additional Information for Business Process Library

8.3 Select Objects to Be Copied

69 July 2010

You can select from the following copy control settings for the table:

Criterion Explanation

Dependent on extraction mode Normal situation

Copy table COMPLETELY, track ALL dependencies

Only relevant in the "intelligent" mode. Does not treat table as dependent table, but copies all sets and tracks all dependencies

DO NOT copy table, trace dependencies

Table is not copied, but dependent objects are copied

DO NOT copy table, NO dependency trace

Table and all dependent objects are not copied

Never copy completely - overwrites cutoff and entity settings

With cutoff and entity settings, overwrites the possible setting that is to be copied completely

Never delete table Table is never deleted in the deletion run in the target system

8.3.9.1 Cut Entities and Tables Using Functionality “Disable/Enable”

To make exclusion of entities and/or tables easier, you can use the Disable/Enable feature. For all entities marked, you can set the copy control setting “Do NOT copy containing tabs; no dependency tracing”. For all tables marked, “DO NOT copy table, NO dependency trace” can be set.

To do this, mark the entities and/or tables you want to exclude and choose Disable/Enable. The copy control setting changes.

8.3.9.2 View Copy Control Settings Using Functionality “Show

Contained Tables”

You can view the list of participating tables for the scenario for control purposes by using the “Show contained tables” feature. This list represents the work list with table-related control parameters (table type, table function indicator, completely copy indicator, selection condition available) for the later extraction procedure. You can see which tables the system copies completely, and which tables are treated as dependent tables.

Alternatively, you can view this list of tables included in the scenario in activity View Involved Tables or in activity Define Selection Criteria within the Tables tab.

Page 70: TDMS Operation guide

8 Additional Information for Business Process Library

8.3 Select Objects to Be Copied

70 July 2010

8.3.9.3 Delete Objects Created or Added by Customer From the Scenario

You can delete objects created or added by customers from the data model. (For predefined objects and object relationships, the only way of excluding them is by means of the copy control settings.) There are a number of ways to delete an object in the overview area:

Completely deleting a (marked) object from the tree structure at all locations with all attached links. To do this, choose the Delete object with all relationships icon.

Deleting a single assignment of a (marked) object to a parent object within a scenario. To do this, Choose Delete assignment.

You can delete the customer‟s selections and table relationships by using Delete object with all relationships.

8.3.9.4 How to Scramble Data in a BPL Context

For reasons of data protection, a requirement can be that sensitive data (e.g., salary data, business partner data) is not visible in the receiver system. TDMS-BPL can modify the data to make it anonymous before it leaves the sender system (“scrambling”). By default, a specific exit is offered to scramble data at table level. The sensitive data in one or more table columns is overwritten by a neutral value (for example, 9999) dependent on field name, data element, and/or domain.

NOTE: Please note that the scrambling of key fields should be avoided. There is the danger of duplicate keys, for example.

The procedure of how to scramble a pre-defined table differs slightly from the procedure for customer tables.

How to scramble table columns in a pre-defined table:

In the overview area, mark the table whose data you want to scramble. Choose Assign and select the Exit object type in the Choose Relationship popup. You can choose an exit from a list of exits.

Select the exit with object abbreviation DEFAULT AN, and confirm. The exit is attached to the marked table (as an alternative you can also create a new customer exit).

Double-click the exit. The Exit popup opens. Give an exit description.

The default function module that contains the scrambling logic is /SNP/DW03_EXIT_AN_DEFAULT.

In the lower table area within the Exit popup, you can define data elements, domains and/or field names as scrambling criteria and a neutral value used to overwrite the relevant data.

Scrambling dependent on data elements: If you define one or multiple data elements as scrambling criterion, all table fields with these data elements will be scrambled. Example: In table BSEG, the table fields with assigned data elements KOART and DMBTR are to be scrambled. Edit the Exit popup as follows:

Page 71: TDMS Operation guide

8 Additional Information for Business Process Library

8.3 Select Objects to Be Copied

71 July 2010

Element name Value Description

DTELS KOART,DMBTR

NEWVALUE 9999

Scrambling dependent on domains: If you define one or multiple domains as scrambling criterion, all table fields with these domains will be scrambled. Example: In table BSEG, the table fields with assigned domain WERT7 are to be scrambled. Edit the Exit popup as follows:

Element name Value Description

DOMAINS WERT7

NEWVALUE 9999

Scrambling dependent on field names: If you define one or multiple field names as scrambling criterion, all table fields with these field names will be scrambled. Example: In table BSEG, the table fields SGTXT and PRCTR are to be scrambled. Edit the Exit popup as follows:

Element name Value Description

FIELDS SGTXT,PRCTR

NEWVALUE 9999

Scrambling dependent on field names, data elements, and domains using one exit: Example: Edit the Exit popup as follows:

Element name Value Description

FIELDS SGTXT,PRCTR

DTELS KOART,DMBTR

DOMAINS WERT7

NEWVALUE 9999

In the overview area, you can reuse an already defined scrambling exit for other tables if the scrambling criteria fit. Therefore you can assign the existing exit to the other tables via drag-and-drop. Alternatively, you can proceed as follows:

1. Mark the table to be scrambled. 2. Click Assign. 3. Choose Exit. 4. The already defined scrambling exit is displayed (DEFAULT_AN).

Choose this exit.

Page 72: TDMS Operation guide

8 Additional Information for Business Process Library

8.3 Select Objects to Be Copied

72 July 2010

As an alternative, you can also create a new exit:

1. Mark the table to be scrambled. 2. Click New. 3. Choose Exit (Customer). 4. Set an appropriate function module for scrambling

(in this example /SNP/DW03_EXIT_AN_DEFAULT)

Example:

A scrambling exit for table BSEG is defined as follows:

Element name Value

FIELDS SGTXT

NEWVALUE 9999

Tables BSID, BSIS, BSAS, and BSAK also contain the SGTXT field. You can assign the exit for BSEG to BSID, BSIS, BSAS, and BSAK.

How to scramble table columns in a customer table:

In the overview area, mark the customer table whose data you want to scramble. Choose Create, and select Exit (Customer). An Exit popup opens. Give an exit description.

As function module, enter /SNP/DW03_EXIT_AN_DEFAULT.

Now you can edit the lower area of the exit popup and reuse the defined exit as described for pre-defined tables.

How to create custom scrambling function modules:

Creating custom scrambling rules requires ABAP knowledge. The function module /SNP/DW03_EXIT_AN_DEFAULT should be copied to your customer namespace as a template. After implementing your custom scrambling function module on the sender system, it can be customized as function module in the scrambling exit configuration of the scenario designer.

How to use TDMS field-based scrambling rules in BPL:

After you have defined your field-based scrambling rules in the TDMS scrambling platform, you can also use them in BPL (as of DMIS_EXT SP level 08). Please specify /SNP/DW03_EXIT_AN_TDMS as scrambling function module in your scrambling exit configuration and enter the following parameters:

Element name Value

SCR_PACKID <your scrambling package>

SCR_RULEID <id of the scrambling rule>

Page 73: TDMS Operation guide

8 Additional Information for Business Process Library

8.4 Activity “Define Selection Criteria”

73 July 2010

SCR_CLIENT <client of the central system>

8.3.10 How to Copy One or More Specified Tables

Instead of extracting a certain business object together with its data environment, you can also copy one or more specified tables within a package. This makes sense, for example, if you want to supply an already filled receiver system with only a few additional tables.

For a detailed description of how to proceed, see section 7.7.

8.4 Activity “Define Selection Criteria”

This activity provides the features needed to define selections. When you execute the activity, you can view the three tab pages Selections, Selection criteria and Tables. The Selections tab page is active.

In the Selections tab page, you can define your own selections. In the Selection criteria tab page, you can activate predefined selection parameters and give concrete parameter values.

8.4.1 Selections Tab Page

In the upper area of the Selections tab page, there is a display filter. Depending on this filter, starting tables, dependent tables, or all tables contained in the scenario are displayed in the upper area.

If you double-click a table in the upper area, all table fields of this table are displayed in the lower area in the Field selections tab page. If you click the Multiple selections button in a line with regard to a certain table field, a Multiple selection popup opens where you can define (or exclude) single selection values and/or selection ranges for this table field.

When you have defined a selection, a second filter Scope of selection is displayed in the Field selections tab page. By default, selections only affect starting tables (directly). In this case, these selections affect dependent tables indirectly by tracking table dependencies. As an extended feature, you can specify that selections should also affect dependent tables directly. Then, a selection for a dependent table restricts the number of data sets extracted from this table in addition to the selections made for the starting table. The Scope of selection filter is important with regard to dependent tables. Here, you can control for each dependent table whether the defined selection takes effect or not. Choose between the two settings Only starting tables – standard and Starting and dependent tables (extends relation).

Next to the Field selections tab page, you can see the Further selections tab page. Here, you can view further selections that affect the considered table in addition to the selections from the Field selections tab page. If existing, such a selection has been assigned to the considered table in the scenario designer within the Select Objects to be Copied activity:

Examples for further selections:

To assign a selection set to the starting table of the chosen business context, you must create a special selection statement in the scenario designer.

In each business context, a predefined selection has been assigned to the starting table in the scenario designer. This selection contains placeholders for the most important selection parameters with regard to the chosen business context. You can activate these parameters in the Selection criteria tab page (located next to the Selections tab page) and define concrete parameter values.

Page 74: TDMS Operation guide

8 Additional Information for Business Process Library

8.4 Activity “Define Selection Criteria”

74 July 2010

You can double-click a selection in the Further selections tab page to view the properties of the selection. You can set the selection to inactive. You can control whether the selection applies to dependent tables or not. Confirm your settings. Your settings are displayed in the Further selections tab page.

To view an SQL preview of the generated selection statement, mark the table in the upper area of the Selections tab page and choose Result preview of the selection icon. You can view the generated selection statement and a prognosis for the number of records selected.

Example:

Assumptions:

Chosen business context: Material Starting table of Material business context: MARA Primary key of MARA: MATNR (material number)

Dependent table of interest in this example: MARC (plant data for material) MARC shows in which plants a certain material is used. Primary key of MARC: MATNR, WERKS

You want to select for material numbers 11001200 to 11001500.

With regard to plant data for material, you are only interested in plants 1000 and 3000.

1) Define a selection on MARA (starting table) for a material number range 11001200 to 11001500.

Execute the activity Define Selection Criteria. The Selections tab page is active.

The display filter shows ST starting tables. The system lists MARA as the starting table in the upper area of Selections tab page. In the lower area, the Field Selections tab page is active and shows the fields of MARA starting table.

Click the multiple selections icon for table field MATNR. In the selections popup, enter the range of material numbers 11001200 to

11001500, and confirm. There are different hints that a selection for MATNR in table MARA has

been defined: - the Scope of selection filter that has appeared in Field selections tab page - the green rectangle in the multiple selections button for table field MATNR - the little checkmark in the Field selections tab - an indicator for MARA in the Selection condition available column in the upper area of Field Selections tab page

Mark table MARA and click the result preview of the selection icon. A popup opens that shows the SQL selection statement and the number of found records.

2) Define a selection on MARC for the plants 1000 and 3000: In the display filter in the upper area, choose Dep.obj.Tables. The system lists the dependent tables contained in the scenario in the

upper area. Double-click table MARC. In the Field selections tab page, the system displays the fields of table MARC.

Click the multiple selections icon for table field WERKS. In the selections popup, enter the selection values 1000 and 3000. The Scope of selection filter appears in Field selections tab page

3) Save your settings.

Page 75: TDMS Operation guide

8 Additional Information for Business Process Library

8.5 Repeatable execution of BPL packages

75 July 2010

8.4.2 Tab Page “Selection Criteria”

In each business context, a predefined selection is assigned to the starting table. This selection contains placeholders for the most important selection parameters with regard to the selected business context. In the Selection Criteria tab page, you can activate these parameters and define concrete parameter values. To activate the parameters, set an indicator in the Active column. Enter the required parameter values in the Parameter value column. Save your settings.

Activating selection parameters in the Selection criteria tab page is not obligatory. You can also define the required selections in the Selections tab page, if applicable.

Example:

Assumptions:

Selected business context: Material

The most commonly used selection parameter in the Material business context is the material number (table field MATNR). You want to select for the range of material numbers 11001200 to 11001500.

You can define the required selection in the Selections tab page. As an alternative, you can use the predefined selection assigned to the Material business context, because it contains placeholders for the most commonly used selection parameters Material number (From) and Material number (to). Activate these parameters and enter the required selection values:

Execute Define Selection Criteria activity, if applicable. Click the Selection criteria tab page. To activate the selection parameters Material number (From) and Material

number (to), set indicators in the Active column. Enter the required range of material numbers in the Parameter value

column. According to the domain, enter 18-digit values, that is, 000000000011001200 to 000000000011001500.

Save your settings.

8.4.3 Selection Set

For the starting table, if you want to select a large number of values, you can use a selection set. In the Define Selection Criteria activity, you can upload a selection set. A selection set is a CSV file containing specified parameter values. The maximum size of a selection set is 64 KB. Having uploaded the selection set, you must assign it to the starting table. For this reason, you must create a special selection statement in the scenario designer within the activity Select Objects to be Copied.

For detailed information about selection sets, refer to Upload a Selection Set.

8.4.4 Show Contained Tables in the Tables Tab

In the Tables tab, you can view the tables contained in the scenario. Alternatively, you can view these tables in the View Involved Tables activity and, by using the Show contained tables feature, in the activity Select objects to be copied.

8.5 Repeatable execution of BPL packages

This new BPL process tree only consists of two phases:

Page 76: TDMS Operation guide

8 Additional Information for Business Process Library

8.5 Repeatable execution of BPL packages

76 July 2010

Package Settings

Setup and Execution

During the Package Settings phase, you define the RFC destinations and create a BPL scenario based on the predefined BPL business contexts. The Setup and Execution phase can be used repeatedly. For example, you can re-execute the data transfer of a scenario within the same package instead of creating a new package for each transfer. The settings of the scenario can be readjusted before the execution.

Page 77: TDMS Operation guide

9 Consistency in Test Landscapes Built With SAP TDMS

9.1 Business Suite Components supported by TDMS

77 July 2010

9 Consistency in Test Landscapes Built

With SAP TDMS

9.1 Business Suite Components supported by TDMS

As of Release TDMS 3.0, SP13, SAP TDMS supports the following components

SAP ERP (Basis Releases: 4.6C, 6.20, 6.40, 7.00)

SAP BI (BI 3.5 – Basis Release 6.40; BI 7.0 part of NetWeaver 7.0)

SAP CRM 4.0 (Basis Release 6.20); CRM 5.0 (Basis Release 6.40); CRM 6.0 (Basis Release 7.00); CRM 7.0 (Basis Release 7.00)

9.2 Assumed Customer Scenario

If you create a test landscape including each of the above-mentioned components with the help of SAP TDMS, you need to create individual TDMS packages for each of the components, e.g.

Time based package (TDTIM) for ERP

Time based package (TDBTM) for BI

Time based package (TDCTM) for CRM

The following picture shows the scenario:

Picture 1

Productive BI System

Productive CRM System

Productive Legacy System

Extraction

Extraction

Extraction

Data Exchange

Test ERP System

Test CRM System

Test BI System

TDMS Central System

Productive ERP System

Page 78: TDMS Operation guide

9 Consistency in Test Landscapes Built With SAP TDMS

9.3 Consistency

78 July 2010

The productive landscape on the left assumes that a productive ERP system is connected to a productive CRM system and exchanges master data and sales orders with this system using CRM middleware. The ERP system is also connected to a productive BI system. ERP data is extracted and loaded to the BI system at regular intervals (e.g. twice a day). The CRM system is also connected to the same BI system, and data is extracted and loaded to the BI system. Additionally, data from a customer legacy system is loaded into the productive BI system. All data that is loaded into the BI system is collected and merged in different InfoCubes.

The test landscape that is built with the help of TDMS is shown on the right side of the picture. We assume that you have only one central system to run all 3 packages. Processes in the central system cannot be explicitly split and assigned to different packages.

This scenario has the following implications for the test landscape to be built up:

Connections between the ERP, CRM, BI and legacy systems that are set up in the productive landscape are not automatically available in the test landscape. It requires some effort to set up connections in the test landscape between ERP, CRM and BI.

The productive legacy system cannot be copied with TDMS to build a test legacy system. Hence there is no test legacy system which can be connected to the test BI system.

There is a considerable time difference between the 3 TDMS packages, which has an impact on the data consistency between ERP, CRM and BI. The following chapter describes this impact in more detail.

9.3 Consistency

Generally speaking, the degree of consistency in clients built up by SAP TDMS is quite high. Yet, due to the reduction of data in the time-based scenario, there is some risk that inconsistent test data might occur. Before landscape consistency issues are described in more detail, the following chapter explains the situation within one SAP component-specific package.

9.3.1 Consistency Within Components

9.3.1.1 TDMS for ERP

Time-based scenarios

Time-based TDMS scenarios use a cut-off date („from-date‟) for the data reduction in large tables. This kind of reduction partially leads to data inconsistencies in the reduced test system, because business processes might include data that covers a long period. For that reason, TDMS includes special programs (sometimes referred to as „fill header programs‟) for the most important business applications to keep data along business processes as consistent as possible. For performance reasons, however, we have to limit the consistency checks, and in some cases this might lead to inconsistent data in the test system.

Generally, the probability of inconsistent business documents is the higher the closer the creation date of the document is to the cut-off date of the TDMS scenario.

Page 79: TDMS Operation guide

9 Consistency in Test Landscapes Built With SAP TDMS

9.3 Consistency

79 July 2010

The following picture illustrates the consistency problem:

Picture 2

The area „Partially transferred with TDMS‟ represents the Fill Header programs which are used to include business process specific historical data to keep business processes consistent.

Potential consistency risks exist in the following areas:

Business processes for which no fill header programs exist - if historical data is needed to keep the business process consistent.

Business processes for which fill header programs exist, but the history is too long (For performance reasons, fill header programs do not capture all historical data.)

During the time when TDMS selects data for transfer (this is the time when data is loaded into the cluster table), users may continue with their normal work in the productive system. This may lead to inconsistencies, because some tables which belong together from a business process perspective are loaded earlier than others. This is why we recommend a downtime of the productive system during the cluster load.

Page 80: TDMS Operation guide

9 Consistency in Test Landscapes Built With SAP TDMS

9.3 Consistency

80 July 2010

Business Process Library (BPL)

Business Process Library scenarios of TDMS focus on single instances of business process-related data. Tables that belong to a business process as well as selection criteria for those tables are explicitly defined. Data consistency along the whole business process depends on the definition of the scenario and can be controlled by the user.

The following picture gives an overview of this scenario:

Picture 3

Inconsistencies may rather occur in other areas outside the defined business process if impacts and interrelationships to other business applications are omitted in the definition of a business scenario.

Please be aware of the risk of data inconsistencies if data is transferred by BPL in an already filled receiver client, e.g. because of differences in internal number ranges between sender and receiver. We recommend using a new empty client as receiver client (customizing only). Alternatively, you can also use a receiver client that has been built up directly before the BPL transfer using a classic TDMS scenario (e.g. master data and customizing).

9.3.1.2 TDMS for BI

The time-based TDMS BI scenario transfers the following:

1) Full transfer - BI master data - BI meta data - other non-BI-related data

2) Optional transfer - PSA (persistent staging area) 3) Reduced transfer - DSO (data store objects; formerly ODS) 4) - Info Cubes

Page 81: TDMS Operation guide

9 Consistency in Test Landscapes Built With SAP TDMS

9.3 Consistency

81 July 2010

DSO and InfoCube objects are reduced using a time-based approach.

To ensure that all relevant data from the productive system is transferred to the test system, all delta queues in the info sources should be empty, and updates of DSOs and InfoCubes from PSAs should be finished. If this is ensured, data in the test BI system should be consistent and in sync with the productive BI system.

9.3.1.3 TDMS for CRM

In general, everything that is mentioned for TDMS for ERP above, applies to TDMS for CRM.

Like in TDMS for ERP, the application tables that usually contain the largest data volume are reduced based on a cut-off date. To keep business process data as consistent as possible, One Order objects in CRM are reduced with special fill header programs. This is the only business object in CRM that is reduced with fill header programs, because this is a generic object that is used in various business processes in CRM.

CRM connects external components (ERP, BI, others), but also CRM-internal subcomponents (mobile clients; billing engine; entitlements manager …) asynchronously through middleware using BDOC technology. BDOCs are queued and processed asynchronously. To keep the test CRM synchronous with productive CRM, it is important that BDOC queues in the sender system are empty before data selection starts.

Potential consistency risks exist in the following areas:

Business processes for which no fill header programs exist - if historical data is needed to keep the business process consistent.

Business processes for which fill header programs exist, but the history is too long (For performance reasons, fill header programs do not capture all historical data.)

During the time when TDMS selects data for transfer (this is the time when data is loaded into the cluster table), users may continue with their normal work in the productive system. This may lead to inconsistencies, because some tables that belong together from a business process perspective are loaded earlier than others. This is why we recommend a downtime of the productive system during the cluster load.

If BDOC queues are not empty and middleware is active during data selection, this may result in data inconsistencies in the receiver system.

Page 82: TDMS Operation guide

9 Consistency in Test Landscapes Built With SAP TDMS

9.3 Consistency

82 July 2010

9.3.2 Consistency among Different Components

ERP, BI and CRM exchange data asynchronously, but usually very frequently. When test systems are built for each of the components with the help of TDMS, it is challenging to keep data in the test landscape as consistent as in the productive landscape.

The following picture shows a possible scenario of how a test landscape, as shown in picture 1, can be built with the help of TDMS

Picture 4

When using one central system for all packages in parallel, the number of batch processes in the central system has to be split among the three packages. This is critical from the overall performance perspective and anyway difficult to keep the ERP-, CRM and BI test systems consistent. Thus, the most realistic assumption is that the 3 packages are executed sequentially as depicted in picture 4.

Depending on the data volume and power of the participating systems, a package may run from several hours to several days. It is unrealistic to have a downtime in all sender systems which spans the relevant time frame for all three packages. Hence it is obvious that data in the receiver of the first package (in picture 4, this is BI) will not have all data of ERP and CRM receivers, because these are started at a later point in time and their sender systems were changed and enhanced in the meantime.

Data differences between the test BI, ERP and CRM systems are visualized through time frames T1, T2 and T3.

To minimize these differences, TDMS 3.0 SP13 delivers a solution for the time-based scenario (TDTIM) which enables you to keep the test landscape consistent to the highest possible degree.

Please note in addition that the TDMS functionality for BI selects data as of the from-date and does not use any programs for filling internal header tables. That means that there are some restrictions

Page 83: TDMS Operation guide

9 Consistency in Test Landscapes Built With SAP TDMS

9.3 Consistency

83 July 2010

in comparison to TDMS for ERP, where such programs are used to transfer relevant data that has been created previous to the from-date. This applies, for example, to document flows in Materials Management and Sales Distribution or open items in Financials.

9.3.2.1 Approach to Keep Test Landscape Consistent

To make use of the new functionality delivered with SP13 that enables you to keep your test landscape consistent, you have to follow the sequence below:

Proceed as follows:

1. Execute a TDMS time-based scenario for ERP to build a test ERP system

2. Process a TDMS time-based scenario for CRM up to the post-processing phase.

3. In the post-processing phase of the TDMS time-based scenario for CRM (TDCTM), additional steps have been included to load CRM-relevant ERP data (master data and sales

Recommended Sequence

Settings

Analysis

Selection

Transfer

Post

processing

BI Package

Settings

Analysis

Selection

Transfer

Post

processing

ERP Package

Settings

Analysis

Selection

Transfer

Post

processing

CRM Package

Sequence of

package execution

Time T 1 T 2

T 3

Page 84: TDMS Operation guide

9 Consistency in Test Landscapes Built With SAP TDMS

9.3 Consistency

84 July 2010

orders) from the productive ERP to the test ERP considering only delta objects from the start of selection in ERP to the end of selection in the CRM package (interval T1). The following steps are executed:

a. The system calculates the time frame for which data consistency has to be improved. To do so, it considers the first execution of the first fill header program in the productive ERP system and the last execution of the data selection in the productive CRM system to obtain the maximum data consistency between CRM and ERP. The time difference between the execution of TDMS CRM and TDMS ERP packages is obtained from the control tables. The missing business objects are then determined based on these dates from the BDOC store in the CRM system.

b. From the BDOC store the data to be synchronized is obtained and stored in DB table. The data in this table serves as input for TDMS BPL. It is also displayed to the user. The user can choose not to execute the synchronization steps if the data volume is acceptably low.

c. The business object data found above serves as input for TDMS BPL, and TDMS BPL transfers this subset of data. However, as the package in execution is a TDMS CRM package, the RFCs connect two CRM systems. This connection has to be changed so that the RFCs connect the two ERP systems in the landscape under consideration.

d. The modified TDMS BPL tree integrated in TDMS CRM is used to transfer delta data of business objects like sales documents, business partner master and material master data from the productive ERP to the non-productive ERP system. This data is transferred with TDMS standard scenarios.

When these steps have been completed successfully, the ERP and CRM test systems should be consistent..

4. Process TDMS time-based scenario for BI (TDBTM) up to the post-processing phase.

5. In the post-processing phase for the TDMS time-based scenario for BI (TDBTM), additional steps have been included to delete delta data of interval T2 (picture above) from the test BI system. The following steps are processed:

e. A new activity determines the end date of the data selection in the TDMS CRM package and the end of the data selection date in the TDMS BI package. It also obtains the requests and corresponding objects and stores them in a DB table.

f. Another new activity fetches the requests generated in the BI system between the two dates i.e. between the data selection of the CRM and data selection of BI, from the DB table of the above step. SAP BI standard functions can be called to delete the requests. The standard report (function module) deletes the data associated with the request and also deletes this from the corresponding control tables.

When you follow this approach, the three test systems for ERP, CRM and BI should be consistent.

Page 85: TDMS Operation guide

10 Appendix

10.1 Related Guides

85 July 2010

10 Appendix

10.1 Related Guides

The master guide gives an overview of the features of SAP TDMS and ways of working with this software. The security guide provides additional information about security-related aspects to be considered in the context of SAP TDMS, such as user roles and authorizations.

Data management guide for SAP Business Suite - This document provides customers and consultants with information about tables that show the strongest growth in data volume. Additional table information is also provided, for example, how to deactivate updating, how to summarize (aggregate) data, and how data can be removed from the tables. The document also provides hints and recommendations on performance-critical processes and how they can be improved by decreasing data volumes. The overall performance of SAP TDMS can be improved by following some of the guidelines in the document.

Data management guide: http://service.sap.com/~sapidb/011000358700005044382000E

Master guide: http://service.sap.com/~sapidb/011000358700006332942006E

Security guide: http://service.sap.com/~sapidb/011000358700000118162008E

BPL Scenario guide: http://service.sap.com/~sapidb/012002523100007398392010E


Recommended