What is SAP ALE – Details

ALE is SAP proprietary technology that enables data communications between two or more SAP R/3 systems and/or R/3 and external systems.

When a new enterprise resource planning (ERP) solution such as R/3 is implemented, companies have to interface the ERP system with legacy systems or other ERP systems. ALE provides intelligent mechanisms whereby clients can achieve integration as well as distribution of applications and data.

SAP ALE technology facilitates rapid application prototyping and application interface development, thus reducing implementation time. The ALE components are inherently integrated with SAP applications and are robust, leading to a highly reliable system. ALE comes with application distribution/integration scenarios as well as a set of tools, programs, data definitions, and methodologies that you can easily configure to get an interface up and running.

The message-based architecture of ALE comprises three layers:

Application layer. This layer provides ALE with an interface to R/3 to originate or receive messages containing data to or from external (or other R/3) systems.

Distribution layer. The distribution layer filters and converts messages containing data based on predefined or custom-defined rule sets. These conversions may occur to ensure compatibility between different releases of R/3 and R/2.

Communications layer. ALE communications are carried out both synchronously and asynchronously. Synchronous message transmissions are typically used for the direct reading of control data, while asynchronous message transmissions are used for transmitting or receiving application data. It is also possible to achieve a pseudo-real-time exchange of application data using transactional Remote Function Calls (tRFC), which I’ll detail later in this article series.

ALE scenarios fall into three categories: master data, transactional data, and control data distribution. Although the underlying principles are the same for the different categories, there are differences in their functions and configurations. SAP delivers over 200 ALE scenarios; and by extension there are approximately 200 application areas that can leverage ALE technology for data distribution or communication. A subset of these scenarios is supported by R/3 for Electronic Data Interchange (EDI).

There are several advantages to using ALE technology:

  • SAP ensures release independence.
  • Robust mechanisms capture changes to master data or transactional data.
  • ALE offers better inbound interface performance compared to traditional techniques such as Batch Data Communications (BDC) or Call Transactions. ALE does not use screen-based batch input.
  • ALE provides black-box technology, so the user is at a higher level.
  • Most ALE interfaces can be prototyped in a couple of days, resulting in smaller implementation timelines.
  • There is little or no ABAP program development. In most cases, the SAP-delivered ALE functionality meets the requirements.
  • ALE offers a systematic and organized approach to custom enhancements and extensions.
  • An ALE interface is easy to maintain due to the structured approach and minimal number of development objects.
  • ALE is the strategic architecture for R/3 “loose coupling” with legacy and third-party applications and is a Business Framework key element. It provides a message-based architecture for asynchronous integration of Business Framework components, including Business Components, Business Objects, and BAPIs.

ALE Building Blocks and Concepts

The following building blocks are fundamental to ALE functionality:

Logical System. A Logical System (LS) is the representation of an R/3 or external system in SAP R/3 for the distribution of data to and from the R/3 System. Every R/3 client used for ALE or EDI has to have a base LS associated with the client. This LS becomes the “sender” for outbound messages and a “receiver” for inbound messages. In addition to the base LS, a second LS should be created within that R/3 system for each R/3 or external system used for ALE interfaces. In an inbound ALE interface, this second LS represents the sender (another R/3 or external system) with respect to the base LS (receiver). In an outbound ALE interface, this second LS is the receiver on behalf of the R/3 or external system with respect to the base LS (sender).

Message type. A message type represents the application message exchanged between R/3 systems and R/3 and an external system. A message type characterizes the data sent across systems and relates to the structure of the data called an IDOC type (see below). For example, MATMAS is a message type for Material Master, and INVOIC is a message type for an Invoice (Billing Document). ALE supports over 200 message types in R/3 and about 200 application areas.

IDOC type and IDOC. An Intermediate Document (IDOC) type represents the structure of the data associated with a message type (DEBMAS02 for message type DEBMAS — Customer Master, and WMMBID02 for message type WMMBXY— Goods Movements), while an IDOC is an object containing the data of a particular message type. IDOCs are data containers with intelligence built in. Each IDOC contains one and only one business object. For example, an IDOC of type SHPMNT01, message type SHPMNT, will contain data only of one Shipment Document. Generally, the architecture of an IDOC is independent of the message type by virtue of ALE’s ability to redefine it for any message type.

Customer Distribution Model. In an R/3 system, the Customer Distribution Model is a tool that stores information about the flow of messages across various systems. The Customer Distribution Model uses an SAP-delivered Distribution Reference Model as its basis (the Customer Distribution Model can have distribution scenarios other than ones stored in the Distribution Reference Model.) The Customer Distribution Model stores data that dictates which messages (message types) flow to which Logical Systems. Many messages can flow to one Logical System, and one message can flow to several systems. With the use of filter objects and listings (which I’ll describe shortly), it is also possible to specify in a model the criteria for filtering information for a specific system. A Customer Distribution Model can be created in an R/3 system with that client’s base Logical System as the “sender” Logical System.

Use transaction BD64 or the following menu path to maintain the model: From the IMG (Implementation Guide), Cross-Application Components -> Distribution (ALE) (*) -> Distribution Customer Model -> Maintain Distribution Customer Model Directly -> Maintain Distribution Customer Model Directly.

The IMG for ALE, Distribution (ALE) (*), can also be directly invoked by using transaction SALE. This transaction is used very frequently in ALE. (I’ll discuss the process of creating, maintaining, and distributing a Customer Distribution Model later in this article.)

Filter object type and filter objects. A filter object type is used in the Customer Distribution Model to impose a selection criterion on the message (type) flowing to a Logical System. A filter object type with a value associated with it is called a filter object. For example, BUKRS (Company Code) is a filter object type available for message type DEBMAS (Customer Master). To distribute Customer master data of only Company Code “1001” to a particular Logical System, you would use filter object type BUKRS to create a filter object with value BUKRS = 1001. You can have multiple filter objects with different values for the same message type associated with that LS. While determining the receiver(s) of a particular message based on the Distribution Model, ALE performs object filtering. As with the Customer Distribution Model, filter objects are relevant only to ALE.

(I’ll explain the steps to create a filter object, as well as how to create a new filter object type, later in this article.)

Listings. Listings are a special filter object type occurrence and are also used to specify a selection criterion for distributing master data. Listings are based on the SAP Classification system (classes and characteristics), and are applicable only to Material, Customer, and Vendor master data. Once a list has been created, based on certain classification information using the ALE customizing menu, it is associated with an LS. The listing is then used to create a filter object with type LISTING, for a message type associated with that LS.

Lists are maintained and allocated to an LS from the ALE customizing guide using transaction SALE, or Distribution Scenarios -> Master Data Distribution -> Distribution via Listings.

Change pointers. Change pointers are R/3 objects that mark changes to SAP master data. Change pointers are managed by mechanisms in a Shared Master Data (SMD) tool and are based on Change Document (CD) objects. CD objects record the changes occurring to master data at a field level. These changes are stored in tables CDHDR (header table) and CDPOS (detail table). ALE configuration provides a link between CD objects and change pointers. Internal mechanisms update tables BDCP and BDCPS, which host the change pointers. While CD objects are application-data-specific, the processing status of change pointers is message-type-specific. Also, the ALE change pointers are activated first at a general level and then at the message-type level.

ALE provides powerful capabilities to capture changes occurring to master data and to distribute them via the IDOC interface. This feature can be used to keep two or more systems synchronized with respect to master data.

Ports. A port is a logical representation of a communication channel in SAP, with the data communicated being IDOCs. There are four types of ports that can be defined in R/3: tRFC, File, R/2, and Internet. ALE can use all port types to distribute IDOCs, while EDI typically uses a file-based port. tRFC and File ports can link to RFC destinations connected to R/3-to-R/3 or TCP/IP. By linking ports to RFC destinations, the port can also trigger scripts to invoke EDI subsystems, IDOC mapping software, and FTP.

You can maintain ports by executing transaction WE21 or WEDI, or selecting IDOC -> Port Definition. RFC destinations can be maintained using transaction SM59.

Process codes. Process codes are used in ALE and EDI to identify the function module or API to be invoked for subsequent processing. An inbound interface uses a process code to determine the application module that will process the inbound IDOC to an SAP application object such as a sales (Customer) order (process code — ORDE), Material Master record (MATM), or a shipment (SHIP). An outbound interface uses process codes only in the case of applications that use message control (which I’ll get to shortly). In this case, the process code identifies the application module that populates the IDOC with application data. Each process code is associated with a message type. Outbound process codes are stored in table TEDE1, while inbound process codes are stored in TEDE2.

Use transaction WE41 to display outbound process codes and WE42 to display inbound codes, or from WEDI select Control -> Outbound Process Codes/Inbound Process Codes, or from ALE customizing SALE select Extensions -> Outbound -> Maintain Process Code, or Extensions -> Inbound -> Maintain Process Code.

Message control and output type. In R/3, message control is a mechanism by which documents are output based on certain selection criteria, requirements, and sequences. Message control determines the type of document, its timing, number, and medium (print, fax, ALE, or EDI.). Outbound messages in SD (Sales and Distribution) and MM (Materials Management, Purchasing) are created and processed by message control records. The output records are stored in the NAST table.

Message control uses the condition technique. The conditions for creating an output message are stored in condition tables that have selection fields picked from a catalog of application fields/tables. To determine if an application document qualifies for output, search strategies are used through access sequences, output procedures, and requirements. Once a message qualifies for output, message control modules use the parameters set in the condition type or output type to determine the timing of transmission and the medium of the message. The output type also specifies the program or module to be invoked to create the output.

Message/output determinations are concepts applicable not only to EDI and ALE, but also to other output mediums.

Partner profile. A partner profile is an identifier for a system used for communicating messages. There are four types of partner profiles: KU for Customer, LI for Vendor, B for Bank, and LS for Logical System. KU, LI, and B are used for EDI partners, while LS is used for ALE communications. Every partner profile used for ALE must be based on an existing LS.

A partner profile brings together several ALE and EDI elements to define the parameters of communication between two or more systems. Other than general information, you have to maintain inbound parameters, outbound parameters, and message control. The main parameters are message types, IDOC types, process codes, partner functions, application identifiers, message functions, output types, and ports. Other parameters also determine the mode of processing and error handling.

A partner profile plays a major role and can be viewed as a gateway for ALE and EDI communications. It routes the specified messages through defined IDOC types to a given port after invoking the appropriate function modules for outbound processing. It receives IDOCs of a specific type, and it identifies modules to post data to the application databases in the case of inbound interfaces.

Use transaction WE20 to maintain partner profiles, or from WEDI select IDOC -> Partner Profile, or from SALE (ALE Customizing guide) -> Communication -> Manual maintenance of partner profiles -> Maintain partner profiles.

The processes in the application layer and the ALE layer are completed on both the inbound and outbound processing sides. The communication layer transfers the data by transactional Remote Function Call (tRFC) or by EDI file interface.

The process can be divided into the following sub-processes:

Outbound Processing

  • Receiver determination
  • Calling the generated outbound function module
  • Conversion of BAPI call into IDoc
  • Segment filtering
  • Field conversion
  • IDoc version change
  • Dispatch control

IDoc dispatch

IDocs are sent in the communication layer by transactional Remote Function Call (tRFC) or by other file interfaces (for example, EDI).

tRFC guarantees that the data is transferred once only.

Inbound Processing

  • Segment filtering
  • Field conversion
  • Transfer control
  • Conversion of IDoc into BAPI call
  • BAPI function module call
  • Determination of IDoc status
  • Posting of application data and IDoc status

Error Control

Interrogating the Distribution Model

You do not have to interrogate the distribution model, it is optional.

There are two function modules that can interrogate the ALE distribution model: ale_model_determine_if_to_send and ale_model_info_get. ale_model_determine_if_to_send is called with the message type and possibly with the logical receiving system if it is already known in the application. A check is made in the ALE distribution model that a message flow has been maintained for the input parameters. If this is not so, the export parameter idoc_must_be_send is set to initial; otherwise, an “X” is returned. If there are filter objects in the distribution model that control this message flow, they are not evaluated. An IDoc must only be created if ale_determine_if_to_send returns an “X”.

Module ale_model_info_get is used for more complex queries made to the ALE distribution model. It is called with the message type to be dispatched. In return, you get a table containing all the potential recipients of this message type, as well as the associated filter objects. Note that there may be several entries for one receiver in the table returned. If there are no entries in the distribution model, the exception no_model_info_found is issued. If an exception is issued, an IDoc does not have to be created. Otherwise an IDoc does have to be created. You will find the receiving logical system in the rcvsystem field in a table entry.

The end result, that is, whether the receivers receives an IDoc and what the IDoc looks like, is only determined after all the filter objects for a message flow in the distribution model have been evaluated. This is carried out in the ALE layer.

Structure of Control Records The control record consists of a field string for the structure edidc. The relevant fields are listed below; all other fields should be left with their initial values. List of fields for the control record Field Description Comment mestyp Logical message type. Conveys the business meaning of the message. Mandatory field idoctp Basic structure of the IDoc. Identifies the layout set that uses this message. Mandatory field cimtyp Structure of customer extension. If the customer extends an SAP basic structure, he must give a name to the structure of his extension. Mandatory field if customer has made an enhancement. Otherwise initial. rcvprt Partner type of the receiver; “LS” (i.e. logical system) for ALE. Optional field. See below. rcvprn Partner number of the receiver; the logical system for ALE. Optional field. See below. rcvpfc Partner function of the receiver; normally initial for ALE. Optional field. See below. When the receiving system has been determined from the distribution model, it can be written to field rcvprn. Then field RCVPFC must be filled with “LS” (for logical system). If necessary, the partner function can be written into the field RCVPFC. However, the partner function is not normally used in ALE. What is important, is that either both rcvprt and rcvprn are left empty or that both are filled. If rcvprt and rcvprn are passed with their initial values, the receivers are determined entirely in the ALE layer.
Structure of the Data Records Replacing SAP Codes with ISO Codes [Seite 67] The data records of an IDoc are created in an internal table with structure EDIDD. The relevant fields are shown below. Important Table Fields for Creating IDoc Data Records Field Description SEGNAM Segment type of the IDoc data record SDATA 1000 byte-long character field for the data used by the IDoc The remaining fields in EDIDD should be left initial. All the segment types and their sequence are specified in the IDoc structure. The data records are structured according to this sequence and included in the internal table. For each segment type of the IDoc structure, there is a DDIC structure with the same name. A field string with this structure is used for creating a data record. The application data is mapped to the field string. The segment type is written to the field SEGNAM, and the field string is written to the field SDATA. This data record is then included in the internal table with the structure edidd.

Converting Currency Amounts Currency amounts have to be converted from an SAP system format to a format that can be understood externally. In the SAP system, all currency amounts are stored with two decimal places. If a currency has a different number of decimal places, the currency amount has to be converted. You can use function module CURRENCY_AMOUNT_SAP_TO_IDOC for this conversion; it performs a suitable currency amount conversion for IDocs. We recommend that you encapsulate the code in a subroutine <SEGMENT-TYP>_CURRENCY_SAP_TO_IDOC.

Replacing SAP Codes With ISO Codes There are ISO codes for country keys, currency keys, units of measure and shipping instructions. According to SAP design guidelines, you should use ISO codes for an IDoc if they are available. When you set up the IDoc, the SAP codes have to be replaced by ISO codes. To do this, you can use these function modules: Function modules for converting SAP codes Domain Function module Currency keys CURRENCY_CODE_SAP_TO_ISO Country keys COUNTRY_CODE_SAP-TO_ISO Units of measure UNIT_OF_MEASURE_SAP_TO_ISO Shipping instructions SAP_TO_ISO_PACKAGE_TYPE_CODE We recommend that you encapsulate the code in a SUBROUTINE <SEGMENT-TYP>_CODES_SAP_TO_ISO.

Left-justified Filling of IDoc Fields All fields must be filled left-justified. This happens automatically for character fields. If the original field of the application is a non-character field, you must execute a condense on the corresponding field in the IDoc segment. To find out which fields require a condense, see the documentation structure for a segment type. The name of the documentation structure begins with “E3” or “Z3” (instead of “E1” or “Z1”); otherwise it is the same. This structure contains the same fields as the “E1” or “Z1” structure. But here you will find the original data elements and domains of the application. All fields with a data type unequal to char, cuky, clnt, accp, numc, dats, tims or unit require a condense. We recommend that you encapsulate the code in a subroutine <SEGMENT-TYP>_CON

Calling MASTER_IDOC_DISTRIBUTE After the MASTER_IDOC_DISTRIBUTE has been called, you must specify a COMMIT WORK; the standard Database Commit at the end of the transaction is not sufficient. The COMMIT WORK does not have to directly follow the call; it can be specified at higher call levels or after multiple calls of MASTER_IDOC_DISTRIBUTE. Note that the IDocs created remain locked until the called transaction has been completed. If you want to unlock them earlier, you can call one of the following function modules: • DEQUEUE_ALL releases all locked objects • EDI_DOCUMENT_DEQUEUE_LATER as a parameter releases the transferred IDocs If the application document is created via the update program, the call of MASTER_IDOC_DISTRIBUTE must also be performed in update task (if an update call has not already been performed at a higher level).

Exceptions and Export Parameters of MASTER_IDOC_DISTRIBUTE The module uses the table parameter COMMUNICATION_IDOC_CONTROL to return the control records of the IDocs that were created in the database. To find out the IDoc number and the current status for example, see fields DOCNUM AND STATUS. In general, this table is not relevant to the calling application. If the IDoc recipient was passed in the control record when MASTER_IDOC_DISTRIBUTE was called, but the distribution model does not allow the recipient to receive this IDoc, exception ERROR_IN_IDOC_CONTROL is output with an appropriate error message. If a receiver was not given in the control record and ALE does not find a recipient in the distribution model, an exception is not issued. If you want to react to this case, you must query the return table COMMUNICATION_IDOC_CONTROL. If this table is empty, no IDoc was created. This different behavior for the initial and non-initial receiver has historical reasons. The initial recipient is the standard case for master data replication: here it is of no further interest whether an IDoc was actually created. Presetting the receiver is the standard for dispatching transaction data: if an IDoc is not created, this is interpreted as an error.

Sample ALE scenarios

Now let’s explore a couple of sample ALE scenarios. The first illustrates a few interfaces with an external warehouse management system using ALE technology. The second depicts the distribution of master data between two or more R/3 systems. These scenarios are a small sample of the multitude of possible ALE interfaces.

Example 1: Consider a business scenario in which R/3 needs to be interfaced with an external warehouse management system (WMS) (see Figure 3). This scenario assumes that the Inventory Management module is being implemented. In an outbound interface, the SAP application communicates to the WMS picking requests — Materials in the warehouse that need to be picked for packing and shipping. Message type PICKSD, whose corresponding IDOC type is SDPIOD01, is used. This IDOC consists of a header with fields for delivery number, shipping point, total weight of delivery, units of measurement, and the name and address of the ship-to party. The header is followed by one or more detail segments that contain the delivery items with fields for item number, Material number, quantity, and units of measure.

Figure 3 Sample ALE scenario-Interface with Warehouse Management System.

After the receipt of the picking request and completion of the operation, the WMS sends a pick confirmation back to SAP. This is an inbound interface to SAP from the external system where message type SDPICK is used. Its corresponding IDOC type is SDPIID01. This IDOC type also has a header segment followed by one or more detail segments. The IDOC communicates the Material quantities picked by the warehouse based on deliveries sent earlier. It can handle batch splits, movement type splits, and also invoke a “post goods issue” process.

As seen in Figure 3, there are several inbound inventory interfaces that can be handled by one single message type WMMBXY. These inbound interfaces are typically goods movement transactions, including inventory receipts (with or without a purchase order), inventory status change, goods receipts against production orders, and inventory reconciliation. Most goods movement types are supported by this message type. The corresponding IDOC type is WMMBID01 (or WMMBID02 in 4.x releases), which can handle multiple line items for a single header. In the case of inventory reconciliation, ALE function modules need to be enhanced to modify the data contained in the inbound IDOC for inventory adjustments based on comparing the stock in WMS versus SAP. This can be achieved easily with a few lines of code in a Customer function (user exit) provided by SAP in the ALE function module.

Example 2: Let’s look at another simple ALE scenario distributing master data across multiple R/3 systems (see Figure 4, page 86). In large companies, there are advantages to distributing applications and databases, especially if the differentiating parameters can be used to segment the data discretely, such as plants, lines of business, geographic locations, and departments. In this example, the company headquarters is responsible for maintaining master data such as Customer Master and Material Master. This is loosely coupled with two different plants/companies, 1001/US01 and 2001/EU01, to which master data is distributed. ALE provides the capability to filter data and distribute it only to relevant systems. We can distribute the master data pertaining to that particular plant/company code. The filter object type used for message type MATMAS (Material Master) is WERKS (plant), and for DEBMAS (Customer Master), it is BUKRS (company code). Initially, after the Customer Master and Material Master are loaded during conversion at the headquarters, we can transmit the relevant data to each plant or company. Then, on an ongoing basis, we can capture the changes occurring to the master data at headquarters and communicate them to the corresponding plant/company.

Figure 4 Sample ALE scenario-distributing master data.

R/3-to-R/3 InterfaceS

Now let’s talk about how to build interfaces between two or more R/3 systems. While the underlying concepts are almost the same for either R/3-to-R/3 or R/3-to-external system interfaces, there are important differences in configurations and in the mode of communications. We will work with an example of distributing characteristics and classes from one R/3 instance to another. While using objects such as Materials, Customers, and Vendors, it is often necessary to classify these objects to further describe their nature and to distinguish them from other objects. In SAP, we use characteristics and classes to classify objects. Characteristics are attributes that further describe an object. For example, a chemical’s temperature sensitivity and a customer’s store shelf square-footage are characteristics of objects that can be maintained in the R/3 classification system. Classes are groups of characteristics that conform to a class type — such as Material or Vendor. While the term classification data refers to the actual values of the characteristics, classes and characteristics can be considered configuration data. In SAP, it is not possible to transport class and characteristic data using the Correction and Transport System (CTS) across systems (such as development, QA, or production). ALE provides message types, IDOC types, and function modules to distribute class, characteristic, and classification data to other systems. Let’s walk through the process of building an interface to distribute characteristics and classes from one R/3 instance to another.

The message types available for this purpose are:

CHRMAS — Characteristics master
CLSMAS — Class master
CLFMAS — Classification data.

If you are using the Classification system for master data, such as Materials, Customers, and Vendors, in addition to distributing the master data, you will need to distribute the classification data and use the message type CLFMAS. In this example, we’ll focus on distributing the characteristics and class master using message types CHRMAS and CLSMAS, respectively; We’ll communicate these messages to another R/3 system using tRFC, and we’ll learn to configure RFC destinations and R/3 connections. We’ll also discuss monitoring aspects of tRFC and get to know programs that will confirm the status of communications. While configuring the Distribution Model, we need to create new filter objects to distribute only the configuration data created in the Classification system, because SAP delivers certain characteristics with the system that we do not need to transport to other systems.

Step 1: Maintaining the Logical System and the Distribution Model. Let’s create a new LS called CHRCLSR301 that represents the receiving R/3 system.

Here are the steps for configuring the Distribution Model:

• Execute transaction BD64.
• Enter the base LS defined for your client (say, BK1CLNT010). This LS should have been created and should be allocated to the client using transaction SCC4.
• Enter CHRCLSMODL (as an example) for the name of the Distribution Model.
• Click on Create. You will see a hierarchical listing with BK1CLNT010 as the parent and all other LSs, including CHRCLSR301, under it.
• After placing the cursor on CHRCLSR301, click on create message type.
• Enter CHRMAS.
• Repeat this operation for CLSMAS.
• If you want to distribute classes pertaining to, for example, Material and Customers only, then you have the option of specifying a filter object for message type CLSMAS. One of the object types available is KLART — Class Type. To specify the filter, place cursor on message type CLSMAS under LS CHRCLSR301. Click on create filter object. You will see a pop-up screen with open fields Object Type and Object. Pull down the list of object types (F4), and select KLART. Enter value “001” for class type Materials in the field Object. Repeat operation for object value “011” for class type Customers.

• SAP delivers certain characteristics that are used throughout the system. We do not need to transport (distribute) these to the other R/3 system. We need to use a filter object to restrict the characteristics to perhaps those pertaining to Materials and Customers. Follow the instructions described in the next section to create new filter object types.

Step 2: Creating new filter object types. Filter objects are criteria used for selecting data of a particular message type in order to create the required IDOCs. A filter object type is basically a field on one of the IDOC segments of the IDOC type corresponding to that message type. We first need to identify the field on the IDOC that can be used for filtering data. For example, if we use the field ATKLA (Characteristics Group) to group similar characteristics that we create, then we can use the field to create the filter object type. Upon scrutinizing the IDOC type CHRMAS01, we find that ATKLA is a field on the segment E1CABNM (see Figure 5). Further:

Figure 5 Defining a new filter object type.

• From transaction SALE Extensions -> ALE Object Maintenance -> Maintain object types (for separate message types), select Execute.
• You will see a pop-up screen for message type. Enter CHRMAS.
• Click on new entries.
• Enter ZATKLA (for example) for Object Type, E1CABNM for Segment Type, 1 for Sequential Number, ATKLA for Field Name, 86 for Byte Offset (from documentation on IDOC type CHRMAS01), and 10 for Internal Length.
• Save.

Now that we have created a filter object type for use with message type CHRMAS, let’s complete the configuration of our Customer distribution CHRCLSMODL by executing transaction BD64:

• Expand the tree for the CHRCLSR301 Logical System.
• Place cursor on message type CHRMAS.
• Click on create filter object.
• Pull down the menu (F4) on field Filter Object Type, and select the object type that we created—ZATKLA.
• Enter value CUSTOMER in the field Object.
• Repeat this operation and enter MATERIAL in the field Object (see Figure 6).
• Save.

Figure 6 Customer Distribution Model for characteristics and classes.

Note: Menu paths may vary slightly depending on your version of R/3.

Step 3: Creating a CPIC user on target system. To communicate and process messages in the remote system, SAP uses a user ID on the target system. This user ID needs to be of type CPIC. Though the user could be a normal dialog user, a user of type CPIC should be used to preclude performance problems such as “maximum number of logons exceeded.” Ask your Basis administrator to set up this user ID. Ensure that the ID has all the authorizations required to update that system’s databases for characteristics and classes.

Step 4: Maintaining the RFC destination. R/3-to-R/3 communication uses tRFC. RFCs are Remote Function Calls used to invoke function modules for transactional or asynchronous activities, typically on remote systems. The word transactional prefixed to RFC merely indicates that particular function is invoked per logical unit of work, which could be one Material Master, one delivery, or one invoice. SAP tRFC and aRFC (asynchronous Remote Function Call) both have advanced mechanisms to track data packet communications and to maintain status. For example, to ensure delivery of data, tRFCs are “retried” until the calls are successfully completed.

To set up an RFC destination for our interface:

  • Execute transaction SM59.
  • Place the cursor on R/3 Connections. Click on create.
  • Enter the name of the RFC destination, say CHRCLSR301.
  • Enter “3” for connection type—this is an R/3 connection.
  • Enter a description of the RFC destination.
  • Press Enter. You will see a few additional fields appear on the screen.
  • Enter the name and system number of the other R/3 server in the field Target Machine.
  • Enter logon information such as Client, Language, User ID (the CPIC user ID defined earlier), and password.
  • Save.
  • Click on test connection. You will get a list of connection and communication timings for logon and transfer of a certain number of bytes. This does not verify the password entered earlier. If the password is incorrect, you will notice system problems on the target instance.
  • Click on remote logon. If you are using a CPIC user ID, there will be no action taken. If it is a dialog user, you will be logged on to the target system. This is only a test (see Figure 7, page 90).

    Figure 7 Creating an RFC Destination.

It is important to ensure that the name of the LS and the RFC destination are the same. You can also access this function using SALE -> Communications -> Define RFC destination.

Step 5: Generating RFC port and partner profile. Here we learn how to generate RFC ports and partner profiles for an R/3-R/3 interface using SAP functionality. The port definition is generated based on the RFC destination that we created in the previous step, while the partner profile is generated based on the Customer Distribution Model we created, along with the port generated.

Follow these steps to generate these objects:

• Go to SALE (ALE Customizing) -> Communications -> Generate Partner Profiles.
• On the screen that appears, enter the Customer Model CHRCLSMODL as defined earlier.
• Enter details for receiver of notifications. This is to identify the recipient of workflow messages in case of errors.
• Switch the Outbound Parameters to Collect IDOCs.
• Switch the Inbound Parameters to Background, so there is no overriding by express flags.
• Execute.
• You will see a list of messages confirming the generation of a port and partner profile.

The tRFC port has an internally assigned number. If you browse the partner profile CHRCLSR301 generated in this step, you will notice there are three entries generated for Outbound Parameters for message types CHRMAS, CLSMAS, and SYNCH. The message type SYNCH is for synchronous communications between the two R/3 systems and is used for validation of ALE functions. The port associated with the three outbound parameters’ entries is the port generated in this step.

The objects created in this process must be generated in the client/instance from which the communications are originating.

Step 6: Creating a receiving partner profile on the target system. We must now create an LS and a partner profile for receiving messages from the sending system. This is a mirror image of the sending LS on the target system.

Here are the steps of the process:

• Create a Logical System BK1CLNT010 on the target system.
• Create a partner profile with partner number BK1CLNT010 on the target system.
• Maintain its Inbound Parameters. Create a new entry for message type CHRMAS, with process code CHRM, and processing mode Background, with no express override flag. Create a similar entry for message type CLSMAS, with process code CLSM, and processing mode Background, with no express override flag.
• Save.

Step 7: Distributing the Customer model. The Customer Distribution Model CHRCLSMODL was created on the “sender” system. This determines and dictates the flow of certain message types — CHRMAS and CLSMAS in this case — to other systems. This information has to be communicated to the recipient system as well, so that it can accept and process the inbound IDOCs. ALE provides tools to “distribute” Customer models.

Here are the steps of the process:

• Go to SALE -> Distribution Customer Model -> Distribute Customer Model -> Distribute Customer Model.
• Specify the Customer Model to be distributed — CHRCLSMODL.
• Specify the Receiving Logical System — CHRCLSR301.
• Execute.

You should receive a message confirming the success of the action. Browse the Customer Distribution Model on the target system to see that it has been created correctly.

Working the Interface

Now that we have configured the system for an R/3-to-R/3 interface, let’s examine the methods for executing this interface and for understanding its results. This section will also go over techniques for monitoring the communications and will discuss performance issues related to R/3-R/3 ALE communications.

Sending data. SAP provides standard ALE programs for sending and processing IDOCs. The two programs we are going to use to send data to the target system are RBDSECHR for sending Characteristics Master and RBDSECLS for sending Class Master. Note that characteristics data has to be sent before the Class Master, since characteristics belong to classes — classes are like envelopes for characteristics. As a first step, let’s create the communication IDOCs on the sending system. To do this:

• Go to BALE æ Master Data -> Classification System -> Characteristics -> Send. This is the same as executing program RBDSECHR or transaction BD91.
• Enter the name of the Logical System — CHRCLSR301 in this case.
• Execute.

If the number of characteristics is large, then you should schedule RBDSECHR as a background job after having defined an appropriate variant. Use transaction WE05 to view the created IDOCs. They should be in status “30” — IDOC ready for dispatch (ALE service). Browse the IDOCs to understand and verify the data.

For the Class Master:

• Go to BALE æ Master Data -> Classification System -> Class -> Send. This is the same as program RBDSECLS or transaction BD92.
• Enter the class types — 001 for Material and 011 for Customer in this case. Enter the names of classes you want to distribute. Enter the name of the Logical System — CHRCLSR301 in this case.
• Execute.

If the number of classes is large, you should schedule program RBDSECLS as a background job with an appropriate variant. Display the created IDOCs, and browse them to understand and verify the data.

Dispatching IDOCs to the target system. Once you have created the communication IDOCs, the next step is to dispatch them to the target system. This is when the tRFC calls are invoked to connect and communicate to the remote system. Using transaction SM59, test the connection for RFC destination CHRCLSR301 to ensure that the communication channels are open and working.

• Go to WEDI -> Test -> Outbound from IDOCs. This is the same as program RSEOUT00 or transaction WE14.
• Enter the parameters, such as message types (CHRMAS, CLSMAS), partner number of receiver, and date last created.
• Execute.

If there is a large number of IDOCs, schedule program RSEOUT00 as a background job with an appropriate variant. After processing, you should find all IDOCs to be in a status of “03”—Data passed to port OK. Bear in mind that a status of “03” does not necessarily imply that the tRFC communication was successful. I’ll discuss a method of updating this status with the results of the final processing later in this section.

Monitoring transactional RFC. While dispatching IDOCs from one R/3 system to another using tRFC, it is possible to monitor the communications and take appropriate actions to ensure its success. The main tool is the tRFC monitor, which can be accessed via BALE -> Monitoring -> Transactional RFC. This is the same as executing transaction SM58 or program RSARFCRD. Enter the period and user who initiated the RFC. This log displays only RFC calls that had an error. If there are entries in the log, you can analyze them by reading the system logs and dumping analysis for that period using transactions SM21 and ST22 successively. Carefully analyze these errors to take the correct action. The R/3 Connection may not be active, or the user may not have the necessary authorization for creating entries in the target system. If the problems are other than certain mandatory settings, most RFC transactions should get processed within a small period of time. In case of errors in communication, you may find several jobs in the Job Overview (transaction SM37) with a prefix ARFC. These are normal, since the system is scheduling jobs to reprocess the RFC transactions. However, an excessive number of such jobs could bog down the system, since all batch processors would get flooded, resulting in a repetitive loop causing more jobs to be created. The status records of RFC calls sent from the system are stored in table ARFCSSTATE, while those of RFC calls on the receiver system are stored in table ARFCRSTATE.

Processing IDOCs on the target system. When the IDOCs arrive on the target system from the host machine, they are created with a status of “64” — IDOC ready to be passed to application. This is because we chose the option of “background” on the target system, rather than processing them immediately. We now need to run a program that will process these IDOCs and post the data to the application. To do this:

• Go to BALE -> Periodic Work -> ALE Inbound IDOCs. Choose the radio button for “64” — IDOC ready to be passed to application. Execute. This is the same as executing program RBDAPP01.
• In the panel displayed, enter parameters such as message type (CHRMAS and CLSMAS in this case), creation date, or IDOC numbers. Execute.
• A list will be displayed indicating the status of the processing.

Also check the status of the IDOCs, using transaction WE02 or WE05. All IDOCs must have a status of “53”—Application document posted.

If workflow has been set up, you will receive work items in your inbox in case of errors. There you can edit the IDOCs if the errors are related to data and then reprocess them. However, in case of application errors, you can check the logs to determine the cause of these errors and take remedial action.

The necessary steps include:

• Execute transaction SLG1.

• Enter CAPI for Object (Classification system) and CAPI_LOG for Subobject. If necessary, enter time restrictions and user information. Execute.
• You will see a display of errors pertaining to characteristics, and you will also see log messages for all successful class (CLSMAS) transactions.

In case of errors due to system availability, deadlocks, or temporal data problems, it is possible to schedule program RBDMANIN in the background to reprocess the IDOCs in a status of “51” — Error: Application document not posted.

How does the sending system know that the tRFC calls to the remote system were successful? There is a program you can execute that collects information about the result of the tRFC calls on the remote system and reports the information to the host client.

To do this:

• Go to BALE -> Periodic Work -> Check IDOC Dispatch. This is the same as executing program RBDMOIND or transaction BD75.
• Enter the IDOC creation date and the number of IDOCs after which the process can be committed, say 100. This implies that after checking the status of 100 IDOCs, the program will update its status.

If the tRFC calls were successful, the aforementioned process should update the status of the IDOCs dispatched to “12” — Dispatch OK.