Quantcast
Channel: Process Integration (PI) & SOA Middleware
Viewing all 741 articles
Browse latest View live

Rolling Software Updates of HCI-PI

$
0
0

Hello Colleagues!

 

In this blog, I have explained the Rolling Software Upgrade procedure of SAP HANA Cloud Integration. This blog is part of the architectural overview of SAP HANA Cloud Integration. You can find all the blogs over here.


Rolling software upgrades allows HANA Cloud Integration to upgrade the version of a worker node while not interrupting the message processing service. New traffic will be routed to the higher version of worker node and existing connections will not be interrupted. The cloud managed services team triggers the rolling upgrade of HANA Cloud Integration.

 

The following diagram helps explain the rolling upgrade procedure -

Rolling_Update.png

Currently, the frequency of upgrades is once-a-month. Every month, HANA Cloud Integration releases a new version with new capabilities and this is available to all the customers. Backward compatibility of the integration content is assured - that means, you do not have to check integration flows that you have developed on an older version against the latest version of the runtime.

 

A note on the tool-set when an upgrade is done: During every upgrade, the Web-based IDE is also upgraded. If you are using an eclipse-based toolset, then you must manually upgrade the same from the HCI Eclipse update site. You shall be notified in Eclipse if you are connecting to a higher version of the runtime.

 

So, the nodes are upgraded without cutting off existing message connections. You can send messages while the upgrade is going-on. No interruption of message processing services and no observable downtime for the end-users.

However, we still have the data maintenance windows for HANA cloud integration. This is for two reasons: (1) The underlying HANA cloud platform is also upgraded and (2) During upgrade, you cannot deploy new integration artifacts to the tenant. The existing integration artifacts can continue processing messages.

 

All tenant owners of HANA Cloud Integration shall be notified when an upgrade is done. Information on the maintenance windows depends on the location of the data centre you have chosen and is available in the contracts that you sign when purchasing HANA Cloud Integration.

 

So, that is how we release new versions of HANA Cloud Integration to all customers and partners every month!

 

Best Regards,

Sujit


Architectural Overview of SAP HANA Cloud Integration

$
0
0

Introduction

 

SAP HANA Cloud Integration allows you to quickly and seamlessly connect your cloud applications to other SAP and non-SAP enterprise software - without extensive coding. This integration as a service solution from SAP can help you harmonise business processes and data in a secure and reliable environment.

 

SAP HANA Cloud Integration offers many capabilities, including:

  1. Bi-directional integration to connect your cloud and enterprise applications
  2. Gain a unified view of all business data and eliminate data entry
  3. Centralized monitoring and management of integrations
  4. Speed implementation using pre-packaged integration content
  5. Lower TCO with an affordable, pay-as-you-go subscription model and minimal up-front investment


Architectural Overview of HCI-PI

 

SAP HANA Cloud Integration is a cloud-based solution. The tenants are provisioned for you on request. For all matters of operation, it functions as a black-box, however it would be interesting to see HCI is powered under-the-hood. Developers can build integrations in virtualized environments, and focus on enterprise integration patterns, implementation challenges, and leave out the maintenance efforts on the platform. There are many advantages of the platform that might be unfamiliar to integration developers on SAP Process Orchestration.

 

The following sections highlight the key differentiators of HANA Cloud Integration.

 

Blog 1: Landscape components of HCI-PI

Blog 2: Failover and Scalability of HCI-PI

Blog 3: Rolling Software Updates of HCI-PI

 

Read through them and let us know if you need more information to be included.

 

Best Regards,

Sujit

Displaying Flat File Content in PI Monitoring

$
0
0

I had the opportunity recently to try out a long awaited feature of the File adapter - display FCC content in PI monitor. This arrived with the enhancements on PI/PO 7.31 SP13 / 7.4 SP8. There is not much details on the online SAP Help portal as the page that contains information about this has a dead link.

 

Hence the purpose of this short blog - to highlight the availability of this feature and the usage details. Although it is just a simple configuration, it is a value added feature during both development and support phases.

 

 

Configuration

It is a pretty simple configuration which just requires an addition of a new parameter in Advanced Mode.

Parameter NameParameter Value
messageLogtrue

 

config.png

 

 

Testing

 

Below is a example of the receiver channel FCC configuration for simple CSV conversion with header line.

fcc.png

 

The target message payload that will go through FCC.

payload.png

 

Message log displaying the payload prior to FCC.

am.png

 

Now the new parameter, there is a new log version "FCC Version" in the message log which shows the converted content that will be placed in the target location.

fcclog.png

 

 

Further info

The feature shown above is for the Receiver channel. An enhancement for the Sender channel was released with the latest 7.31 SP14 / 7.4 SP9 but the system I'm working on has not been upgraded yet so I haven't had the chance to try that out.

Flat File Content for Sender File Adapter (New) - What's New in SAP NetWeaver 7.4 (Release Notes) - SAP Library

 

For displaying flat file content on other adapters, I use MessageLoggerBean (before or after FCC on MessageTransformBean) in the module chain. More details on that standard module in the blog below.

Message Staging and Logging Options in Advanced Adapter Engine of PI 7.3x

Hana Cloud Integration for Application-Development Partners

$
0
0

The Cloud is big, and growing (check out IDC research). Cloud opportunities are up for grabs by everyone: by SAP, SAP customers, by SAP's ecosystem...

 

SAP customers are already simplifying their cloud journey by using the different flavours of SAP Hana Cloud Integration - Application, Standard, and Professional Editions – which I blogged about recently. But that's not all there is to HCI…

Now, for the SAP partners who want to innovate in the cloud and grow their customers' and their own business, SAP HCI is available, as well.

 

As we wrote here earlier, we are releasing SAP Hana Cloud Integration for the application development partners who are looking to commercialize innovative cloud solutions in order to capture cloud opportunities. SAP HCI is becoming part of the SAP Partner Edge for Application Development program. Partners can build and monetize pre-packaged integration content, such as integration scenarios and connectivity adapters, in order to jumpstart projects and simplify the integration experience for their customers.

 

Looking for examples? Applexus is a partner who adopted HCI early and shared some experiences here; go have a look, if you are curious. For those interested to try HCI out first, check out Udo Paltzer's blog on getting an HCI trial.

 

Stay tuned for more…

Understanding Authentication & Testing Connectivity in SAP HANA Cloud Integration (PI)

$
0
0

Hello Colleagues!

 

SAP HANA Cloud Integration enables you to connect cloud-based applications to on-premise applications or other cloud-based applications. This blog explores the different options to connect the systems to HANA Cloud Integration. In the first two blogs, we learn about the types of authentication supported in HANA Cloud Integration and then in the later two blogs, learn about the steps for connectivity.

 

Connectivity tests are the first steps you carry out in an integration project. In the blogs covering the connectivity tests, I have taken an example of an SAP ERP system connecting to HANA Cloud Integration (mainly because, we have many customers wanting to connect their on-premise systems to the cloud). However, the technical know-how and procedure remains valid for any system, so that should not stop you from reading the blogs .

 

Also, this series of blog is written to help the integration team with first steps of connecting their on-premise or cloud systems to SAP HANA Cloud Integration.

 

Blog 1: Authenticating to HANA Cloud Integration

Blog 2: Authenticating from HANA Cloud Integration

 

Blog 3: Connectivity tests to HANA Cloud Integration

Blog 4: Connectivity tests from HANA Cloud Integration

 

 

Hope the information provided helps you in the integration projects!

 

Best Regards,

Sujit

Authenticating to HANA Cloud Integration (PI)

$
0
0

Hello Colleagues!

 

In this blog, we shall see how you can authenticate applications communicating to SAP HANA Cloud Integration. This blog is part of the series on Understanding Authentication & Testing Connectivity in HANA Cloud Integration. You can access all the blogs here.

 

The message sending application can use the following types to communicate with HANA Cloud Integration: basic authentication and certificate-based authentication.

 

The type of authentication is chosen at every integration flow. You configure the option in the Sender Channel of an integration flow. See the diagram below:

 

Sender_config_basic_auth.JPG

 

Basic Authentication

 

To communicate to HANA Cloud Integration using basic authentication, you have to meet two requirements:

 

  1. An SCN-based user
  2. HANA Cloud Integration role assigned to the user (role name: ESBMessaging.Send).

 

HANA Cloud Integration authenticates based on the SCN credentials. The identity of the back-end is checked by SAP evaluating the credentials against the user stored in the SCN database.

 

Note: Every customer is provisioned two tenants - test tenant and productive tenant. It is highly recommended that you restrict the use of basic authentication to your test tenant only.

 


Certificate-based Authentication

 

Let us take an example of a simplified landscape to understand how the certificate-based authentication works:

Simplified_Connectivity_Diagram.JPG

The ERP system works as the client. And BigIP load balancer authenticates itself against the ERP system (as trusted server) when the connection is set up. In this case, load balancer acts as server and the authentication is based on certificates. The identity of the customer system is checked by HANA Cloud Integration evaluating the client certificate chain of the customer. This means you have to get the ERP certificates signed by a Certifying Authority recognized by SAP.

The list of certifying authorities currently recognized by HANA Cloud Integration is provided in the documentation. (Documentation link: https://cloudintegration.hana.ondemand.com/PI/help -> Connecting a Customer System to SAP HCI -> Concepts of Secure Communication -> HTTPS-Based Communication -> Load Balancer Root Certificates Supported by SAP)

 

An integration flow must authenticate the user making the request. As prerequisite for this authentication process, the client root certificate has to be made available for SAP prior to the connection set up. You have to import the certificate in the integration flow's sender component -

Sender_certificate_configuration.JPG

 

Conclusion

 

When you want to authenticate to HANA Cloud Integration, you can do so using basic authentication or certificate-based authentication. The authentication of the customer system happens at the BigIP server. After a system is authenticated, the authorization of the message happens at the integration flow.

 

Best Regards,

Sujit

Authenticating from HANA Cloud Integration

$
0
0

Hello Colleagues!

 

When SAP HANA Cloud Integration sends messages to systems, HANA Cloud Integration must be authenticated by the receiving systems. In this blog, we shall see the types of authentication and how to do it. This blog is a part of the series on Understanding Authentication & Testing Connectivity in HANA Cloud Integration. You can access all the blogs here.

 

You can authenticate from HANA Cloud Integration to other systems using certificate-based authentication, basic authentication or the OAuth 2.0 authentication. HANA Cloud Integration has a uniform way of configuring the authentication - through the use of Credential Artifacts.

 

(A small note: The complete OAuth 2.0 specification is not yet supported. We currently support only the client credentials grant type.)

 

How to configure Authentication Credentials?

 

In the Eclipse tool-set, navigate to the Deployed Artifacts tab of the tenant. Click on the Deploy button and select the type of authentication from the wizard. The procedure is illustrated below:

 

Credential_Deployment.png

 

For example, you let's say you want to send messages to an SAP Cloud for Customer application and want to authenticate using the credentials of the Cloud for Customer application. In the Deploy Artifactswizard, you select the User Credentials artifact, specify a username and password. This credential artifact is then referred in the sender channel of the integration flow. A sample configuration is provided below:


sample_configuration.png

 

Conclusion

 

When you want to authenticate from HANA Cloud Integration, always maintain the authentication information in the credential artifacts. An inherent advantage of this mechanism of HANA Cloud Integration is that you have to maintain the credentials of a receiving system only in one artifact; you can refer to the same artifact in multiple integration flows. Further, if you have to change the authentication details - you have to do it only once and need not reconfigure the integration flows.

 

Best Regards,

Sujit

Connectivity tests to HANA Cloud Integration (PI)

$
0
0

Hello Colleagues!

 

In this blog, we shall see the steps to test the connectivity when communicating to SAP HANA Cloud Integration. This blog is part of the series on Understanding Authentication & Testing Connectivity in HANA Cloud Integration. You can access all the blogs here.

 

Let us assume the following scenario:

 

Connectivity_Simplified_Diagram.JPG

You have an SAP ERP system want to test whether HANA Cloud Integration is reachable from SAP ERP. Before, we start on the SAP ERP system - let us test from a Web browser.

 

Testing from a Web Browser

 

I would propose to test first from a browser because it is the best way to understand what to expect in connectivity from HANA Cloud Integration. These are the steps:

 

Prerequisite: You have an SCN user that has the role to communicate with HANA Cloud Integration.

 

Step 1: Create a simple integration flow (SOAP - to - SOAP). Deploy the integration flow, and obtain its endpoint URL


You need not provide any WSDL for the sender SOAP endpoint. And ensure that you have selected Basic Authentication in the Sender.

dummy_iflow.png

Note the endpoint that has been created for the integration flow. We need this in our next step.


Step 2: Open a Web browser and enter the endpoint URL

 

When asked about the authentication, provide the SCN username and password that has been authorized against this tenant. Also, the role to access via basic authentication must be granted to this user.

browser_entry.JPG.png

Important Point to Note: Only HTTPs-based communication is possible with HANA Cloud Integration. This means when I am sending a request from the Web browser, HANA Cloud Integration presents itself with its certificates. It is important for the client (Web browser) to recognize these certificates. Therefore, the certificate store of the Web browser must have certificate chain of HANA Cloud Integration.

 

Let's take an example to understand this better. Let's say I am using Google chrome Web browser to connect to HANA cloud integration. When you enter the URL in the browser, you can navigate and check the certificate chain of the HANA Cloud Integration instance. The Web browser must contain the certificate chain of HANA Cloud Integration. Else, it cannot establish a trusted connection. See screenshots below:

 

google_chrome_example.png

 

 

Step 3: Check the Message Monitoring Log

 

If the connectivity is fine, then a message shall be sent to the integration flow. And it shall be visible in the message monitoring log.

Dummy_Channel_Message_Monitoring.JPG

 

Testing from an SAP ERP system

 

In principle, testing from an SAP ERP system is similar to testing from a browser.

 

It is easy to test via an HTTP destination created using the transaction SM 59.

 

  1. Create an HTTP destination in SM 59.
  2. Enter the endpoint URL of the integration flow.
  3. Since ERP is sending the data to HCI, the ERP system acts as a client. So, the ERP system must recognize the certificate chain of HANA Cloud Integration (if the CA is not already included).
  4. In the SM 59 destination, you can provide the credentials to connect to HANA Cloud Integration. Try with basic authentication first, and then using certificates.
  5. Remember to configure the sender components in integration flows accordingly. This can be done in the STRUST transaction of SAP ERP. Select the SSL Client SSL Client (Standard) certificate, export the certificate and save the file locally as .CER. Import this into the sender component of HANA Cloud Integration.

 

Few screenshots of the procedure described is shown below:

erp_certificate_config.png

 

Conclustion


We have taken a simplified setup to explain the connectivity tests. Our experiences with customer implementations show that most landscapes look like this below. Nevertheless, the concepts explained remain the same. And you must configure the Web dispatcher to speak to HANA Cloud Integration. Further, keep a lookout for network filters and firewalls. They could block the calls to the integration instance.

typical_landscape.JPG

Best Regards,

Sujit


Connectivity tests from HANA Cloud Integration (PI)

$
0
0

Hello Colleagues!

 

In this blog, we shall see the steps to test the connectivity when communicating from SAP HANA Cloud Integration. This blog is part of the series on Understanding Authentication & Testing Connectivity in HANA Cloud Integration. You can access all the blogs here.

 

On the HANA Cloud Integration side, it is quite simple to configure the settings. You have to configure the integration artifact (refer to the blog here). For certificate-based communication, you have to upload the receiving system's public certificate in the HANA Cloud Integration keystore. One way to send messages from HANA Cloud Integration (without a sender component) is to configure a timer and a content modifier step in the Integration flow. You can create payload messages by manually specifying them in the body of the content modifier. The scheduler step can be configured to run once or at multiple intervals.

 

Let's say you have an SAP ERP system, then create an HTTP destination that can receive the messages. You can check incoming logs at the SAP ERP system. The same procedure can apply for any receiving system.

 

Note: We are currently developing a feature that allows you to ping destinations from HANA Cloud Integration. Once that is available, I shall update this blog.

 

sample_ERP_System.JPG

 

Some recommendations:

 

  1. Ensure that the firewall in the customer landscape can accept messages originating from HANA Cloud Integration. The IP address range for different data centres of SAP HANA Cloud Integration are maintained in the documentation. (Documentation link: https://cloudintegration.hana.ondemand.com/PI/help -> Operating and Monitoring SAP HCI -> Understanding the Basic Concepts -> Virtual System Landscapes)
  2. For outbound HTTP/HTTPS connections, always use port 443

 

Conclusion

 

Connectivity tests from HANA Cloud Integration are easier to execute. Check out the firewall settings on the on-premise systems. You can test using basic authentication and then move to certificate-based authentication.

 

Best Regards,

Sujit

Blog 5: Content Enricher Pattern in Integration Flows

$
0
0

Hello Integration Community !

 

In this blog, I shall explain the content-enricher pattern of SAP HANA Cloud Integration (HCI-PI) and how you can use it in your integration project.

 

What is the Content Enricher Pattern?

 

From the definition of Enterprise Integration Patterns, a content enricher accesses an external data source in order to augment a message with missing information.

 

Let's take an example from the HR domain. In the SuccessFactors(SFSF) suite, we can obtain information on employee object entity.I get the job location code from the results. But, I want to send the complete job location and not just the job code and the job location is stored in a different entity. So, I enrich my data with the job location using a content enricher step.

 

Likewise, one system may provide us with a order ID, but the receiving system actually requires the associated customer ID associated with that order.

Content enricher is a very common integration pattern.

 

How to Use the Content Enricher?

 

In HANA Cloud Integration, content enricher is available as a Service Task.

ServiceCall.JPGSwitchtoContentEnricher.JPG

 

We shall take the following integration flow as an example:

Integration_flow_pattern.JPG

The first call to SFSF shall return the compound employee data. This is how the data looks:

 

 

 

CompoundEmployee.JPG

 

We are interested in enriching the job information. So, let us take an expanded look into it. In the expanded view, take a closer look at the location field. We are interested in enriching that particular field with the exact address details.

 

jobinfo.png

So, to achieve our purpose we shall use the Content Enricher step with the following configuration.

enrich_property.JPG

The Lookup message depends on the address location entity.The entity to which you are querying. So, the final output or the enriched output shall look like the one below.

FO_Location.JPG

 

Note: Content Enricher also has another option - Combine. That is a very simple logic of combing the two elements. Employee information and Job information query results are combined together into one entity. For fun, for the same objects we saw above - the result looks like this:

Combine.JPG

 

Conclusion

 

In the current version, the Content Enricher step works well with SuccessFactors system. For SOAP-based scenarios, the entire payload goes to the sending look-up system. If you do not want the entire data to go, you would have to employ the data store step and the content modifier step.

 

Best Regards,

Sujit

Blog 6: Splitting messages in Integration Flows

$
0
0

Hello Integrators!

 

In this blog, we shall look into the splitter pattern provided in SAP HANA Cloud Integration. The use of a splitter is quite clear from its name - to break out a composite message into a series of individual messages, each containing data related to one item.

 

How is Splitter supported in Integration Flows?

 

In HANA Cloud Integration, a splitter is available as a Message Routing flowstep. When you configure a splitter flowstep, it can appear confusing at first. Lots of options!

 

Splitter_Options.JPG

 

Nevertheless, let us try to understand Iterating and General Splitter first.

A very cool way to understand the workings of the splitter is by configuring a integration flow like below (idea courtesy: HCI development team).

Integration_Flow.JPG

In the Content Modifier step, I insert the following payload in the Body. It is group of orders:

Orders_full.JPG

Each order has details in the following format.

Order_expanded.JPG

 

Now, let us see how the splitter works. In the first splitter, after the content modifier, we have configured a General splitter the following properties.

SP1.png

 

output1.png

 

The exclusive gateway step routes the message according to the ordernumber.

gateway.JPG

 

Let us go to the second splitter that has been configured as a Iterating splitter with expression type as Token.

config_2.png

This is how the output of the second splitter looks like. Notice that the root tags of the incoming payload is not retained and further, it has been grouped as per the number provided.

 

output2.png

Note: The Token splitter is mainly used for non-xml payload. I have just illustrated this example so that you can understand its working. Here "<item>" (including the angle brackets) form the "token".


And if we configure the Iterating splitter with the expression type as XPath and the properties as below, we get the same output as the previous one. The only difference being there shall be the XML tag in the beginning of the output message <?xml version="1.0" encoding="UTF-8"?>.

SP_3.JPG

 

The forth splitter is also a General splitter. But, I recommend that you configure the same so that you can see the difference with the Iterating splitter against the same message. The difference lies in the enveloping signature of the XML. Compare this output with the previous one.

GeneralSplitterOutput.JPG

 

Conclusion

 

You use the splitter flow step in HANA Cloud Integration to break down a composite message. For XML payloads, you would use the general and iterative-xpath splitters. Iterative-token type is used for non-xml payloads. The other variants - IDOC and PKCS are used in specific scenarios. I shall cover those in another blog.

 

Best Regards,

Sujit

Blog 7: Message Events in Integration Flows

$
0
0

Hello Integrators!

 

In this blog, we shall an easy topic on message events. The topic itself is easy to understand. However, I want to provide a more contextual information on where you would use the different message events.

 

What are Message events?

 

Message events point to changes in the state of message processing of an Integration flow. To make it clearer, let's say if a message comes to an integration flow, then the pipeline starts processing the message. Or, when the integration pipeline has to send it to the receiver, it would have completed processing the message. Such changes in states are explicitly modelled as events.

 

Which Message events are supported, and where to use them?

 

You can configure all the message events from the Integration palette.

palette.JPG

 

The Start Message and the End Message events are used when HCI receives a message from a Sender and when HCI sends a message to a Receiver. By default, when you create an integration flow, the start and end message events are made available.

start_end.png

The other message events - Error Start and Error End can be used only within an exception sub-process. This has been explained in detail in following blog: Blog 4: Modelling Exceptions in Integration Flows (HCI-PI).

ErrorStart_End.JPG

Timer start is especially useful in scenarios where you have go and pull data from systems or have to trigger Web services at specified time/ intervals. In terms of polling (pulling data from systems), currently - you can use it only with a SuccessFactors adapter as it is a pull-based adapter.

 

The usual pattern of using a timer is -  A content modifier followed by a timer. That is because a timer does not create payload in the pipeline. With a content modifier, you can create the request payload that can be sent to the system.

 

timer_pattern1.JPG

 

 

And finally the Terminate End - this is useful if you want to stop further processing of a message. For example, you can use it in a content router where you have processing defined for specific values on the payload. If the payload does not match those values, you want to terminate the connection.

Note: The message monitoring shall show a Successful Message and not a Failed Message because it has terminated successfully. Terminated messages do not mean failed messages.

 

TerminatePattern.JPG

Use all these events in your integration projects and let us know your feedback !

 

Best Regards,

Sujit

Blog 8: Message Aggregation Pattern in Integration Flows

$
0
0

Hello Integrators!

 

In this blog, we shall look into the Aggregation Pattern supported in SAP HANA Cloud Integration. Aggregation is the first stateful pattern supported by HANA Cloud Integration. Stateful means it keeps a track of "state" of the messages and clears the status only if a condition is achieved. This shall become clearer as you read through the blog.

 

What is the Aggregation Pattern?

 

An aggregator flow step collects and stores individual messages until a complete set of related messages has been received. On successful receipt of the messages, the aggregator publishes a single message (aggregated on some principle). We shall look into the details using an example.

 

Let's say we want to aggregate every incoming messages of the same format together into a new message. The messages are of the following format.XMLFormat.JPG

 

Quite specifically, let's say we receive three separate messages into the Integration Flow.

 

Aggregator_Iflow.JPG

InputMessages.png

 

Now, we want to aggregate the messages according to a predefined condition. Now, let us define the conditions. Each aggregator is configured by a Correlation expression and an Aggregation strategy. They are represented in the properties of the Aggregator flow step.


Aggregation_properties.png

 

The correlation expression is used to define which messages should be aggregated together. For every incoming message, depending on the correlation message - a correlation id is created. All messages with the same id are aggregated together.

 

correlation_explanation.png

The aggregation strategy provides the logic for combining all messages into a single aggregated message. Now, we want to combine all the messages strictly in sequence. In this case, the sequence number must be provided in the incoming message. In our example, it is provided in the field /Mobile/ MCode. Further, we shall denote the last message by the field value /Mobile/LastItem = true. This means as soon as this message is received by the aggregator, the messages have to be grouped and sent as a single message.

 

Further, I shall give a timeout period of one minute. That means, between two messages - the maximum waiting time should be one minute. If that time period is elapsed, then the aggregator should combine all the messages received thus far and send a single message. Check our settings below.

 

aggregation_explanation.png

So, finally when the three messages arrive, it is aggregated as one message and sent to the receiver. This is how our final output looks like:

FinalMessage.JPG

Two points to note:

 

1. In the message monitoring tool, the logs appear in pairs. One for receiving the message into aggregator and the other for confirming it into the data store.

MessageMonitoring.JPG

 

2. Post the aggregation step, you may want to know if the aggregation has been successfully completed based on the expression or did a timeout happen. The information can be obtained from the header parameter in the integration flow. It is available from the ${header.CamelAggregatedCompletedBy} parameter.

The values would be timeout or predicate. You can use this in an exclusive gateway step.

 

Further enhancements to the aggregation step is planned. So, keep a look out for it!

 

Best Regards,

Sujit

Blog 9: Scripting in Integration Flows

$
0
0

Hello Integrators!

 

In this blog, we shall explore a slightly advanced feature of SAP HANA Cloud Integration - scripting.

 

What is Scripting and When to use scripting?

 

By scripting, I am referring to the use of a scripting language (like Groovy, JavaScript) in your integration projects. HANA Cloud Integration provides a rich set of functionality to transform your data - mapping, content modifier, converters, and so on. However, at times you want to perform a more complex task that lies outside of the native functionality provided. For example, say you receive the following incoming payload:

Sample_In.JPG

The payload comes from an HR system that tracks based on activities recorded every day. You want to integrate to a time recording system, that expects the total number of work hours against each person. Expected payload is in this format:

Samplt_out.JPG

So, we have to parse through the entire incoming payload, calculate the number of hours against each person, and map to the final payload format. Achieving this scenario using the native functionality set of HANA Cloud Integration could be a little tedious task. It can be easy accomplished using custom transformation functions. That is where scripting comes in.

 

Which Scripting Languages are supported?

 

HANA Cloud Integration supports two forms of scripting languages: Groovy and JavaScript


  • Groovy is an Object Oriented Scripting Language which provides dynamic, easy-to-use capabilities. It absorbs most of the syntax from Java. Learn more Groovy from its site here.
  • JavaScript is a dynamic programming language of the Web. Most HTML pages are programmed using

 

Both scripting languages are easy to learn and come with a host of resources that you can use in your integration project.

.

How to use Scripting?

 

In the integration project, you should create the following folders in your project for scripting:

 

script_folder.JPG

 

The src.main.resource.script folder should contain all the scripts. In an integration flow, the script step is available as part of the Message Transformer step. In the context menu of the script step, you can create or assign existing scripts from the folder.

Script_Palette.JPG

 

When you create a new script, you shall get the default code editor in Eclipse with the following view. Functions to access the message are provided to you by default.

groovy_explained.png

Using External Libraries in Scripting

 

One more cool feature you could use in the scripting message is the use of external libraries in your project.There are many open source libraries available to you.

 

Let's say you want read an XML file in Groovy using an Xpath library. And you finalized on using Jaxen. Then, here is how you proceed in an integration flow project.


Step 1: Import the Jaxen libraries in the src.main.resources.lib folder

jaxen.JPG

 

Step 2: Include the import definitions in the script file of your integration flow

AddingJARs.JPG

Step 3: Modify the processData function with your function logic

codesnippet.JPG

 

That is it! The procedure to use script in integration flows is simple.

 

Conclusion

 

You can utilise a scripting language in your HANA Cloud Integration project for complex transformations.As language of choice, you have Groovy or JavaScript - and in addition, you can augment it with external libraries. We strongly recommend that you first look at the native functionality supported by the toolset before writing scripts. By its inherent nature, scripts can make integration projects harder to maintain.

 

Best Regards,

Sujit

Blog 10: Importing PI content in Integration Flows

$
0
0

Hello Integrators!

 

In this blog, we shall look at a connecting link between the on-premise Process Orchestration and HANA Cloud Integration. Although, it is a goal to run on-premise based integrations on HANA Cloud Integration, we started with the first step - mappings. On our discussions with customers and partners, we realised that mappings are one of the most important business asset that consultants wanted to reuse.

 

We support downloading the artifacts from Process Orchestration version 7.10 and above.

 

How to import artifacts from Process Orchestration ?

 

Downloading Message Mappings

 

Step 1: Configure the settings to the Enterprise Service Repository (ESR) from the Eclipse tool-set

connection.png

 

Step 2: From the integration project, click on Import PI content. You can import Message mappings, Operation mappings, and WSDLs.

Let's talk about the example of downloading a message mapping. We want to re-use a message mapping in our integration flow project of HANA Cloud Integration. You have to select it from the wizard. The mappings are then downloaded in a HANA Cloud Integration - native format.

mapdownload.png

Check the file format - It has been downloaded as an .mmap file. You can then edit the downloaded mappings using the mapping editors.

 

  • WSDLs/XSDs corresponding to Message Types and Fault Message Types are placed under src.main.resources.mapping folder
  • Other interfaces get placed under src.main.resources.wsdl

 

Downloading Operation Mappings


Operation mappings have a slightly unique behaviour. They can currently downloaded only as a .jar file. The functionality of the operation mapping can be used, but you cannot modify the mapping in HANA Cloud Integration.

 

The imported operation mapping has the following features:

 

  1. If operation mapping contains message mapping, then the message mapping is downloaded as a jar under src.main.resources.mapping package
  2. If the operation mapping contains XSLTs, then the files are downloaded as .xsl under src.main.resources.mapping
  3. Imported source or target WSDLs are not supported in integration flows

 

Note: There are certain restrictions in downloading the mappings. We remove them as time goes. The limitations are documented in the guide; so, do make a check on the documentation from time-to-time.

 

Conclusion

 

Process Orchestration customers who are adopting HANA Cloud Integration could do well to reuse the mapping components. Before the start of a project, check if the interfaces and mappings have already been defined in Process Orchestration. Message mappings are more easily imported into HANA Cloud Integration. You can also modify them and adapt them in the project.

 

Best Regards,

Sujit


SAP HANA Cloud Integration (HCI) - A complementary offering to SAP Process Orchestration (PO)

$
0
0

A lot of ambiguity is seen wrt. usage of SAP Process Orchestration and SAP HANA Cloud Integration. Even heard that HCI is a replacement of PI/PO which is not TRUE. SAP HANA Cloud Integration (HCI) is public since 2013. And we get from time to time, questions on the difference between SAP HANA Cloud Integration and SAP Process Integration/Orchestration. Let us understand on both the offerings:

 

SAP Process Orchestration provides on-premise middleware platform to design, model, execute and monitor business processes by ensuring connectivity to multiple different business / technical systems and applications (SAP and non-SAP). Along with the options of creating human and/or system centric processes, it offers the following products under one umbrella:

 

  • SAP Process Integration (including B2B Add-On and Connectivity Add-On)
  • SAP Business Process Management
  • SAP Business Rules Management

 

SAP HANA Cloud Integration (HCI) is a SAP’s strategic secure cloud-based integration platform with a strong focus on process and data integration across domains (SAP Financial Services Network (FSN), SuccessFactors, Cloud for Customers, Ariba, Business by Design, Travel On Demand etc.). It provides you the Process Integration (HCI-PI) and Data Integration (HCI-DS) capabilities.

 

HCI enables you to connect your cloud applications quickly and seamlessly to other SAP and non-SAP applications (on-cloud or on-premise) without extensive coding. This integration as a service solution from SAP can help you integrate your business processes and data in a secure and reliable environment. Also, another important point to understand is that HCI is not SAP Process Integration on cloud. It is a new product that runs on SAP HANA Cloud Platform. HCI is designated as IPaaS -- Integration Platform as a Service.


Picture2.jpg

 

 

 

As both the products are from SAP, SAP has provided a way to re-use your existing investments on SAP Process Orchestration wrt. message mappings that can be readily used in SAP HANA Cloud Integration. Both the solutions are complementary and lot of factors decide on which solution to be used:


1.    Cloud to cloud integration. There are many use cases that derive integration from one cloud solution to another. E.g. SuccessFactors to SHL or PeopleAnswers or Workforce integration. These cloud solutions can be SAP or non-SAP.  Right choice of the solution for this use case would be SAP HANA Cloud Integration as there is no on-premise involvement and most important fact to understand is that customer has invested on cloud to get  everything as an hosted/subscription model including integration to avoid capital expenditure and development of technology skill set.


2.    On-premise integration: There are use cases when a customer wants to integrate mainly on-premise systems and applications. These can be SAP or non-SAP. As all systems/applications need to be connected reside in customer’s on-premise landscape, the right technology to use is on-premise middleware i.e. SAP Process Orchestration.


3.     Cloud to on-premise and vice versa: We also call it as hybrid integration use case. This integration area is causing most of the confusion.  Let us see  the different factors that need to be considered to decide which solution fits best:


    1. If a customer is already having PO/PI and wants to leverage the same, SAP has introduced the required technical adapters e.g. SuccessFactors adapter, Ariba cXML adapter etc. to connect to the respective cloud applications. So SAP Process Orchestration can be continued as a single middleware in customer’s landscape covering both integration needs.
    2. If customer is not having PI/PO, immediate right choice would be to use SAP HANA Cloud Integration as a minimal up-front investment. HCI is a multi-tenant solution specially built for cloud integration usecases.
    3. There are many usecases that exist when the customer is on PI, but still HCI can be considered for cloud Integration. Few examples:
      • Customer is thinking to move into the cloud and requires speed integration of new cloud applications for business innovation and Pre-packaged content is only available on HCI, then HCI is the right choice.
      • From the different LOBs perspective, they want to have integration bundled within cloud application to achieve faster results and keep a bifurcation of different integration use cases. So, customers can have one middleware each for cloud integration and for on-premise integration use cases respectively.
      • PI is on older release and does not have all the technical adapters available with latest release. Customers do not want to invest on upgrade.


Though HCI is already capable of integrating cloud applications via custom integration, in cloud era, lot of focus is on simplicity, quick configurations and deployments. Pre-packaged content is of utmost importance. As of today (Jan. 2015), lot of pre-packaged integration content is already available on SAP HANA Cloud Integration:

 

  • SAP Cloud for Customer (C4C) with SAP ERP
  • SAP Cloud for Customer (C4C) with SAP CRM
  • SAP SuccessFactors LMS Curricula with SAP HCM Qualification
  • SAP SuccessFactors HCM Suite Competency with SAP HCM Qualification
  • SAP SuccessFactors HCM Suite Talent Management with SAP HCM
  • SAP SuccessFactors Recruitment Management (RCM) with 3rd party assessment vendor PeopleAnswers
  • SAP SuccessFactors Recruitment Management (RCM) with 3rd party assessment vendor SHL
  • SAP SuccessFactors Employee Central (EC) with 3rd party benefits vendor Benefitfocus
  • eDocument (Electronic Invoicing) solution with government solution in Peru (SUNAT) and Chile (SII)


Lot of other pre-packaged content is under development for Ariba, SAP SuccessFactors (e.g. Employee Central Payroll, Cost Centre, Org integration) and other cloud applications that is s planned to get released during next release cycles.


Also, as SAP HANA Cloud Integration is having monthly release cycles, it becomes interesting to check continuously and keep yourself up-to-date on the newly released features and upcoming pre-packaged content. I am sure we have enough information available for PO and PI on this SCN space. For HCI, we can refer the following to get a quick information:


Some tips for High Availability setup on PI dual-stack system

$
0
0

1. Naming schema changes for some XI/PI components in SLD

 

For High Availability (HA) setup, the SLD naming schema for Integration Server, Domain and Adapter Engine are a little bit different.

 

  • The SLD name of Integration Server and Domain are built as is.<cisysnr>.<cihost> and domain.<cisysnr>.<cihost>. The Central Instance hostname is taken from the ABAP profile parameters.

 

  • The SLD name of Adapter Engine contains the name of J2EE Database. It's built as af.<system_id>.<j2ee_dbhost>. Because several Adapter Engines can be part of the same domain, e.g. central Adapter Engine + non-central Adapter Engines, the Adapter Engine name cannot contain the host name of central instance, but needs to have a unique name for each Adapter Engine. The value for j2ee_dbhost is taken from the Java profile parameters.

 

  • You can check the profile parameters in the folder /usr/sap/<SID>/SYS/profile.

 

2. parameter "com.sap.aii.connect.integrationserver.sld.name"

 

The value of parameter "com.sap.aii.connect.integrationserver.sld.name" mentioned in HA notes is as below:

 

        <is sld name>           Default: is.<cisysnr>.<cihost>

 

  • You can configure it under http://<host>:<hport>/dir/start/index.jsp -> Administration -> Exchange Profile -> Parameters -> Connections. You might have some doubts whether this parameter will work if you switch from CI to DI.

 

  • In fact the value of this parameter is just an identifier that does not have any particular meaning for the java stack. It is just used to avoid many SLD reads. So there you should put the host name and instance number of CI. This term is somehow outdated because when stripping the ASCS from CI, it becomes a plain DI. But anyway, one of the DI's is denoted central and this is defined during the system install. So the CI "knows" it is the central one and this exact instance is used to define the name of the integration server.

 

  • So if installed according the HA notes, there should be no problems if the CI is offline, the DI known which CI it belongs to and uses it to recreate SLD content if necessary.

 

Related Notes:

SAP Note 951910 - NW2004s High Availability Usage Type PI

SAP Note 1052984 - Process Integration >=7.1 - High Availability

 

Related Docs:

Steps for running SAP Netweaver PI on high availability (HA)

New Functionality for Table Switch Procedure - Table Switch Control

$
0
0

In this blog I'd like to generally introduce the new functionality for table switch procedure - Table Switch Control.

 

Firstly, please allow me to explain some details about the procedure of switch deletion:

All the messages in your system can be divided into three parts:
Part1 - messages are not in retention period and with final status so that they can be deleted.
Part2 - messages are not in retention period but do not have final status so that they can not be deleted.
Part3 - messages are in retention period so that they can not be deleted.

 

When the switch procedure starts, there are three steps to deal with these messages:
Step1 - Messages in Part1 are deleted logically. The table entries are not physically deleted from the database tables, instead the flag "Deleted" is set in the master entry.
Step2 - Messages in Part2&3 which are not set the delete flag will be copied to new tables.
Step3 - The original tables in database are dropped and then recreated again immediately. Messages in Part1 which are set the delete flag are physically deleted in their tables.


Sometimes you might face critical situation when the table switch procedure is in use. This most likely is because of the following reasons:

• During the execution of the switch, additional table space is required to copy the valid messages to the new master table. Only after the copy job finishes the original tables can be dropped. The disc space consumption increases considerably after you activate switch procedure.

• The copy process takes long time

• While a switch is pending, no other reorganization job is allowed to start; in particular, neither deletion job nor archiving job can run and therefore the space issue becomes even more critical.

 

Now the Table Switch Control (report RSXMB_TABLE_SWITCH_CONTROL) is available with SAP Note 2106462. You are able to get out of this awkward position on your own and to complete a pending switch.

 

Upon start the report provides detailed information on the current status of a pending switch. By pressing the push buttons with recycling bin icon, you can perform a physical deletion of inactive messages (either logically deleted messages or messages that already have been copied).

11.PNG

Additionally this report provides the option to revert the direction of the switch. Reverting the direction is a critical operation and multiple pre-conditions must be fulfilled. All of these conditions are checked automatically and changing the direction is enabled if and only if all conditions are fulfilled:

12.PNG

Just by clicking the execution button you can revert the direction of the switch:

13.PNG

Pre-requisites:

The harmonized persistence layer is an indispensable prerequisite for the Table Switch Control report. Harmonization is delivered by SAP Notes 2038403 & 2095113.

 

Related Notes:

SAP Note 872388 - Troubleshooting archiving and deletion in PI
SAP Note 2038403 - Harmonization of PI persistence layer
SAP Note 2095113 - Harmonization of PI persistence layer II
SAP Note 2039256 - PI: How to calculate the remaining messages to be copied during the Switch Procedure
SAP Note 2041299 - PI: How to calculate the time remaining before the Switch procedure completes
SAP Note 2106462 - Kontrolle des Switchverfahrens

 

Related Docs:

Troubleshooting for Archive and Delete on Integration Engine

Overview of the Switch Deletion Procedure

Storing password in SAP PI modules

$
0
0

Storing password in SAP PI modules.

 

Setting user and password inside a module is slightly different from normal adapter module parameters as the text can’t be kept in clear-text in module parameters.

 

Three strategies we can use:

 

1) Use hard-coded user id and password in the module. Not a great approach but sometimes this can be the only feasible option. The advantage of course is that there is no risk of locking the user.

 

2) Setting in comm channel as a secure parameter ( displayed as asterisk ).


Here, user can be set as a normal string parameter. For passwords, we don’t want the password to show up in clear text. Hence, password can be the following:

 

  • - If password parameter  starts with pwd, it’s displayed as asterisks when entered and displayed. However, the database folder is unencrypted.
  • - If password parameter starts with cryptedpassword, the database folder is encrypted. This is more secure as the database folder is encrypted.

 

The advantage is that the values can be configured for each system and the drawback being if the password is not correctly entered it can get locked and trying to find the comm channel which is locking the user can be time consuming.

 

3) Setting values in Application Properties. This  combines the best of both worlds – we’re able to configure values in each environment and as we’re configuring it in only one location, the chances of accidentally locking the user due to incorrect values is reduced.


The values can be modified from NWA. The path is:


NWA_1.png


Configuration Management->Infrastructure->Java System Properties




Steps required to add configuration capacity.

 


a)  Add sap.com~tc~je~configuration~impl.jar to the module EJB project.

Path to get the client library: /usr/sap/<SID>/<instance>/j2ee/cluster/bin/services/configuration/lib/private/sap.com~tc~je~configuration~impl.jar

 

b) Create sap.application.global.properties file under META-INF. It’s essentially a  .properties file.

 

EAR_1.png

 

 

Sample content to make User modifiable and appear as clear text

 

## Comment for user

#? secure = false; onlinemodifiable = true

#% type = STRING;

User =

 

Sample content to make User modifiable and appear as asterisk when entering in NWA.

## Comment for password

#? secure = true; onlinemodifiable = true

#% type = STRING;

Password =

 

c) Update module code to read the property


Sample code will look something like this ( to be added in the module code )

 

// Obtain the JNDI context

InitialContext ctx = new InitialContext();


// access the Application-Configuration-Façade service

ApplicationPropertiesAccess appCfgProps = (ApplicationPropertiesAccess) ctx.lookup("ApplicationConfiguration");


java.util.Properties appProps = appCfgProps.


if (appProps == null) {

// perform error handling

}

else


{

userID = appProps.getProperty("

password = appProps.getProperty("Password");

                                                                }

 

d) Update application deployment descriptor to indicate the library being used. Add this to application-j2ee-engine.xml .

 

<reference reference-type="hard">

    <reference-target provider-name="sap.com"

target-type="service">

      tc~je~appconfiguration~api

    </reference-target>

</reference>

Reading Messages from PI System

$
0
0

This blog describes how to retrieve information about messages from a PI system. PI has several interfaces which can be used to retrieve PI messages and other data form the system.

 

To read messages from a PI System, we have different solutions for Java and ABAP. They are not released as stable APIs and we don’t give assurance that it will never change. However, the APIs are usually stable in a release (and available since NetWeaver 7.0) and can be used to retrieve the data. Depending if the messages are retrieved from Java (Adapter Engine) or ABAP (Integration Engine) stack, there are different technologies. A web service can be used for the Java stack and ABAP function modules for the ABAP stack.

 

Retrieving Messages from Java

We offer a web service to retrieve messages from Java stack. The web service offers similar functionality to what can be done in the RWB Message Monitoring tool:

  • Select messages which match a filter. You can filter by message header attributes like Time, Sender Component, Receiver Component, etc.
  • Resend or Cancel Messages
  • Retrieve the payload of a Message

 

This can be done with the web service AdapterMessageMonitoringVi which is delivered with a standard PI installation and is available with all releases. In 7.10, please check note 1373289 for availability limitations. The web service can be investigated and tested with the Web Services Navigator tool of the WebAS Java.

 

 

Web service WSDL URL is available at:
http://<host>:<port>/AdapterMessageMonitoring/basic?wsdl&mode=ws_policy&style=document

The web service also offers three different bindings for basic Http authentication, SSL over Https and HTTPS with client certificate authentication.

The actual web service URL depends on which binding is to be used and can be one of the follows:
http://<host>:<port>/AdapterMessageMonitoring/basic?style=document
https://<host>:<httpsport>/AdapterMessageMonitoring/ssl?style=document
https://<host>:<httpsport>/AdapterMessageMonitoring/clientCert?style=document
In most cases the basic binding can be used if no special security requirements must be met.

 

The web service offers a set of operations/methods for different purposes. Some interesting methods for message monitoring are:

  • getMessageList: Get a list of message. Input is a search filter similar to the filter in RWB Message Monitoring and the result contains the Messages and their header data (without the message payload) and status
  • getMessagesByKeys: Similar to getMessageList, but search can be done only by a list of message keys.
  • getMessageBytesJavaLangStringBoolean: Retrieve the payload of a message.
  • getMessageBytesJavaLangStringIntBoolean: Retrieve the payload of a message.
  • getLogEntries: Read the Audit Log of a message (see note 1814549 for required SPs and patch levels).

 

 

The web service also contains other methods, e.g. cancelMessage and resendMessage that can be used for message manipulation, but are not required for monitoring purposes. In newer releases like 7.30 and 7.31 the web service AdapterMessageMonitoringVi even contains many more methods for advanced monitoring (e.g. can be used for User-Defined Message Search) but which are out of scope for this document.

 

Method getMessageList

The method getMessageList  can be used to search for messages which match to a given filter. This is according to the search functionality in RWB Message Monitoring.

 

The method has two input parameters:

  • filter: A structured data type to give the search filter
  • maxMessages: Integer value to limit the number of search results. It should always be provided to protect against high memory consumptions and out-of-memory situation.

 

The input structure for parameter filter contains many attributes of PI messages which are used to search for specific messages.

 

Important filter fields are:

  • archive: search in the Message Archive or in the Database. Use false to search in the Database
  • direction: Sender or Receiver direction. Valid values are “INBOUND” or “OUTBOUND”
  • fromTime: The start time for the date/time selection
  • toTime: The end time for the date/time selection. Find messages which are processed between fromTIme and toTime.
  • interface: The message interface name and namespace
  • messageIDs: A message ID for searching for a specific message. If a message ID is given in this field, all other filter attributes are ignored. Must be in the format of a 36 character length guid like “5ba9192c-fa6c-11e0-ca61-00001001a6a3”
  • onlyFaultyMessages: If set to true, then only messages which had an error in the processing are in the result. This doesn’t mean that the current message status is “Error”. It can be a “Successful” message which at some point had an error.
  • protocol: The message protocol. Should be “XI”
  • qualityOfService: The message Quality of Service. Valid values are “EO”, “BE” and “EOIO”
  • receiverInterface: Receiver interface name and namespace
  • receiverName: Receiver Component name
  • receiverParty: Receiver Party name
  • senderInterface: Message sender Interface name and namespace
  • senderName: Sender Component name
  • senderParty: Sender Party name
  • status: The message status. Valid values are “success”, “toBeDelivered”, “waiting”, “holding”, “delivering”, “systemError”, “canceled”. Only one status is possible per web service call. It’s not possible to send a combination of 2 or more status at a time.

 

Unused filter fields can be left empty so that they are not considered during the search. An example for a valid filter which searches messages with error status in a certain time interval can look like in the following screenshot:

 

 

Only fromTime, toTime and status are provided here. All other filter attributes are left empty and thus ignored.

 

The result of method getMessageList  contains a structure with all search results, and for each result the message header information. An important field in the result structure is messageKey, because this field contains the input value which is required as input for the other methods getMessagesByKeys, getMessageBytesJavaLangStringBoolean and getMessageBytesJavaLangStringIntBoolean.

 

The following picture shows a part of the result:

 

The result contains an array of AdapterFrameworkData with one entry for each message that matched the filter.

 

Method getMessagesByKeys

This method works similar to method getMessageList. It returns a list of messages and their header attributes, but without the message payload. The only difference between the two methods is that it has only a list of message keys as input parameters.

 

This message keys have to be in the following format:

<guid>\<direction>\<node>\<QoS>\<seqNr>\
(Please note the backslash “\” at the end. It’s important to add it!)

 

  • guid: the message id
  • direction: the message direction. Can be “INBOUND” or “OUTBOUND” (without quotation marks)
  • node: the server node id
  • QoS: the quality of service. Can be “EO”, “BE” or “EOIO” (without quotation marks)
  • seqNr: the sequence number for EOIO messages

 

An example for a valid message key is like this:

5ba9192c-fa6c-11e0-ca61-00001001a6a3\INBOUND\268543650\EO\0\

 

An invalid message key would be:

5ba9192c-fa6c-11e0-ca61-00001001a6a3\\\EO\\

(all 5 parameters needs to be filled in the message key)


Please note: Only the fields guid and direction are really of importance here. All other fields can have “random” values (but the values should stay in the same parameter type; e.g. server node has to be integer, QoS has to be one of “EO”, “BE”, “EOIO”, etc.)

Methods getMessageBytesJavaLangStringBoolean and getMessageBytesJavaLangStringIntBoolean

The two methods getMessageBytesJavaLangStringBoolean and getMessageBytesJavaLangStringIntBoolean can be used to retrieve the payload of a message. Both methods work very similar, only the second method has an additional parameter for the message version (method one always returns the latest message version).

 

 

The other input parameters are

  • archive: search in the Message Archive or in the Database. Use false to search in the Database
  • messageKey: The message key to identify the message. The message key format is described above under method getMessagesByKeys. The message key can be obtained from the result of method getMessageList from result field messageKey
  • version: The number of the message version to read. Use -1 to read the newest version, or a number larger/equal 0 to read older versions. E.g. if there are 4 message versions existing, valid values are 0, 1, 2 and -1. Please note that 3 isn’t a valid value in this example, since -1 has to be used to read the newest message version.

 

The result of this method contains a byte array with the serialized payload of the PI message. It can be deserialized and afterwards processed. For example it can be (depending on the payload type) parsed as XML document to extract several elements of the XML. In the code this can look similar to this:
// call web service to get data
byte[] msgBytes = adapterMessageMonitoringWS.
getMessageBytesJavaLangStringBoolean(…);
// parse result
SAXParser parser
= SAXParserFactory.newInstance().newSAXParser();
parser.parse(new ByteArrayInputStream(msgBytes),new MySAXHandler());

Method getLogEntries

This method can be used to retrieve the Audit Log entries of a PI message. It has 5 input parameters:

  • messageKey
  • archive:
  • maxResults: Maximum number of results. The results are sorted by time/date in descending order (newest come first)
  • locale:
  • olderThan: Only return Audit Log entries that are older than this timestamp. Can be used together with parameter maxResults to browse through a long list of audit log entries without overloading the web service client. If this parameter is not provided than the result starts with the newest log entries.

 

The return value contains a list with the Audit Log entries for the message with the timestamp, severity, text and some other data.

 

 

 

Retrieving Messages from ABAP

There are two important function modules to read the messages from the ABAP stack:

  • SXMB_GET_MESSAGE_LIST
  • SXMB_GET_XI_MESSAGE: retrieve the payload of a PI message

Both function modules are remote enabled and can be called externally, e.g. via JCO from a Java application. The functionality of the function modules is similar to the Java web service methods.

 

Function module SXMB_GET_MESSAGE_LIST

The function module can be used to search for messages which match to a given filter. This is according to the search functionality in RWB Message Monitoring or SXMB_MONI.

 

Input and output parameters of the function module:

 

Input parameters:

  • IM_FILTER: Type SXI_MESSAGE_FILTER;  A filter to search for message which fulfill certain criterias
  • IM_MESSAGE_COUNT: Type INT4;  The maximum number of results. It should always be provided to protect against high memory consumptions and out-of-memory situation.

 

The data type SXI_MESSAGE_FILTER  for the filter has the following important fields:

  • FROM_TIME: TIMESTAMPL; UTC Time Stamp in Long Form (YYYYMMDDhhmmssmmmuuun) The start time for the selection interval.
  • TO_TIME: TIMESTAMPL; UTC Time Stamp in Long Form (YYYYMMDDhhmmssmmmuuun) The end time for the selection interval.
  • OB_PARTY: SXI_PARTY; XI: Communication Party; Sender Party
  • OB_PARTY_TYPE: SXI_PARTY_TYPE; XI Partner: Identification Schema
  • OB_PARTY_AGENCY: SXI_PARTY_AGENCY; XI Partner: Agency
  • OB_SYSTEM: AIT_SNDR; Sending System
  • OB_NS: RM_OIFNS; Outbound/Sender Interface Namespace
  • OB_NAME: RM_OIFNAME; Outbound/Sender Interface Name
  • IB_PARTY: SXI_PARTY; XI: Communication Party; Receiver Party
  • IB_PARTY_TYPE: SXI_PARTY_TYPE; XI Partner: Identification Schema
  • IB_PARTY_AGENCY: SXI_PARTY_AGENCY; XI Partner: Agency
  • IB_SYSTEM: AIT_RCVR; Receiving System
  • IB_NS: RM_IIFNS; Inbound/Receiver Interface Namespace
  • IB_NAME: RM_IIFNAME; Inbound/Receiver Interface Name
  • MESSAGE_IDS: SXMSCGUID_T; Character Format Message GUID Table. A message ID for searching for a specific message. If a message ID is given in this field, all other filter attributes are ignored. Must be in the format of a 32 character length guid like “5ba9192cfa6c11e0ca6100001001a6a3”
  • QUALITY_OF_SERVICE: SXMSQOS; Integration Engine: Quality of Service. Valid values are “BE”, “EO” and “EOIO”
  • CLIENT: SYMANDT; Client ID of Current User
  • STATUS_TYPE: SXI_STAT_TYPE; XI: Type of a Status. A number for a status groups. Status groups pool together single message status values of the Integration Engine into groups with a semantic equal meaning. Valid numbers are:

01           Successful
03           Scheduled
05           Application Error
06           System Error
10           Branched
12           Waiting
19           Manually Modified
16           Retry
21           Canceled with Errors
30           Waiting for Confirmation
50           Log Version

  • STATUS: SXMSPMSTAT; Integration Engine: Message Status. Instead of a status group (with parameter STATUS_TYPE) now a single status can be given for the message search. For a list of all available status, please see contents of table SXMSMSTAT via transaction SE16.

 

Output parameter:

The function module returns the search result in output parameter EX_MESSAGE_DATA_LIST. This structure contains a list of results of type SXI_MESSAGE_DATA. It contains the header information for all found PI messages.

 

Function module SXMB_GET_XI_MESSAGE

This function module can be called with the message ID (taken from the result of SXMB_GET_MESSAGE_LIST) and it returns the payload of the message. The result can be deserialized and further processed (e.g. parsed as XML document).

 

Input and output parameters of the function module:

 

 

Input parameter:

  • IM_MSGKEY: SXMSMKEY; XI: Message-Id as returned in the result of SXMB_GET_MESSAGE_LIST
  • IM_ARCHIVE: SXMSFLAG; 1=read from archive, 0 = read from database; should be 0 to read from the database.
  • IM_VERSION: SXMSLSQNBR; message version number; leave empty to get the latest version

 

Output parameter:

  • EX_MSG_BYTES: XSTRING; the bytes of the message
Viewing all 741 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>