Quantcast
Channel: Process Integration (PI) & SOA Middleware
Viewing all 741 articles
Browse latest View live

HCI - Payload Logging using Groovy Scripts - Part 2 - Use Externalized Parameters

$
0
0

Background

As discussed in my previous blog, HCI - Payload Logging using Groovy Scripts logging of payloads should be used prudently. Typically you would like the logging to be disabled and have options to enable this only on a need basis.

 

If you are from a PI background, you would be well aware of parameters like TRACE_LEVEL, LOGGING_SYNC and Integrated Configuration Logging Options that enable you to turn on / off logging & staging when required. So how can this be done in HCI using the Groovy scripts we described in the previous blogs?

 

Externalized Parameters in HCI

Some background first - HCI enables you to define parameters that can be controlled externally in your Integration Flow. Typical examples of these are SOAP/HTTP URLs, FTP Server Host,etc which will typically vary in each environment. There is already content on Externalized Parameters for HCI on SCN and would recommend you read that or the corresponding documentation from SAP - Externalizing Parameters of Integration Flow

 

To be able to define Logging Dynamically we will be using a custom Externalized Parameters, so you can enable and disable logging without having to make changes to your Integration Flow or the Groovy script.

 

Integration Flow Changes

We will continue to use the sample Integration Flow described in my previous blog with an additional Content Modifier Step and some changes in the Groovy script as described below.

 

19.png

 

StepDescription
Delta Changes
Content Modifier
  • The  New Content Modifier step is used to read the External Parameter into a local property.
  • Name = logger ( Message Property to be used in the Groovy Script )
  • Type = External Parameter
  • Data Type = java.lang.String
  • Value = externalParamLogger ( Name of the External Parameter whose value will be set externally )

10.png

Groovy Script Change
  • Read Message Properties
  • Read the value of property logger as set in the previous content modifier step
  • If Logger is set to 1, then log the payload else, do not perform any logging.
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
def Message processData(Message message) {    def body = message.getBody(java.lang.String) as String;    def messageLog = messageLogFactory.getMessageLog(message);    def propertyMap = message.getProperties()    //Read logger from the Message Properties    String logger = propertyMap.get("logger");    if(logger.equals("1")){        if(messageLog != null){            messageLog.setStringProperty("Logging#1", "Printing Payload As Attachment")            messageLog.addAttachmentAsString("ResponsePayload:", body, "text/plain");         }    }  return message;
}

 

Setting of Externalized Parameters

Once the Content Modifier step is defined, the Externalized Parameters tab should contain the Parameter with the name as defined in our Content Modifier step. In this case: externalParamLogger. Set the default value of the Parameter as 0 (Zero). Save and Deploy your Integration Flow.


11.png

Test your flow with External Parameter Value set to 0

Trigger your Integration flow with the External Parameter Value set a 0. The Integration Flow executes and the Groovy Script does not log the payload as the value is set to 0. As seen in the image below no attachment is available.

 

12.png

 

 

Test your flow with External Parameter Value set to 1

To update your external parameters, navigate to Window --> Show View --> Other. In your Show View dialog box, navigate to Other --> Configurations

13.png

 

In the Configurations Pane that opens, navigate to your project and double click on the same. In my case the name of my project is - Prj_PayloadAsAttachment
14.png

Go to Externalized Parameters tab and update the value of externalParamLogger as 1 and click on Save Parameters

15.png

 

Right Click on your project in Configurations --> Deploy Integration Content. Your Integration Flow is now updated to have the Logging Enabled!

16.png

Re-trigger the Interface and you should see the logs with the attachment as the Logger is now set to Value 1.

17.png


Final Note

As seen in this blog, Externalized Parameters provide you with options to provide parameters to your Integration flow dynamically. This along with some smart Groovy Scripting enables you to dynamically control payload logging.



Advantco MQTT adapter for SAP PI/PO

$
0
0

Introduction

MQTT (MQ Telemetry Transport) is a publish/subscribe, extremely simple and lightweight messaging protocol, designed for constrained devices and low-bandwidth, high-latency or unreliable networks. Using MQTT brokers to buffer the data from hundreds of users or devices would allow PI/PO to process these data in a more control manner.

 

Advantco MQTT adapter for SAP PI/PO enable you to publish data to any MQTT broker and to subscribe to any MQTT topics. As a MQTT client, the Advantco MQTT adapter will initiate the connection to the MQTT broker, this eases the network configuration. TLS support guarantees that your data are well protected when sending or receiving data from the MQTT brokers.

 

pic1.jpg

A native JSON conversion engine is a standard feature of the MQTT adapter.

pic2.jpg

 

Key Features

• Full integration with SAP PI Adapter Framework, Alert Framework and Monitoring.

• Supports publish and subscribe operations to MQTT database.

• Support MQTT v3.1 and MQTT v3.1.1 protocol versions.

• Support MQTT and MQTTS.

Support TLS 1.1 and TLS 1.2 for SAP PI/PO 7.1, 7.3 and 7.4

• Supports both Synchronous and Asynchronous mode of communication.

• Support various content conversion including XML / JSON / Flat (CSV) data.

• Available for SAP PI 7.1, 7.1 EHP1, 7.3, 7.3 EHP 1, 7.4.

 

 

Key Benefits:

• Minimal effort to install, configure and usage of  the MQTT adapter.

• Maintain and support by team with profound knowledge of SAP PI/PO adapter developments.

 

List of MQTT brokers have been tested with the Advantco MQTT adapter:

• Mosquitto (http://mosquitto.org/)

• RabbitMQ with MQTT plugin (http://www.rabbitmq.com/)

• HiveMQ (http://www.hivemq.com/)

• Apache ActiveMQ (http://activemq.apache.org/index.html)


For more information about the Advantco MQTT adapter, please visit: https://www.advantco.com/products


Sources: MQTT

Twitter [Receiver] Adapter for HCI

$
0
0

In this blog I am going to focus on how to access real-time social media information using Twitter [Receiver] adapter which was released on 2015-10-24 as part of SAP Hana Cloud Integration [HCI].


Introduction


Twitter is a social networking and micro-blogging service that enables its users to send and read messages known as tweets. Tweets are text-based posts of up to 140 characters displayed on the author’s profile page and delivered to the author’s subscribers who are known as followers.


The Twitter adapter provides an easy way of integrating a flow of tweets. Once configured correctly with Twitter credentials, all that is necessary is to implement the Twitter listener, instantiate the adapter and handle the objects received.


We can use the Twitter receiver adapter to extract information from the Twitter platform (which is the receiver platform) based on certain criteria such as keywords, user data. We can also perform a Twitter search based on a schedule and publish the search results within Messages.


To communicate with Twitter only as the currently authenticated user, then you can obtain the OAuthAccessToken and OAuthAccessTokenSecret directly from this page on Twitter. The OAuthAccessToken and OAuthAccessTokenSecret are listed under the OAuth Settings in the Your Access Token section.

 

For authenticated operations, Twitter uses OAuth - an authentication protocol that allows users to approve an application to act on their behalf without sharing their password. The connection works that way that the tenant logs on to Twitter based on an OAuth authentication mechanism and searches for information based on criteria as configured in the adapter at design time. OAuth allows the tenant to access someone else’s resources (of a specific Twitter user) on behalf of the tenant.

 

In order to use OAuth authentication/authorization with Twitter you must create a new Application on the Twitter Developers site. Follow the directions below to create a new application and obtain consumer keys and an access token.

 

  • Click on the Register an app link and fill out all required fields on the form provided; set Application Type to Client and depending on the nature of your application select Default Access Type as Read & Write or Read-only and Submit the form. If everything is successful you’ll be presented with the Consumer Key and Consumer Secret. Copy both values in a safe place.
  • On the same page you should see a My Access Token button on the side bar (right). Click on it and you’ll be presented with two more values: Access Token and Access Token Secret. Copy these values in a safe place as well.


Twitter receiver adapter features

 

Adapter Type:         Twitter

Messages Protocol: HTTPS


Operations


Twitter adapter provides send tweet, search and send direct message operations.To access twitter account you can choose the following operations.


  • Send Tweet -Allows you to send content to a specific user timeline.
  • Search - Allows you to do a search on Twitter content by specifying keywords under filter settings.
  • Send direct message -Allows you to send messages to Twitter (write access, direct message).
  • Page size - specifies the maximum number of tweets per page.
  • Number of pages - Number of pages to consume.


Authentication methods


Twitter adapter supports basic authentication and OAuth mechanism based on the shared secret technology. We should make a note of below parameters while doing twitter configuration and we need to specify the same in the receiver twitter adapter.

 

We must specify both the OAuthClientId [Consumer Key] and OAuthClientSecret [Consumer Secret] to connect to an OAuth server.

 

  • Consumer Key

       OAuth requires you to register your application. As part of the registration, you will receive a client Id, sometimes also called a consumer key, and a client secret. An alias by which the consumer (tenant) that requests Twitter resources is identified by using this OAuthClientid / Consumer key.

  • Consumer Secret

       OAuth requires you to register your application. As part of the registration you will receive a client Id and a client secret, sometimes also called a consumer secret.An alias by which the shared secret is identified (that is used to define the token of the consumer (tenant).

  • Access token

       The OAuthAccessToken property is used to connect using OAuth. The OAuthAccessToken is retrieved from the OAuth server as part of the authentication process. It has a server-dependent timeout and can be reused between requests.


       The access token is used in place of your username and password. It also protects your credentials by keeping them on the server. In order to make authorized calls to the TwitterAPI, your application must first obtain an OAuth access token on behalf of a Twitter user.

  • Access Token secret

       The OAuthAccessTokenSecret property is used to connect and authenticate using OAuth. The OAuthAccessTokenSecret is retrieved from the OAuth server as part of the authentication process. It is used with the OAuthAccessToken and can be used for multiple requests until it times out.

 

Twitter Adapter Documentation & Download Links

 

https://proddps.hana.ondemand.com/dps/d/preview/93810d568bee49c6b3d7b5065a30b0ff/2015.10/en-US/frameset.html?6dc953d97322434e9f2a5acdc216844d.html

 

http://scn.sap.com/community/pi-and-soa-middleware/blog/2016/03/02/integrating-hci-with-twitter--part-1

 

  http://scn.sap.com/community/pi-and-soa-middleware/blog/2016/03/02/integrating-hci-with-twitter--part-2

Advantco Salesforce adapter for SAP HCI

$
0
0

Overview

 

This blog provides a quick overview of the new Advantco Salesforce (SFDC) adapter for SAP HCI. The adapter enables Salesforce integration with SAP and non-SAP system thru SAP HCI. Both inbound scenarios to Salesforce and outbound scenarios from Salesforce are supported by the adapter. The adapter comes also with a Eclipse Workbench to ease the development with features to generate schemas and to test SOQL queries.

 

In follow-up blogs, we will provide more details on how to use the adapter and will provide concrete scenarios.

 

Support both inbound scenarios to Salesforce and outbound scenarios from Salesforce.

 

pic1. Outbound.jpg

pic3. Inbound.jpg

1.Support both SOAP and BULK APIs.

pic2. Receiver channel.jpg

 

2.Simple channel configuration to query data from Salesforce.

pic4. Query.jpg

 

The SFDC adapter for SAP HCI comes with a standard Eclipse Workbench.

 

1. Schema generation for single as well multiple Objects.

pic5. Schema.jpg

 

2. Query Builder with schema generation of the selected fields.

pic6.Query.jpg

 

For more information about the Advantco Salesforce adapter for SAP HCI, please visit: https://www.advantco.com/products

Interfacing with SuccessFactors using SFSF adapter in SAP PI dual stack environment - Part I

$
0
0

Preface:SAP SuccessFactors Employee Central is a complete, cloud technology-based core HR system of record that combines HR transactions, processes and data with social collaboration features and mobile functionality. In today’s always-on, mobile, social and data-driven world it is imperative to enable the integration for real-time applications such as SuccessFactors with other On-premise / Cloud applications using a robust middleware layer.

With SuccessFactors having established itself as one of the leading Cloud based HR solutions in the market, it has seen a surge in SuccessFactors Employee Central integration with On-premise and Cloud based HR / Non HR applications.


The below blog is first in the series to understand the SuccessFactors Employee Central integration with such applications in details.


Objective: The objective of this guide is to illustrate the steps required to create and test a basic interface using SFSF Adapter in SAP PI dual stack environment.

 

 

Prerequisites:

  1. Connectivity Add-On is installed in SAP PI.
  2. Eclipse IDE for Java EE developers (With HCI Tools) is installed in the machine.
  3. SAP PI instance is up and running.
  4. User has necessary authorization to build, test and monitor an interface in SAP PI.
  5. SuccessFactors instance is up and running.
  6. SuccessFactors instance connectivity details are available.
  7. Entity “PerPersonal” contains some data.
  8. Third party system instance such as SAP HR is up and running.
  9. SAP PI connectivity with third party system instance such as SAP HR is setup.

 

 

Assumptions:

  1. User is familiar with the necessary development tools such as SAP PI. Eclipse IDE.
  2. Interface is built in a dual stack SAP PI 7.11 environment.

 

 

SuccessFactors basic concepts:

 

 

SuccessFactors Employee Central: SuccessFactors Employee Central (SFEC) is next generation Cloud based Core HR System.

 

Objects: SuccessFactors application broadly maintains two types of information at object level.

 

  • Foundation Objects: Contains details for Organization, Pay and Job Structure Details.

        Some examples are Company/ Legal Entity, Business Unit, Location Group etc.

 

 

  • Personal and Employment Objects: Contains Personal and Employment Details for Employees.

        Some examples are User, PerAddress, PerDirectDeposit, PerEmail etc.

 

 

Entity: Entities are table structures, used to store the data within SF application.

 

SFAPI: The SFAPI is library of SF entities, exposed to the outside world for integration purposes.

 

Operations – Each entity supports some operations through API: Example of such operations are

  • Insert
  • Update
  • Upsert
  • Query
  • QueryMore
  • Delete

 

 

 

Interface Scenario: The interface fetches the basic user information such as first name, last name, gender, marital status etc. from SFEC entity PerPersonal and puts it in a file (xml format) on the application server in SAP HR system.

 

The interface uses SAP PI as middleware.

 

Screen Shot 2016-04-07 at 7.20.58 PM.png

 

 

Implementation Details:


1. Test the SF connectivity.

 

a. Get the below connectivity details from SF team.

 

  • SuccessFactor URL or endpoint
  • User name
  • Password
  • Company ID

 

b. Open the SF URL in the browser and download the WSDL

 


Screen Shot 2016-04-07 at 7.22.59 PM.png

 

c. Import the WSDL in any external tool such as SOAP UI and provide the connectivity details such as User name, Password and the company ID in the

    Login operation of request payload.

 

d. Trigger the request. Session Id is received in the Response.

 

e. This completes the successful connectivity with SF application.

Screen Shot 2016-04-11 at 6.18.26 PM.png

 

f. Connectivity can further be confirmed by running other operations such as Query.

 

g. To run the Query operation, specify the Session Id in the header parameter, received through the login operation test.

 

h. Now specify the query using SFQL (Successfactors query language) and also specify the name and value parameters as specified in the screenshot.

 

i. Trigger the request.

 

j. Response is received with the first 5 users from the User entity.

 

Screen Shot 2016-04-11 at 6.41.35 PM.png

 

2. Generate the XSD for SF entity PerPersonal.

 

The SFSF adapter generates an XSD file every time you perform an operation. You can use this XSD file for mapping purposes.

 

a. Launch the Eclipse IDE. During the launch, ensure to specify the correct workspace. If multiple Eclipse tools are installed in the system, then it is advisable         to specify a separate workspace for every release of Eclipse. This is to avoid any confusion with other releases or with other tools such as NWDS.

 

Screen Shot 2016-04-11 at 6.47.49 PM.png

 

b. Install the Operation modeler. To install, first open the Eclipse IDE and click on Install new software.

 

Screen Shot 2016-04-11 at 6.51.22 PM.png

 

 

 

 

 

 

c. Click on Add and Specify the repository name and location. Please specify the relevant Eclipse release at the end of repository location. In this scenario it            was luna.

 

Screen Shot 2016-04-07 at 7.26.06 PM.png

 

d. Tick the SAP Hana Cloud Integration Tools and click on next.

 

 

Screen Shot 2016-04-07 at 7.28.10 PM.png

 

e. Review the items to be installed and click on next.

 

Screen Shot 2016-04-07 at 7.30.09 PM.png

 

f. Accept the license agreement and click on finish.

 

Screen Shot 2016-04-11 at 6.23.01 PM.png

Screen Shot 2016-04-07 at 7.34.12 PM.png

 

g. Click on Yes once you get the prompt for Eclipse restart.

 

    Please refer to the SAP Help in the reference section for installing the operation modeler.

 

Screen Shot 2016-04-07 at 7.37.13 PM.png

 

h. Create a new project. To create a new project, open the Integration designer perspective in the Eclipse IDE. To do so, click on Open other perspective         under the windows menu, and then click on Integration designer.

 

Screen Shot 2016-04-11 at 3.25.40 PM.png

 

i. Click on New and then Integration project under File menu option.

 

Screen Shot 2016-04-11 at 3.28.22 PM.png

 

j. Specify the project name and keep the project type as integration flow.

 

Screen Shot 2016-04-11 at 3.30.19 PM.png

 

k. Specify the integration flow name and keep the pattern as Point to Point. Click on finish.

 

Screen Shot 2016-04-11 at 3.32.13 PM.png

Screen Shot 2016-04-11 at 3.33.32 PM.png

 

l. Depending upon whether the SFSF adapter is to be used on Sender / Receiver side, click on the corresponding communication channel. In this scenario,           SFSF adapter will be used as Sender adapter hence click on Sender communication channel.

Screen Shot 2016-04-11 at 6.35.26 PM.png

 

 

 

 

m. Click on browse on the Adapter type input help and select the Successfactors adapter.

 

Screen Shot 2016-04-11 at 3.37.24 PM.png

n. Click on Adapter Specific tab and then on Model operation.

 

Screen Shot 2016-04-11 at 3.39.26 PM.png

 

o. Specify the SF URL, Company name, user id and password and then click on next.

 

Screen Shot 2016-04-11 at 3.41.17 PM.png

 

p. Select the relevant entity and then click on next. In this interface “PerPersonal” entity is used, which contains user details such as first name, last name,              gender, marital status etc.

 

Screen Shot 2016-04-11 at 3.42.55 PM.png

 

q. Select the Operation as “Query” and the fields to be used in the interface. Operation modeler automatically creates the SFQL query. Please take note of this      SFQL query for future reference.

 

Screen Shot 2016-04-11 at 3.44.23 PM.png

 

r. The next two steps can be used to configure the filters and the sorting criteria for the SFQL query. We are not considering these two steps for the sake of ease     in understanding wrt to our scenario. Click on finish.

 

Screen Shot 2016-04-11 at 3.46.26 PM.png

Screen Shot 2016-04-11 at 3.47.30 PM.png

 

s. The operation modeler generates the xsd.

 

Screen Shot 2016-04-11 at 3.49.44 PM.png

 

t. Refer to the xsd file, which is now available at the specified location. Right click on the XSD name and download it to a location in your machine.

 

Screen Shot 2016-04-11 at 3.51.01 PM.png

3. Configure design time objects in ESR.

 

a. Create a namespace.

 

b. Create an external definition and import the xsd, as created in step 2 above.

 

c. Observe the field names in the external definition after import.

 

Note: Steps such as creation of a new Product, SWC, installation of SWC on the SAP HR business system, creation of a new namespace are not covered in this guide.

 

Screen Shot 2016-04-11 at 3.56.07 PM.png

 

d. Create an Outbound Asynchronous interface and assign the relevant request structure.

 

Screen Shot 2016-04-11 at 3.57.25 PM.png

 

e. Create an Inbound Asynchronous interface and assign the same request structure. For ease of understanding, the structure on the receiver side is kept same     as that from sender side.

 

f. Save and activate the ESR objects.

 

Screen Shot 2016-04-11 at 3.59.32 PM.png

Interfacing with SuccessFactors using SFSF adapter in SAP PI dual stack environment - Part II

$
0
0

4. Download the Successfactor certificates and upload them in SAP PI Netweaver key-store.


a. To download the Successfactors certificate, open the target url in a browser such as Google chrome. First click on the lock button, then on the connection tab      and finally on the certificate information hyperlink.

Screen Shot 2016-04-11 at 4.10.10 PM.png

b. There are going to be 3 certificates, which need to be downloaded.

 

Successfactor certificate

(ii) Symantec certificate

(iii) Verisign certificate

 

c. Open the successfactor certificate first.

 

Screen Shot 2016-04-11 at 5.15.22 PM.png

 

d. Go to the details tab and click on the Copy to File.

 

Screen Shot 2016-04-11 at 5.16.49 PM.png

 

e. Click on Next and select the format as DER encoded binary X.509, which is compatible with SAP PI.

 

Screen Shot 2016-04-11 at 5.20.00 PM.png

Screen Shot 2016-04-11 at 5.21.13 PM.png

 

f. Specify the directory and the certificate name and finish the download.

 

Screen Shot 2016-04-11 at 5.23.09 PM.png

Screen Shot 2016-04-11 at 5.27.29 PM.png

 

g. Go to the certification path and double click the first two line to access the next two certificates one by one.

 

h. Repeat the steps from d to f for these two certificates.

 

Screen Shot 2016-04-11 at 5.29.38 PM.png

 

i. Upload the 3 certificates in NWA keystore in SAP PI. Keep the entry type as X.509

 

Note: Uploading the certificates in NWA keystore is not covered in this guide.

 

Screen Shot 2016-04-11 at 5.31.42 PM.png

 

5. Configure the runtime objects in ID.

 

a. Create a communication component for successfactor i.e. BC_Successfactor. And assign the outbound interface as created in 3d.

 

Screen Shot 2016-04-11 at 5.34.44 PM.png

 

b. Create a sender communication channel using SFSF adapter. Specify the message protocol as SOAP.

 

Screen Shot 2016-04-11 at 5.36.28 PM.png

 

c. Select the endpoint URL for SuccessFactors from the list. If not already available, then specify it as others and then mention it in the Alternate Endpoint URL.

 

d. Specify the SFAPI URL suffix as /sfapi/v1/soap

 

e. Specify company name, user id and password.

 

Screen Shot 2016-04-11 at 5.38.05 PM.png

 

f. Go to the Processing tab and specify the SFQL query as obtained in step 2q above.

 

g. Specify the poll interval accordingly. This is the interval which the SF adapter is going to be using to poll the SFEC entity.

 

Screen Shot 2016-04-11 at 5.40.03 PM.png

 

h. Specify the SSL certificate. Tick on the specify SSL certificate and specify the certificate key for verisign certificate.

 

i. Create a standard receiver file communication channel on SAP HR system with NFS protocol. Specify the target directory and file name.

 

j. Create a standard ICO with the design and run time objects created in Step 3 and Step 5.

 

k. Save and activate the ID objects along with ICO.

 

Screen Shot 2016-04-11 at 5.42.56 PM.png

 

6. Test the interface.

 

To test the interface, follow the below steps.

 

a. Stop and then start the Sender SFSF communication channel.

 

b. A message can be seen entering the Adapter engine in Sender communication channel monitoring.

 

c. A success message can be seen in message monitoring.

 

Screen Shot 2016-04-11 at 6.00.26 PM.png

 

d. Go through the Message log to understand the message processing better.

 

 

Screen Shot 2016-04-11 at 5.49.16 PM.png

 

e. The request payload will look like this.

 

Screen Shot 2016-04-11 at 5.51.48 PM.png

7. Troubleshooting tips.

 

Few common errors related to the SF Adapter and their resolutions.

 

a. Error: A previous lock still persists. Aborting poll operation.

 

    Resolution: Ensure to increase the polling period in the receiver SF communication channel. In the example shown here, the polling period was increased         from 5 mins to 1 day to correct the issue.

 

Screen Shot 2016-04-11 at 6.00.26 PM.png

 

b. Error: INVALID_SFQL. Invalid SFQL! Error: Query “-----“ is not properly formed.

 

    Resolution: Ensure that the SFQL query is maintained correctly in Receiver SF channel. In the example shown here, comma (“,”) was placed between the         fields in SFQL query to correct the issue.

 

Screen Shot 2016-04-11 at 6.03.22 PM.png

 

 

References:

 

1. Install Operation Modeler:

 

    http://help.sap.com/saphelp_nw-connectivity-addon100/helpdata/en/a5/940acc528043ccb90d9267e8e13f1e/frameset.htm

 

2. Create a new project:

 

    http://help.sap.com/saphelp_nw-connectivity-addon100/helpdata/en/25/86a07e53614dbe9ac26c15f7702639/content.htm

 

3. Modelling Operations:

 

    http://help.sap.com/saphelp_nw-connectivity-addon100/helpdata/en/89/8362021924433d8d7f2f231c436ee5/content.htm

 

 

Abbreviations:

 

SFEC: SuccessFactors Employee Central

SFQL: SuccessFactors Query Language

Video Tutorials on SAP HANA Cloud Integration

$
0
0

Hello Integration Community,

276822_l_srgb_s_gl.jpg

 

We want to bring easy and fun ways for you to learn about SAP HANA Cloud Integration (SAP HCI). So, we are happy to announce the short video demonstrations of SAP HCI in our you tube playlist. The videos are aimed at helping you get up and running on SAP HANA Cloud Integration!



Part 1: Ab initio/ Understanding SAP HANA Cloud Integration



Part 2: Getting your Hands Dirty


Part 3: Dig into the Details

 

This is the first set of videos published; more shall follow. Check it out! Let us know your thoughts, feedback and wishlists for further videos.

Happy Integrating!

 

Best Regards,

Sujit

SAP Info Days on SAP HANA Cloud Integration

$
0
0

Interested in learning more about SAP HANA Cloud Integration?

 

If yes, then we are very happy to inform you that we continue with our series of info days about "SAP HANA Cloud Integration (SAP HCI) - Everything that you want to learn about our cloud - based integration platform and how you can leverage SAP HCI today in integration scenarios!"

 

During these info days the development organization would like to take the opportunity to share some exciting news around new capabilities, supported scenarios and new use cases.

 

As you may know, SAP HCI is SAP’s strategic cloud integration platform to integrate SAP Cloud applications with other systems, SAP and non-SAP, on premise or on the cloud. SAP HCI builds upon the SAP HANA Cloud Platform and is leveraged for integration of SAP Cloud solutions, such as SAP SuccessFactors, SAP Cloud for Customer, S/4HANA, SAP Ariba, SAP Hybris, SAP Financial Services Network etc.

 

The SAP HCI Standard and Professional Editions allow you to leverage SAP HCI in any system to any system integration scenarios. And the SAP HCI Developer Edition is targeted towards partners to build integration content or connectivity options for SAP HCI.


The next SAP HCI info day will take place on Tuesday, June 21st 2016, in 69190 Walldorf, Germany.Start and end time of the info day: 10:00 am – 4:00 pm CET(Please consultwww.timeanddate.com if you are in another time zone). Some of the key benefits of this event are networking and discussions with the participants, therefore the SAP HCI info day is an onsite event without the possibility for remote dial-in.


The SAP HCI info day is for free, but space is limited, therefore registration at udo.paltzer@sap.com is mandatory.

 

The agenda of the info day is as follows:

  • Cloud Integration (incl. key scenarios, use cases, demo etc.)
  • Certification options for SAP HANA Cloud Integration
  • Customer and partner presentations on SAP HANA Cloud Integration - part 1
  • Lunch
  • Fill out Integration Survey 2016
  • Customer and partner presentations on SAP HANA Cloud Integration - part 2
  • Break out into working groups
  • Presentation of group results

 

The info day will be held in the English language.

 

If you have any questions about SAP HANA Cloud Integration or regarding the registration to the info day, please contact me at udo.paltzer@sap.com.

 

 

Request of SAP HCI info days in other locations

 

As mentioned above, these info days are part of a series of info days on SAP HCI. In case you would like to request SAP HCI info days in other countries around the globe, kindly feel free to reach out to me at udo.paltzer@sap.com.

 

We would be also very happy to offer SAP HCI info days directly at a partner site via the well established form of SAP CodeJam events; further information about SAP CodeJam sessions can be found at http://scn.sap.com/community/events/codejam and http://scn.sap.com/docs/DOC-37775.

 

Last but not least ...

 

We promote the SAP HCI info days via the International Focus Group for Integration (IFG for Integration). Further information about the IFG for Integration can be found at International Focus Group for Integration. Kindly feel also free to participate in our annual integration survey at https://nl.surveymonkey.com/r/2016_Global_Integration_Survey. The survey, which offers you a great opportunity to give feedback to SAP, closes on August 21st, 2015.


HCI - Using a Twitter use case to understand Apache Camel - Part 1

$
0
0

Background

 

I would like to build an Integration Flow on HCI that queries a timeline of a Twitter Handle. I want to make sure Tweets that have been read are not read again. Well this should be easy I thought. There was already a blog series on SCN describing how to set up the Twitter  adapter to post a tweet and a direct message Integrating HCI with Twitter - Part 1 and Integrating HCI with Twitter - Part 2

 

So, I went about building my Integration Flow as below,

  • Start Event: Time to start the Integration Flow
  • Request-Reply: Make the call to the twitter adapter with search keyword set as: from:<twittteracount>
  • Response sent back to SFTP Adapter.

 

Integration Flow

21.png

 

Twitter Adapter

 

22.png

 

I was skeptical because this configuration did not make sense for multiple reasons,

  • What is the response that the Twitter Adapter would return back? Is it XML? What is the format of this data?
  • How do I make sure only "Delta Tweets" are provided back to me,i.e, my Integration Flow should only return to me the Tweets I have not read previously. There did not seem to be any option in the Adapter to give me such a handle.

 

I had nothing to loose and hence triggered my Integration Flow. As expected I did get a error and the error was as below stating: "No type convertor available to convert from type: java.util.ArrayList to the required type java.io.InputStream".

23.png

So, twitter did respond back to me but it sent back a Java Array List which meant I had to understand this a little more and then convert this Array List to a XML String. I looked up the documentation from SAP on this but unfortunately there wasn't much to go on here. So, how do we proceed from here? What could be the content of this ArrayList I wondered.

 

HCI uses the Apache Camel  framework is something that all of us are aware of. When trying to figure out the answer to this question on Twitter adapter I decided to check Apache Camel on this and see if there was a means to reverse engineer the workings of the HCI twitter adapter

 

Setting up the Built in Examples of Apache Camel

 

Apache Camel: Examples provides with a comprehensive list of examples that can be used to trigger an Apache Camel flow and understand its underlying components. One of the examples in this page listed was the Twitter Websocket Example.

The example is demonstrating how to poll a constant feed of twitter searches and publish results in real time using web socket to a web page

Exactly what I want was my first reaction. So how do I run this example locally? How do i test and play around with this example to understand the the HCI twitter adapter.

 

In part 1 of this blog series, we will set up this example locally and test the same.

In part 2, we will extend the learning from this standalone run to HCI and use the same in HCI Twitter Adapter.

 

High Level Steps

  1. Download and Install Apache Camel -  Apache Camel: Download
  2. Download and Install Apache Maven - Download Apache Maven
  3. Set Up your Environment Variables
  4. Download project using Maven
  5. Import project into Eclipse
  6. Run & Understand the Project

 

1. Download and Install Apache Camel

StepDetails
  • Download Apache Camel Distribution from link Apache Camel: Download
  • In this case we will download the Windows Distribution for Camel 2.17.x
24.png
  • Unzip the Contents of the zip file to any directory.
  • In this case I extract the same to my E:\
  • Note: Make sure your directory names do not have any spaces in them if you plan to extract this into any other location. There are some known issues when you try to start Camel in a directory with spaces.
25.png

 

2. Download and Install Apache Maven

 

StepDetails
26.png
  • Unzip the Contents of the zip file to any directory.
  • In this case I extracted the same to my E:
27.png

 

3.Set Up your Environment Variables

 

StepDetails
  • Set up JAVA_HOME by pointing to your JRE Directory.
  • Right Click on Computer --> Properties --> Advanced System Settings -> Environment Variables
  • JAVA_HOME = C:\Program Files\Java\jre7 ( In my case )
  • Make sure you have Java Version >= 7.0
28.png
  • Set up your Path Variable to point to the Maven Bin Directory
  • Maven Bin Directory = E:\apache-maven-3.3.9-bin\apache-maven-3.3.9\bin
29.png
  • Verify Maven is Ok by launching Command Prompt and executing command - mvn -v
30.png
  • Verify JAVA_HOME by launching command prompt and executing command echo %JAVA_HOME%
31.png

 

4. Download Project Using Maven

 

StepDetails
  • Navigate to your Camel Directory --> examples --> camel-example-twitter-websocket
  • In my case : E:\apache-camel-2.17.0\examples\camel-example-twitter-websocket
  • Note the path down as this will be used in your Command Line Prompt
32.png
  • Launch Command Prompt
  • Navigate to the Camel Directory
33.png
  • Execute Command - mvn eclipse:eclipse
34.png
  • Your Project will now be downloaded.
  • You may get some warnings - they can be ignored
  • You should have the message "BUILD SUCCESS"
35.png
  • Navigate to your directory : E:\apache-camel-2.17.0\examples\camel-example-twitter-websocket
  • You should now see the Java Project with the source code, classpath etc.
36.png

 

5. Import Project into Eclipse

StepDetails
  • Open Eclipse, right click on Package Explorer, import --> Existing Projects into Workspace
37.png
  • Select the directory where the maven project was downloaded. In this case, E:\apache-camel-2.17.0\examples\camel-example-twitter-websocket
  • Click Finish
38.png
  • Your project should be listed as below
39.png

 

6. Run  and Understand the Project

 

StepDetails
  • Navigate to CamelTwitterWebSocketMain.java under src/main/java
  • The Java Class uses CamelTwitter Account as its twitter account
  • This Example publishes the tweets with the word "gaga" into a HTTP Socket running on URL : http://localhost:9090/index.html . As we are interested in the workings of the Apache Camel Twitter Component we will not focus on viewing the tweets in the browser.
  • Run this as-is without any changes ( Run -> RunAs-> Java Application )
  • You should be able to view the resulting tweets in the Console
41.png
  • Twitter limits the number of calls in its API and hence after a point this will start having errors as shown.
  • Stop your Java Program
40.png
  • Currently the code searches for the term : "Gaga"
  • For the  current requirement to read the tweet handle of a particular user, this can be changed to "from:<username>"

43.png

44.png

  • Update your Twitter credentials if required
42.png

 

So Far So Good But..

So far so good, I thought! From the looks of it, it was clear that HCI was using the Twitter Component of Apache Camel as its underlying framework for the Twitter adapter. But, I still didn't have an answer as to how to restrict the twitter feed to only new tweets. I still do not understand what the Java Array List is all about.

 

I went about then digging on Apache Camel Twitter component documentation - Apache Camel: Twitter. Couple things were of interest here,

 

Apache Camel Documentation Excerpt
Inference
The Twitter component enables the most useful features of the Twitter API by encapsulating Twitter4J.
  • The Twitter component of Apache Camel in turn using Twitter4J
  • Would recommend reading through Twitter4J and what it does on the link provided at a high level.

sinceId - Camel 2.11.0: The last tweet id which will be used for pulling the tweets. It is useful when the camel route is restarted after a long running.

  • sinceId was the answer to how to keep track of the tweets that have already been read..What does it actually do?
  • Another round of reading led me to twitter's official documentation on this :https://dev.twitter.com/rest/public/timelines. Would recommend reading this link to understand the working of sinceId.
45.png
  • The search function returned a List of Type <twitter4j.Status> .
  • This meant I could now understand what the search function returns and read data as I need by looking up the Java Doc of Twitter4j.status Status

 

So,in summary, it was clear to me now that I had to use a combination of Twitter4j.status and sinceID to build what was needed to meet my requirement.

 

I had been able to reach this point by running a built in apache camel twitter example and understanding the underlying working of HCI's twitter adapter. In the next part of this blog, we will look at how the info deciphered above is used in our Integration Flow.

HCI - Using a Twitter use case to understand Apache Camel - Part 2

$
0
0

Background

In part 1 of this blog series, we saw how -

  • HCI Uses the Twitter Component of Apache Camel
  • Apache Camel examples can be utilized to understand the inner workings of HCI
  • sinceId will enable us to capture only new Tweets from a Twitter hanle
  • twitter4j.status is the format of search results of Twitter adapter

 

In this blog we will now use the leanings above to update our Integration flow to enable us to Query the timeline of a twitter user for any Delta Tweets.

 

Integration Flow

46.png

High Level Process

  • As it is important to persist the since_Id, we will use Global Variables of HCI to retrieve and update the since_Id.
  • The Keywords CamelTwitterKeywords and CamelTwitterSinceId are used to set the Search Keywords and SinceId ( Refer to camel documentation for Twitter )
  • The conversion of the Java Array List twitter4j.status into XML is a custom process that uses the twitter4.status class and its various methods.
  • The code is just a snippet on how to use the Twitter4j.status. Not all methods have been used and this can be modified to suit your needs.

 

Detailed Steps

Step
Configuration and Usage
Timer Start
  • Trigger the Integration Flow with a Timer Start Event
Content Modifier
  • Read the Global Variable since_id persisted in the HCI Data store

47.png

Script
  • Custom Groovy Script
  • Sets sinceID into the messageHeader using key - CamelTwitterSinceId
  • Sets keywords into the messageHeader using key - CamelTwitterKeywords
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
def Message processData(Message message) {    def propertyMap = message.getProperties()    def messageLog = messageLogFactory.getMessageLog(message);    //Read since_id from the Properties    String strsince_id = propertyMap.get("since_id")    messageLog.setStringProperty("strsince_id", strsince_id);    Long sinceid;    if (strsince_id==null || strsince_id == "" ){      sinceid = 1;      def map = message.getHeaders();    }    else{     sinceid =Long.parseLong(strsince_id);    }    def map = message.getHeaders()    message.setHeader("CamelTwitterKeywords", "from:saphci_integ");    message.setHeader("CamelTwitterSinceId",sinceid);    def keywords = map.get("CamelTwitterKeywords");    return message;
}
Request-Reply
  • Makes call to the Twitter Adapter
  • The Keywords and sinceId will be read from the Message Header set in the  Groovy Script

48.png

Script
  • Custom Script to convert twitter4j.status to a Custom XML Message
  • Loops through each Tweet and then reads corresponding properties to form XML
  • Persists latest twitter ID as since_id into the message property
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
import twitter4j.StatusJSONImpl;
import java.text.DateFormat;
import java.text.SimpleDateFormat;
import java.util.List;
import java.util.Date;
import twitter4j.Status;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.apache.commons.lang.StringEscapeUtils;
def Message processData(Message message) {    List<Status> list = message.getBody();    def messageLog = messageLogFactory.getMessageLog(message);    DateFormat df = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss");    df.setTimeZone(TimeZone.getTimeZone("UTC"));    def propertyMap = message.getProperties()    String xmlHeader = "<?xml version=\"1.0\" encoding=\"UTF-8\"?><TwitterSearchResult><Resultset>";    String listOfTweets= "";    //Read each Tweet and form XML    for (body in list) {       String street = " ";       String countrycode = " ";       String country = " ";       String latitude = " ";       String longitude = " ";       if (body.getPlace() != null){           street = body.getPlace().getFullName();           countrycode = body.getPlace().getCountryCode();           country = body.getPlace().getCountry();        }        if (body.getGeoLocation() != null){            latitude = body.getGeoLocation().getLatitude();            longitude = body.getGeoLocation().getLongitude();        }        String tweetText = (body.getText());        String userName = (body.getUser().getName());        String postText = (body.getText());        String tweet = "<result><tweetText>"+postText+"</tweetText><tweetID>"+body.getId()+"</tweetID><TweetHandle>"+body.getUser().getScreenName()+"</TweetHandle><userID>"+body.getUser().getId()+"</userID><tweetUser>"+userName+"</tweetUser><tweetDate>"+df.format(body.getCreatedAt())+"</tweetDate><tweetStreet>"+street+"</tweetStreet><tweetCountryCode>"+countrycode+"</tweetCountryCode><tweetCountry>"+country+"</tweetCountry><Longitude>"+longitude+"</Longitude><Latitude>"+latitude+"</Latitude></result>"        listOfTweets = listOfTweets + tweet;     }    String xmlFooter = "</Resultset></TwitterSearchResult>";    String outputMessage = xmlHeader + listOfTweets +  xmlFooter;    String id = "";    //Read tweetID for since_id    for (body in list) {        id = body.getId();        break;     }    //Set tweetID into message property    if(id == null || id == ""){        String strsince_id = propertyMap.get("since_id")        messageLog.setStringProperty("strsince_id", strsince_id);        message.setProperty("since_id",strsince_id);    }else{        messageLog.setStringProperty("sinceID", id);        message.setProperty("since_id",id);    }    message.setBody(outputMessage);    return message;
}
Write Variables
  • Updates the Global Variable since_id with value from the Message Property as set in the Groovy script

49.png

Receiver
  • Pushes Content into a SFTP Channel

 

Execution of Flow

ScenarioResults

Run#1

  • sinceId not set in Global variable.
  • Defaulted to "1" and all tweets are read as per page size set

50.png

51.png

52.png

Run#2

  • sinceId is read from Global Store
  • No new tweets since last read and hence no tweet in output
  • sinceId is updated back into the Global Store with the same value as there is no new Tweet and hence no new ID.

54.png

Run#3

  • User has a new Tweet
  • Tweet is read
  • SinceID is updated into Global Store

55.png

56.png

Run#4

  • No new Tweets
  • sinceID points to the TweetID of previous run Tweet
57.png

 

Final Note

Through this use case, we had the opportunity to explore Apache Camel and get an insight into the working of Hana Cloud Integration. While this blog focuses on the Twitter example from Apache Camel, there are multiple other examples which which Apache Camel provides which are equally helpful and can help you on your journey of HCI as you start building complex integration patterns.

 

It is always good to understand how the underlying platform of execution works and I hope this blog inspires you to explore Apache Camel Framework and  use it within HCI.

How to integrate SAP PI with Teradata (Sync Macros without mapping pass through trick in ¡just 1 service interface out!)

$
0
0

Now a days we can see that there are a lot of requirements regarding to implement our Big Data, ERP interoperability or Data Warehousing needs, and some companies use SAP PI independent if it is or not a best practice. So for this reason i share some interesting topics:

  1. One Trick: Implement just one service outbound interface, without inbound interfaces in Synchronous way,
  2. Share just one datatype and messagetype,
  3. Implement an ICO including just your outbound interface, forcing the configuration without requiring the inbound interface,
  4. Request Information from Teradata via Macro, but with this you can get different tips in order to implement other SQL Logics like if you were working with SQL Server, MySQL, etc…
  5. Here you can find interesting information about Teradata and SAP Architecture Proposal: ftp://ftp.ucg.com/boston/2008%20BI%20&%20Netweaver%20Portal__Dynamic%20Items%20Report/Teradata_partner-SAP%20NetWeaver-EB4699.pdf


 

 

A. What Teradata is?:


B. After that, you will be able to implement different SAP PI Integrations using SQL Best Practices:


C. SAP PI Configuration:


1. Enterprise Service Repository:


You can implement just ONE Service Interface Outbound simulating the Inbound service interface, just if Metadata (Data Types) are equals…


Service.PNG

 

We implement the structure to request the information via Macro and from the client test (sender) we will add the data requeried, and will be delivered the same structure to the receiver:

 

DT_send.PNG

 

For the response we create the next structure, just adding that we need to include “_response” after the name of the above DT_name, now called DT_name_response.

 

DT_resp.PNG

 

2. Integration Directory:


Now we can implement a classic interface or an ICO (Integrated Configuration) as bellow, and the important logic here is that the Service Interface Outbound must to be configured in the Receiver Interfaces following the “trick”. This is because we have the same structure in the sender and the receiver metadata.

 

ico.PNG

 

 

We can implement a SOAP Sender Communication Channel, to play with SOAPUI and to confirm the functionality.

 

soap.png

 

 

 

And the important configuration is the JDBC Receiver, to connect to Teradata via JDBC Driver and the important parameters.

cc rec.png

connect.png

 

    • JDBC Driver: com.teradata.jdbc.TeraDriver
    • Connection: jdbc:teradata://IP_NUMER/DATABASE=NAME,PORT=1025,CHARSET=UTF8
    • Username & Password

 

processing.png

 

    • Maximum Concurrency: 15

 

 

jdbc.png

 

    • Advanced Mode: you can set number of retries of database transaction on SQL Error a default transaction isolation Level and you can set Disconnect from Database After Processing Each Message.

 

 

D. Teradata Structures:


The SQL Macro Sentence that execute the macro, will be:


EXEC MACRO_TERADATA_NAME (3, '2017-03-03');

 

 

And the Macro maybe call a select statement, with some structure like this, that will return in our example the rows that match with Id=3 and Date=2017-03-03:

 

Table: Structure to be called by MACRO_TERADATA_NAME

Id

Date

Amount

3

2017-03-03

  1. 100000.0000

3

2017-03-03

  1. 200000.0000

3

2017-03-03

  1. 300000.0000

2

2017-03-03

  1. 300000.0000

1

2017-03-03

  1. 300000.0000

 

E. Test via SOAPUI to SAP PI SOAP to SAP PI JDBC Teradata driver in Synchronous Mode.


Now we can request the following information:


<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:urn="HERE YOUR NAMESPACE">   <soapenv:Header/>   <soapenv:Body>      <urn:MT_Teradata>         <SAPStandard4SQL>            <Macro action="SQL_QUERY">               <access>EXEC MACRO_TERADATA_NAME (3, '2017-03-03');</access>            </Macro>         </SAPStandard4SQL>      </urn:MT_Teradata>   </soapenv:Body></soapenv:Envelope>

In SOAPUI we confirmed that we consumed the web service, requesting the EXEC Statement and we received the response whit the 3 rows that matched:

soapui.png

 

And we will receive something like this:


<SOAP:Envelope xmlns:SOAP="http://schemas.xmlsoap.org/soap/envelope/">   <SOAP:Header/>   <SOAP:Body>      <urn:MT_Teradata_response xmlns:urn="USE HERE URE NAMESPACE">         <SAPStandard4SQL_response>            <row>               <Id>3</Id>               <Date>2017-03-03</Date>               <Amount>100000.0000</Amount>            </row>            <row>               <Id>3</Id>               <Date>2017-03-03</Date>               <Amount>200000.0000</Amount>            </row>            <row>               <Id>3</Id>               <Date>2017-03-03</Date>               <Amount>300000.0000</Amount>            </row>         </SAPStandard4SQL_response>      </urn:MT_Teradata_response>   </SOAP:Body></SOAP:Envelope>

 

With this we can confirm the easy way to integrate Teradata and SAP PI in other ways...


Thank you!,

Azael

Auto Client - Tool from SWIFT for your Payment File Integration with Bank

$
0
0

Hello Everyone,

 

This blog is about the Auto-client a software/tool from SWIFT to connect to the Alliance Lite2 server and this tool will be installed on one of the servers present in the client premises for routing the payment transactions from the client's SAP system to various Banks.

 

Auto-client is an optional tool used to connect to the SWIFT Alliance Lite2 server. Alliance Lite2 will in turn route the files to the Banks through the most secure SWIFTNet network.

 

Auto-client installation can be done on the C drive of any windows-based PC and the default folder for 64-bit based Windows PC is %Program Files (x86)%\SWIFT\Alliance Lite2. Inside the Alliance Lite2 folder, there will be one folder called files and this folder in turn consists of 4 sub-folders named Emission, Reception, Archive, Error.

 

In the emission folder the payment files like MT100, SEPA, MT103 etc generated by the SAP BCM/ECC system must be dropped. In SAP ECC/BCM system the files will be dropped under AL11 folder, files can be picked up using the NFS connection by SAP PI File Sender channel or SFTP connection can also be enabled for more secured way of communication. SFTP connection is highly recommended.  In SAP PI a normal pick and drop interface can be created. NFS connection needs to be enabled between SAP PI server and files folder of Auto-client.

 

Auto-client continuously polls for the emission folder (just like our good old file sender channel ) and as we can set the polling interval in our File sender channel, we can set the same for Auto-client too. This timer can be set by editing the AutoClient.Properties file present under the following path - %Program Files (x86)%\SWIFT\Alliance Lite2\config. EmissionTimerInMillis is the parameter for setting the polling interval time to send the files from Auto-client directories to Alliance Lite2.

 

Once the files are successfully uploaded on to the Alliance Lite2 server, the files will be archived to the Archive folder under files directory. What if the file was not uploaded successfully to Alliance Lite2 server, what would happen to the file? Does it stay in the Emission folder itself? If it stays in Emission folder, then how would i know what is happening, why it was not picked up? Good serious of questions The files which resulted in Error, will be moved to the present Error folder under main directory files.

 

Once the file reaches Alliance Lite2 server and from Alliance Lite2 if the file is sent on to SWIFTNet network successfully, you will receive a successful Acknowledgement from the Alliance Lite2 server. If there was any error in uploading the files on to the SWIFTNet network, there will be a Negative Acknowledgement, sent from Alliance Lite2. These messages are popularly known as ACK/NAK messages. These files will be dropped in the Reception folder present under main directory files. Similar to the polling interval we have for the Emission folder to upload the files to Alliance Lite2 server, we have parameter called ReceptionTimerInMillis which can be set to download the files from Alliance Lite2 server on to the Auto-client Reception folder.

 

ACK/NAK messages tell us the status about the payment sent on SWIFTNet Network to the Bank. But the actual status of the payment processed or not processed can be only given by the Bank. The payment response from the bank will be received from the Bank on the SWIFTNet Network by the Alliance Lite2 server and from the Alliance Lite2 Server, response files will be downloaded by Auto-client on to the Reception folder based on the timer set for the parameter, ReceptionTimerInMillis.

 

Have you made it successfully till here reading all the things mentioned above, congratulations and thank you All through the blog, you might be thinking, how the files dropped on the Emission folder of Auto-client will be sent to Alliance Lite2 server, Auto-client is like a normal software which is installed under the Program Files of the Windows PC. What is the magic it does and the files are sent to Alliance Lite2 server and from Alliance Lite2 to the Banks on SWIFTNet Network? What is it? How come? Please do not break your head Answer is very simple.

 

Auto-client uses a secure USB token, which is shipped by SWIFT team to the Client who procures the license for SWIFT. This  USB-token must be plugged into one of the USB ports and unique password set for the same must be provided to get your Auto-client running. Unless the correct credentials are provided your Auto-client do not start and hence the exchange of files won't happen. Please note, this password need to be provided only one-time to get the Auto-client running and not for each and every payment file to be sent to the Bank.

 

Okay, the most curious question about Auto-client and Alliance Lite2 question was finally answered But now the question arises regarding the security  of the payment files exchanged between Auto-client and Alliance Lite2. How the communication between Auto-client and Alliance Lite2 is secure? Again the answer is very simpledAuto-client USB token is a tamper-proof hardware security module. This module digitally signs and authenticates every communication with the Alliance Lite2 server using a strong 2048-bit PKI certificate that resides on the token.

 

Auto-client offers 2-system landscape, Live and Test system. It is highly recommended to keep 2 separate servers, one for Live system and another for Test system. For both the systems you will have 2 different USB tokens. Test system can be used to send the files to Bank Test system.

 

I hope you have made the reading till the end of this boring blog and liked the provided information

 

Regards,

Nitin Deshpande

FILE LOOKUP IN SAP PI USING UDF

$
0
0

This is my first post on SCN. We have few threads on File Lookup but i face few challenges when i was going through it. Thus i thought to create a simple step by step document for the same.

 

PI has two standard types of lookup available in ESR (i.e. JDBC and RFC lookup).  But we can achieve two more types of lookup in PI to fulfill our needs, those are File and SOAP lookup.

 

File Lookup is done when we want to map a value against an incoming payload value. In this scenario file may be maintained on some other server. We do a lookup to read the value against incoming field and map that value to the target payload.

 

To achieve this functionality we need to write a small UDF, which will leverage our requirement.

 

Step1: Create a UDF and import two packages as mentioned in screenshot.

           java.net.URL

           java.net.URLConnection

imp1.PNG

 

 

 

Step2: Copy paste below code in your PDF

(You need to replace username, password, server,path directory and filename in below code accordingly.)

 

String custId = "";

try{

 

URL url =new URL("ftp://<UserName>:<Password>@<IPAddress>/ <pathtodirectory>/ <filename.txt>");

 

URLConnection con = url.openConnection();

InputStream lookupStream = con.getInputStream();

 

InputStreamReader reader = new InputStreamReader(lookupStream);

BufferedReader buffer = new BufferedReader(reader);

 

String read;

while((read=buffer.readLine()) != null){

String temp = read.substring(0,key.length());

if(key.equals(temp)){

custId = read.substring(key.length()+1,read.length());

 

if (  read.substring(key.length()+1,read.length()) != "00"){

 

int num = Integer.parseInt( read.substring(key.length()+1,read.length()));

num = num+2;

 

custId = Integer.toString(num);

}

}

}

 

  1. buffer.close();

 

} catch (Exception e){

          return e.toString();

}

 

return custId;

 

Step3: Create a file on external server which has input and output separated by "=".

flkp1.JPG

 

 

Step4: We can verify our code through Display queue in message monitor's tab by giving input in the source field.

Below screenshot shows if we are getting output as expected.

flkp2.JPG

 

 

This is the simplest way of doing a file lookup in SAP PI through UDF.


Note: File Lookup is not encouraged by SAP.


I hope you all like this post.!!!!!!

HCI Content Monitoring Tools

$
0
0

Before you can process messages with an integration flow (iflow), you need to deploy this iflow to the HCI runtime node. When you complete the iflow modelling and invoke the ‘deploy’ command, the following steps are performed:

  • The iflow is checked for correctness
  • The iflow is sent to the tenant management node (tmn node)
  • The iflow is converted into an executable program (iflow bundle) on tmn node
  • The iflow bundle is distributed from tmn to runtime nodes (iflmap node)
  • The iflow bundle is started on iflmap nodes

 

The following tools can be used for tracking the iflow deployment process:

  • Console– shows client logs
  • Deployed Artifacts– provides a list of deployed artifacts on HCI tenant
  • Tasks View– provides a status of deployment task
  • Component Status View– displays the various statuses of the deployed iflows over time
  • Tail Log– provides access to the tail end of the server logs

 

Most of these tools are currently available in eclipse only.

 

 

Console

When you deploy an iflow, the iflow is first checked for correctness. Results of the checks can be found in the console. When you deploy an iflow, make sure that there were no validation errors.

 

How to access the console

Window -> Show View -> Console

 

mt-console.png

 

 

Deployed Artifacts

Deployed artifacts view shows a list of iflows and other artifacts deployed on a tenant. When you deploy an iflow, it should appear in the list of deployed artifacts in DEPLOYED state. If after a couple of minutes this does not happen, this indicates a possible deployment error.

 

How to access the Deployed Artifacts view

  1. Open Node Explorer view
  2. Double click on tenant
  3. Switch to Deployed Artifacts

 

mt-deployed-artifacts.png

 

 

Tasks View

Tasks View shows the status of various tasks running on the server including deployment tasks.

When you deploy an iflow, the following task is executed: “Build and deploy ‘YOUR_IFLOW’ Project. For successful deployment this task must be completed with status “SUCCESS”. If this is not the case, select the failed task and check the task trace for more details.

 

How to access the Tasks View:

Window -> Show view -> Tasks View

 

mt-tasks-view.png

 

 

Component Status View

In Component Status view you can check the current status of a node’s components. After successful deployment, an iflow should appear in the component status view of a runtime node with runtime status “started”.

If the status is not “started” you can invoke context menu “Show Error Details” for additional info. You can also continue error analysis by looking into the tail log.

 

If you want to restart an iflow, you can do so by clicking on “Restart” button in component status view. This is the easiest way to restart iflows triggered by a Timer with “Run Once” option. The other way would be to redeploy the iflow.

 

How to access Component Status view:

  1. Select IFLMAP node in the Node Explorer
  2. Open Component Status View
  3. Use a filter to narrow down list of components
  4. Check your component’s status

 

mt-csv.png

 

 

Tail Log

Each HCI node has its own server log which can be used for error analysis. Using the Tail Log view, you can download the most recent part of the log. You can also specify how many records should be downloaded (in KB).

 

In case of deployment issues you should check the tail log of both TMN and IFLMAP nodes.

 

How to access Tail Log:

  1. Switch to Node Explorer
  2. Select IFLMAP node
  3. Switch to or open the Tail Log view (Window -> Show view -> Tail Log)
  4. Specify the size of the log to be downloaded
  5. Click on Refresh

 

mt-log.png

 

 

Summary

Iflow deployment consists of several steps performed on the tenant management and runtime nodes. Using tools described above, you can monitor these steps and search for a root cause in case of deployment issues.

 

You can use the following checklist to make sure your iflow is deployed successfully

  • In Console: there should be no validation errors
  • In Deployed Artifacts: your iflow should have deploy status “DEPLOYED”
  • In Tasks View: a task “Build and deploy ‘YOUR_IFLOW’” should have status “SUCCESS”
  • In Component Status View: your iflow should have the runtime status “started”
  • In Tail Log: there should be no errors and no stack-traces, neither in the TMN nor in the IFLMAP tail log

Monitoring your integration flows

$
0
0

HCI offers several possibilities to monitor message processing. Here we will give on overview of available tools.

 

  • Message Monitoring– provides an overview on processed messages
  • Tail Log– provides access to server log file
  • Message Tracing– provides access to message payload

 

In addition to monitoring tools, you can enhance your iflow to persist additional information for future analysis. You can achieve these using following HCI features:

  • MPL Attachments– provides an API to store data in the message processing log
  • Data Store– an iflow component to persist arbitrary data in the HCI database

 

Most of the monitoring tools are currently available in eclipse only.

 

 

Message Monitoring

Use the Message Monitoring view to check the status of recently processed messages. In case you have lot of messages you should specify a time period and/or integration flow to monitor in order to narrow down the search results.

 

How to Access Message Monitoring

  1. Open Node Explorer
  2. Double-click on your tenant (root node)
  3. Switch to the Message Monitoring view
  4. Specify a search filter
  5. Click on the ‘Search’ button
  6. Select a message of interest
  7. Check the Log for more details

 

mm1.png

 

 

 

 

Tail Log

When you search for the root cause of a message processing issue, you can check the server log for more details. Each HCI node has its own server log. since message processing only happens on IFLMAP nodes, we are only interested in the IFLMAP nodes logs. Using “Tail Log” view you can download the most recent part of the log. You can specify how big this part should be in Kilobytes.

 

How to Access Tail Log

  1. Switch to Node Explorer
  2. Select IFLMAP node
  3. Switch to or open Tail Log view (Window -> Show view -> Tail Log)
  4. Specify size of the log to be downloaded
  5. Click on Refresh

 

mm2.png

Note: By default, only errors are written to the server log. However, some components log useful information with a lower severity level. For example, you can dump SOAP request and response messages to the server log by increasing log level of org.apache.cxf.* packages to INFO.

 

How to dump SOAP envelopes to the log

1. In main iflow properties enable ‘Enable Debug Trace’ option

 

mm3.png

2. Open a ticket on Cloud Operations Team on component LOD-HCI-PI-OPS for changing the log level. Example:

 

Dear Colleagues,

 

Please set the log level for the following loggers:-

 

TMN url: https://v0XXX-tmn.avt.us1.hana.ondemand.com

Application: IFLMAP

Tenant Name: v0XXX

Logger name: org.apache.cxf.*

Set Logger Level to DEBUG

 

Thanks and regards,

 

3. Execute scenario and check log for SOAP envelopes

 

mm4.png

 

 

 

Message Tracing

Use message tracing when you need to access the payload of processed messages. When activated, the message tracing gathers payload and headers on each step during iflow execution. You can access this information later from the message monitoring.

 

When you want to use message tracing, you first need to activate it on your tenant. The following steps are required:

  1. Activate message tracing on your tenant via ticket to cloud ops team
  2. Enable tracing in your iflow
  3. Provide your user with authorizations to access traces

 

More details can be found in HCI documentation:

https://cloudintegration.hana.ondemand.com/PI/help

  • Designing and Managing Integration Content

    • Activating Tenant and Integration Flow Tracing

 

How to access the message payload

  1. Open Message Monitoring
  2. Select a message
  3. Click on ‘View Trace’ button
  4. Click on a yellow envelope once the iflow gets opened
  5. Check message payload

 

mm5.png

 

 

mm6.png

 

 

MPL Attachments

HCI keeps a message processing log for every processed message. The log contains the basic information: status, error message, start/end timestamps for iflow components.

Using custom script (Groovy or Javascript) you can put additional entries into the log.

 

There are two possibilities:

  • properties
  • attachments

 

Properties are simple name/value pairs written into the log.

In the following example, the “payload” property contains a JSON structure:

 

mm7.png

 

Attachments are complete documents which are ‘attached’ to the message processing log. You can store the entire payload or other document as attachment

 

In the following example the same JSON payload is stored as a “weather” attachment.

mm8.png

Clicking on "weather" link opens an attachment viewer

mm9.png

 

Example Iflow:

mm10.png

 

write_mpl.gsh

 

import com.sap.gateway.ip.core.customdev.util.Message;

import java.util.HashMap;

 

 

def Message processData(Message message) {

  def payload = message.getBody(String.class);

  def messageLog = messageLogFactory.getMessageLog(message)

  messageLog.setStringProperty("payload", payload)

  messageLog.addAttachmentAsString("weather", payload, "application/json");

  return message;

}

 

More details on scripting in HCI documentation:

https://cloudintegration.hana.ondemand.com/PI/help

  • Designing and Managing Integration Content
  • Designing Integration Content With the SAP HCI Web Application
  • Configure Integration Flow Components
  •    Define Message Transformers
  •      Define Script

 

 

 

Data Store

Another way to persist the message payload for further analysis is to use the data store. The primary goal of the data store is to enable asynchronous message processing. You can write messages into the data store from one iflow and read them back from another one.

With some limitations you can also use the data store to persist messages for monitoring purposes.

The main limitation: if a message goes into FAILED state, no data is committed to the data store for it. For example, if you do a SOAP call in your iflow and this call fails, normally the iflow goes into FAILED state. In such case, no data will be written to the data store. Only if the iflow completes with the COMLETED state, data is committed to the database and can be accessed afterwards.

 

Example iflow:

mm11.png

 

You can access the data from data store in two ways

  1. Using another iflow with SELECT or GET operation
  2. Using eclipse data store viewer

 

How to access eclipse data store viewer

  1. Double-click on your tenant in Node Explorer
  2. Switch to “Data Store Viewer” tab
  3. Select the data store
  4. Inspect entries, use “Download Entry” context menu command to download the content

 

 

mm12.png

Note: your user needs the AuthGroup.BusinessExpert role on TMN application of your tenant in order to access data store entries using eclipse data store viewer. Raise ticket on LOD-HCI-PI-OPS for getting this authorization.

 

More details on the Data Store in HCI documentation:

https://cloudintegration.hana.ondemand.com/PI/help

  • Designing and Managing Integration Content
    • Defining Data Store Operations

 

 

Summary

In this blog we walked through available monitoring features of HCI. Using the out-of-the box message monitoring you can search for processed messages and check their status. In the tail log you can find additional low level information i.e. error messages or soap envelopes. Using message tracing you can access the payload of processed messages. Using MPL attachments API you can enhance the message log with additional information. Using the Data Store is another way to persist message payloads for monitoring purposes.

 

 

 

 

 

 




How to skip header record from simple Plain2XML (file type is .csv,SFTP Sender adapter) without using any UDF at message mapping level ?

$
0
0

Hello All,

 

In this blog I am going to explain how to skip header record from simple Plain2XML(file type is .csv,SFTP Sender adapter).

 

Input file structure at Sender adapter side :

 

SFTP Input file.png

 

I did not used any UDFs , I am just using standard node functions to remove first line from file.

 

 

Mapping doc.png

 

First record suppressed at message mapping level :

message mapping1.png

Hope this will be useful ..

 

For above requirement I am referring below thread.

How to skip header record from simple Plain2XML (file type is .csv,SFTP Sender adapter)

 

Thanks you

Handling Mass Data Upload for Value Mapping which can't be easily handled with NWDS and VMR interface

$
0
0

I was having a scenario where i need to handle more than 10000 values in Value Mapping which was very tedious task. Entering large number of values manually in ID was not possible otherwise it would end in months. Then i tried Value Mapping Replication (VMR) interface available in Basis component but it was also not that efficient. Then i also tried uploading with NWDS directly by creating CSV file for Value Mapping but it fails when we have any "," (comma) in key or Value.

So this option was not helpful for me.

 

Instead of this we were knowing that if we are doing lookups in any database or file server it will hit our interface's execution time if number of lookups are more.

Then i thought why we can't handle these much values in ESR only in any file format and directly read values from those file which will be much quicker than any other option. So i was having only one option where i can upload any kind of file in SAP PI i.e. Imported Archive. Imported archive is normally used for JAVA or XSLT mappings. But it provides an option to upload files in .zip format that gave me a loophole from where we can upload any kind of file  in ESR after zipping the files together.

 

It was amazing when i got success to do lookup among >50K values in milliseconds. So i thought to share this new concept with all of you because i searched whole SCN and SAP documents to handle this problem and i returned empty hand.

 

I will explain step by step procedure to handle any number of values in Key- Value pair format in ESR and can easily do a lookup through a small UDF.

 

Step 1: Create text files containing key - Value pair separated by space or "=" as shown in below screenshot


 

Step 2: Now create all the files that you want to create for lookup in text format and zip it together.

      ( I was having a requirement where i need to transform for 20 EDI segments incoming input values into its standard actual values as shown in above figure.           So i created different file for different segment. If you want you can merge all files together and upload single text file.)

          I created 21 files as per the requirement and zipped those together as below:

 

 

Now i have a zipped file that contains all key-value pair in it. Now we can upload this file in imported Archive without any issue

 

Step 3: Create an Imported Archive as ValueMapping  and import .zip file into it.


 

 

Now you can see all text files in imported Archive as shown in below screenshot.

 

 

Click on any file , you can see the content of the file as below:

 

 

Now Save and Activate your imported Archive.

 

Step 4: Now assign your imported archive in your message mapping in Function tab under Archive used as shown below:


 

Step 5: Now we will create a simple UDF which will take two input  values first value will be key against which i want description and second input of UDF will be file in which i want to lookup values.

   (If you creating one file then you can pass only one input to UDF and directly write file name in UDF.)

 

Step 6: Copy paste the below code in your UDF:


//public String FileLookup(String key, String filename, Container container) throws StreamTransformationException{


String returnString="";

     try {

          InputStream lookupStream = this.getClass().getClassLoader().getResourceAsStream(filename);

         InputStreamReader reader = new InputStreamReader(lookupStream);

             BufferedReader buffer = new BufferedReader(reader);

 

String read;

while((read=buffer.readLine()) != null){

String temp = read.substring(0,key.length());

if(key.equals(temp)){

returnString = read.substring(key.length()+1,read.length());

if (  read.substring(key.length()+1,read.length()) != "00"){

 

int num = Integer.parseInt( read.substring(key.length()+1,read.length()));

num = num+2;

 

returnString = Integer.toString(num);

 

     }    }

}

        } catch (Exception e) {

          returnString = e.getMessage();

     }

 

     return returnString;

//}

 

Step 6 : Now we will create one more UDF for trimming fixed extra description that will always come when we are using lookup code.

               There will be only one input for this UDF. We will pass the output of the Lookup UDF into it and it will give actual output to us. If you are a bit confused                you will get clear picture once you do display queue on these udf.

 

Below is the code for trimValue UDF. ( Input Parameter of UDF : value )

 

if(value.length() >0){

String str ="";

str = value.substring(19,value.length()-1);

return str;

}else{

    return "";

}

 

Now our UDFs are ready for testing

 

Step 7 : Now i will pass a key that is available in DE_365.txt file and our output will be actual value against this key.

             I have shown every input and output using display queue that will explain everything clearly and now you can understand why i wrote trimvalue function


 

 

Now we can compare the key value available in our text file:



This will never hit performance and execution time of message mapping as we are maintaining the lookup files in ESR as an ESR object.

 

Value Mapping has a constraint over the length of target field (i.e. it can't be more than 300 ) but here you can pass more than that as we are maintaining the values in text file.


Hopefully this will solve most of the problems related to large Value Mapping data maintenance. You can upload millions of data without much effort in ZIP format. 

 





Rapid Static Java Source Code Analysis with JLin in NWDS

$
0
0

Intro

This blog contains brief outlook on a tool named JLin, that is embedded in SAP NetWeaver Developer Studio and used for static Java source code analysis. There are customers who actively use JLin during development and source code early quality assurance phase, but there are others who don't have JLin in their development toolbox - together with that, lack of SCN discussions regarding usage of JLin for source code quality analysis makes me think that either this tool is obvious and straightforward for Java developers in SAP world, or it is not very common in SAP developers community and undeservingly neglected and underestimated by Java developers. If you are a part of the first group and are already active users of JLin, further reading may not be exciting for you, but if you are interested in figuring out how you can leverage through using NWDS built-in features on the way of Java source code quality improvement, let's move forward...

 

JLin is an integral part of NWDS distribution: it is already equipped with pre-configured default set of checks and it doesn't require any extra infrastructure components for assessing source code, it is highly integrated with other NWDS/Eclipse tools, results processing and presentation (all tasks are performed by JLin purely within NWDS - which also means, the tool can be used for offline standalone analysis). All together, mentioned features make the tool a great choice for rapid analysis of Java source code that can be conducted right from NWDS in very few clicks.

 

It shall be noted that JLin is in no way a replacement for well-known, commonly used and mature static source code analysis products like SonarQube, FindBugs, PMD, Checkstyle and others, and shall not be matched against them. SonarQube and its equivalents provide complete solution and infrastructure for static source code analysis, that is focused on centralized management and governance of this process, and provisioning of sophisticated features which include (but not limit to) such functionality as central storage of source code check templates and source code inspection results, capabilities for historic analysis and visualization tools, extensibility of existing patterns and development of custom checks, support of variety of programming languages for which source code can be analyzed. This also means such tools shall be installed and properly configured before they can be effectively utilized - that, in its turn, requires some efforts. In contrast to this, JLin is a lightweight tool that comes with default check patterns (that can be enabled/disabled, but cannot be extended), it employs basic Eclipse based user interface and does not provide rich aggregated analysis and visualization capabilities, as well as it only supports analysis of code written in Java. But in many general cases, it can be used either completely out of the box or with minimum adjustment - as a consequence, its initial setup becomes a matter of few minutes.

 

Having written so, I shall encourage developers to differentiate use cases for JLin and dedicated general purpose static source code analysis tools mentioned earlier, and point out that JLin can be considered as a complementary tool for fast preliminary analysis of source code quality, it can be included as an optional step in a process of quality assurance of developed Java applications, followed by (preferably) mandatory extensive analysis of submitted development by means of static source code analysis infrastructure that is used across the organization.

 

JLin is described in details in SAP Help - refer to Testing Java Applications with JLin - SAP Composition Environment - SAP Library. Even though help materials for JLin are placed in SAP Composition Environment space, the tool is generic and can be used for vast majority of Java developments related to other areas, PI/PO being one of them.

 

On high level, process of JLin usage consists of following principal steps:

  1. (Optional) Define JLin initial (general) preferences and properties like optional export of results to an external file, priority threshold, etc.;
  2. (Optional) Define custom JLin variant containing selected checks from a list of checks shipped with NWDS and their priorities;
  3. Create JLin launch configuration for an assessed Java project using default or custom JLin variant;
  4. Run created JLin configuration;
  5. Review and analyze JLin tests results.

 

 

JLin initial preferences and variant configuration

A default JLin configuration already contains commonly used major checks for tests, but if it is required to enable or disable some checks, or change their priority, then in NWDS, go to menu Window > Preferences. In Preferences window, select Java > JLin:

Preferences - JLin.png

Here, it is possible to check default variant configuration or create a custom one.

 

For a created variant, it is possible to customize general properties like results export to an XML file (which is helpful for further analysis of results produced by JLin, outside of NWDS, especially in case of further machine parsing and processing), priority threshold, etc..

 

Central aspect of JLin variant configuration is definitely selection of tests that form basis of JLin variant and scope of checks that will be applied to examined source code:

JLin variant.png

A set of JLin checks is shipped by SAP as a part of NWDS and cannot be extended. In case of uncertainty and unclarity regarding any specific check, it is possible to get description of test scope and useful explanatory notes (available from context menu of a check):

JLin check - description.png

For selected checks, it is possible to adjust their priority based on individual estimation of severity and impact of issues detected in the source code and corresponding to those checks:

JLin check - options.png

 

 

JLin tests launch configuration

If JLin launch configuration for the analyzed project hasn't been created yet, then in NWDS, go to menu Run > Run Configurations. In Run Configurations window, select JLin and create new launch configuration for it (either from a context menu or using corresponding button in a window). In a newly created JLin launch configuration, specify examined source code base (for example, Java project) and JLin variant (default or created on the previous step) that has to be used for analysis execution:

Launch configuration - JLin.png

 

 

JLin tests execution and results analysis

After initial preferences are configured and launch configuration is prepared, we are ready to execute JLin checks, which can be done from the same window that was used to configure launch configuration on the previous step.

 

JLin outputs checks results and general statistics about executed JLin test to NWDS view Problems:

JLin results - overview.png

For every found issue JLin creates a problem marker, that can be explored in more details - corresponding Java resource can be opened and analyzed, Eclipse task can be created for that marker, as well as description of an issue (retrieved based on definition of a check that was not passed) can be viewed by selecting JLin test description from context menu of a respective selected marker:

JLin results - marker in source code.png

Similarly to other problem markers in Eclipse, it is possible to use filters in Problems view in order to focus analysis on specific examined Java resources or issue types by selecting View menu > Configure Contents in Problems view and specifying required filter options:

JLin results - filter.png

 

As it can be seen from the above, just in few steps, we accomplished JLin configuration and conducted basic static source code analysis for a selected Java project - with absolutely no necessity in additional environment setup and/or external servers / Internet connection. As a result, already preliminary code check highlighted several issues and produced generic recommendations for their elimination, which can be addressed before developed source code reaches further quality assurance phases.

Creating a custom HCI Adapter - Part 1

$
0
0

Introduction

Recently I did some work in SAP HANA Cloud Integration (HCI) and started to fiddle with the available adapters.  Although they are capable and cater for most needs you might find yourself in a similar situation where you need to create your own adapter.

 

For those wondering - adapters are used to connect external systems to a HCI tenant. An adapter encapsulates the details of connectivity and communication.  An integration designer will make use of the design tool and choose an appropriate adapter from the available options.  The HCI documentation lists all the available adapters.

 

This blog is the result of my experience which I would like to share.  It is a step-by-step guide with screenshots and examples.  Also read the SAP documentation on creating HCI adapters for more information.



 

What we'll do

I will show you how to create a new adapter in HCI, deploy the adapter and test that it is working.  The adapter will not be based on an existing camel component, that is the topic of a future blog .  We'll create the "echo" adapter that mocks the endpoints by inserting a dummy message.

 

Part 1 - Create a Camel component

Part 2 - Create the HCI adapter

Part 3 - Create integration flow using new adapter and see it working

What you'll need

 

All code used for this blog is available here GitHub - nicbotha/hci-echo-adapter

 

Part 1 - Create a Camel component

 

Camel components are extension points used for integration. A Camel component creates endpoints that understands the many protocols, data formats, API's etc.  Remember at runtime HCI uses Camel for the mediation and routing of messages, so we can make use of this extension point and add our own component.  The outcome of this part is a Camel component.

 

Step - Create a Camel component using Maven

 

Open a command prompt and run the following maven command:

 

mvn archetype:generate \   -DarchetypeGroupId=org.apache.camel.archetypes \   -DarchetypeArtifactId=camel-archetype-component \   -DarchetypeVersion=2.12.3  \   -DgroupId=my.domain \   -DartifactId=echo

-DarchetypeVersion is the most interesting because it determines the camel-core version in the pom file (which you can change later).  But make sure it is not greater than the Camel version on your tenant.

 

You will be prompted for a name and a scheme.  Your component has to have a unique URI and should not match any existing components.

 

The output of this step should be a success message looking like:

mvn-arch-success.png

Step - Create an Eclipse project

 

Staying in the same command prompt navigate to where your new pom file is and run the following maven command:

 

mvn eclipse:eclipse \
-DdownloadSources \
-DdownloadJavadocs

Lines 2 and 3 are optional but I always like to pull the docs and source into my projects

 

The output of this step should be a success message same as previous step.

Step - Import project into Eclipse

 

I'm not going into detail here as you surely know how to import an existing project into Eclipse.  What I'd like to point out is the generated code and project structure.  After importing you should have a project as below.

  1. A unit test is already created and you can run it.  It creates a camel route and assert that at least 1 message is present.
  2. The main code contains component, endpoint and consumer/producer.
  3. This file is used to map the URI scheme to the component.

 

eclipse-project.png

Step - Define a URI parameter

 

You can specify URI parameters that will be passed to the component at runtime using the @UriParam annotation.  We will add a parameter to allow a user to specify the delay of the consumer.  Later you will see how this parameter is used during design time in the editor allowing the designer to configure a delay time.

 

Edit the echoEndPoint.java and add:

 

public class echoEndpoint extends DefaultEndpoint {  @UriParam
 private long delay;
....
public long getDelay() {  return delay;  }  public void setDelay(long delay) {  this.delay = delay;  }
  • line 4 - use @UriParam annotation to define a parameter
  • line 5 - create the variable

 

Edit the echoConsumer.java and add:

 

public class echoConsumer extends ScheduledPollConsumer {    private final echoEndpoint endpoint;    public echoConsumer(echoEndpoint endpoint, Processor processor) {        super(endpoint, processor);        this.endpoint = endpoint;        setDelay(this.endpoint.getDelay());    }
  • line 8 - the delay value is set using the parameter that was passed in the URI to the endpoint.

 

Step - update the test

 

The unit test will not pass at this stage as the endpoint now expects a parameter in the URI.  Lets modify the echoComponentTest.java by adding this expected parameter:

 

protected RouteBuilder createRouteBuilder() throws Exception {        return new RouteBuilder() {            public void configure() {                from("echo://foo?delay=1")                  .to("echo://bar")                  .to("mock:result");            }        };    }
  • line 4 - added delay=1 parameter

Step - build and test

 

All that remains is to build the project and ensure the test pass.  Open a command prompt and navigate to where the project pom is.  Run the following Maven command:

 

mvn clean install

The output of this step should be a success message looking like:


part1-end.png

Conclusion

 

In Part 1 we have created a working Camel component.  At this point the consumer expects a parameter and will add a dummy message to be routed.  In the next part we will create the HCI Adapter project.

 

Creating a custom HCI Adapter - Part 2

$
0
0

Overview

 

Part 2 - Create the HCI adapter

 

In Part 1 of this series we created the Camel component.  Now we need to wrap that component in a HCI adapter project. We will create the echo adapter using the echo component created in part 1.  The outcome of this part is a HCI adapter deployed on a tenant.

 

Step - Create the adapter project

The first thing todo is to create the adapter project.  For this you will need to open Eclipse and select File > New > Other..  Expand the SAP HANA Cloud Integration section and select Adapter Project

eclipse-wiz1.png

 

click Next.

eclipse-wiz2.png

fill in required fields and clickFinish

 

Step - Generate metadata

At this step you should have a project looking similar to:

adap-1.png

The project itself contains a few empty folders and a metadata.txt file.  Next we want to copy the echo-*.jar from the echo camel component project into the component folder of the echo adapter project.  Once done your adapter project should look like:

adap-2.png

The next action is to generate the component metadata.  You can read all about what component metadata is in the online help.  But it is basically the place where you setup the info required by the editor in design time. 


Right click on the echo-adapter project and select Generate Component Metadata

 

adap-3.png

Next, right click on the echo-adapter project and select Execute Checks.  The output of this step should be:

 

adap-4.png

Step - Modify metadata.xml

 

Expand the metadata folder and you will see a newly generate file named metadata.xml.  Open this file.  Notice that the generation tool inspected our echo component and generated appropriate xml content from it.  What we will do now is modify it a bit by removing unnecessary elements.

 

There are only two parameters we want users of our adapter to specify: first part URI and delay.  We can remove the others as the adapter will use default values.  Once done your metadata.xml should look like:

 

<?xml version="1.0" encoding="UTF-8" standalone="yes"?><ComponentMetadata ComponentId="ctype::Adapter/cname::me:echo/version::1.0.0" ComponentName="me:echo" UIElementType="Adapter" IsExtension="false" IsFinal="true" IsPreserves="true" IsDefaultGenerator="true" MetadataVersion="2.0" xmlns:gen="http://www.sap.hci.adk.com/gen">    <Variant VariantName="Echo Component Sender" gen:RuntimeComponentBaseUri="echo" VariantId="ctype::AdapterVariant/cname::me:echo/tp::echo/mp::echo/direction::Sender" MetadataVersion="2.0" AttachmentBehavior="Preserve">        <InputContent Cardinality="1" Scope="outsidepool" MessageCardinality="1" isStreaming="false">            <Content>                <ContentType>Any</ContentType>                <Schema xsi:type="xs:string" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"></Schema>            </Content>        </InputContent>        <OutputContent Cardinality="1" Scope="outsidepool" MessageCardinality="1" isStreaming="false">            <Content>                <ContentType>Any</ContentType>                <Schema xsi:type="xs:string" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"></Schema>            </Content>        </OutputContent>        <Tab id="connection">            <GuiLabels guid="f20e0d5e-6204-42b8-80db-7dc179641528">                <Label language="EN">Connection</Label>                <Label language="DE">Connection</Label>            </GuiLabels>            <AttributeGroup id="defaultUriParameter">                <Name xsi:type="xs:string" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">URI Setting</Name>                <GuiLabels guid="fef82a37-793e-45d2-9c62-261a5a2987fa">                    <Label language="EN">URI Setting</Label>                    <Label language="DE">URI Setting</Label>                </GuiLabels>                <AttributeReference>                    <ReferenceName>firstUriPart</ReferenceName>                    <description>Configure First URI Part</description>                </AttributeReference>            </AttributeGroup>            <AttributeGroup id="echoEndpoint">                <Name xsi:type="xs:string" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">Echo Endpoint</Name>                <GuiLabels guid="7a914dcf-ab5e-4dfc-a859-d4709bd7402d">                    <Label language="EN">Echo Endpoint</Label>                    <Label language="DE">Echo Endpoint</Label>                </GuiLabels>                <AttributeReference>                    <ReferenceName>delay</ReferenceName>                    <description>Configure Delay</description>                </AttributeReference>            </AttributeGroup>               </Tab>    </Variant>    <Variant VariantName="Echo Component Receiver" gen:RuntimeComponentBaseUri="echo" VariantId="ctype::AdapterVariant/cname::me:echo/tp::echo/mp::echo/direction::Receiver" IsRequestResponse="true" MetadataVersion="2.0" AttachmentBehavior="Preserve">        <InputContent Cardinality="1" Scope="outsidepool" MessageCardinality="1" isStreaming="false">            <Content>                <ContentType>Any</ContentType>                <Schema xsi:type="xs:string" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"></Schema>            </Content>        </InputContent>        <OutputContent Cardinality="1" Scope="outsidepool" MessageCardinality="1" isStreaming="false">            <Content>                <ContentType>Any</ContentType>                <Schema xsi:type="xs:string" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"></Schema>            </Content>        </OutputContent>        <Tab id="connection">            <GuiLabels guid="8c1e4365-b434-486a-8ec8-2a8fd370f77f">                <Label language="EN">Connection</Label>                <Label language="DE">Connection</Label>            </GuiLabels>            <AttributeGroup id="defaultUriParameter">                <Name xsi:type="xs:string" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">URI Setting</Name>                <GuiLabels guid="d1004d88-8e54-4f46-a5e7-e6e4d0dca053">                    <Label language="EN">URI Setting</Label>                    <Label language="DE">URI Setting</Label>                </GuiLabels>                <AttributeReference>                    <ReferenceName>firstUriPart</ReferenceName>                    <description>Configure First URI Part</description>                </AttributeReference>            </AttributeGroup>               </Tab>    </Variant>    <AttributeMetadata>        <Name>firstUriPart</Name>        <Usage>false</Usage>        <DataType>xsd:string</DataType>        <Default></Default>        <Length></Length>        <IsParameterized>true</IsParameterized>        <GuiLabels guid="eaa8b525-3e51-4d18-81ca-16ec828567dc">            <Label language="EN">First URI Part</Label>            <Label language="DE">First URI Part</Label>        </GuiLabels>    </AttributeMetadata>    <AttributeMetadata>        <Name>delay</Name>        <Usage>false</Usage>        <DataType>xsd:long</DataType>        <Default></Default>        <Length></Length>        <IsParameterized>true</IsParameterized>        <GuiLabels guid="7630e90d-5982-403b-a217-6a9be329ee04">            <Label language="EN">Delay</Label>            <Label language="DE">Delay</Label>        </GuiLabels>    </AttributeMetadata></ComponentMetadata>

To ensure all is good run the Execute Checks again.


Step - Deploy


So at last we can deploy our adapter to a HCI tenant.  Right click on the echo-adapter project and select Deploy Adapter Project


adap-5.png

In the Deploy Integration Content dialog select the Tenant you want to deploy to and click OK


adap-6.png


Where after you should see the confirmation as below


adap-7.png


Test - Validate successful deployment


Double click on the tenant in the Node Explorer view and then open the Deployed Artifacts view.  You might need to refresh the view by clicking on the yellow arrows and if the deployment was a success the adapter will display as follows.


adap-8.png

Also, lets ensure it started on the worker node by first selecting the node in the Node Explorer and then opening the Component Status View

 

adap-9.png

 

Conclusion

 

In Part 2 we have created a HCI adapter and deployed it to a HCI tenant.  At this point the echo adapter is available for use.  In the final part we will create an integration flow to use the adapter and ensure the endpoints are working.

Viewing all 741 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>