6

Hands-On Tutorial: Infuse your Processes with Machine Learning through the BTP K...

 2 years ago
source link: https://blogs.sap.com/2021/10/06/hands-on-tutorial-infuse-your-processes-with-machine-learning-through-the-btp-kyma-runtime-sap-integration-suite/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client
October 6, 2021 8 minute read

Hands-On Tutorial: Infuse your Processes with Machine Learning through the BTP Kyma Runtime & SAP Integration Suite

When I was a student, life was simpler. Most of the data was send to me in an excel or a csv file from a PHD student. Of course, we then explored the data, did a lot of statistics, and created a machine learning model. To show the results the normal plot function was enough, and I didn’t have to think about deployment or process integration. If that sounds familiar to you, let me know in the comments 🙂 Now, to deploy my work into the business I face a broad range of challenges. First, the results need to be consumable for different users and fit into the business process. Further, based on the results additional processes need to be triggered or even changed. In this blog post I want to show you how life can still be simple when we have the right tools at hand.

Therefore, we will use two very powerful solutions available in the SAP Business Technology Platform: Kyma Runtime and the SAP Integration Suite. The Kyma Runtime gives us a lot of flexibility to create extensions or deploy docker images, containing eg. machine learning models or custom Python and R functions. Through the SAP Integration Suite, specific processes can be created which can make use of the functions developed in the Kyma Runtime. Sounds complex? Let me guide you through it.

Imagine a use case in which we want to find fraudulent transactions through machine learning. The standalone machine learning model is useless if it is not integrated into business processes. Our goal is to embed this machine learning logic directly into a process, so that the execution is dependent on the prediction. For instance, we might want to stop the transaction if the probability of a fraudulent case is above a certain threshold. Further, an employee should get a notification with a collection of the predicted fraud cases. Of course, we want to automate this as much as possible so that the false positives don’t blow up the mailbox. In conclusion, we need an orchestration for the usage of this machine learning model. Hence, let’s take a look at a simple example how a data scientist can hand over his machine learning model deployed in the Kyma Runtime to a developer in the SAP Integration Suite.

What will you learn in this Hands-On tutorial?

  1. Set up the REST API and Authentication in the Kyma Runtime
  2. Set up the Authentication in the SAP Integration Suite
  3. Create a first process in the SAP Integration Suite

What are the requirements?

  1. Set up the REST API and Authentication in the Kyma Runtime

Let’s start in the Kyma Runtime, where the deployed machine learning model can be consumed through a REST API. In the following, you will set up the REST API protected through an OAuth2 authentication.

First, move to your namespace in which you deployed your docker image or function and set up the authentication for the API rule.

1-34.png

On the left choose “OAuth Clients” and create a new one.

2-8.png

Give it a name and choose “ID token” as well as “Client credentials”. Further, enter “read” as value into the Scope. Then click “Create”.

Please take note of the decoded secret and Client ID. You will need these credentials in the second step 🙂

3-6.png

Now, move to “API Rules” and create a new one. Through the API rule we will be able to create predictions through the deployed machine learning model on the fly.

4-6.png

Provide a name and a hostname for your API rule. In addition, choose OAuth2 as the access strategy and find your service. Then choose “Create”.

5-7.png

As last step in the Kyma Runtime save the certificate of the website as a file. In Windows using Google Chrome you must click the lock signal next to the URL, choose “Certificate” and “Copy to File”.

6-5.png

Perfect! You already finished the first step of this Hands-On tutorial in the Kyma Runtime.

  1. Set up the Authentication in the SAP Integration Suite

In the second step you will set up the authentication in the SAP Integration Suite, such that you can use the API rule created in the Kyma Runtime in an Integration Flow. Hence, move to your SAP Integration Suite and choose the Cloud Integration Scenario.

7-5.png

8-6.png

First, move to the Monitor area where you can add the certificate and authentication. Under Manage Security go to “Keystore”.

9-6.png

Choose “Certificate” under “Add” on the right.

10-5.png

Provide a name and browse for the Kyma certificate. Then press “Add”.

11-3.png

Next, add the OAuth Credentials. Move to “Manage Security Material”.

12-4.png

Click “Create” and choose “OAuth 2 Client Credentials”.

13-5.png

14-3.png

Provide a name and a description. Then, add the Token Service URL which consists out of:

https://oauth2.<Domain>/oauth2/token

Further, copy the decoded Client ID as well as secret from the Kyma Runtime into the according fields. Change the Client Authentication to “Send as Request Header”. Check “Include Scope” and write “read” into the Scope. Set the Content Type to “application/x-www-form-urlencoded”. Then click “Deploy”.

15-2.png

Great! You finished the second step successfully 😃

  1. Create a first process in the SAP Integration Suite

Now, let’s tackle the final step and bring everything together in a first Integration Flow. Therefore, move to the Design area and create a new package.

16-3.png

17-3.png

In the package move to Artifacts and add a new Integration Flow.

18-3.png

Give the Integration Flow a name & ID and click “OK”.

19-2.png

Click on to your Integration Flow.

20-1.png

Remove the start message in your Integration Flow.

21-1.png

22-1.png

Further, add a Timer as the start for the Integration Flow.

23.png

Join the “Start Timer” and “End” through a connection.

24.png

Under External Call choose the “Request Reply” operator and add it to the Integration Flow.

25.png

Further, drag & drop a “Receiver” under the Integration Flow.

26.png

Then connect the Request Reply with the Receiver.

27.png

Choose “HTTP” as the Adapter Type.

28.png

Pull the HTTP configuration up to add the connection.

29.png

Add the “Address” as well as the “Query” separately into the according fields.

Adress: <your REST API>/predict
Query: c=0&a=4900&obo=10000&nbo=5100&obd=1000&nbd=5900&dl=1

In the query a new observation is incorporated which will be send to the machine learning model through the GET request. The result will be a prediction for this new observation which will be incorporated in the message payload as a string. You can change the values of the transaction incorporated in the query to get different predictions.

To finish the configuration please mark “Send Body” and choose “OAuth2 Client Credentials” as Authentication. Provide your “Credential Name”.

30.png

Now, add a “Groovy Script” after the “Request Reply” into your Integration Flow.

31.png

32.png

33.png

Click on to the “Groovy Script” and choose “Create”.

34.png

By default, a comprehensive script is provided to work with the process data.

35.png

Of course, you are now very flexible to work with the message. Recall the application returns the following string after creating a prediction:

'The predicted result for the observation ' + str(observation) + ' is: ' + str(prediction)

In the following script the prediction is extracted of the string by setting the payload between the index places 93 and 95.

36.png

Save the script and then deploy the Integration Flow.

37.png

Move to the Monitoring area to enable the trace of the Integration Flow to get more information after the execution.

38.png

Choose “All” under “Manage Integration Content”. Click on to your Integration Flow.

39.png

Change the Log Configuration to “Trace”.

40.png

Please, deploy your Integration Flow again.

41.png

Then choose “All Integration Flows” under Monitor Message Processing.

42-1.png

Click on to the newest deployment of your Integration Flow and choose “Trace”.

43-1.png

44-1.png

Choose the “End” of your Integration Flow and click on to “Message Content”. Under “Payload” we find our prediction of the observation. The value zero means that the prediction of the machine learning model for the transaction is non fraudulent.

45-1.png

Congratulations! You extracted the prediction incorporated in the message body successfully. Hence, you have now the basis to extend the Integration Flow as needed for your business. For example, you could persist the prediction through a JDBC connection in SAP HANA Cloud, create custom notifications or make the whole Integration Flow more dynamic. In addition, there are many prepared content packages available in the Discovery Center. If you want to explore more, the following blog posts and tutorials really helped me to get started:

I want to thank Gaurav Abbi, Sven Huberti and Sarah Detzler for their support while writing this Hands-On tutorial.

Cheers!

Yannick Schaper


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK