This can be a visitor put up by Siva Manickam and Prahalathan M from Vyaire Medical Inc.
Vyaire Medical Inc. is a world firm, headquartered in suburban Chicago, targeted solely on supporting respiration via each stage of life. Established from legacy manufacturers with a 65-year historical past of pioneering respiration expertise, the corporate’s portfolio of built-in options is designed to allow, improve, and prolong lives.

At Vyaire, our group of 4,000 pledges to advance innovation and evolve what’s doable to make sure each breath is taken to its fullest. Vyaire’s merchandise can be found in additional than 100 international locations and are acknowledged, trusted, and most well-liked by specialists all through the respiratory neighborhood worldwide. Vyaire has 65-year historical past of medical expertise and management with over 27,000 distinctive merchandise and 370,000 clients worldwide.
Vyaire Medical’s purposes panorama has a number of ERPs, corresponding to SAP ECC, JD Edwards, Microsoft Dynamics AX, SAP Enterprise One, Pointman, and Made2Manage. Vyaire makes use of Salesforce as our CRM platform and the ServiceMax CRM add-on for managing subject service capabilities. Vyaire developed a {custom} information integration platform, iDataHub, powered by AWS providers corresponding to AWS Glue, AWS Lambda, and Amazon API Gateway.
On this put up, we share how we extracted information from SAP ERP utilizing AWS Glue and the SAP SDK.
Enterprise and technical challenges
Vyaire is engaged on deploying the sphere service administration answer ServiceMax (SMAX, a natively constructed on SFDC ecosystem), providing options and providers that assist Vyaire’s Discipline Providers group enhance asset uptime with optimized in-person and distant service, increase technician productiveness with the newest cellular instruments, and ship metrics for assured decision-making.
A serious problem with ServiceMax implementation is constructing a knowledge pipeline between ERP and the ServiceMax utility, exactly integrating pricing, orders, and first information (product, buyer) from SAP ERP to ServiceMax utilizing Vyaire’s custom-built integration platform iDataHub.
Answer overview
Vyaire’s iDataHub powered by AWS Glue has been successfully used for information motion between SAP ERP and ServiceMax.
AWS Glue a serverless information integration service that makes it simple to find, put together, and mix information for analytics, machine studying (ML), and utility growth. It’s utilized in Vyaire’s Enterprise iDatahub Platform for facilitating information motion throughout completely different programs, nevertheless the main target for this put up is to debate the mixing between SAP ERP and Salesforce SMAX.
The next diagram illustrates the mixing structure between Vyaire’s Salesforce ServiceMax and SAP ERP system.

Within the following sections, we stroll via organising a connection to SAP ERP utilizing AWS Glue and the SAP SDK via distant perform calls. The high-level steps are as follows:
- Clone the PyRFC module from GitHub.
- Arrange the SAP SDK on an Amazon Elastic Compute Cloud (Amazon EC2) machine.
- Create the PyRFC wheel file.
- Merge SAP SDK recordsdata into the PyRFC wheel file.
- Check the reference to SAP utilizing the wheel file.
Conditions
For this walkthrough, it is best to have the next:
Clone the PyRFC module from GitHub
For directions for creating and connecting to an Amazon Linux 2 AMI EC2 occasion, discuss with Tutorial: Get began with Amazon EC2 Linux situations.
The explanation we select Amazon Linux EC2 is to compile the SDK and PyRFC in a Linux setting, which is suitable with AWS Glue.
On the time of scripting this put up, AWS Glue’s newest supported Python model is 3.7. Be certain that the Amazon EC2 Linux Python model and AWS Glue Python model are the identical. Within the following directions, we set up Python 3.7 in Amazon EC2; we will observe the identical directions to put in future variations of Python.
- Within the bash terminal of the EC2 occasion, run the next command:
- Log in to the Linux terminal, set up git, and clone the PyRFC module utilizing the next instructions:
Arrange the SAP SDK on an Amazon EC2 machine
To arrange the SAP SDK, full the next steps:
- Obtain the
nwrfcsdk.zipfile from a licensed SAP supply to your native machine. - In a brand new terminal, run the next command on the EC2 occasion to repeat the
nwrfcsdk.zipfile out of your native machine to theaws_to_sapfolder.
Unzip the nwrfcsdk.zipfile within the present EC2 working listing and confirm the contents:
unzip nwrfcsdk.zip

- Configure the SAP SDK setting variable
SAPNWRFC_HOMEand confirm the contents:

Create the PyRFC wheel file
Full the next steps to create your wheel file:
- On the EC2 occasion, set up Python modules cython and wheel for producing wheel recordsdata utilizing the next command:
- Navigate to the PyRFC listing you created and run the next command to generate the wheel file:

Confirm that the pyrfc-2.5.0-cp37-cp37m-linux_x86_64.whl wheel file is created as within the following screenshot within the PyRFC/dist folder. Notice that you could be see a unique wheel file title primarily based on the newest PyRFC model on GitHub.


Merge SAP SDK recordsdata into the PyRFC wheel file
To merge the SAP SDK recordsdata, full the next steps:
- Unzip the wheel file you created:

- Copy the contents of lib (the SAP SDK recordsdata) to the pyrfc folder:

Now you possibly can replace the rpath of the SAP SDK binaries utilizing the PatchELF utility, a easy utility for modifying current ELF executables and libraries.
- Set up the supporting dependencies (gcc, gcc-c++, python3-devel) for the Linux utility perform PatchELF:
Obtain and set up PatchELF:
- Run patchelf:
- Replace the wheel file with the modified pyrfc and dist-info folders:

- Copy the wheel file
pyrfc-2.5.0-cp37-cp37m-linux_x86_64.whlfrom Amazon EC2 to Amazon Easy Storage Service (Amazon S3):


Check the reference to SAP utilizing the wheel file
The next is a working pattern code to check the connectivity between the SAP system and AWS Glue utilizing the wheel file.
- On the AWS Glue Studio console, select Jobs within the navigation pane.
- Choose Spark script editor and select Create.

- Overwrite the boilerplate code with the next code on the Script tab:
- On the Job particulars tab, fill in obligatory fields.
- Within the Superior properties part, present the S3 URI of the wheel file within the Job parameters part as a key worth pair:
- Key –
--additional-python-modules - Worth –
s3://<bucket_name>/ec2-dump/pyrfc-2.5.0-cp37-cp37m-linux_x86_64.whl(present your S3 bucket title)
- Key –

- Save the job and select Run.
Confirm SAP connectivity
Full the next steps to confirm SAP connectivity:
- When the job run is full, navigate to the Runs tab on the Jobs web page and select Output logs within the logs part.
- Select the job_id and open the detailed logs.
- Observe the message
SAP Connection profitable – connection object: <connection object>, which confirms a profitable reference to the SAP system. - Observe the message Efficiently extracted information from SAP utilizing {custom} RFC – Printing the highest 5 rows, which confirms profitable entry of knowledge from the SAP system.

Conclusion
AWS Glue facilitated the info extraction, transformation, and loading course of from completely different ERPs into Salesforce SMAX to enhance Vyaire’s merchandise and its associated data visibility to service technicians and tech assist customers.
On this put up, you discovered how you need to use AWS Glue to connect with SAP ERP using SAP SDK distant features. To study extra about AWS Glue, try AWS Glue Documentation.
In regards to the Authors
Siva Manickam is the Director of Enterprise Structure, Integrations, Digital Analysis & Growth at Vyaire Medical Inc. On this function, Mr. Manickam is accountable for the corporate’s company features (Enterprise Structure, Enterprise Integrations, Knowledge Engineering) and produce perform (Digital Innovation Analysis and Growth).
Prahalathan M is the Knowledge Integration Architect at Vyaire Medical Inc. On this function, he’s accountable for end-to-end enterprise options design, structure, and modernization of integrations and information platforms utilizing AWS cloud-native providers.
Deenbandhu Prasad is a Senior Analytics Specialist at AWS, specializing in massive information providers. He’s enthusiastic about serving to clients construct trendy information structure on the AWS Cloud. He has helped clients of all sizes implement information administration, information warehouse, and information lake options.
