SAP Accelerator Business Hub
SAP Accelerator Business Hub
Interview question: Can we connect dev S/4 hana and QA s/4 hana to single cpi tenant ?
I need to check if we can point multiple ERP systems to same CPI system .Eg: SBX and HAQ of s/4 can
be connected to Dev CPI system?
You need to Publish the events from s/4 hana to third party systems:
Events :create /update/delete –
S/4 hana->Event Mesh -> via(AMQP Adaptor) -CPI(transformations )->third party system
S/4 hana to Event Mesh: trigger event in two ways Manual and Automated
SAP S/4HANA offers different types of APIs: OData and SOAP ,REST:
We cannot wait till the Abaper trigger the IDoc Messages as we need to work on development we will
put a start timer and have a content Modifier to in the body section we could paste the sample payload
and start developing the iflow
Abaper will write the code for the idoc to get triggered in s/4 hana
--------------------------------------------------------------------------------------------------------------------------------------
SAP S/4HANA exposes OData services (Open Data Protocol) that allow external systems to interact with
the data in a standardized way. You can use OData services to read and write data to/from S/4HANA.
Connectivity:
You can use CPI to call these OData services using HTTP-based communication.
S/4HANA exposes OData services using SAP Gateway, and CPI can consume these services.
S/4HANA provides web services (SOAP-based) for integration. You can consume these web services from
CPI using the SOAP adapter.
CPI can call the SOAP-based services in S/4HANA by using a SOAP adapter, and S/4HANA can expose
web services using SAP Gateway or through predefined service interfaces.
Connectivity:
CPI can connect to S/4HANA's IDoc interface using the IDoc adapter in CPI.
This allows for both inbound and outbound IDocs to/from S/4HANA.
Connectivity:
CPI uses the RFC adapter to call RFC-enabled function modules directly in S/4HANA.
Connectivity:
CPI can read/write files from/to S/4HANA's directories, with the use of File adapters, FTP, or
SFTP protocols.
1. OData Services:
Connectivity:
S/4HANA exposes OData endpoints, and CPI can consume these services using the OData
adapter.
CPI can send and receive data from these services via HTTP(S).
2. 2. IDoc (Intermediate Document): IDocs are commonly used for asynchronous communication
in SAP environments. S/4HANA can send IDocs to CPI for processing.
Connectivity:
S/4HANA provides SOAP-based web services to expose functionality, which can be consumed by
external systems like CPI. These services provide a synchronous way of exchanging data.
Connectivity:
S/4HANA exposes SOAP web services that CPI can call using the SOAP adapter.
CPI can consume these services over HTTP/HTTPS and provide real-time responses.
5. RFC (Remote Function Call) S/4HANA supports RFC-based communication, allowing external systems
like CPI to invoke remote function modules (RFMs) in S/4HANA.
Connectivity:
S/4HANA can expose remote function calls, and CPI can invoke these using the RFC adapter.
This method is common for ERP-to-ERP communication and integration with SAP back-end
processes.
REST APIs: S/4HANA supports RESTful APIs, which are more lightweight and flexible than SOAP-based
services.
Connectivity:
S/4HANA exposes REST APIs, and CPI can interact with these using the REST adapter.
This enables faster and more flexible communication, especially for scenarios requiring JSON
payloads.
File-based Integration (FTP, SFTP, or File Adapter): For scenarios where batch processing or file-based
transfers are required, S/4HANA can push files (like CSV, XML) to CPI via FTP or SFTP.
Connectivity:
CPI can listen for files in FTP/SFTP locations, and S/4HANA can upload files to these locations for
further processing in CPI.
Common for integration of transactional data like invoices, orders, etc.
Before testing, ensure that the web service is available and correctly exposed in your S/4HANA system.
To find and identify the web service:
1. Transaction Code: SOAMANAGER – This is the main tool for managing and testing web services
in SAP.
o Go to SOAMANAGER (Transaction SOAMANAGER).
o Find the service definition by searching for the web service (SOAP or REST) using the
available options in the interface.
o You can either check for standard SAP services or custom services that are developed.
1. SOAMANAGER: After identifying the service, you can check if the web service is configured and
activated.
o In SOAMANAGER, go to the Service Administration section.
o Check if the web service you want to test is active.
You can use a WSDL (Web Service Description Language) to test the web service:
Alternatively, you can use Postman (usually for RESTful services but can be used for SOAP as well).
If you are testing a RESTful web service (which is common for newer S/4HANA applications), the process
is simpler as REST services use HTTP methods like GET, POST, PUT, DELETE.
1. Identify the Endpoint: In S/4HANA, REST services are typically exposed through OData services,
which can be tested using Postman.
o In SOAMANAGER, you can locate the RESTful API or OData service that you want to test.
Alternatively, if it's a custom REST API, the URL endpoint will be available in the service
definition.
For OData services or RESTful services exposed via the SAP Gateway, you can test them directly within
S/4HANA using the SAP Gateway Client:
1. Go to Transaction /IWFND/GW_CLIENT.
2. Enter the Service URL of the OData or REST service you want to test.
3. Select HTTP Method (GET, POST, PUT, DELETE).
4. Send Request: If the service requires parameters, you can pass them in the request body or URL.
5. Review the response for correctness, and check for any issues or error messages
SOAP UI: For SOAP web services, you can import the WSDL and test operations easily.
Postman: For both SOAP and RESTful web services, Postman is an easy tool to simulate HTTP
requests and analyze responses.
SAP Gateway Client (/IWFND/GW_CLIENT): Useful for testing OData and RESTful APIs directly in
the S/4HANA system.
SOAMANAGER: For configuring and testing SOAP-based web services, and for monitoring web
service usage.
In SOAMANAGER, you can also use the Testing and Monitoring tab to monitor web service
operations and troubleshoot any issues.
WSDL Testing: You can test the operation by invoking the service directly through the Test
Service functionality in SOAMANAGER.
1. SM21 (System Log) – Check for any errors related to web service calls.
2. ST22 (ABAP Dump Analysis) – If there are ABAP dumps during the execution of the web service.
3. SOST (SAPconnect Send Requests) – For issues related to email or communication with external
systems.
4. Transaction: /IWFND/ERROR_LOG – Check this log for any OData service errors or issues during
request processing.
-------------------------------------------------------------------------------------------------------------------------------------------
---
When connecting SAP Cloud Platform Integration (CPI) to SAP S/4HANA, there are several adapters
that can be used depending on the type of communication required (e.g., SOAP, REST, IDocs, OData,
etc.). These adapters enable CPI to communicate seamlessly with S/4HANA for various integration
scenarios.
1. SOAP Adapter: It allows CPI to consume or expose SOAP web services. This is commonly used
for synchronous communication where real-time interaction between CPI and S/4HANA is
needed.
Typical Scenarios: Calling SOAP-based APIs in S/4HANA for synchronous operations such as
creating or updating records in real-time.
REST Adapter: Use Case: The REST adapter is used to exchange data between CPI and RESTful APIs
exposed by S/4HANA (e.g., OData services).
Configuration: The REST adapter in CPI will interact with REST-based services exposed by S/4HANA,
which can include OData and custom REST APIs.
IDoc Adapter: IDoc adapter is used for asynchronous communication in scenarios where S/4HANA
needs to send or receive IDocs (Intermediate Documents).
Configuration: You can configure the IDoc sender or receiver in CPI to connect with S/4HANA for IDoc-
based integration.
Typical Scenarios: Batch processing or asynchronous integration where business documents such as
purchase orders, invoices, or inventory updates need to be sent/received between CPI and S/4HANA.
OData Adapter
Use Case: The OData adapter is used to integrate with OData services in S/4HANA. OData
(Open Data Protocol) is a web protocol for querying and updating data, which is widely used in
SAP S/4HANA and other SAP cloud applications.
Functionality: This adapter allows CPI to expose and consume OData services, enabling
seamless communication for both synchronous and asynchronous operations.
Typical Scenarios: Used for real-time interaction with OData-based APIs in S/4HANA, including
CRUD operations (Create, Read, Update, Delete) on data such as sales orders, material master,
etc.
Configuration: In CPI, the OData adapter is used to interact with OData services exposed by
S/4HANA, typically for data queries and updates.
RFC Adapter
Use Case: The RFC (Remote Function Call) adapter is used to call RFC-enabled function
modules in SAP S/4HANA from CPI.
Functionality: It allows CPI to invoke remote function modules (RFMs) directly in S/4HANA.
RFCs are typically used for synchronous interactions between systems.
Typical Scenarios: Synchronous communication where CPI needs to invoke SAP function
modules (e.g., for creating a material, processing orders, or updating data in S/4HANA).
Configuration: The RFC adapter is configured in CPI to connect with RFC-enabled function
modules in S/4HANA. This typically requires setting up RFC destinations on the SAP S/4HANA
system.
Use Case: The File adapter is used for integrating CPI with systems via file-based exchange. This
is commonly used for transferring files (such as CSV, XML, or flat files) between S/4HANA and
external systems.
Functionality: It allows CPI to read/write files from/to an FTP or SFTP server and process the
data as needed. This is often used in batch processing scenarios.
Typical Scenarios: File-based integration, such as transferring invoices, orders, or other data
from S/4HANA to external systems, or importing data into S/4HANA via file transfers.
Configuration: The File adapter in CPI is configured to interact with FTP or SFTP servers where
files are transferred between S/4HANA and other systems.
Use Case: The Mail adapter is used to send and receive emails between CPI and S/4HANA or
external systems.
Functionality: This adapter supports email-based integration, allowing messages to be
exchanged as part of business processes (e.g., sending notifications or processing incoming
emails).
Typical Scenarios: Sending email notifications from S/4HANA to customers, or receiving email-
based requests from external parties.
Configuration: The mail adapter can be configured to work with SMTP, POP3, or IMAP protocols
to connect CPI with email servers.
8. SuccessFactors Adapter (If integrating with SuccessFactors)
Use Case: If integrating SAP SuccessFactors with S/4HANA via CPI, the SuccessFactors Adapter is
used.
Functionality: It enables bi-directional integration between SuccessFactors and S/4HANA,
supporting scenarios like HR data exchange, employee information synchronization, etc.
Typical Scenarios: HR-related processes, such as synchronizing employee data from
SuccessFactors to S/4HANA.
Configuration: This adapter requires configuration in CPI to facilitate data flow between the
SuccessFactors system and S/4HANA.
Use Case: For business-to-business (B2B) communication, CPI can use the B2B adapter to
facilitate EDI (Electronic Data Interchange) communications between S/4HANA and external
partners.
Functionality: This adapter supports EDI-based integration for scenarios like purchase orders,
invoices, and other standard business documents.
Typical Scenarios: EDI-based integration where S/4HANA exchanges standard business
documents with external partners using protocols like AS2, X12, or EDIFACT.
Configuration: The B2B adapter requires configuring communication protocols and document
types in CPI.
Use Case: The JDBC adapter is used to integrate CPI with external databases (e.g., if data from
an external database needs to be pulled into or pushed from S/4HANA).
Functionality: It enables CPI to communicate with relational databases via JDBC to fetch or store
data.
Typical Scenarios: Data integration scenarios involving external databases or systems not
directly connected to S/4HANA.
Configuration: The JDBC adapter requires configuration of database connection details and SQL
queries to read/write data to/from the database.
Step by Step process to publish standard Business events from S/4 Hana to Event Mesh
This blog explains Outbound Configuration to send events from S/4 Hana on premise to Event
Mesh and test the same.
You need to have below roles
1. SAP_IWXBE_RT_XBE_ADM
2. SAP_IWXBE_RT_XBE_BUSI
3. SAP_IWXBE_RT_XBE_MDT
In S/4 system, please follow below steps. Create a Channel using T-code “/n/IWXBE/CONFIG”.
Click on “Via Service key”.
Enter Channel name and Description. Get the Event mesh instance key from Basis team. Copy
and paste the service key of event mesh as shown below.
select the channel and activate it.
Once it is activated, click on Check connection. You will get the below message if connection
successful
To Create Outbound binding, please select channel and click on outbound bindings.
Click on create and then F4.
Select the topic from F4 and save it. If you don't find topics in F4 help, you need to implement
note "3346777".
Now, you need to create a Queue in Event Mesh and subscribe the topic.
Open Event Mesh and click on Create Queue.
Once the Queue is created, click on highlighted button and select Queue Subscriptions.
Enter Topic name along with the namespace and click on add button to subscribe to the topic.
Configuration is done in S/4 Hana system and in the BTP Event Mesh.
To test the messages,
Trigger Standard Business Partner Events by creating a business partner using T-Code BP.
Click on Person.
Enter First name and last name and then save it.
Message has been sent to Event Mesh.
Check the payload after consuming the message in CPI by integration team.
Message number decreased to zero in Event Mesh after consuming the message in CPI.
how to setup client certification authentication between CPI and s/4 hana
To set up Client Certificate Authentication (mTLS) between SAP Cloud Platform Integration (CPI) and
SAP S/4HANA, you need to configure both systems to use mutual TLS (mTLS), where both CPI and
S/4HANA authenticate each other using their respective X.509 certificates.
First, you need to generate the required X.509 certificates for both the CPI system and the S/4HANA
system. This ensures both parties can authenticate each other securely.
For S/4HANA, you need to obtain a certificate from your system administrator or SAP's certificate
management tools.
SAP S/4HANA can either use its own internal certificates or use certificates from a trusted
external CA.
Ensure the CA certificate of the S/4HANA system is available to CPI for trust verification.
In CPI, you must install the client certificate (the one you generated or obtained in Step 1) in the
Keystore.
1. Log into the SAP Cloud Platform Integration (CPI) cockpit.
2. Go to Manage Security Material.
3. Choose Keystore and click on Add.
4. Upload the client certificate (private key and certificate) generated for CPI.
To trust the S/4HANA system, install its server certificate (or the certificate from the trusted CA) in CPI:
This will usually involve uploading the client certificate into the S/4HANA system through the
SSL Configuration or PSE (Personal Security Environment).
Ensure that S/4HANA has the client certificate installed correctly so that it can authenticate CPI
during the TLS handshake.
To trust the CPI client certificate, you must install the root CA certificate in S/4HANA if CPI's certificate
was signed by a third-party CA.
Now that the necessary certificates are installed in both CPI and S/4HANA, you need to configure CPI to
authenticate requests using the client certificate.
1. In CPI, create or open an existing integration flow (iFlow) where you want to trigger
communication from CPI to S/4HANA.
2. Use the HTTP Sender Adapter or HTTP Receiver Adapter, depending on the direction of the
communication.
3. Under the Sender or Receiver Adapter Configuration:
o For the Receiver (S/4HANA), configure the Adapter to use the HTTPS protocol.
o Enable Client Authentication: In the adapter configuration, enable client certificate
authentication.
o Select the Keystore and specify the client certificate (the one you uploaded to CPI).
4. For Secure Communication:
o Select HTTPS as the protocol.
o Ensure TLS/SSL encryption is enabled.
o Choose the Keystore that contains the client certificate.
o If necessary, configure mutual TLS (mTLS) settings on the HTTP adapter to send the
certificate along with the request.
You can also use Basic Authentication or OAuth headers in combination with the client certificate for
additional security. However, mTLS alone will suffice for client authentication.
This will ensure that the S/4HANA system will accept the requests coming from CPI that are
authenticated with the appropriate client certificate.
Ensure that S/4HANA is configured to accept secure TLS (Transport Layer Security) communication.
1. Verify that TLS encryption is enabled for the communication channel with CPI.
2. Set the SSL mode to require mutual authentication.
Once both systems are configured, you should test the communication to ensure that client certificate
authentication is working as expected.
Test from CPI: Send a request from CPI to S/4HANA (via the configured HTTP Adapter in CPI)
and monitor the logs to check for successful authentication.
Test in S/4HANA: Check the S/4HANA logs (transaction SMICM or SM21) to verify that the
authentication attempt was successful and that CPI's client certificate was correctly validated.
6. Troubleshooting
If the authentication fails, the following checks can help diagnose the issue:
Ensure that the client certificate installed in CPI is correctly configured in the Keystore and
associated with the iFlow.
Double-check the Truststore in CPI to ensure that S/4HANA’s server certificate or its root CA
certificate is present and trusted.
Verify that the SSL configuration in S/4HANA is correct, and S/4HANA is configured to accept
the client certificate from CPI.
Review the SSL logs in CPI and S/4HANA for any specific errors related to the TLS handshake.
Summary of Steps:
1. Generate and install certificates: Generate and install the client certificate (for CPI) and server
certificate (for S/4HANA).
2. Configure CPI: Install the client certificate in CPI's Keystore and configure the iFlow to use the
client certificate for authentication.
3. Configure S/4HANA: Ensure S/4HANA can trust the client certificate from CPI, configure SSL
settings, and enable client certificate authentication.
4. Test the setup: Perform tests to verify that mutual authentication works correctly between CPI
and S/4HANA.
To set up Client Certificate Authentication (mTLS) between SAP S/4HANA and SAP Cloud Platform
Integration (CPI), you need to configure both systems to securely authenticate each other using X.509
certificates. This setup ensures that both S/4HANA and CPI trust each other by verifying the identity of
each system using client certificates.
Here are the steps to configure client certificate authentication (mutual TLS or mTLS) between
S/4HANA and CPI:
Client Certificate for CPI: This certificate is used by CPI to authenticate itself when
communicating with S/4HANA.
Server Certificate for S/4HANA: This certificate is used by S/4HANA to authenticate itself to CPI.
Both systems will use their respective certificates to establish trust during the SSL handshake.
If you don't already have one, generate or obtain a client certificate for CPI:
Use OpenSSL, Java Keytool, or similar tools to generate a private key and a certificate (either
self-signed or from a trusted Certificate Authority (CA)).
Upload this certificate to the Keystore in CPI for use during the communication.
The server certificate for S/4HANA is typically provided by your S/4HANA administrator or generated
from a trusted CA. This certificate will allow CPI to authenticate S/4HANA during communication.
To enable CPI to use the client certificate for authentication, you need to upload it to CPI’s Keystore:
1. Open your integration flow (iFlow) in CPI where the connection to S/4HANA is established.
2. In the Receiver Adapter configuration (assuming you are calling an S/4HANA system from CPI):
o Select the HTTPS protocol for communication.
o Enable Client Authentication and select the Keystore containing the client certificate.
o Ensure TLS/SSL encryption is enabled.
o Optionally, configure mTLS settings for the mutual authentication process, which will
trigger the use of the client certificate during communication.
To ensure that CPI can trust S/4HANA, install the S/4HANA server certificate or its root CA certificate in
the Truststore in CPI:
On the S/4HANA side, you must configure it to accept the client certificate from CPI. To do this:
1. Log into S/4HANA and navigate to Transaction STRUST (the SSL configuration tool).
2. Select the SSL Server Standard entry.
3. Import the CPI client certificate into the PSE (Personal Security Environment) under the SSL
Client Standard section.
4. Configure S/4HANA to recognize and trust the client certificate being sent by CPI.
4.2 Install the Root Certificate (if applicable)
If the client certificate for CPI is signed by an intermediate or external CA, you must install the root CA
certificate into S/4HANA so it can validate the authenticity of the certificate sent by CPI:
1. In S/4HANA, ensure that client certificate authentication is enabled for the communication
channels that will receive requests from CPI.
2. For HTTPS communication, ensure TLS is enabled, and configure the system to accept client
certificate authentication.
3. Verify the SSL/TLS configuration to ensure mutual authentication (mTLS) is set up correctly.
Once both systems are prepared, configure the actual communication settings between CPI and
S/4HANA.
1. In your iFlow in CPI, set up the receiver to use HTTPS to communicate with S/4HANA.
2. Ensure that the adapter is configured to use mTLS and is set up to use the client certificate for
authentication.
1. Test the communication by sending a message from CPI to S/4HANA using the configured iFlow.
2. Monitor the logs in both CPI and S/4HANA:
o In CPI, use Message Monitoring to check the integration flow's execution.
o In S/4HANA, check the logs (use Transaction SMICM or SM21) to ensure the server
successfully received and validated the request from CPI.
6. Troubleshooting and Verification
Verify the Certificates: Ensure that the client certificate from CPI is correctly installed in
S/4HANA and that S/4HANA's server certificate is installed in the Truststore of CPI.
Check SSL/TLS Logs: Review SSL and TLS logs in both CPI and S/4HANA to check for any
handshake errors or authentication failures.
Validate CA Trust: Make sure that S/4HANA trusts the CA that signed CPI’s certificate and vice
versa.
Summary of Steps:
1. Generate/Obtain Certificates:
o Generate/upload a client certificate for CPI.
o Obtain the server certificate for S/4HANA.
2. Configure CPI:
o Install the client certificate into CPI’s Keystore.
o Install the server certificate from S/4HANA into CPI’s Truststore.
o Configure the iFlow in CPI to use mTLS.
3. Configure S/4HANA:
o Install the client certificate (from CPI) into S/4HANA.
o Install the Root CA certificate in S/4HANA if necessary.
o Enable client certificate authentication in S/4HANA.
By following these steps, you can successfully set up client certificate authentication (mTLS) between
SAP S/4HANA and SAP Cloud Platform Integration (CPI) to ensure secure, mutual authentication during
communication.
Ask your AbaPer to expose that fields in CDS view then you have it
No we do it other way we create an CDS view ith @Odata Annotation this will automatically crate
Odata
Generally check the field is present in the entity while querying odata service otherwise you will get
400 bad request and check if its filterable or not
How many ways we we can trigger the events from s/4 hana to cpi
There are several ways to trigger events from SAP S/4HANA to SAP Cloud Platform Integration (CPI),
depending on the integration scenario and the type of event or message that needs to be transferred.
Here are the primary methods to trigger events from S/4HANA to CPI:
REST API Integration: In S/4HANA, you can configure outbound HTTP communication to call an
external CPI endpoint. This method triggers an event in CPI when a specific condition occurs in
S/4HANA.
o How it works: S/4HANA sends an HTTP(S) request (typically using REST APIs) to CPI’s
exposed HTTP Receiver Adapter. This approach is often used when you want to trigger
an event (like creating or updating a document) in CPI based on specific actions or
events in S/4HANA.
o Common Usage: This method is widely used for event-driven integration, such as
sending a message when a business document is created or updated.
o Example: When an order is created in S/4HANA, S/4HANA can send an HTTP request to
CPI to trigger a further process (e.g., notify a partner system).
IDoc Outbound: IDocs are used to transfer data between SAP systems (like S/4HANA) and
external systems (like CPI). S/4HANA can send outbound IDocs to CPI for further processing or
triggering an event.
o How it works: S/4HANA triggers an outbound IDoc (through a message type like
ORDERS05 for sales orders), and this IDoc can be consumed by CPI using the IDoc
Receiver Adapter.
o Common Usage: IDocs are typically used in ALE (Application Link Enabling) scenarios for
integration with external systems or for replicating business data between S/4HANA
and other applications.
o Example: When an order is created in S/4HANA, an outbound IDoc can be triggered to
pass the data to CPI, which can then be routed to an external partner system.
SAP Event Mesh (formerly known as SAP Cloud Platform Enterprise Messaging) enables
asynchronous messaging between S/4HANA and CPI by producing and consuming events. This
approach is useful for event-driven integration, where systems react to specific events triggered
by business actions in S/4HANA.
o How it works: S/4HANA produces an event (using SAP Event Mesh or BEM), and CPI
subscribes to these events to perform further processing or trigger other processes.
o Common Usage: Event Mesh is used in modern cloud-native architectures, where you
can trigger integration flows in CPI based on business events in S/4HANA.
o Example: When an invoice is posted in S/4HANA, an event is published to Event Mesh,
and CPI subscribes to that event to trigger downstream integrations (e.g., notifying a
third-party system).
OData is a protocol used to expose business data and services in S/4HANA for consumption by
external applications, including CPI.
o How it works: In S/4HANA, you can expose business logic or data as OData services. CPI
can consume these services using the OData Receiver Adapter and trigger
corresponding actions based on the data retrieved.
o Common Usage: OData services are commonly used to expose CRUD operations on
S/4HANA business objects to external systems for integration purposes.
o Example: A CPI integration flow can trigger an update in S/4HANA when new sales
order data is received via an OData service.
RFC and BAPI are traditional integration techniques that allow S/4HANA to communicate with
external systems, such as CPI.
o How it works: RFC and BAPI calls from S/4HANA can trigger integration flows in CPI.
S/4HANA can initiate RFC calls or call BAPIs that are exposed as web services, triggering
further processing in CPI.
o Common Usage: This method is typically used for synchronous communications
between S/4HANA and external systems, where the calling system expects an
immediate response.
o Example: A BAPI call in S/4HANA could trigger a request to CPI, which processes the
data and returns a response back to S/4HANA.
SOAP Web Services can be used to trigger events from S/4HANA to CPI.
o How it works: S/4HANA can expose SOAP Web Services to provide data or trigger
specific actions, and CPI can consume these services using the SOAP Receiver Adapter.
o Common Usage: This approach is typically used for more structured, synchronous
communication where precise and defined messaging formats are required.
o Example: A SOAP request from S/4HANA can trigger an event in CPI, such as creating or
updating a record in an external system.
Scheduled Batch Jobs can be used to trigger events based on predefined schedules.
o How it works: S/4HANA can schedule batch jobs to trigger events or process data
asynchronously, and the resulting output can be sent to CPI via HTTP, IDocs, or other
methods.
o Common Usage: This approach is useful when you want to process large volumes of
data on a regular basis, such as bulk updates or periodic data transfers to external
systems.
o Example: A scheduled batch job in S/4HANA can generate IDocs that are sent to CPI for
further processing.
Summary
Here are the main ways to trigger events from S/4HANA to CPI:
Client certificate authentication is it same for s/4 hana to cpi and cpi to s/4 hana
how to connect multiple s/4 hana onpremise systems to the same CPI tenant?
To connect both SAP S/4HANA Development (Dev) and Quality Assurance (QA) on-premise systems to
the same SAP Cloud Platform Integration (CPI) tenant, the approach is very similar to connecting
multiple on-premise S/4HANA systems to CPI. However, you need to differentiate between the Dev and
QA environments for each system while ensuring that each system can independently interact with the
CPI tenant.
1. Set Up Communication Channels for Each S/4HANA Environment (Dev and QA)
Both the Dev and QA SAP S/4HANA systems will need their own communication channels within CPI.
These channels will handle the communication between each system and CPI. You can set up separate
channels for each environment to ensure proper segregation.
Steps:
2. Set Up SAP Cloud Connector for Both S/4HANA Systems (Dev and QA)
SAP Cloud Connector is required to securely connect your on-premise systems (Dev and QA) to CPI. You
will install and configure two instances or two sets of connections within a single Cloud Connector
installation to connect both S/4HANA systems (Dev and QA) to CPI.
Steps:
For each environment, you will need to create dedicated iFlows in CPI. This way, you can handle the
different business logic, data mapping, and transformation requirements for Dev and QA independently.
Steps:
Each communication channel needs to be linked to its respective communication agreement. This
ensures the messages are processed and routed correctly.
Steps:
Both the Dev and QA environments may require different authentication credentials (e.g., service users,
certificates, or OAuth tokens).
Steps:
1. Authentication:
o Set up authentication credentials for both environments in CPI:
Dev S/4HANA System: Configure the required credentials (user, certificate, or
OAuth) to securely connect Dev S/4HANA to CPI.
QA S/4HANA System: Similarly, configure credentials for the QA system.
2. Security Policies:
o Ensure that each environment has its own security policies (SSL certificates, OAuth
tokens) configured for secure communication.
Once you’ve configured the communication channels, iFlows, and security settings, test the integration
thoroughly for both environments (Dev and QA). This step is critical to ensure that the communication
between the on-premise S/4HANA systems and CPI is working as expected.
Steps:
1. Test iFlows:
o Test both iFlows (for Dev and QA environments) by sending test messages from each
S/4HANA system to CPI. Ensure the message is processed correctly and reaches the
correct target system.
2. Monitor Integration:
o Use CPI’s monitoring tools to track the status of both iFlows. Monitor the messages,
error logs, and processing status for both Dev and QA environments to ensure smooth
integration.
For both environments, implement error handling and logging mechanisms to troubleshoot and manage
failed messages.
Steps:
-------------------------------------------------------------------------------------------------------------------------------------------
---- how to connect multiple s/4 hana cloud to the same CPI tenant?
To connect Dev (Development) and QA (Quality Assurance) SAP S/4HANA Cloud systems to the same
SAP Cloud Platform Integration (CPI) tenant, the process is similar to connecting multiple cloud systems
to a CPI tenant, but you need to configure separate communication channels, integration flows, and
potentially authentication mechanisms for each environment. The key is to ensure proper segregation of
Dev and QA environments while utilizing a single CPI tenant for integration.
Steps to Connect Dev and QA S/4HANA Cloud Systems to the Same CPI Tenant
You will need to configure different communication channels for both Dev and QA S/4HANA Cloud
systems in your CPI tenant to ensure that messages from both environments are processed
independently.
Key considerations:
Dev Environment: Set up a communication channel that is linked to the Dev S/4HANA Cloud
system.
QA Environment: Set up a communication channel that is linked to the QA S/4HANA Cloud
system.
Steps:
Each S/4HANA Cloud system (Dev and QA) will require its own authentication setup for secure
communication with CPI.
You will need to create separate iFlows for each environment to ensure data is processed independently
for Dev and QA environments. This allows you to implement different business logic, message
transformations, and error handling for each environment.
Steps:
If your integration involves on-premise systems (either as sources or targets), you may need to configure
SAP Cloud Connector for secure communication between on-premise systems and CPI. However, for
cloud-to-cloud communication (S/4HANA Cloud to CPI), Cloud Connector is typically not required.
Ensure that SSL/TLS encryption is enabled for secure communication between S/4HANA Cloud and CPI.
You may also need to manage SSL certificates for secure communication between systems.
1. Enable HTTPS: Ensure that communication between S/4HANA Cloud and CPI is secured using
HTTPS.
2. Install SSL Certificates: If using HTTPS, make sure the necessary SSL certificates are in place for
both environments (Dev and QA).
Once the communication channels and iFlows are configured, monitor the integration to ensure that
messages from Dev and QA environments are correctly processed.
1. Monitor CPI:
o Use the CPI monitoring tools to check the status of messages and iFlows for both
environments.
o Monitor for any errors or issues in the integration and resolve them promptly.
2. Set Up Alerts:
o Configure alerts for message failures or integration issues for both Dev and QA
environments to proactively address any issues.
Before moving to production, thoroughly test the integration for both environments (Dev and QA) to
ensure everything is working as expected.
Conclusion
To connect Dev and QA S/4HANA Cloud systems to the same SAP Cloud Platform Integration (CPI)
tenant, you need to:
SAP Cloud Integration: Generate Alert if file not found in Sender SFTP Folder
SAP Cloud Integration functionality to raise an alert when polling for a file from source SFTP Folder does
not yield any results (Specified file does not exist in source folder)
I have one doubt. I have a sender File adapter with FTP transfer protocol. I want to pull the file every 24
hours. If the file does not exist, there is any way to have an error and send a mail to the client?
SFTP Processing Parameters, Timestamp to File Name, Message-ID to File Name, Write Mode, etc.
Add Timestamp to filename.
Add the timestamp in format YYYYMMDD_HHMMSS-xxx before the extension of the filename. If the
configuration is activated and File Name parameter is set as ‘Test_.XML’, the name of the receiver files
will be set as Test_YYYYMMDD_HHMMSS-xxx.XML. ‘xxx’ is a random sequence number generated by
Adapter Engine.
For example, files starting with “Order” should be moved to “Orders” target folder on SFTP server.
Invoices to “Invoices” folder and all other files to “Other” folder.
In this scenario, we will make use of the following features of SAP Integration Suite interface
development techniques,
Create Use an Xpath expression or other methods to extract the values you need from the incoming
message payload and assign
To summarize, we can make use of standard and custom Header/Exchange Property parameters to
determine different receiver adapter parameters. Not only use parameters, but you can also use Simple
Expressions to assign custom values during runtime.
While reading files using CPI SFTP Sender channel, we want dynamic file names like: {Current date}&File
Name
I am writing this blog to address the requirement where we need to add SAP CPI message-id in FILE-
NAME for FTP/SFTP receiver adapter.
In my scenario, 10 +files were getting created at the same time (up to milliseconds) hence adding time
stamp was not the solution.
So I followed many blogged to make every filename unique but could not achieve the result.
Here are the detailed steps required to add "Message-ID" in File Name.
Order_${date:now:yyyyMMddHHmmss}_${header.SAP_MessageProcessingLogID}.xml
It will add the CPI message-id in the file name and it will be helpful to monitor the message as well in
CPI.
6. now go to your SFTP/FTP receiver channel and set the filename like this
Order_${date:now:yyyyMMddHHmmss}_${property.number}_$
{header.SAP_MessageProcessingLogID}.xml
This blog post is about setting up dynamic filename and dynamic folder on receiver SFTP for different
incoming files.
Business Case: Suppose you have n (in this case 2) different kind of incoming files which you are polling
from same SFTP directory path and needs to deposit to different folders on same SFTP server.
Solution Approach:
1. Create your iflow as required, for this case its something like below.
2. Set the directory paths for multiple incoming routes using content modifier.
Keep the variable name same for all the directories you will set.
Note: I am polling file from SFTP, and routing files based on their filenames. For more info, read
my previous blog post.
3. Use another content modifier in main integration process to set the dynamic filename and
dynamic directory path.
Put name as CamelFileName as this is standard camel expression for filename and set its value as /
[root directory path]/${header.path}/${file:onlyname}
There are few other parameters as well, you can follow reference link at the end.
4. Configure your receiver SFTP and keep file access parameters as blank as it would dynamically
take the values from step 3.
Your files will be deposited in different folders based on dynamic configuration setup.
After deploying the changes you should get the file name as per expectations.
In a CPI integration project, we had the challenge that we should use SFTP sender to pickup files, those
should be routed differently based on the file name.
We have 2 files on the same SFTP server in the same folder, these differ only in the name, for example
test1.txt and test2.txt, these should be processed differently as mentioned above and have to be rooted
to different SFTP server / folder
IFlow
Solution
We used the Router, so we have 2 processing variants, thus come to the important part of this blog.
To set the routing condition, 2 steps have to be implemented as shown in the picture below
Step1
Step2
${header.CamelFileName} = 'test1.txt'
This have to be also implemented for the second routing, with the related file name obviously.
I am now setting file name in FTP adapter (receiver) for an integration flow.
${date:now:yyyyMMddHHmmss.SSS}.txt
instead of ${date:now:yyyyMMddHHmmssffffff}.txt
and it worked.
In scheduler tab also we need to choose how many times this particular interface needs to be run
Myfilename_${date:now:yyyy-MM-dd}.csv
Successfactor :
Entities examples
PerPersonal
Emp Job
Job Classification
Effective Date
Asof Date
To Date
From date
OData Query options
Purchase
Products
Order
Employees
Delta Load /Period Delta :
What success factor APIs you have worked on tell me some APIs name
What kind of data you have retrieved from success factor
How you connected success factor system
What is compound employee ..what is job data
How do you read job data or address information from sucessfactor
The most common method for integrating SuccessFactors with CPI is through the OData API. This
approach uses REST-based OData services, which are typically exposed by SuccessFactors and can be
consumed by CPI to retrieve, update, or manipulate employee data and other entities.
How It Works:
SuccessFactors exposes OData endpoints for various HR-related data (e.g., employee
information, organizational structure).
In CPI, you set up a REST adapter to consume the OData API and map data between
SuccessFactors and other systems (e.g., SAP S/4HANA, other HR systems).
Another method for integrating SuccessFactors with CPI is by using SOAP-based web services.
SuccessFactors provides a set of SOAP-based web services for various operations related to HR data.
How It Works:
SuccessFactors provides a dedicated API called SFAPI (SuccessFactors API) to enable integration with
third-party systems. This API is designed specifically for SAP SuccessFactors data and can be used for
operations such as data retrieval and updating.
How It Works:
SFAPI provides access to a range of SuccessFactors data entities, like employees, positions,
compensation, etc.
It can be consumed in CPI using HTTP or REST adapters, where CPI sends requests to the SFAPI
endpoint and handles the response.
How It Works:
Use SuccessFactors Integration Center to create integration templates (e.g., data exports).
CPI can consume these data exports by connecting to SuccessFactors APIs (like OData or SFAPI)
or using the standard FTP integration method.
The integration templates help automate data flow between SuccessFactors and other systems.
Common Integration Scenarios: Standard HR workflows like employee creation, organizational
structure changes, etc.
How It Works:
Use CPI iFlows to enable employee data replication between SuccessFactors Employee Central
and SAP S/4HANA.
Leverage SuccessFactors APIs (such as OData, SFAPI, or custom integrations) to expose HR data
and transfer it to S/4HANA.
6. Data Replication (SFTP, File-based Integrations)
Another way to integrate SuccessFactors with CPI is through file-based or SFTP (Secure FTP)
integrations. SuccessFactors can generate CSV or XML files for employee data, and these can be
transferred to CPI for further processing.
How It Works:
How It Works:
o API Management acts as a proxy between SuccessFactors and other systems.
o CPI consumes the APIs exposed via API Management, providing a streamlined and
secure way to expose, monitor, and manage APIs.
Security: Supports OAuth 2.0, API keys, and other authentication methods to secure the APIs.
Common Integration Scenarios: Secure and scalable third-party API consumption, exposing
SuccessFactors data to external applications.
SAP is experimenting with SAP Graph for cloud applications, which is an API layer that enables users to
interact with multiple SAP systems in a unified manner. While still in a limited phase, SAP Graph could
be used in the future to integrate SuccessFactors with other cloud and on-premise SAP applications
through a unified interface.
Use Case: Unified API layer for integrating multiple SAP systems, including SuccessFactors.
How It Works:
o SAP Graph aggregates data from multiple SAP systems (including SuccessFactors) into a
single API.
o CPI would consume this unified API to simplify the integration.
Security: OAuth 2.0-based security will likely be used for these APIs.
Conclusion
In summary, there are several ways to connect SAP SuccessFactors to CPI, including:
How it works:
SuccessFactors exposes OData services (standard or custom) for various entities (e.g., employee
data, organizational structure, compensation, etc.).
CPI can consume these services using the REST adapter.
Common Use Cases: Data retrieval and updates (e.g., employee data synchronization, time
management).
SuccessFactors also supports SOAP-based web services for integration. These web services expose
different HR-related functionalities that CPI can consume.
How it works:
SuccessFactors exposes SOAP web services for operations like creating employees, updating
records, or getting employee data.
CPI can integrate with these services using the SOAP adapter.
SOAP messages are exchanged in XML format.
The SuccessFactors API (SFAPI) provides a set of more specialized APIs for interacting with
SuccessFactors data, often used for bulk data operations or custom integrations.
How it works:
o The SFAPI is specifically designed to handle operations like querying, creating, or
updating data in SuccessFactors, including employee records, positions, and
compensation.
For file-based data exchange, SuccessFactors can generate files (CSV, XML) that CPI can consume via FTP
or SFTP protocols.
How it works:
SuccessFactors exports data in file format (e.g., CSV or XML) and sends it to a file server
(FTP/SFTP).
CPI picks up the files from the file server using the FTP/SFTP adapter.
Data is processed, transformed, and sent to the target system.
The Integration Center within SuccessFactors allows for the creation and management of integrations
using templates or pre-built connectors. These integrations can then be consumed by CPI.
How it works:
SuccessFactors Integration Center enables you to configure data export/import templates that
define how data will be transferred.
CPI can consume the data generated by these templates by connecting via OData or FTP
(depending on how the Integration Center is configured).
Predefined connectors in Integration Center make it easier to set up standard integration
scenarios.
In hybrid scenarios where SuccessFactors Employee Central is integrated with SAP S/4HANA, CPI acts as
the middleware to synchronize HR data between SuccessFactors and S/4HANA.
How it works:
o CPI uses iFlows to synchronize employee master data, organizational structure, and
other HR data between SuccessFactors Employee Central and SAP S/4HANA.
o Data is exchanged using OData, SFAPI, or SOAP web services.
Authentication: OAuth 2.0 or SAML for secure communication.
Common Use Cases: Employee data replication, HR policies, compensation, payroll, etc.
API Management in SAP Cloud Platform can be used to expose, secure, and monitor SuccessFactors
APIs. CPI can consume these APIs via the API Gateway.
How it works:
o API Management acts as a middleware layer that exposes SuccessFactors APIs to
external systems.
o CPI can then consume the APIs exposed by API Management to interact with
SuccessFactors.
o API Management provides additional monitoring, security, and throttling capabilities.
Authentication: OAuth 2.0 or API keys.
Common Use Cases: Managing and securing APIs for large-scale integrations, monitoring API
usage, handling external systems.
SAP SuccessFactors Hybrid Cloud Integration is designed to connect on-premise systems with
SuccessFactors. This can also work in conjunction with CPI when the integration involves both cloud and
on-premise components.
How it works:
o SuccessFactors HCI provides pre-built integration packages for connecting with on-
premise systems like SAP ERP and SAP S/4HANA.
o CPI is used as the middleware for routing data between SuccessFactors and the on-
premise systems.
Authentication: Typically uses OAuth 2.0 or other secure authentication methods.
1. OData API (REST-based) – For real-time data exchange (e.g., employee data, organizational
structure).
2. SOAP Web Services – For legacy integrations with SOAP protocols.
3. SFAPI – For bulk data processing and custom integrations.
4. Integration Center – For using pre-built templates or connectors to integrate SuccessFactors
with other systems.
5. File-based Integration (FTP/SFTP) – For exchanging bulk data in file format.
6. Employee Central Integration with SAP S/4HANA – For hybrid HR data integration.
7. SAP Cloud Platform API Management – For managing, securing, and monitoring SuccessFactors
APIs.
8. SAP Graph (Beta) – A unified API layer for data access across multiple SAP systems, including
SuccessFactors.
9. SAP SuccessFactors HCI (Hybrid Cloud Integration) – For hybrid cloud-on-premise integrations.
what are different adaptors used to connect to successfactor from sap cpi
In SAP Cloud Platform Integration (CPI), different adapters are used to connect to SAP SuccessFactors
depending on the type of integration, the protocol used, and the type of data being exchanged. Below
are the key adapters that can be used to integrate SAP CPI with SuccessFactors:
The REST Adapter is one of the most commonly used adapters to integrate with SAP SuccessFactors,
particularly when using OData APIs or SuccessFactors API (SFAPI). These APIs are typically REST-based,
and the REST adapter allows CPI to send and receive data from these endpoints.
The SOAP Adapter is used when integrating with SOAP-based web services exposed by SAP
SuccessFactors. These web services allow more structured XML-based communication and are often
used in legacy integrations.
The FTP/SFTP Adapter is used when integrating SuccessFactors with CPI through file-based data
exchange, such as CSV or XML files. This method is commonly used for large data transfers or for
scenarios where batch data needs to be processed.
The IDoc Adapter is used when integrating SAP SuccessFactors with on-premise SAP systems like SAP
ERP or SAP S/4HANA using IDocs. While not directly used with SuccessFactors, it can be part of an
integration flow when data needs to be exchanged between SuccessFactors and SAP ERP systems
through CPI.
The HTTP Adapter is used for custom integrations where SuccessFactors APIs or other endpoints are
consumed over HTTP or HTTPS, but not necessarily using the standard REST or SOAP adapters. This
adapter is useful for connecting to custom SuccessFactors endpoints that might not use OData or SOAP.
Protocols Supported: HTTP, HTTPS.
Use Cases:
o Custom API Integrations: For any custom REST/SOAP services exposed by
SuccessFactors or other systems.
o Can be used when SuccessFactors has custom endpoints that need to be called using
HTTP methods.
How It Works:
o CPI sends HTTP requests (GET, POST, PUT, DELETE) to the specified SuccessFactors
endpoint.
o The response is processed and routed to other systems or applications as needed.
The S/4HANA Adapter is used to connect SAP SuccessFactors with SAP S/4HANA systems. While not
specific to SuccessFactors alone, it plays a crucial role when integrating HR data between SuccessFactors
Employee Central and SAP S/4HANA.
If you're integrating CPI with a legacy SAP Process Integration (PI) or SAP Process Orchestration (PO)
system, the PI/PO Adapter can be used to connect SuccessFactors to these systems for hybrid
scenarios.
The JMS Adapter is used when integrating SuccessFactors with systems that rely on Java Message
Service (JMS) for messaging, though this is less common in standard SuccessFactors-CPI integrations.
SAP Cloud Connector is often used when integrating SuccessFactors with on-premise systems or when a
secure connection is required between SAP CPI and on-premise applications, including SAP systems.
1. REST Adapter – For OData and SFAPI integrations (common for real-time HR data).
2. SOAP Adapter – For SOAP-based web services (legacy integrations).
3. FTP/SFTP Adapter – For file-based integrations (e.g., bulk data exchange).
4. IDoc Adapter – For integration with on-premise SAP ERP/S/4HANA (IDoc-based data exchange).
5. HTTP Adapter – For custom REST or SOAP integrations (non-standard endpoints).
6. S/4HANA Adapter – For integrating SuccessFactors with SAP S/4HANA.
7. SAP PI/PO Adapter – For hybrid integration with SAP PI/PO systems.
8. JMS Adapter – For messaging-based integrations (less common).
9. SAP Cloud Connector – For secure, on-premise connectivity with hybrid systems.
In SuccessFactors, the Compound Employee API is a key part of the system that allows for the
retrieval, management, and manipulation of employee data. It provides a single endpoint to access
multiple data entities associated with an employee
ere are the different entities involved in the Compound Employee API in SuccessFactors:
This entity includes the employee's personal data such as their name, gender, date of birth, and
nationality.
It contains various fields that describe the employee’s basic details like employee ID, legal name,
marital status, and date of birth.
2. Employment Information (EmpJob)
This entity handles information about the employee's job or employment status.
It covers fields like position, job code, department, pay grade, location, and employment type.
It also tracks employment status (active, inactive, terminated, etc.).
This entity contains the compensation details of the employee, such as base salary, bonuses,
commissions, and other forms of remuneration.
It can also track compensation changes over time, including adjustments, merit increases, and
bonuses.
This section deals with time-related data for the employee, such as time off, attendance, and
hours worked.
It can include fields related to leave, vacation days, sick days, and working hours.
This entity tracks the employee’s position within the organization, which may differ from their
job.
It could include organizational units, reporting relationships, and additional details about the
employee’s role in the company.
This entity provides historical job data, such as previous roles held by the employee, job titles,
and changes in position.
It might include information on job transfers, promotions, and role assignments over time.
It includes the user-specific settings and information such as the user’s login ID, email, and roles
within the system.
It is used for authentication, permissions, and system configuration purposes.
Tracks the physical locations or offices associated with the employee, including headquarters,
remote offices, or client sites.
It can include the geographic location as well as specific office or department assignments.
This entity handles specific pay components like bonuses, allowances, and deductions.
It helps in detailed tracking of different pay structures and components that make up the
employee’s salary.
This entity tracks the learning and development activities of the employee, such as courses
taken, certifications, and training programs.
It includes learning history, competencies, and skills acquired during the employee’s tenure.
This entity covers the employee’s talent data, including potential for growth, talent pool
participation, and succession planning information.
It helps in tracking employee potential for future roles, promotions, and talent development.
This entity is related to employee documents like contracts, performance reviews, or any official
papers stored within the system.
It allows for retrieving metadata about documents associated with the employee’s records.
This entity manages leave of absence requests and approvals, including types of leave (e.g.,
maternity, sick leave), duration, and approval status.
SuccessFactors allows the inclusion of custom fields and entities tailored to an organization’s
specific needs.
These can represent any other data points related to the employee that aren’t covered by
standard entities.
Tracks payroll-related data for employees, including salary calculations, deductions, tax details,
and pay slips.
It integrates with other payroll systems or modules to handle payroll operations.
By using these different entities together, the Compound Employee API helps organizations manage and
retrieve all critical employee-related data efficiently in SuccessFactors.
In SuccessFactors, the Compound Employee API allows you to query multiple related entities (or data
sets) at once, giving you a comprehensive view of an employee's information across various aspects of
the HR system. These entities include personal, job, compensation, and other HR-related data.
Here’s a detailed overview of the entities along with their fields available in the Compound Employee
API in SuccessFactors:
Fields:
o personIdExternal: The unique identifier for the employee.
o firstName: The employee's first name.
o lastName: The employee's last name.
o gender: Gender of the employee.
o birthDate: The employee's date of birth.
o maritalStatus: Marital status of the employee.
o nationality: Nationality of the employee.
o startDate: The start date of employment.
o endDate: The end date of employment (if applicable).
o workEmail: Work email address of the employee.
o address: Employee's address.
Fields:
o jobCode: The job code associated with the employee’s role.
o position: The position held by the employee.
o department: The department in which the employee works.
o location: The physical location or office where the employee is based.
o payGrade: The pay grade assigned to the employee.
o employeeClass: Employee class (e.g., Full-time, Part-time).
o company: The company the employee belongs to.
o employmentStatus: Current employment status (active, terminated, etc.).
o startDate: Job start date.
o endDate: Job end date (if applicable).
o managerId: The ID of the employee's manager.
Fields:
o compensationType: Type of compensation (e.g., base salary, bonus).
o currency: The currency used for the compensation (e.g., USD, EUR).
o amount: The compensation amount.
o payFrequency: Pay frequency (e.g., monthly, weekly).
o compensationAmount: The amount of compensation in specific pay categories.
o bonus: The bonus or incentive associated with the employee.
o payComponent: Breakdown of pay components (e.g., base pay, variable pay).
Fields:
o timeType: Type of time entry (e.g., regular hours, overtime, paid leave).
o hoursWorked: Number of hours worked.
o leaveType: Type of leave taken (e.g., vacation, sick leave).
o startDate: The start date of the time record.
o endDate: The end date of the time record.
o status: Approval status of the time record.
o totalLeaveDays: Total number of leave days taken.
Fields:
o positionId: Unique identifier for the position.
o positionTitle: The title of the position held.
o positionType: The type of position (e.g., regular, temporary).
o organizationUnit: The organizational unit to which the position belongs.
o jobCode: The job code associated with the position.
o supervisor: Employee's supervisor or manager.
o reportsTo: The position to which this position reports.
Fields:
o previousJobTitle: The employee's previous job title.
o previousDepartment: The previous department the employee worked in.
o startDate: The start date of the job history entry.
o endDate: The end date of the job history entry (if applicable).
o location: The previous work location of the employee.
o jobCode: The job code for the previous role.
Fields:
o locationId: Unique identifier for the location.
o locationName: The name of the location.
o city: City of the location.
o country: Country of the location.
o state: State/province of the location.
o address: Physical address of the location.
Fields:
o benefitPlan: The type of benefit plan (e.g., medical, dental, retirement).
o enrollmentDate: Date the employee enrolled in the benefit plan.
o benefitStatus: Status of the benefit enrollment (e.g., active, pending).
o dependents: Dependents associated with the employee's benefits.
o coverageType: Type of coverage (e.g., individual, family).
Fields:
o payComponentType: Type of pay component (e.g., salary, overtime, bonus).
o amount: Amount associated with the pay component.
o currency: Currency for the pay component.
o effectiveDate: Date when the pay component becomes effective.
o frequency: Frequency of the pay component (e.g., monthly, annually).
Fields:
o courseId: ID of the course or training the employee has completed.
o courseTitle: Title of the course or training.
o completionDate: Date the employee completed the course.
o status: Status of the course (e.g., completed, in progress).
o certification: If applicable, the certification obtained from the course.
Fields:
o documentId: Unique identifier for the document.
o documentType: Type of the document (e.g., contract, performance review).
o uploadDate: Date the document was uploaded.
o documentStatus: The status of the document (e.g., approved, pending).
o documentLink: Link to the document for download or viewing.
Fields:
o leaveType: The type of leave taken (e.g., vacation, sick leave).
o leaveStartDate: Start date of the leave.
o leaveEndDate: End date of the leave.
o leaveStatus: Status of the leave (e.g., approved, pending).
o totalDays: Total days of leave taken.
Fields:
o payrollId: Unique identifier for the payroll record.
o payPeriodStartDate: Start date of the pay period.
o payPeriodEndDate: End date of the pay period.
o payAmount: Total pay amount for the employee.
o taxAmount: Total tax deduction for the employee.
o netPay: Employee's net pay after deductions.
Custom entities and fields may be created in SuccessFactors, specific to an organization's needs.
These can represent any additional data that the organization needs to track, such as:
o customField1, customField2: Custom fields created to track data specific to the
organization.
o customEntity: Custom entities created to track unique data points related to employees.
How It Works:
When using the Compound Employee API, you can fetch data related to multiple entities in a
single API request, reducing the number of calls needed for comprehensive employee data.
The API Endpoint typically uses the personIdExternal to filter and gather all relevant employee
data from these entities.
The API returns data in a structured way, where each of these entities and their fields can be
nested, depending on the relationships and dependencies.
This comprehensive structure ensures that organizations can manage, retrieve, and update employee
data efficiently across different modules within SuccessFactors.
In SuccessFactors, the terms Effective Date, From Date, and To Date are often used in different contexts
to indicate time periods for various employee records and actions, such as employment history,
compensation, position changes, and more. Here's an explanation of the differences between them:
1. Effective Date:
Definition: The Effective Date is the date when a change or event becomes applicable. It marks
the actual date from which the data is considered active or valid in the system.
Use Cases: This is the key date used when an update to an employee's record is implemented.
For example:
o If an employee's compensation is increased, the effective date would indicate the day
on which the salary increase takes effect.
o For a job change, the effective date would indicate when the employee officially starts
the new role, even if the approval happened earlier.
Example: If an employee's salary increase is approved on December 1st but is set to take effect
from January 1st, the effective date is January 1st.
2. From Date:
Definition: The From Date indicates the start of a particular period or event. It specifies when
something starts.
Use Cases: It is often used for historical records or for a time range when tracking changes or
activities. For example:
o Job History: When an employee starts a new role, the "From Date" would indicate the
date the employee began that position.
o Compensation: When a new pay rate is implemented, the "From Date" shows the start
of the period for which the new rate applies.
o Leave of Absence: For an employee who takes a leave, the "From Date" specifies the
first day of the leave.
Example: If an employee is promoted to a new position starting on January 1st, the "From Date"
would be January 1st.
3. To Date:
Definition: The To Date is the end date of a period or event, specifying when a particular action
or record ceases to be valid or active.
Use Cases: The To Date is typically used to mark the end of an employee’s assignment, job,
compensation, or benefit. For example:
o Employment: If an employee's position changes or they leave the company, the "To
Date" would represent when the previous role ended or when the employment ended.
o Leave of Absence: For a leave, the "To Date" indicates when the leave ends.
o Compensation: If a pay adjustment ends, the "To Date" marks the end of the period for
that pay rate.
Example: If an employee was in a temporary role from January 1st to March 31st, the "From
Date" would be January 1st and the "To Date" would be March 31st.
Key Differences:
Effective Date: Specifies when a change or update becomes effective in the system (when the
change is applied).
From Date: Specifies the start date of a particular event, period, or status.
Example Scenario:
From Date: The date the employee starts the new role (e.g., January 1st).
To Date: The date the employee will leave the current role or if the role has a defined end date
(e.g., December 31st for a fixed-term position).
Effective Date: The date the salary increase becomes applicable, which could be the same as the
promotion date or a later date (e.g., January 1st for both the promotion and the salary change).
Summary:
These dates help organizations track and manage employee records accurately, ensuring proper
handling of transitions, pay changes, job assignments, and other events in the HR system.
n SuccessFactors CPI (Cloud Platform Integration) integration, delta loads refer to the process of
extracting only the changed or updated records since the last successful data load, rather than
loading all data every time.
Scenario: Employee Data Synchronization with an External HR System
Context:
An organization uses SuccessFactors as their HR system and integrates it with an external payroll or
finance system. The goal is to ensure that employee data (e.g., job details, compensation, status) is
continuously updated in the external system.
The organization doesn't want to perform full data extracts every time (which can be resource-intensive)
but rather wants to synchronize only the newly created, modified, or deleted employee records since
the last synchronization. This is where the delta load comes into play
Delta Identification:
Key Criteria for Delta: Delta loads are based on identifying changed, updated, or deleted
records. This is typically done by checking fields such as:
o Last Modified Date (Effective Date): A field that indicates the last time the employee
record was modified (e.g., lastModifiedDate, modifiedDate, etc.).
o Change Log: SuccessFactors keeps track of changes to records, and a change log or audit
log is used to track updates, deletions, and creations.
o Status or Active Flag: For deletions, the integration may look for employees whose
status has been marked as "inactive" or "terminated."
Delta Criteria: The system will only extract records that have:
A modified date after the last successful integration (using the lastModifiedDate field).
A status change (e.g., from "active" to "terminated").
New employee records (e.g., newly hired employees).
Delta Extraction:
In the subsequent integrations, the delta logic will filter out unchanged data and only extract
records that meet the delta criteria.
For example, if the last delta load was on December 1st, the next load would fetch only records
where the last modified date is later than December 1st.
Data Filtering: The CPI integration flow can be designed to use SuccessFactors API parameters
(such as $filter in OData queries) to fetch only records with a modification date greater than the
previous load.
Handling Deletions:
In some cases, the external system may also need to be updated when records are deleted in
SuccessFactors (e.g., an employee leaves the company).
The delta load would identify the employees marked as "terminated" or "inactive" and trigger a
deletion process in the external system.
Deletions are usually handled by checking employee status or a specific delete flag.
In SuccessFactors, the Employee Central (EC) module provides several APIs that allow external systems
to interact with and retrieve employee-related data. These APIs are primarily used for integrations, data
synchronization, and reporting. Below is an overview of the main types of APIs available in the
Employee Central module of SuccessFactors:
SuccessFactors provides a set of OData APIs that are widely used for querying and manipulating
employee data. These APIs are based on the OData protocol and allow CRUD (Create, Read, Update,
Delete) operations on employee-related data.
Employee Central OData API: This API exposes entities and allows interactions with employee
records (e.g., personal information, job information, compensation, etc.).
o Entities include:
Employee: Retrieve, update, or manage personal details of employees.
EmpEmployment: Manage employment details (e.g., job, position, department).
EmpJob: Manage employee job data (e.g., job title, company, and location).
Compensation: Retrieve or modify compensation data (e.g., salary, bonuses).
PayComponent: Interact with individual pay components (e.g., salary
increments, bonuses).
Position: Access position data (e.g., job role, department, location).
EmployeeProfile: Retrieve detailed employee profiles including skills, education,
etc.
WorkSchedule: Query employee work schedules, shift timings, and other time-
related data.
bash
Copy code
/odata/v2/Employee
/odata/v2/EmpJob
/odata/v2/EmpEmployment
/odata/v2/Position
OData Entity-Specific Endpoints: Each OData API has specific endpoints based on the employee
data entities, which can be used to query specific data, filter by certain attributes, or perform
operations such as creation, update, and deletion of records.
o Example: GET /odata/v2/Employee can be used to fetch employee data.
o Common Operations:
GET (Retrieve data)
POST (Create data)
PUT (Update data)
DELETE (Delete data)
PATCH (Partially update data)
The Compound Employee API is a specialized API that aggregates multiple employee-related entities
into a single API call. It is ideal for use cases where you need to retrieve a large set of employee-related
information in one request.
Entities in Compound Employee API: The Compound Employee API allows you to access a wide
variety of employee data from multiple entities in one go, such as:
o EmpJob (Job Information)
o EmpEmployment (Employment Information)
o Compensation (Compensation Data)
o Employee Profile
o Personal Information (e.g., contact details, personal info)
o Position Information
o Succession & Talent Information
bash
Copy code
/odata/v2/CompoundEmployee
This endpoint can be used to fetch multiple related entities, such as job, compensation, and
personal information, all in one request, which is particularly useful for integration scenarios.
3. Metadata API
The Metadata API allows you to retrieve metadata about entities and fields within the Employee Central
module. This is useful for understanding the structure of data, which can help in building dynamic
integrations or forms that need to adapt to the schema of the data.
Key Functionality:
o Retrieve metadata for entities like Employee, Compensation, Position, etc.
o Understand field names, data types, and relationships between different entities.
Example Metadata API Endpoint:
bash
Copy code
/odata/v2/$metadata
This will return the metadata of the available entities and their relationships.
The Time Off API is used for managing time off (such as vacation, sick leave, etc.) for employees in
Employee Central. It allows external systems to query, request, and manage leave requests.
Use Cases:
o Requesting time off for employees.
o Retrieving the employee’s leave balance.
o Managing leave requests and approvals.
bash
Copy code
/odata/v2/TimeOffRequest
5. Payroll API
Although typically part of SuccessFactors Payroll, the Payroll API can also interact with employee data
in the Employee Central module, especially for payroll data integration and processing.
Use Cases:
o Retrieve payroll data for employees.
o Synchronize payroll information from Employee Central to external payroll systems.
bash
Copy code
/odata/v2/Payroll
6. User API
The User API provides access to information about users within the SuccessFactors system. This can
include users who are employees, managers, or administrators.
Key Functionality:
o Retrieve user account information.
o Manage user roles and permissions.
Example User API Endpoint:
bash
Copy code
/odata/v2/User
The Position Management API is specifically designed to interact with the Position Management
module within Employee Central, allowing users to manage and query position data.
Use Cases:
o Retrieving position details.
o Creating and updating positions.
bash
Copy code
/odata/v2/Position
The Time Management API allows external systems to integrate with SuccessFactors Time Management
for handling employee time data.
Use Cases:
o Retrieving time data (work schedules, hours worked, overtime, etc.).
o Updating employee time data.
bash
Copy code
/odata/v2/TimeManagement
9. Benefits API
The Benefits API allows integration of employee benefits data with external systems. This API can be
used for retrieving, updating, and managing employee benefits information such as enrollment and
coverage.
bash
Copy code
/odata/v2/Benefits
Summary of Key APIs in Employee Central:
These APIs allow for a seamless integration between SuccessFactors Employee Central and external
systems (such as payroll, finance, benefits, etc.) by providing flexible and scalable options for syncing
and managing employee-related data.