SAP Advanced Financial Closing SDK for CDS
npm install @cap-js-community/sap-afc-sdk



> Integration with SAP Advanced Financial Closing is not yet published.
> Please refer to What's New for SAP Advanced Financial Closing
> for the latest updates.
SAP Advanced Financial Closing SDK for CDS provides an SDK for SAP Advanced Financial Closing to
be consumed with SAP Cloud Application Programming Model.
- Requirements and Setup
- Getting Started
- CAP Node.js
- Usage
- Architecture
- Options
- Implement
- Job Processing
- Job Provider
- Periodic Job Sync
- Notification
- API
- Additional Settings
- CAP Java
- Usage
- Architecture
- Options
- Implement
- Job Processing
- Job Provider
- Periodic Job Sync
- Notification
- API
- Deployment
- Service Broker
- Miscellaneous
- Testing
- Authorization
- Work Zone
- Multitenancy
- Support, Feedback, Contributing
- Code of Conduct
- Licensing
SAP Advanced Financial Closing (AFC) lets you define, automate,
process, and monitor the entity close for your organization.
- To develop and test applications built with this SDK, you need a CAP Node.js or CAP Java project
- To integrate, you need access to an instance of SAP Advanced Financial Closing
A new CDS project can be initialized using SAP Build Code tools on SAP Business Technology Platform (BTP)
or @sap/cds-dk CLI command cds init can be used to bootstrap a new CAP application. See capire.
SAP Build Code:
- Open SAP Build Lobby
- Press Create
- Select objective Application
- Choose category Full-Stack
- Select type Full-Stack Node.JS or Full-Stack Java
- Provide the project name and dev space
- Press Review
- Press Create
- Open the project in SAP Business Application Studio
CDS Command-Line-Interface:
- Terminal: npm install -g @sap/cds-dk
- Init a new CDS project:
- Terminal:
- CAP Node.js: cds init
- CAP Java: cds init
- Switch to the project folder:
- Terminal: cd
- Install
- Terminal: npm install
- Add AFC SDK
- Terminal npm install @cap-js-community/sap-afc-sdk
- Use afc command
- Add globally:
- Terminal: npm install -g @cap-js-community/sap-afc-sdk
- Use locally:
- Terminal: npx afc
- Init target environment
- Cloud Foundry (default):
- Terminal: afc init cf
- Kyma:
- Terminal: afc init kyma
- Add SDK features
- Terminal: afc add sample,broker,http
- Add stub implementation
- Terminal: afc add stub
- Terminal: npm start
- Browser:
- CAP Node.js: http://localhost:4004
- CAP Java: http://localhost:8080
The SAP Advanced Financial Closing SDK for CDS provides a plugin for SAP Cloud Application Programming Model (CAP) for Node.js
to extend and integrate with SAP Advanced Financial Closing (AFC). Specifically, it provides an out-of-the-box
implementation of the SAP Advanced Financial Closing Scheduling Service Provider Interface
to expose a Scheduling Provider service to manage Job definitions and Jobs.
Furthermore, it brings the following out-of-the-box features:
- API: Exposes a RESTful API implementing the AFC Scheduling Provider Interface to manage Job definitions and Jobs
- Event-Queue: Provides an Event Queue to process and synchronize Jobs (periodically) asynchronously and resiliently (circuit breaker, retry,
load-balancing, etc.)
- Websocket: Provides websocket connection support to monitor Job processing live
- Feature-Toggle: Provides a feature toggle library to control the execution of the Event Queue
- UI: Provides a UI5 application to monitor and cancel Jobs
- Broker: Implements a service broker to manage service key management to API
- Run npm add @cap-js-community/sap-afc-sdk in @sap/cds CAP Node.js project
- Execute npm start to start server
- Access the welcome page at http://localhost:4004
- Access Applications
- /launchpad.html: Sandbox Launchpad
- /scheduling.monitoring.job: Standalone Scheduling Monitoring Job UI
- Access Service Endpoints
- Public API
- /api/job-scheduling/v1: Scheduling Provider API (OpenAPI Swagger UI)
- OData API (UI)
- /odata/v4/job-scheduling/monitoring: Job Scheduling Monitoring ($metadata)
- WebSocket API
- /ws/job-scheduling: Scheduling WebSocket endpoint
- REST API
- /rest/feature: Feature Toggle API
- CDS Internal API
- sapafcsdk.scheduling.ProcessingService: Scheduling Processing service
``js`
const schedulingProcessingService = await cds.connect.to("sapafcsdk.scheduling.ProcessingService");
sapafcsdk.scheduling.WebsocketService
- : Scheduling Websocket service`
js`
const schedulingWebsocketService = await cds.connect.to("sapafcsdk.scheduling.WebsocketService");
The SAP Advanced Financial Closing SDK for CDS is built on the following architecture open source
building blocks as depicted in the following diagram:
- WebSocket Adapter for CDS (https://github.com/cap-js-community/websocket)
- Exposes a WebSocket protocol via WebSocket standard or Socket.IO for CDS services. Runs in the context of the SAP
Cloud Application Programming Model (CAP) using @sap/cds (CDS Node.js).
- Event Queue for CDS (https://github.com/cap-js-community/event-queue)
- The Event-Queue is a framework built on top of CAP Node.js, designed specifically for efficient and streamlined
asynchronous event processing
- Feature Toggle Library for CDS (https://github.com/cap-js-community/feature-toggle-library)
- SAP BTP feature toggle library enables Node.js applications using the SAP Cloud Application Programming Model to
maintain live-updatable feature toggles via Redis
Using the SAP Advanced Financial Closing SDK, a third-party scheduling provider can be built for SAP Advanced Financial Closing.
The architectural design of the SAP Advanced Financial Closing (AFC) SDK for implementing a Scheduling Provider is based
on the SAP Cloud Application Programming Model (CAP) and SAP Build Code. It leverages the @cap-js-community
open-source components to enable scheduling services in AFC.
The following diagram illustrates the high-level architecture of the SAP Advanced Financial Closing SDK for CDS (Node.js):
Key components and processing flow:
- SAP Advanced Financial Closing (AFC):
- Sends scheduling requests via AFC Scheduling Service Provider Interface using REST
API (OpenAPI)
- Scheduling Provider Service:
- Handles incoming scheduling requests
- Creates scheduling jobs synchronously and places asynchronous requests into the Event Queue
- Scheduling Processing Service:
- Processes scheduled jobs asynchronously
- Retrieves job requests from the Event Queue and executes them.
- Scheduling WebSocket Service:
- Listens for status updates of scheduled jobs
- Notifies the Monitoring Scheduling Job UI via WebSockets when job statuses change
- Scheduling Monitoring Service:
- Monitoring Scheduling Job UI (SAP Fiori Elements V4 / SAP UI5 application)
- Reads scheduling job details from the database
- Supports monitoring via OData V4 API
- Displays scheduling job statuses and updates in real-time via WebSockets
- Event Queue & Feature Toggles:
- Event Queue (instrumenting CDS Queue) facilitates asynchronous job execution
- Feature Toggles allow influencing Job and Event Queue processing dynamically
- Database & Redis Caching:
- Stores job scheduling data in the database
- Redis is used for information distribution (e.g., Event Queue, WebSockets, Feature Toggles)
Options can be passed to SDK via CDS environment in cds.requires.sap-afc-sdk section:
- capabilities: Object: Capabilities configuration. Default is {}capabilities.supportsNotification: Boolean
- : Supports notification configuration. Default is trueendpoints: Object
- : Endpoint configuration. Default is {}endpoints.approuter: String
- : Url of approuter. Default is null (derived from conventions )endpoints.server: String
- : Url of server. Default is null (derived from environment, e.g. CF)api: Object
- : API configuration on /api paths. Default see belowapi.cors: Boolean | Object
- : Cross-Origin Resource Sharing (CORS) configuration for cors module on /api paths. Default is { origin: true }api.cors.origin: Boolean | String | String[]
- : Cross-Origin Resource Sharing (CORS) origin configuration. Default is true (allow approuter url)api.csp: Object | Boolean
- : Content Security Policy (CSP) directives for helmet module on /api paths. Default is falseui: Object | Boolean
- : UI configuration. Use false to disable UI. Default is {}ui.path: String
- : Path to the served UI5 application. Default is ''ui.link: Boolean
- : Fill link of jobs to served UI5 launchpad, if null. Default is trueui.swagger: Boolean | Object
- : Serve API docs via Swagger UI. Default is trueui.swagger."sapafcsdk.scheduling.ProviderService": Boolean
- : Serve API docs of Scheduling Provider via Swagger UI. Default is trueui.launchpad: Boolean
- : Serve launchpad. Default is trueui.scheduling.monitoring.job: Boolean
- : Serve Scheduling Monitoring Job UI separately if no Launchpad is served. Default is truebroker: Boolean | Object
- : Broker configuration. Serve broker endpoint, if truthy. Default is false and true in productionmockProcessing: Boolean | Object
- : Activate mocked job processing. Default is falsemockProcessing.min: Number
- : Minimum processing time in seconds. Default is 0mockProcessing.max: Number
- : Maximum processing time in seconds. Default is 10mockProcessing.default: String
- : Default processing status. Default is completedmockProcessing.status: Object
- : Status distribution values. Default is {}mockProcessing.status.completed: Number
- : Completed status distribution value. Default is 0mockProcessing.status.completedWithWarning: Number
- : Completed With Warning status distribution value. Default is 0mockProcessing.status.completedWithError: Number
- : Completed With Error status distribution value. Default is 0mockProcessing.status.failed: Number
- : Failed status distribution value. Default is 0config: Object
- : Advanced SDK configuration. See config.json. Default is {}
The SDK provides a set of services to implement the job processing service and the scheduling provider service.
#### Job Processing
The job processing service is responsible for processing the jobs.
##### Mock Processing
The library includes a mocked processing for jump-start development, which is disabled by default via option.
cds.requires.sap-afc-sdk.mockProcessing: false
Setting option cds.requires.sap-afc-sdk.mockProcessing: true a basic mocked job processing completes0-10
jobs based on a random time value between seconds:
`json`
{
"cds": {
"requires": {
"sap-afc-sdk": {
"mockProcessing": {
"min": 0,
"max": 10,
"default": "completed"
}
}
}
}
}
The project can be adjusted to use basic mock processing automatically via command:
- Terminal: afc add -b mock
More advanced mocked Job processing can be configured by setting the
following CDS env options (as described in options):
`json`
{
"cds": {
"requires": {
"sap-afc-sdk": {
"mockProcessing": {
"min": 0,
"max": 10,
"default": "completed",
"status": {
"completed": 0.5,
"completedWithWarning": 0.2,
"completedWithError": 0.2,
"failed": 0.1
}
}
}
}
}
}
This default advanced mocked Job processing can be also configured by using CDS profile mock via --profile mock orCDS_ENV=mock.
The project can be adjusted to use advanced mock processing (without additional mock profile) automatically via command:
- Terminal: afc add -a mock
Mock configuration can be adjusted in package.json afterward.
To disable mock processing remove CDS env cds.requires.sap-afc-sdk.mockProcessing, e.g., via command:
- Terminal: afc add -x mock
The default implementation of the job processing is already provided by the SDK. Focus can be put on
custom processing logic and the processing status update handling.
##### Custom Processing
To implement custom job processing, extend the job processing service definition as follows:
CDS file: /srv/scheduling-processing-service.cds
`cds
using sapafcsdk.scheduling.ProcessingService from '@cap-js-community/sap-afc-sdk';
annotate ProcessingService with @impl: '/srv/scheduling-processing-service.js';
`
Implementation file: /srv/scheduling-processing-service.js
`js
const { SchedulingProcessingService, JobStatus } = require("@cap-js-community/sap-afc-sdk");
class CustomSchedulingProcessingService extends SchedulingProcessingService {
async init() {
const { processJob, updateJob, cancelJob, syncJob, notify } = this.operations;
this.on(processJob, async (req, next) => {
// Your logic goes here
await next();
});
this.on(updateJob, async (req, next) => {
// Your logic goes here
await next();
});
this.on(cancelJob, async (req, next) => {
// Your logic goes here
await next();
});
this.on(syncJob, async (req, next) => {
// Your logic goes here
await next();
});
this.on(notify, async (req, next) => {
// Your logic goes here
await next();
});
super.init();
}
}
module.exports = CustomSchedulingProcessingService;
`
A stub implementation for a custom scheduling processing service can be generated via command:
- Terminal: afc add stub
As part of the custom scheduling process service implementation, the following operations can be implemented:
- on(processJob):req.data.ID
- A new job instance was created and needs to be processed
- The job is due (start date time is reached), and the job is ready for processing
- Implement your custom logic, how the job should be processed
- Job ID is accessible via and job data can be accessed via req.jobreq.data.testRun
- Test run can be identified via flag (if job definition supports test mode)await next()
- Call to perform default implementation (set status to running)this.processJobUpdate()
- Job update can be performed via providing the new status and job resultsawait this.processJobUpdate(req, job, JobStatus.completed, results)
- e.g. data
- Result object shall contain stream objects to prevent data materializationcds.requires.sap-afc-sdk.mockProcessing: false
- Throwing exceptions will automatically trigger the retry process in Event Queue
- Disable mocked job processing via (default).on(updateJob)
- :req.job
- A job status update is requested and the job results are stored
- Implement your custom logic, how the job status should be updated
- Job data can be retrieved via async checkStatusTransition(req, job, statusBefore, statusAfter)
- Job status transition is validated via this.statusTransitions
- Valid status transitions are defined in completed
- Check function and status transitions can be customized
- Final statuses are , completedWithWarning, completedWithError, failed, and canceled, no further status transitions are then allowedasync checkJobResults(req, job, results)
- Job results are checked and processed via await next()
- Valid results are valid, according to job results signature constraints (see below)
- Returns the processed job results to be inserted
- Call to perform default implementation (update status to requested status)on(cancelJob)
- :req.job
- A job cancellation is requested
- Implement your custom logic, how the job should be canceled
- Job data can be retrieved via await next()
- Call to perform default implementation (update status to canceled)
The job results signature is defined as follows:
`cds
type ResultTypeCode : String enum {
link;
data;
message;
};
type MessageSeverityCode : String enum {
success;
info;
warning;
error;
};
type JobResult {
name : String(255) not null;
type : ResultTypeCode not null;
link : String(5000);
mimeType : String(255);
filename : String(5000);
data : LargeBinary;
messages : many JobResultMessage;
};
type JobResultMessage {
code : String(255) not null;
values : array of String(5000);
text : String(5000);
severity : MessageSeverityCode not null;
createdAt : Timestamp;
texts : many JobResultMessageText;
};
type JobResultMessageText {
locale : Locale not null;
text : String(5000);
};
`
Multiple job results can be passed for job update.
The following constraints apply for each job result type:
- link:name
- Properties and link need to be provideddata
- Other properties are not allowed
- :name
- Properties , mimeType, filename and data need to be providedmessage
- Data needs to be provided as a base64 encoded string
- Other properties are not allowed
- :name
- Properties and messages need to be provided
- Messages need to be provided as an array of job result messages
- Other properties are not allowed
Job processing is performed as part of the Event Queue processing. The Event Queue is a framework built on top of CAP
Node.js, designed specifically for efficient and streamlined asynchronous event processing. In case of errors, the Event Queue
provides resilient processing (circuit breaker, retry, load-balancing, etc.).
In addition, to overwriting the default implementation via an on handler, also additional before and after handlers can be registered.
##### Test Queue
In test environment the Event Queue processing is disabled per default to simplify testing. See https://cap-js-community.github.io/event-queue/unit-testing.
In order to manually process event queue events of ProcessingService in test environment, the following code can be used in tests:
`js`
const cds = require("@sap/cds");
const eventQueue = require("@cap-js-community/event-queue");
await eventQueue.processEventQueue(new cds.EventContext(), "CAP_OUTBOX", "sapafcsdk.scheduling.ProcessingService");
##### Error Codes
The following error codes are defined to be used in exceptions as part of the stable interface (x-extensible-enum):
- jobCannotBeCanceled: Jobs cannot be canceled in current statusstatusTransitionNotAllowed
- : Status transition is not allowed for the current job status
- ...
See full list in error schema of Scheduling Servcie Provider API.
#### Job Provider
A job provider service is already provided per default by the SDK, implementing
the SAP Advanced Financial Closing Scheduling Service Provider Interface.
Therefore, focus can be put on additional custom provider logic (e.g., streaming of data from a remote location).
The SAP Advanced Financial Closing Scheduling Service Provider Interface is published on SAP Business Accelerator Hub
under the package SAP Advanced Financial Closing at https://api.sap.com/api/SSPIV1.
To implement a custom job provider, extend the job provider service definition as follows:
CDS file: /srv/scheduling-provider-service.cds
`cds
using sapafcsdk.scheduling.ProviderService from '@cap-js-community/sap-afc-sdk';
annotate ProviderService with @impl: '/srv/scheduling-provider-service.js';
`
Implementation file: /srv/scheduling-provider-service.js
`js
const { SchedulingProviderService } = require("@cap-js-community/sap-afc-sdk");
class CustomSchedulingProviderService extends SchedulingProviderService {
async init() {
const { Job, JobResult } = this.entities;
this.on("CREATE", Job, async (req, next) => {
// Your logic goes here
await next();
});
this.on(Job.actions.cancel, Job, async (req, next) => {
// Your logic goes here
await next();
});
this.on(JobResult.actions.data, JobResult, async (req, next) => {
// Your logic goes here
await next();
});
super.init();
}
}
module.exports = CustomSchedulingProviderService;
`
A stub implementation for a custom scheduling provider service can be generated via command:
- Terminal: afc add stub
As part of the custom scheduling provider service implementation, the following operations can be implemented:
- on("CREATE", Job):await next()
- Validates and creates a new job instance
- Call to perform default implementationafter
- : Calls scheduling processing service function processJobon(Job.actions.cancel, Job)
- :await next()
- Cancels a job
- Call to perform default implementationafter
- : Calls scheduling processing service function cancelJobon(JobResult.actions.data, JobResult)
- :await next()
- Call to perform default implementationdata
- Streams data of a job result (type ) from DB to response
In addition, to overwriting the default implementation via an on-handler, also additional before and after handlers can be registered.
#### Periodic Job Sync
A periodic scheduling job synchronization event named sapafcsdk.scheduling.ProcessingService.syncJob is running per default every 1 minute
in the Event Queue, to perform job synchronization from an external source. The default implementation is a no-op.
The event syncJob is registered automatically with cron interval /1 * in the Event Queue configuration.
To change the cron interval, the Event Queue configuration can be adjusted in the CDS env:
CDS Env:
`json`
{
"cds": {
"requires": {
"sapafcsdk.scheduling.ProcessingService": {
"queued": {
"events": {
"syncJob": {
"cron": "/2 *"
}
}
}
}
}
}
}
The cron interval option defines the periodicity of the scheduling job synchronization.
CDS file: /srv/scheduling-processing-service.cds
`cds
using sapafcsdk.scheduling.ProcessingService from '@cap-js-community/sap-afc-sdk';
annotate ProcessingService with @impl: '/srv/scheduling-processing-service.js';
`
Implementation file: /srv/scheduling-processing-service.js
`js
const { SchedulingProcessingService } = require("@cap-js-community/sap-afc-sdk");
class CustomSchedulingProcessingService extends SchedulingProcessingService {
async init() {
const { syncJob } = this.operations;
this.on(syncJob, async (req, next) => {
// Your logic goes here
await next();
});
super.init();
}
}
module.exports = CustomSchedulingProcessingService;
`
A stub implementation for periodic job sync can be generated via command:
- Terminal: afc add stub
Details on how to implement a periodic event via Event Queue can be found in
Event-Queue documentation on Periodic Events.
#### Notification
The service provider can be notified with special scheduling notifications via operation notify.notification
The capability is active per default, can be disabled by setting environment option:
CDS Env:
`json`
{
"cds": {
"requires": {
"sap-afc-sdk": {
"capabilities": {
"supportsNotification": false
}
}
}
}
}
The notify operation of the Scheduling Service Provider Interface can send multiple notifications at once.
The signature of a single notification is defined as follows:
`cds`
type Notification {
name : String(255) not null;
ID : String(255);
value : String(5000);
};
Available notifications are:
- taskListStatusChanged: Notification to inform about changed task list status.name
- : Notification name taskListStatusChangedID
- : Task list IDvalue
- : New task list status
CDS file: /srv/scheduling-processing-service.cds
`cds
using sapafcsdk.scheduling.ProcessingService from '@cap-js-community/sap-afc-sdk';
annotate ProcessingService with @impl: '/srv/scheduling-processing-service.js';
`
Implementation file: /srv/scheduling-processing-service.js
`js
const { SchedulingProcessingService } = require("@cap-js-community/sap-afc-sdk");
class CustomSchedulingProcessingService extends SchedulingProcessingService {
async init() {
const { notify } = this.operations;
this.on(notify, async (req, next) => {
// Your logic goes here
await next();
});
super.init();
}
}
module.exports = CustomSchedulingProcessingService;
`
A stub implementation for notification handling can be generated via command:
- Terminal: afc add stub
The SDK-based application exposes the scheduling provider API. The out-of-the-box open service broker implementation
can be used to manage service keys and access tokens to the API.
After Deployment the Service Broker can be registered in Cloud Foundry.
After adding the broker to project via afc add broker, the default configuration is located at:
- srv/broker.json: Open service broker configurationsrv/catalog.json
- : Open service broker catalog configuration
In addition, the broker configuration can be provided via options
as part of CDS environment in cds.requires.sap-afc-sdk.broker section.
More details on how to use the service broker can be found in the Service Broker section.
#### Redis
The application can be scaled by adding a Redis cache to distribute workload across application instances:
For CAP Node.js Redis support can be added via command:
- Terminal: cds add redis
For Node.js runtime, Redis is used by @cap-js-community/event-queue, @cap-js-community/websocket and @cap-js-community/feature-toggle-library
modules to process events, distribute websocket messages and store and distribute feature toggles values.
#### Feature Toggles
The Feature Toggle Library is used to control the execution of the Event Queue.
It exposes endpoints to manage feature toggles:
- GET /rest/feature/state(): Read current feature toggle statePOST /rest/feature/redisUpdate
- : Update feature toggle state
See .http files in /http/toggles to call feature toggle endpoints.
An internal OAuth token needs to be fetched via /http/auth/uaa.internal.cloud.http
providing credentials from the XSUAA instance or via calling:
- Terminal: afc api key -i
The SAP Advanced Financial Closing SDK for CDS provides a plugin
for SAP Cloud Application Programming Model (CAP) for Java
to extend and integrate with SAP Advanced Financial Closing (AFC). Specifically, it provides an out-of-the-box
implementation of the SAP Advanced Financial Closing Scheduling Service Provider Interface
to expose a Scheduling Provider service to manage Job definitions and Jobs. Furthermore, it brings the following out-of-the-box virtues:
- API: Exposes a RESTful API implementing the AFC Scheduling Provider Interface to manage Job definitions and Jobs
- Queue: Provides a Queue to process and synchronize Jobs (periodically) asynchronously and resiliently (circuit breaker, retry,
load-balancing, etc.)
- Websocket: Provides websocket connection support to monitor Job processing live
- UI: Provides a UI5 application to monitor and cancel Jobs
- Broker: Implements a service broker to manage service key management to API
- Run npm add @cap-js-community/sap-afc-sdk in com.sap.cds CAP Java projectnpm start
- Execute to start serversapafcsdk.scheduling.processingservice.ProcessingService
- Access the welcome page at http://localhost:8080
- Access Applications
- /launchpad.html: Sandbox Launchpad
- /scheduling.monitoring.job: Standalone Scheduling Monitoring Job UI
- Access Service Endpoints
- Public API
- /api/job-scheduling/v1: Scheduling Provider API (OpenAPI Swagger UI)
- OData API (UI)
- /odata/v4/job-scheduling/monitoring: Job Scheduling Monitoring ($metadata)
- WebSocket API
- /ws/job-scheduling: Scheduling WebSocket endpoint
- CDS Internal API
- : Scheduling Processing service
`java`
@Autowired
private ProcessingService processingService;
- sapafcsdk.scheduling.websocketservice.WebsocketService: Scheduling Websocket service
`java`
@Autowired
private WebsocketService websocketService;
The SAP Advanced Financial Closing SDK for CDS is built on the following architecture open source
building blocks as depicted in the following diagram:
Using the SAP Advanced Financial Closing SDK, a third-party scheduling provider can be built for SAP Advanced Financial Closing.
The architectural design of the SAP Advanced Financial Closing (AFC) SDK for implementing a Scheduling Provider is based
on the SAP Cloud Application Programming Model (CAP) and SAP Build Code.
The following diagram illustrates the high-level architecture of the SAP Advanced Financial Closing SDK for CDS (Java):
Key components and processing flow:
- SAP Advanced Financial Closing (AFC):
- Sends scheduling requests via AFC Scheduling Service Provider Interface using REST
API (OpenAPI)
- Scheduling Provider Service:
- Handles incoming scheduling requests
- Creates scheduling jobs synchronously and places asynchronous requests into the Event Queue
- Scheduling Processing Service:
- Processes scheduled jobs asynchronously
- Retrieves job requests from the Event Queue and executes them.
- Scheduling WebSocket Service:
- Listens for status updates of scheduled jobs
- Notifies the Monitoring Scheduling Job UI via WebSockets when job statuses change
- Scheduling Monitoring Service:
- Monitoring Scheduling Job UI (SAP Fiori Elements V4 / SAP UI5 application)
- Reads scheduling job details from the database
- Supports monitoring via OData V4 API
- Displays scheduling job statuses and updates in real-time via WebSockets
- Transactional Queue:
- CDS Transactional Queue facilitates asynchronous job execution
- Database:
- Stores job scheduling data in the database
Options can be passed to SDK via Spring Boot environment in sap-afc-sdk section:
- capabilities: Object: Capabilities configuration. Default is {}capabilities.supportsNotification: Boolean
- : Supports notification configuration. Default is trueendpoints: Object
- : Endpoint configuration. Default is {}endpoints.approuter: String
- : Url of approuter. Default is null (derived from conventions )endpoints.server: String
- : Url of server. Default is null (derived from environment, e.g. CF)api: Object
- : API configuration on /api paths. Default see belowapi.cors: Object
- : Cross-Origin Resource Sharing (CORS) configuration cors module on /api paths. Default is { origin: true }api.cors.origin: Boolean | String | String[]
- : Cross-Origin Resource Sharing (CORS) origin configuration. Default is true (allow approuter url)api.cors.methods: String | String[]
- : Cross-Origin Resource Sharing (CORS) 'allow methods' configuration. Default is []api.cors.heqaders: String | String[]
- : Cross-Origin Resource Sharing (CORS) 'allow headers' configuration. Default is []api.cors.credentials: Boolean
- : Cross-Origin Resource Sharing (CORS) 'allow credentials' configuration. Default is trueui: Object
- : UI configuration. Default is {}ui.enabled: Boolean
- : UI apps are served. Default is false and true in cloudui.link: Boolean
- : Fill link of jobs to served UI5 launchpad, if null. Default is truebroker: Object
- : Service broker configuration. Default is {}broker.name: String
- : Name of the broker. Default is broker.enabled: Boolean
- : Is broker enabled. Default is falsebroker.user: String
- : Name of the broker user. Default is broker-userbroker.credentialsHash: String
- : Credentials hash of the broker user. Default is generatedbroker.endpoints: Object
- : Endpoints of the broker. Default is { api: "/api", job-scheduling-v1: "/api/job-scheduling/v1" }broker.oauth2-configuration.credential-types: String[]
- : Credential types of the broker oauth2 configuration. Default is ["binding-secret", "x509"]broker.authorities: String[]
- : Scope authorities. Default is []mockProcessing: Object
- : Activate mocked job processing. Default is {}mockProcessing.min: Number
- : Minimum processing time in seconds. Default is 0mockProcessing.max: Number
- : Maximum processing time in seconds. Default is 10mockProcessing.default: String
- : Default processing status. Default is completedmockProcessing.status: Object
- : Status distribution values. Default is {}mockProcessing.status.completed: Number
- : Completed status distribution value. Default is 0mockProcessing.status.completedWithWarning: Number
- : Completed With Warning status distribution value. Default is 0mockProcessing.status.completedWithError: Number
- : Completed With Error status distribution value. Default is 0mockProcessing.status.failed: Number
- : Failed status distribution value. Default is 0syncJob: Object
- : Sync job configuration. Default see belowcron: String
- : Sync job cron interval. Default is 0 /1 *tenantCache: Object
- : Tenant cache configuration. Default see belowcron: String
- : Tenant cache invalidation cron interval. Default is 0 /30 *
The SDK provides a set of services to implement the job processing service and the scheduling provider service.
#### Job Processing
The job processing service is responsible for processing the jobs.
##### Mock Processing
The library includes a mocked processing for jump-start development, which is disabled by default
(no sap-afc-sdk.mock-processing config in application.yml).
Setting option sap-afc-sdk.mockProcessing a basic mocked job processing completes0-10
jobs based on a random time value between seconds:
`yaml`
sap-afc-sdk:
mock-processing:
min: 0
max: 10
default: completed
The project can be adjusted to use basic mock processing automatically via command:
- Terminal: afc add -b mock
More advanced mocked Job processing can be configured by setting the
following CDS env options (as described in options):
`yaml`
sap-afc-sdk:
mock-processing:
min: 0
max: 10
default: completed
status:
completed: 0.5
completedWithWarning: 0.2
completedWithError: 0.2
failed: 0.1
The project can be adjusted to use advanced mock processing automatically via command:
- Terminal: afc add -a mock
Mock configuration can be adjusted in application.yaml afterward.
To disable mock processing remove config sap-afc-sdk.mockProcessing, e.g. via command:
- Terminal: afc add -x mock
The default implementation of the job processing is already provided by the SDK. Focus can be put on
custom processing logic and the processing status update handling.
##### Custom Processing
To implement custom job processing, extend the job processing service definition as follows:
Implementation file: srv/src/main/java/customer/scheduling/CustomSchedulingProcessingHandler.java
`java
package customer.scheduling;
import com.github.capjscommunity.sapafcsdk.model.sapafcsdk.scheduling.processingservice.*;
import com.github.capjscommunity.sapafcsdk.scheduling.base.SchedulingProcessingBase;
import com.sap.cds.services.handler.annotations.HandlerOrder;
import com.sap.cds.services.handler.annotations.On;
import com.sap.cds.services.handler.annotations.ServiceName;
import org.springframework.stereotype.Component;
@Component
@ServiceName(ProcessingService_.CDS_NAME)
public class CustomSchedulingProcessingHandler extends SchedulingProcessingBase {
@On(event = ProcessJobContext.CDS_NAME)
@HandlerOrder(HandlerOrder.EARLY)
public void processJob(ProcessJobContext context) {
// Your logic goes here
context.proceed();
}
@On(event = UpdateJobContext.CDS_NAME)
@HandlerOrder(HandlerOrder.EARLY)
public void updateJob(UpdateJobContext context) {
// Your logic goes here
context.proceed();
}
@On(event = CancelJobContext.CDS_NAME)
@HandlerOrder(HandlerOrder.EARLY)
public void cancelJob(CancelJobContext context) {
// Your logic goes here
context.proceed();
}
@On(event = SyncJobContext.CDS_NAME)
@HandlerOrder(HandlerOrder.EARLY)
public void syncJob(SyncJobContext context) {
// Your logic goes here
context.proceed();
}
@On(event = NotifyContext.CDS_NAME)
@HandlerOrder(HandlerOrder.EARLY)
public void notify(NotifyContext context) {
// Your logic goes here
context.proceed();
}
}
`
A stub implementation for a custom scheduling processing service can be generated via command:
- Terminal: afc add stub
As part of the custom scheduling process service implementation, the following operations can be implemented:
- processJob:context.get("ID")
- A new job instance was created and needs to be processed
- The job is due (start date time is reached), and the job is ready for processing
- Implement your custom logic, how the job should be processed
- Job ID is accessible via context.get("testRun")
- Test run can be identified via flag (if job definition supports test mode)context.proceed()
- Call to perform default implementation (set status to running)this.processJobUpdate()
- Job update can be performed via providing the new status and job resultsthis.processJobUpdate(context, job, JobStatusCode.completed, results)
- e.g. sap-afc-sdk.mockProcessing
- Throwing exceptions will automatically trigger the retry process in queue
- Disable mocked job processing by deleting (default).updateJob
- :context.get("ID")
- A job status update is requested and the job results are stored
- Implement your custom logic, how the job status should be updated
- Job ID is accessible via this.checkStatusTransition(context, job, statusBefore, statusAfter)
- Job status transition is validated via this.statusTransitions
- Valid status transitions are defined in completed
- Check function and status transitions can be customized
- Final statuses are , completedWithWarning, completedWithError, failed, and canceled, no further status transitions are then allowedcheckJobResults(context, job, results)
- Job results are checked and processed via context.proceed()
- Valid results are valid, according to job results signature constraints (see below)
- Returns the processed job results to be inserted
- Call to perform default implementation (update status to requested status)cancelJob
- :context.get("ID")
- A job cancellation is requested
- Implement your custom logic, how the job should be canceled
- Job ID is accessible via context.proceed()
- Call to perform default implementation (update status to canceled)
The job results signature is defined as follows:
`cds
type ResultTypeCode : String enum {
link;
data;
message;
};
type MessageSeverityCode : String enum {
success;
info;
warning;
error;
};
type JobResult {
name : String(255) not null;
type : ResultTypeCode not null;
link : String(5000);
mimeType : String(255);
filename : String(5000);
data : LargeBinary;
messages : many JobResultMessage;
};
type JobResultMessage {
code : String(255) not null;
values : array of String(5000);
text : String(5000);
severity : MessageSeverityCode not null;
createdAt : Timestamp;
texts : many JobResultMessageText;
};
type JobResultMessageText {
locale : Locale not null;
text : String(5000);
};
`
Multiple job results can be passed for job update.
The following constraints apply for each job result type:
- link:name
- Properties and link need to be provideddata
- Other properties are not allowed
- :name
- Properties , mimeType, filename and data need to be providedmessage
- Data needs to be provided as a base64 encoded string
- Other properties are not allowed
- :name
- Properties and messages need to be provided
- Messages need to be provided as an array of job result messages
- Other properties are not allowed
Job processing is performed as part of the Event Queue processing. The Event Queue is a framework built on top of CAP
Node.js, designed specifically for efficient and streamlined asynchronous event processing. In case of errors, the Event Queue
provides resilient processing (circuit breaker, retry, load-balancing, etc.).
In addition, to overwriting the default implementation via an on handler, also additional before and after handlers can be registered.
##### Error Codes
The following error codes are defined to be used in exceptions as part of the stable interface (x-extensible-enum):
- statusTransitionNotAllowed: Status transition is not allowed for the current job status
#### Job Provider
A job provider service is already provided per default by the SDK, implementing
the SAP Advanced Financial Closing Scheduling Service Provider Interface.
Therefore, focus can be put on additional custom provider logic (e.g., streaming of data from a remote location).
The SAP Advanced Financial Closing Scheduling Service Provider Interface is published on SAP Business Accelerator Hub
under the package SAP Advanced Financial Closing at https://api.sap.com/api/SSPIV1.
To implement a custom job provider, extend the job provider service definition as follows:
Implementation file: srv/src/main/java/customer/scheduling/CustomSchedulingProviderHandler.java
`java
package customer.scheduling;
import com.github.capjscommunity.sapafcsdk.model.sapafcsdk.scheduling.providerservice.*;
import com.github.capjscommunity.sapafcsdk.scheduling.base.SchedulingProviderBase;
import com.sap.cds.services.cds.CdsCreateEventContext;
import com.sap.cds.services.cds.CqnService;
import com.sap.cds.services.handler.annotations.HandlerOrder;
import com.sap.cds.services.handler.annotations.On;
import com.sap.cds.services.handler.annotations.ServiceName;
import java.util.List;
import org.springframework.stereotype.Component;
@Component
@ServiceName(ProviderService_.CDS_NAME)
public class CustomSchedulingProviderHandler extends SchedulingProviderBase {
@On(event = CqnService.EVENT_CREATE, entity = Job_.CDS_NAME)
@HandlerOrder(HandlerOrder.EARLY)
public void createJob(CdsCreateEventContext context, List
// Your logic goes here
context.proceed();
}
@On(event = JobCancelContext.CDS_NAME, entity = Job_.CDS_NAME)
@HandlerOrder(HandlerOrder.EARLY)
public void cancelJob(JobCancelContext context) {
// Your logic goes here
context.proceed();
}
@On(event = JobResultDataContext.CDS_NAME, entity = JobResult_.CDS_NAME)
@HandlerOrder(HandlerOrder.EARLY)
public void downloadData(JobResultDataContext context) {
// Your logic goes here
context.proceed();
}
}
`
A stub implementation for a custom scheduling provider service can be generated via command:
- Terminal: afc add stub
As part of the custom scheduling provider service implementation, the following operations can be implemented:
- createJob:context.proceed()
- Validates and creates a new job instance
- Call to perform default implementationafter
- : Calls scheduling processing service function processJobcancelJob
- :context.proceed()
- Cancels a job
- Call to perform default implementationafter
- : Calls scheduling processing service function cancelJobdownloadData
- :context.proceed()
- Call to perform default implementationdata
- Streams data of a job result (type ) from DB to response
In addition, to overwriting the default implementation via an on-handler, also additional before and after handlers can be registered.
#### Periodic Job Sync
A periodic scheduling job synchronization event named SchedulingProcessingService.syncJob is running per default every 1 minute
in Spring scheduling tasks, to perform job synchronization from an external source. The default implementation is a no-op.
The event syncJob is registered automatically with interval Spring scheduling configuration.application.yaml
To change the interval, the configuration can be adjusted in the Spring environment:
Application file:
`yaml`
sap-afc-sdk:
syncJob:
cron: 0 /2 *
Implementation file: srv/src/main/java/customer/scheduling/CustomSchedulingProcessingHandler.java
`java
package customer.scheduling;
import com.github.capjscommunity.sapafcsdk.model.sapafcsdk.scheduling.processingservice.*;
import com.github.capjscommunity.sapafcsdk.scheduling.base.SchedulingProcessingBase;
import com.sap.cds.services.handler.annotations.HandlerOrder;
import com.sap.cds.services.handler.annotations.On;
import com.sap.cds.services.handler.annotations.ServiceName;
import org.springframework.stereotype.Component;
@Component
@ServiceName(ProcessingService_.CDS_NAME)
public class CustomSchedulingProcessingHandler extends SchedulingProcessingBase {
@On(event = SyncJobContext.CDS_NAME)
@HandlerOrder(HandlerOrder.EARLY)
public void syncJob(SyncJobContext context) {
// Your logic goes here
context.proceed();
}
}
`
A stub implementation for periodic job sync can be generated via command:
- Terminal: afc add stub
#### Notification
The service provider can be notified with special scheduling notifications via operation notify.notification
The capability is active per default, can be disabled by setting environment option:
Application file:
`yaml`
sap-afc-sdk:
capabilities:
supportsNotification: false
The notify operation of the Scheduling Service Provider Interface can send multiple notifications at once.
The signature of a single notification is defined as follows:
`cds`
type Notification {
name : String(255) not null;
ID : String(255);
value : String(5000);
};
Available notifications are:
- taskListStatusChanged: Notification to inform about changed task list status.name
- : Notification name taskListStatusChangedID
- : Task list IDvalue
- : New task list status
Implementation file: srv/src/main/java/customer/scheduling/CustomSchedulingProcessingHandler.java
`java
package customer.scheduling;
import com.github.capjscommunity.sapafcsdk.model.sapafcsdk.scheduling.processingservice.*;
import com.github.capjscommunity.sapafcsdk.scheduling.base.SchedulingProcessingBase;
import com.sap.cds.services.handler.annotations.HandlerOrder;
import com.sap.cds.services.handler.annotations.On;
import com.sap.cds.services.handler.annotations.ServiceName;
import org.springframework.stereotype.Component;
@Component
@ServiceName(ProcessingService_.CDS_NAME)
public class CustomSchedulingProcessingHandler extends SchedulingProcessingBase {
@On(event = NotifyContext.CDS_NAME)
@HandlerOrder(HandlerOrder.EARLY)
public void notify(NotifyContext context) {
// Your logic goes here
context.proceed();
}
}
`
A stub implementation for notification handling can be generated via command:
- Terminal: afc add stub
The SDK-based application exposes the scheduling provider API. The out-of-the-box open service broker implementation
can be used to manage service keys and access tokens to the API.
After Deployment the Service Broker can be registered in Cloud Foundry.
After adding the broker to a project via afc add broker, the default configuration is located in application.yaml at:
`yaml`
spring:
cloud:
openservicebroker:
catalog:
services: ...
For details see Spring Cloud Open Service Broker documentation.
More details on how to use the service broker can be found in the Service Broker section.
To fully test the application, also accessing APIs from external, a deployment needs to be performed.
BTP offers different deployment options, depending on the target environment (Cloud Foundry or Kyma).
- Initialize project for Cloud Foundry:
- Terminal: afc init cfcds up --to cf
- CDS Upgrade:
- Terminal:
- For details see guide Deployment to CF
#### Kyma
- Initialize project for Kyma:
- Terminal: afc init kymachart/values.yaml
- Configuration:
- Set global domain in chart/values.yaml
- Set global image registry in containerize.yaml
- Set repository in approuter
- Set endpoints to and server in sap-afc-sdk env to Kyma API rule hostscds up --to k8s -n
- CDS Upgrade:
- Terminal:
- For details see guide Deployment to Kyma
An Open Service Broker compliant broker implementation can be added to the CAP project.
The broker is used to manage service key management to the API in a Cloud Foundry environment.
> For AFC SDK feature broker the auth strategy xsuaa with plan broker is required
- Add broker and service configuration:
- Terminal: afc add brokerxsuaa
- Auth strategy for service with plan broker is applied via cds add xsuaaafc api key
- Deploy to CF (see Deployment to Cloud Foundry)
- Get API key credentials
- Terminal: https://
- Use API key credentials
- Swagger UI:
- Open URL: Authorize
- Click and provide key credentials for client_id and client_secretafc api key -h
- Try out endpoints
- HTTP Client:
- Add .http files
- Update .http files placeholders
- Terminal: afc api key -t
- Perform OAuth token request using key credentials (clientId, clientSecret)
- See http/auth/uaa.cloud.http for getting an OAuth token
- Via CLI:
- Terminal: .http
- Call API using OAuth token
- See files in /http to call API endpoints.http
- See files in /http/scheduling to call scheduling provider API endpointsafc api key -c
- Clear credentials in .http files via
- Terminal: afc add key -d -e
- Destination:
- A destination file for an API endpoint can be created via command:
- Terminal: afc add key -d -j
- A destination file for Job Scheduling Provider API can be created via command:
- Terminal: afc api key -r
- Reset API management in CF
- Terminal:
The application can be tested locally using the following steps:
- Start application
- Terminal: npm start
- Open the welcome page
- Browser:
- CAP Node.js: http://localhost:4004
- CAP Java: http://localhost:8080
#### Sample data
To add sample job definitions and job instances, run:
- Terminal: afc add sample
Test data will be placed at /db/data
#### Unit-Tests
To add unit-tests for testing the API endpoints, run:
- Terminal: afc add test
Test files will be placed at /test.
#### .http files
To add .http files for testing the API endpoints run
- Terminal: afc add http
HTTP files will be placed at /http.
#### Authentication Method
The authentication strategy can be configured via CDS env according to CDS documentation.
For AFC SDK feature broker the auth strategy xsuaa with plan broker is required:
- Terminal: afc add brokerxsuaa
- Service with plan broker is applied via cds add xsuaa
- Service Broker feature can be used to manage service keys and access tokens
#### Service Restrictions
Scheduling Provider Service can be restricted for authorization adding @requires annotation:
`cds
using sapafcsdk.scheduling.ProviderService from '@cap-js-community/sap-afc-sdk';
annotate ProviderService with @requires: 'JobScheduling';
`
Details can be found in CDS-based Authorization.
For development and testing purposes SDK UIs are served as part of the server. Exposed UIs can be accessed via the
server welcome page. For productive usage, UIs should be served via HTML5 repo:
- Add SDK Apps to HTML5 Repo (copy to project)
- Terminal: afc add appcds add workzone,html5-repo
- Work Zone and HTML5 Repo features are added automatically via Monitor Scheduling Jobs
- Set up and configure SAP Work Zone instance using HTML5 Apps Content Channel
- Add app to Content Explorercds.requires.sap-afc-sdk.ui: false
- Assign an app to a group, role, and site to be accessible
- (CAP Node.js) Disable UI served in server via CDS env: /app
- (Optional) Apps from AFC SDK can also be copied over into a project at for further adjustmentsafc add app`
- Terminal:
The project can be enabled for multitenancy by following the guide:
https://cap.cloud.sap/docs/guides/multitenancy/#enable-multitenancy
The MTX Tool is used to manage the application lifecycle. It can be used to manage the application in Cloud Foundry.
Details can be found at https://github.com/cap-js-community/mtx-tool.
This project is open to feature requests/suggestions, bug reports, etc. via GitHub issues. Contribution and feedback are encouraged and always welcome. For more information about how to contribute, the project structure, as well as additional contribution information, see our Contribution Guidelines.
We as members, contributors, and leaders pledge to make participation in our community a harassment-free experience for everyone. By participating in this project, you agree to abide by its Code of Conduct at all times.
Copyright 2025 SAP SE or an SAP affiliate company and sap-afc-sdk contributors. Please see our LICENSE for copyright and license information. Detailed information including third-party components and their licensing/copyright information is available via the REUSE tool.