Overview
AFAS Profit is one of the most widely used ERP and HRM platforms in the Netherlands — the system that Dutch businesses use for financial administration, payroll, HR management, project administration, and CRM. For organisations that run AFAS as their operational backbone, the data in AFAS needs to flow to the other systems the organisation uses: the custom web applications that need employee or customer data, the operational tools that need to trigger financial transactions, the reporting systems that need to draw from AFAS financial records, the HR portals that need to read and write employee data, and the external systems that need to exchange data with AFAS on a scheduled or event-driven basis.
AFAS provides a REST API — the App Connector — that exposes AFAS data and functionality to external applications through standard HTTP calls. The App Connector uses a token-based authentication system and a GetConnectors/UpdateConnectors model where GetConnectors retrieve data and UpdateConnectors write data back to AFAS. Understanding the App Connector's specific conventions — the token format, the connector model, the filter syntax, the batch operation structure — is the prerequisite for building reliable AFAS integrations that work correctly in production.
AFAS integrations appear across many of the projects we build for Dutch organisations. Employee data synchronisation between AFAS and custom HR portals. Financial transaction posting from operational systems to AFAS. Customer and debtor data synchronisation between AFAS and CRM systems. Purchase order and invoice data exchange between AFAS and supplier portals. Payroll data flows between AFAS and time tracking systems. In each case, the integration must handle the AFAS App Connector's specific conventions correctly, manage authentication token lifecycle, handle errors and retries appropriately, and maintain data consistency between AFAS and the connected system.
We build AFAS integrations for Dutch organisations that need to connect their custom software, operational tools, and web applications to AFAS Profit — covering the full range of AFAS data domains and the integration patterns that production AFAS connectivity requires.
What AFAS Integration Covers
AFAS App Connector authentication. The AFAS REST API uses a proprietary token-based authentication system that differs from standard OAuth 2.0 or API key authentication. Each integration requires an App Connector token — a base64-encoded JSON structure containing the environment identifier, the connector token, and the token type — configured in the AFAS environment by the AFAS administrator.
Token construction: the correct base64 encoding of the JSON token structure that the AFAS API requires, with the environment number, the token, and the token type formatted exactly as AFAS expects. The token is passed in the Authorization header of every request as a custom scheme — AfasToken {token} — rather than the standard Bearer scheme that most REST APIs use.
Token management: AFAS tokens do not expire in the standard JWT sense but are tied to the AFAS environment configuration. Token storage in the integration's configuration, the rotation process when tokens need to be updated, and the error handling that correctly identifies authentication failures and surfaces them for operational attention rather than silently retrying indefinitely.
Environment management: AFAS environments have distinct identifiers (environment numbers) and distinct API base URLs. The integration configuration that correctly targets the production environment versus the test environment, with the environment switching that allows testing against AFAS test data before deploying changes that affect the production environment.
GetConnectors for data retrieval. AFAS exposes data through named GetConnectors — the endpoints that return data from AFAS in structured JSON format. Each GetConnector has a defined schema that describes the fields it returns, and connectors can be filtered, sorted, and paginated.
Filter syntax: the AFAS filter expression language for filtering GetConnector results. The filter format — {"Filters":{"Filter":[{"Field":"FieldName","OperatorType":1,"Value":"FilterValue"}]}} — URL-encoded and passed as the filterfieldids query parameter. The operator types that correspond to equals, not equals, contains, greater than, less than, and other comparison operators. Compound filters with multiple conditions combined using AND logic.
Pagination: the take and skip parameters that implement pagination over large result sets. The pattern for retrieving all records from a large connector by repeatedly fetching pages until an empty page is returned. The page size selection that balances the number of API calls against the response payload size for each call.
Sorting: the orderbyfieldids parameter with the field name and direction (0 for ascending, 1 for descending) for ordering GetConnector results. The sort configuration appropriate to the data consumer's requirements.
Standard connectors: the built-in GetConnectors that AFAS provides for standard data domains — employee data (Profit_Employees), debtor data (Profit_Debtors), creditor data (Profit_Creditors), project data, financial transactions, leave data, and the other standard AFAS data domains. Custom connectors: the GetConnectors configured in the AFAS environment for data shapes or data combinations that the standard connectors do not provide — the custom connector that joins data from multiple AFAS entities or computes derived fields.
UpdateConnectors for data writing. AFAS accepts data writes through UpdateConnectors — the endpoints that create, update, and delete records in AFAS. UpdateConnectors accept JSON payloads with the data to be written, nested according to the AFAS data model.
Insert operations: the UpdateConnector payload that creates a new record in AFAS. The __metadata action type insert that tells AFAS to create a new record. The required fields that must be present for each record type. The AFAS validation that rejects inserts missing required fields or with invalid field values, and the error response parsing that identifies which validation rule was violated.
Update operations: the UpdateConnector payload that modifies an existing record. The __metadata action type update. The primary key field that identifies the record to update. The partial update that sends only the fields to be changed rather than the full record.
Delete operations: the __metadata action type delete for record deletion. The confirmation that the record to be deleted exists and that deletion is appropriate before submitting the delete operation.
Nested operations: AFAS UpdateConnectors support nested data structures that create or update related records in a single API call — the employee record that includes nested contact records, the order that includes nested order lines. The nested operation structure that matches the AFAS data model's parent-child relationships.
Batch operations: the UpdateConnector payload that performs multiple operations in a single API call — the array of records to be inserted, updated, or deleted in one request. The batch operation that reduces the number of API calls required for bulk data synchronisation.
HR and payroll integrations. The most common AFAS integration domain for Dutch organisations — connecting AFAS's HR and payroll data to the other systems that use it.
Employee data synchronisation: reading employee records from AFAS GetConnectors (Profit_Employees, HrEmployment, and related connectors) and synchronising to the consuming system. Employee lifecycle events — new hires, contract changes, terminations — detected through delta synchronisation or event-based triggers and propagated to connected systems. The employee portal that reads its user data from AFAS rather than maintaining a separate user database.
Leave and absence data: reading leave balances from AFAS and displaying them in HR portals. Submitting leave requests from custom HR applications back to AFAS via UpdateConnectors. Synchronising sick leave registrations between AFAS and case management systems.
Payroll data flows: reading payroll output data from AFAS — the salary components, the payroll totals by period, the employee payslip data — for reporting and financial reconciliation. The time tracking system that submits hours to AFAS through the appropriate UpdateConnector for payroll processing.
Organisational structure: reading department structures, cost centres, and organisational hierarchies from AFAS for the consuming systems that need to reflect AFAS's organisational model. The reporting system that groups data by AFAS department and cost centre.
Financial and accounting integrations. AFAS as the financial system of record — the integration that posts transactions from operational systems into AFAS financial administration.
Journal entry posting: the UpdateConnector that creates general ledger entries in AFAS — the operational system that generates a financial event (a completed sale, a processed invoice) and posts the corresponding journal entry to AFAS. The journal entry structure with the debtor/creditor code, the amount, the VAT code, the cost centre allocation, and the booking period.
Debtor and creditor management: synchronising customer (debtor) and supplier (creditor) master data between AFAS and CRM, procurement, or supplier portal systems. The new customer created in the CRM system propagated to AFAS as a new debtor. The supplier onboarded in the supplier portal created as a creditor in AFAS.
Invoice processing: reading outstanding invoices from AFAS for collection management tools, reconciliation systems, and payment status dashboards. The invoice payment confirmation received from a payment processor posted back to AFAS to close the open item.
Purchase orders: reading purchase orders from AFAS for supplier portals and procurement systems. Updating purchase order status in AFAS as deliveries are received and invoices are matched.
Project administration. AFAS's project module integration for organisations that manage projects in AFAS.
Project data retrieval: reading project structures, project phases, and project budgets from AFAS for project dashboards and reporting tools. The project portal that displays project status from AFAS without requiring users to access AFAS directly.
Hours and cost registration: submitting time entries from custom time tracking applications to AFAS project administration. The hours approval workflow in the custom application that posts approved hours to the AFAS project record.
Project invoicing: reading project invoicing data from AFAS for client billing portals and financial reporting.
CRM integrations. AFAS's CRM module or debtor/creditor data used as the source of truth for customer data in connected systems.
Contact synchronisation: bidirectional synchronisation between AFAS contacts and external CRM systems — new contacts created in either system propagated to the other, contact updates in one system reflected in the other. The deduplication logic that prevents duplicate records from accumulating when the same contact is created in both systems before the synchronisation runs.
Activity logging: writing customer interaction records from CRM systems back to AFAS. The sales call logged in the CRM system that creates a corresponding activity record in AFAS.
Delta synchronisation and change detection. Most AFAS integrations need to synchronise data incrementally — processing only the records that have changed since the last synchronisation run rather than re-processing the entire dataset.
Timestamp-based delta: filtering GetConnector results by the last-modified timestamp to retrieve only records updated since the last synchronisation. The Mutatie_datum field available on most AFAS entities that records the last modification timestamp. The synchronisation state that records the timestamp of the last successful run, used as the filter value for the next run.
Sequence number-based delta: AFAS provides sequence numbers on some connectors that increment monotonically with each change. The synchronisation that records the last processed sequence number and retrieves only records with higher sequence numbers on the next run.
Full reconciliation: the periodic full reconciliation that compares the complete dataset from AFAS with the connected system's data, identifying discrepancies that incremental synchronisation may have missed due to errors or edge cases. The reconciliation that runs weekly or monthly alongside the incremental daily synchronisation.
Integration Patterns
Scheduled synchronisation. The most common AFAS integration pattern — a scheduled job that runs at defined intervals, queries AFAS for changes since the last run, and applies the changes to the connected system. The cron job or scheduled task that handles the routine data synchronisation that keeps connected systems current with AFAS data.
Event-driven integration. AFAS does not natively publish webhook events, so event-driven AFAS integration is typically implemented as frequent polling that approximates real-time behaviour — a scheduled job that runs every few minutes and processes any changes detected since the previous run. For integrations where near-real-time propagation is required, polling intervals are reduced to match the required latency.
On-demand API. The integration that exposes AFAS data through a custom REST API — the middleware service that accepts requests from consuming applications, translates them to AFAS App Connector calls, and returns the results. The API that insulates consuming applications from AFAS's specific conventions and provides a consistent interface regardless of AFAS API changes.
Bidirectional synchronisation. The integration that keeps data consistent between AFAS and another system when both systems can create and modify the same data. Conflict resolution strategy — the rule that determines which system's version takes precedence when both systems have modified the same record since the last synchronisation. Last-write-wins, AFAS-as-authoritative, or manual review for conflicts — the policy appropriate to the data domain and the business process.
Technologies Used
- C# / ASP.NET Core — primary implementation language for AFAS integrations, leveraging the .NET HTTP client and the JSON handling that AFAS API interaction requires
- Rust / Axum — high-performance AFAS data processing for integrations with large data volumes or strict latency requirements
- REST / HTTP — AFAS App Connector API communication
- JSON — AFAS API request and response format
- SQL (PostgreSQL / MySQL) — synchronisation state storage, delta tracking, data staging
- Redis — integration job coordination, rate limit management, caching of frequently accessed AFAS reference data
- Hangfire / Quartz.NET — scheduled job execution for C# integration services
- Docker — containerised integration service deployment
- GitHub Actions — CI/CD pipeline for integration service deployment
AFAS Integration in the Dutch Business Context
AFAS Profit's position in the Dutch business software market means that AFAS integration is a common requirement across the custom software projects we build for Dutch organisations. The employee portal that needs to show AFAS HR data without requiring employees to log into AFAS directly. The operational system that generates financial transactions that need to appear in AFAS without manual re-entry. The supplier portal that needs to sync with AFAS procurement data. The custom dashboard that needs to draw from AFAS financial reports.
In each of these cases, the AFAS App Connector is the integration mechanism — and building integrations that use it correctly, handle errors appropriately, maintain data consistency, and remain operational over the long term requires the specific knowledge of AFAS's integration model that experience with the platform provides.
Connected to AFAS, Working for Your Business
AFAS integrations built to production standards — correct authentication, robust error handling, incremental delta synchronisation, and the monitoring that surfaces integration issues before they affect business operations — keep your systems connected to AFAS reliably rather than requiring ongoing manual attention to maintain the data flows that your operations depend on.