Salesforce · · 32 min read

The Basics of Async Apex

A comprehensive guide to asynchronous Apex — batch classes, scheduled classes, queueable classes, future methods, and platform events. When to use each, how to set them up, and best practices.

Part 53: The Basics of Async Apex

Welcome back to the Salesforce series. Up to this point, most of the Apex we have written runs synchronously — you call a method, it executes, and it returns a result before anything else happens. That works well for small operations, but Salesforce has strict governor limits on synchronous transactions. When you need to process thousands of records, make callouts to external systems, or schedule work to run at a specific time, synchronous Apex is not enough.

That is where asynchronous Apex comes in. Async Apex lets you run code in the background, outside the context of the original transaction. Salesforce provides four distinct async patterns, each designed for different use cases, plus Platform Events for event-driven architectures. This post covers all of them in detail.

By the end you will understand when to use each async pattern, how to implement them, and the governor limit differences that make each one useful.


What is Async Apex?

Synchronous vs Asynchronous Execution

In synchronous execution, code runs line by line. Each statement must complete before the next one begins. The user (or calling process) waits for the entire operation to finish. In a trigger context, the record save does not complete until all trigger logic finishes executing.

In asynchronous execution, the platform queues work for later processing. The calling code continues without waiting for the async work to complete. Salesforce runs the async job when resources are available — typically within a few seconds, but there is no guaranteed timing.

Why Async Exists

The primary reason async Apex exists is governor limits. Synchronous transactions have strict limits:

  • 50,000 SOQL query rows per transaction
  • 10,000 DML rows per transaction
  • 100 SOQL queries per transaction
  • 150 DML statements per transaction
  • 10-second CPU time limit
  • 6 MB heap size limit
  • No callouts from triggers (synchronous context)

When you need to process 500,000 Account records, these limits make it impossible in a single synchronous transaction. Async Apex solves this by breaking the work into smaller chunks, each with its own set of governor limits.

The Four Async Patterns

Salesforce provides four built-in async execution patterns:

PatternPrimary Use Case
Batch ApexProcessing large data volumes (thousands to millions of records)
Scheduled ApexRunning code at specific times or on a recurring schedule
Queueable ApexComplex async processing with job chaining and state
Future MethodsSimple, fire-and-forget async work like callouts

Each pattern has different governor limits, different levels of control, and different strengths. We will cover each one in detail.


How to Set Up a Batch Class

Batch Apex is the workhorse of asynchronous processing in Salesforce. When you need to process thousands or millions of records, batch is the answer. It works by dividing a large dataset into smaller chunks (called scopes), processing each chunk in its own transaction with its own governor limits.

The Database.Batchable Interface

To create a batch class, you implement the Database.Batchable<sObject> interface. This interface requires three methods:

public class AccountCleanupBatch implements Database.Batchable<sObject> {

    // Step 1: Gather the records to process
    public Database.QueryLocator start(Database.BatchableContext bc) {
        return Database.getQueryLocator(
            'SELECT Id, Name, Phone, LastModifiedDate FROM Account WHERE Phone = null'
        );
    }

    // Step 2: Process each chunk of records
    public void execute(Database.BatchableContext bc, List<sObject> scope) {
        List<Account> accountsToUpdate = new List<Account>();

        for (sObject s : scope) {
            Account acc = (Account) s;
            acc.Phone = '000-000-0000';
            accountsToUpdate.add(acc);
        }

        if (!accountsToUpdate.isEmpty()) {
            update accountsToUpdate;
        }
    }

    // Step 3: Run any post-processing logic
    public void finish(Database.BatchableContext bc) {
        AsyncApexJob job = [
            SELECT Id, Status, NumberOfErrors, JobItemsProcessed, TotalJobItems
            FROM AsyncApexJob
            WHERE Id = :bc.getJobId()
        ];

        System.debug('Batch completed. Status: ' + job.Status);
        System.debug('Items processed: ' + job.JobItemsProcessed + ' of ' + job.TotalJobItems);
        System.debug('Errors: ' + job.NumberOfErrors);
    }
}

The Three Methods Explained

start() — This method runs once at the beginning. It defines the dataset that the batch will process. You almost always return a Database.QueryLocator, which allows you to query up to 50 million records (far beyond the normal 50,000 row limit). You can also return an Iterable<sObject> if you need custom logic to build your dataset, but the Iterable approach is limited to the standard 50,000 row governor limit.

execute() — This method runs once per chunk. The scope parameter contains the records for that chunk. Each execute call gets its own set of governor limits. If your batch processes 10,000 records with a scope size of 200, the execute method runs 50 times — each time with a fresh set of limits.

finish() — This method runs once after all chunks have been processed. Use it for post-processing: sending notification emails, logging results, or chaining another batch job.

Database.BatchableContext

The Database.BatchableContext parameter (commonly named bc) is passed to all three methods. It gives you access to the job ID via bc.getJobId(), which you can use to query the AsyncApexJob object for status information.

Running a Batch and Scope Size

To execute a batch class, use Database.executeBatch():

// Default scope size (200 records per chunk)
AccountCleanupBatch batch = new AccountCleanupBatch();
Id jobId = Database.executeBatch(batch);

// Custom scope size (50 records per chunk)
Id jobId = Database.executeBatch(batch, 50);

The scope size controls how many records are passed to each execute() call. The default is 200, and the maximum is 2,000. Smaller scope sizes are useful when your execute logic is complex or you are approaching governor limits within a single chunk. If your batch makes callouts, the maximum scope size drops to 100.

Database.Stateful

By default, batch classes do not maintain state between execute calls. Each execute runs in its own transaction with fresh member variables. If you need to accumulate data across chunks — like counting the total number of records processed — implement the Database.Stateful interface:

public class RecordCounterBatch implements Database.Batchable<sObject>, Database.Stateful {

    public Integer totalRecordsProcessed = 0;
    public Integer totalErrors = 0;

    public Database.QueryLocator start(Database.BatchableContext bc) {
        return Database.getQueryLocator('SELECT Id, Name FROM Contact WHERE MailingCity = null');
    }

    public void execute(Database.BatchableContext bc, List<sObject> scope) {
        List<Contact> contactsToUpdate = new List<Contact>();

        for (sObject s : scope) {
            Contact c = (Contact) s;
            c.MailingCity = 'Unknown';
            contactsToUpdate.add(c);
        }

        List<Database.SaveResult> results = Database.update(contactsToUpdate, false);

        for (Database.SaveResult sr : results) {
            if (sr.isSuccess()) {
                totalRecordsProcessed++;
            } else {
                totalErrors++;
            }
        }
    }

    public void finish(Database.BatchableContext bc) {
        System.debug('Total records updated: ' + totalRecordsProcessed);
        System.debug('Total errors: ' + totalErrors);
    }
}

Without Database.Stateful, totalRecordsProcessed and totalErrors would reset to 0 at the start of every execute call. With it, their values persist across all chunks.

Warning: Using Database.Stateful means Salesforce must serialize and deserialize your class between every execute call. Keep your instance variables lean — do not store large collections in stateful batch classes.

Making Callouts from Batch Apex

If your batch needs to make HTTP callouts to external systems, implement Database.AllowsCallouts:

public class ExternalSyncBatch implements Database.Batchable<sObject>, Database.AllowsCallouts {

    public Database.QueryLocator start(Database.BatchableContext bc) {
        return Database.getQueryLocator('SELECT Id, Name, ExternalId__c FROM Account WHERE NeedsSync__c = true');
    }

    public void execute(Database.BatchableContext bc, List<sObject> scope) {
        for (sObject s : scope) {
            Account acc = (Account) s;

            HttpRequest req = new HttpRequest();
            req.setEndpoint('callout:ExternalSystem/api/accounts/' + acc.ExternalId__c);
            req.setMethod('PUT');
            req.setHeader('Content-Type', 'application/json');
            req.setBody(JSON.serialize(acc));

            Http http = new Http();
            HttpResponse res = http.send(req);

            if (res.getStatusCode() == 200) {
                acc.NeedsSync__c = false;
            }
        }

        update scope;
    }

    public void finish(Database.BatchableContext bc) {
        System.debug('External sync batch complete.');
    }
}

Remember that when making callouts, the maximum scope size is 100 (not the usual 2,000), and each execute call is limited to 100 callouts.

Monitoring Batch Jobs

You can monitor batch jobs in several ways:

Setup UI: Navigate to Setup > Environments > Jobs > Apex Jobs. This page shows all async jobs with their status, number of batches processed, and errors.

SOQL: Query the AsyncApexJob object directly:

List<AsyncApexJob> jobs = [
    SELECT Id, Status, JobType, ApexClassId, MethodName,
           NumberOfErrors, JobItemsProcessed, TotalJobItems,
           CreatedDate, CompletedDate
    FROM AsyncApexJob
    WHERE JobType = 'BatchApex'
    ORDER BY CreatedDate DESC
    LIMIT 10
];

Programmatic abort: If a batch is running and you need to stop it:

System.abortJob('707XXXXXXXXXXXX'); // Pass the AsyncApexJob Id

Testing Batch Classes

Testing batch classes requires Test.startTest() and Test.stopTest(). The Test.stopTest() call forces all async work to execute synchronously so your assertions can run against the results.

@isTest
private class AccountCleanupBatchTest {

    @TestSetup
    static void setupData() {
        List<Account> accounts = new List<Account>();
        for (Integer i = 0; i < 200; i++) {
            accounts.add(new Account(
                Name = 'Test Account ' + i
                // Phone is deliberately null
            ));
        }
        insert accounts;
    }

    @isTest
    static void testBatchProcessesRecords() {
        Test.startTest();
        AccountCleanupBatch batch = new AccountCleanupBatch();
        Database.executeBatch(batch);
        Test.stopTest();

        List<Account> updatedAccounts = [
            SELECT Id, Phone FROM Account WHERE Phone = '000-000-0000'
        ];
        System.assertEquals(200, updatedAccounts.size(),
            'All accounts should have their phone updated');
    }
}

Key points about testing batch classes:

  • Always create test data in your test method or @TestSetup
  • Call Database.executeBatch() between Test.startTest() and Test.stopTest()
  • Test.stopTest() forces synchronous execution of the batch
  • In test context, the batch runs as a single execute call regardless of scope size
  • Assertions go after Test.stopTest()

How to Set Up a Scheduled Class

Scheduled Apex lets you run code at a specific time or on a recurring schedule. Common use cases include nightly data cleanup, weekly report generation, and periodic integration syncs. You often pair Scheduled Apex with Batch Apex — the scheduled class kicks off a batch job at the right time.

The Schedulable Interface

To create a scheduled class, implement the Schedulable interface:

public class NightlyCleanupScheduler implements Schedulable {

    public void execute(SchedulableContext sc) {
        // Launch a batch job
        AccountCleanupBatch batch = new AccountCleanupBatch();
        Database.executeBatch(batch, 200);
    }
}

The execute method is the entry point. The SchedulableContext parameter gives you access to the trigger ID via sc.getTriggerId(), which corresponds to the CronTrigger record for this scheduled job.

CRON Expressions

Salesforce uses CRON expressions to define schedules. A CRON expression has seven fields:

Seconds  Minutes  Hours  Day_of_Month  Month  Day_of_Week  Year
FieldValuesSpecial Characters
Seconds0-59None
Minutes0-59None
Hours0-23None
Day of Month1-31, - * ? L W
Month1-12 or JAN-DEC, - *
Day of Week1-7 (1=Sunday) or SUN-SAT, - * ? L #
Year1970-2099, - *

Key rules:

  • You cannot specify both Day_of_Month and Day_of_Week. One must be ? (no specific value).
  • L means “last” — L in Day_of_Month means the last day of the month. 2L in Day_of_Week means the last Monday.
  • W means “nearest weekday” — 15W means the nearest weekday to the 15th.
  • # means “nth occurrence” — 2#3 means the third Monday of the month.

CRON Expression Examples

// Every day at midnight
String cronExp = '0 0 0 * * ?';

// Every weekday at 6:00 AM
String cronExp = '0 0 6 ? * MON-FRI';

// Every Monday at 8:30 AM
String cronExp = '0 30 8 ? * 2';

// First day of every month at 1:00 AM
String cronExp = '0 0 1 1 * ?';

// Every hour on the hour
String cronExp = '0 0 * * * ?';

// Last day of every month at 11:00 PM
String cronExp = '0 0 23 L * ?';

// Every Saturday at 9:00 AM
String cronExp = '0 0 9 ? * 7';

// January 1st every year at midnight
String cronExp = '0 0 0 1 1 ? *';

Scheduling with System.schedule

Use System.schedule() to schedule your class programmatically:

// Schedule to run every day at midnight
NightlyCleanupScheduler scheduler = new NightlyCleanupScheduler();
String jobId = System.schedule(
    'Nightly Account Cleanup',   // Job name (must be unique)
    '0 0 0 * * ?',               // CRON expression
    scheduler                     // Schedulable instance
);

You can execute this from Anonymous Apex in the Developer Console, or from another Apex class.

Scheduling from the Setup UI

You can also schedule classes from the Setup UI:

  1. Navigate to Setup > Custom Code > Apex Classes.
  2. Click Schedule Apex.
  3. Select the class (it must implement Schedulable).
  4. Give the job a name.
  5. Set the frequency (weekly or monthly), start date, end date, and preferred start time.

The Setup UI is more limited than CRON expressions — you cannot express every possible schedule through the UI.

Scheduling Limits and Considerations

  • You can have a maximum of 100 scheduled Apex jobs at one time.
  • Scheduled jobs have their own set of governor limits (synchronous limits, not async).
  • If you need to process large data volumes, schedule a class that kicks off a batch — do not put heavy processing logic directly in the scheduled class.
  • Scheduled Apex runs in system context — it does not run as a specific user in terms of sharing rules unless you explicitly enforce them.
  • You can query scheduled jobs using the CronTrigger object:
List<CronTrigger> scheduledJobs = [
    SELECT Id, CronJobDetail.Name, State, NextFireTime, PreviousFireTime
    FROM CronTrigger
    WHERE CronJobDetail.JobType = '7'
    ORDER BY NextFireTime ASC
];
  • To abort a scheduled job:
System.abortJob('08eXXXXXXXXXXXX'); // Pass the CronTrigger Id

Testing Scheduled Classes

@isTest
private class NightlyCleanupSchedulerTest {

    @isTest
    static void testSchedulerExecutes() {
        Test.startTest();
        String cronExp = '0 0 0 * * ?';
        String jobId = System.schedule(
            'Test Nightly Cleanup',
            cronExp,
            new NightlyCleanupScheduler()
        );
        Test.stopTest();

        CronTrigger ct = [
            SELECT Id, CronExpression, TimesTriggered, NextFireTime
            FROM CronTrigger
            WHERE Id = :jobId
        ];

        System.assertEquals(cronExp, ct.CronExpression, 'CRON expression should match');
        System.assertEquals(0, ct.TimesTriggered, 'Job should not have fired yet');
    }
}

How to Set Up a Queueable Class

Queueable Apex is the modern replacement for future methods in most scenarios. It combines the simplicity of future methods with the power and flexibility of batch classes. You can chain jobs, pass complex data types, and monitor job progress.

The Queueable Interface

To create a queueable class, implement the Queueable interface:

public class OpportunityFollowUpQueueable implements Queueable {

    private List<Id> opportunityIds;

    // Constructor lets you pass data into the job
    public OpportunityFollowUpQueueable(List<Id> opportunityIds) {
        this.opportunityIds = opportunityIds;
    }

    public void execute(QueueableContext context) {
        List<Opportunity> opps = [
            SELECT Id, Name, StageName, OwnerId, CloseDate
            FROM Opportunity
            WHERE Id IN :opportunityIds
        ];

        List<Task> followUpTasks = new List<Task>();

        for (Opportunity opp : opps) {
            followUpTasks.add(new Task(
                Subject = 'Follow up on ' + opp.Name,
                WhatId = opp.Id,
                OwnerId = opp.OwnerId,
                ActivityDate = Date.today().addDays(3),
                Status = 'Not Started',
                Priority = 'High'
            ));
        }

        if (!followUpTasks.isEmpty()) {
            insert followUpTasks;
        }
    }
}

Enqueuing a Queueable Job

Use System.enqueueJob() to add your job to the queue:

List<Id> oppIds = new List<Id>{ '006XXXXXXXXXXXX', '006YYYYYYYYYY' };
OpportunityFollowUpQueueable job = new OpportunityFollowUpQueueable(oppIds);
Id jobId = System.enqueueJob(job);

Passing Data via the Constructor

One of the biggest advantages of Queueable over future methods is the ability to pass complex data types through the constructor. Future methods only accept primitive types and collections of primitives. With Queueable, you can pass sObjects, custom Apex types, Maps, and any serializable data:

public class DataMigrationQueueable implements Queueable {

    private Map<Id, Map<String, Object>> recordUpdates;

    public DataMigrationQueueable(Map<Id, Map<String, Object>> recordUpdates) {
        this.recordUpdates = recordUpdates;
    }

    public void execute(QueueableContext context) {
        List<Account> accountsToUpdate = new List<Account>();

        for (Id accountId : recordUpdates.keySet()) {
            Map<String, Object> fields = recordUpdates.get(accountId);
            Account acc = new Account(Id = accountId);

            for (String fieldName : fields.keySet()) {
                acc.put(fieldName, fields.get(fieldName));
            }

            accountsToUpdate.add(acc);
        }

        if (!accountsToUpdate.isEmpty()) {
            update accountsToUpdate;
        }
    }
}

Chaining Queueable Jobs

You can chain queueable jobs by enqueuing a new job from the execute method of the current one. This is useful for sequential processing where each step depends on the previous one:

public class StepOneQueueable implements Queueable {

    public void execute(QueueableContext context) {
        // Do step one work
        List<Account> accounts = [SELECT Id, Name FROM Account WHERE Industry = null LIMIT 500];

        for (Account acc : accounts) {
            acc.Industry = 'Other';
        }
        update accounts;

        // Chain to step two
        System.enqueueJob(new StepTwoQueueable());
    }
}

public class StepTwoQueueable implements Queueable {

    public void execute(QueueableContext context) {
        // Do step two work
        List<Contact> contacts = [SELECT Id, Department FROM Contact WHERE Department = null LIMIT 500];

        for (Contact c : contacts) {
            c.Department = 'Unassigned';
        }
        update contacts;
    }
}

Important chaining rules:

  • In production, you can chain up to 5 queueable jobs from a single queueable execution (as of recent releases).
  • In test context, you can only enqueue one child job — chaining beyond that will throw a LimitException.
  • Do not create infinite chains. Always include a termination condition.

Making Callouts from Queueable Jobs

To make callouts from a Queueable, implement Database.AllowsCallouts:

public class ExternalNotificationQueueable implements Queueable, Database.AllowsCallouts {

    private List<String> emails;

    public ExternalNotificationQueueable(List<String> emails) {
        this.emails = emails;
    }

    public void execute(QueueableContext context) {
        HttpRequest req = new HttpRequest();
        req.setEndpoint('callout:NotificationService/api/notify');
        req.setMethod('POST');
        req.setHeader('Content-Type', 'application/json');
        req.setBody(JSON.serialize(new Map<String, Object>{
            'recipients' => emails,
            'message' => 'Your records have been processed.'
        }));

        Http http = new Http();
        HttpResponse res = http.send(req);

        if (res.getStatusCode() != 200) {
            System.debug(LoggingLevel.ERROR, 'Notification failed: ' + res.getBody());
        }
    }
}

The Finalizer Interface

The Finalizer interface lets you attach cleanup logic that runs after a Queueable job completes — whether it succeeded or failed. This is especially useful for error handling and retry logic:

public class RobustQueueable implements Queueable {

    public void execute(QueueableContext context) {
        // Attach a finalizer
        System.attachFinalizer(new RobustQueueableFinalizer());

        // Do the actual work
        List<Lead> leads = [SELECT Id, Status FROM Lead WHERE Status = 'Open' LIMIT 1000];
        for (Lead l : leads) {
            l.Status = 'Working';
        }
        update leads;
    }
}

public class RobustQueueableFinalizer implements Finalizer {

    public void execute(FinalizerContext ctx) {
        if (ctx.getResult() == ParentJobResult.SUCCESS) {
            System.debug('Queueable completed successfully.');
        } else {
            System.debug('Queueable failed. Exception: ' + ctx.getException().getMessage());

            // Retry logic — enqueue the job again
            System.enqueueJob(new RobustQueueable());
        }
    }
}

The FinalizerContext gives you access to getResult() (SUCCESS or UNHANDLED_EXCEPTION) and getException() if the job failed. Finalizers are powerful for building resilient async processing.

Testing Queueable Classes

@isTest
private class OpportunityFollowUpQueueableTest {

    @TestSetup
    static void setupData() {
        Account acc = new Account(Name = 'Test Corp');
        insert acc;

        Opportunity opp = new Opportunity(
            Name = 'Test Deal',
            AccountId = acc.Id,
            StageName = 'Prospecting',
            CloseDate = Date.today().addDays(30)
        );
        insert opp;
    }

    @isTest
    static void testFollowUpTasksCreated() {
        Opportunity opp = [SELECT Id FROM Opportunity LIMIT 1];

        Test.startTest();
        System.enqueueJob(new OpportunityFollowUpQueueable(new List<Id>{ opp.Id }));
        Test.stopTest();

        List<Task> tasks = [SELECT Id, Subject, WhatId FROM Task WHERE WhatId = :opp.Id];
        System.assertEquals(1, tasks.size(), 'One follow-up task should be created');
        System.assert(tasks[0].Subject.contains('Follow up on'), 'Task subject should reference the opportunity');
    }
}

Like batch classes, Test.stopTest() forces the queueable to execute synchronously. Be aware that chained jobs will not execute in test context beyond the first child — test chained jobs individually.


How to Set Up a Future Method

Future methods are the simplest form of async Apex. They are regular static methods annotated with @future. When called, they do not execute immediately — Salesforce queues them for later execution.

The @future Annotation

public class AccountService {

    @future
    public static void updateAccountRatings(Set<Id> accountIds) {
        List<Account> accounts = [
            SELECT Id, AnnualRevenue, Rating
            FROM Account
            WHERE Id IN :accountIds
        ];

        for (Account acc : accounts) {
            if (acc.AnnualRevenue != null && acc.AnnualRevenue > 1000000) {
                acc.Rating = 'Hot';
            } else if (acc.AnnualRevenue != null && acc.AnnualRevenue > 500000) {
                acc.Rating = 'Warm';
            } else {
                acc.Rating = 'Cold';
            }
        }

        update accounts;
    }
}

Making Callouts with @future(callout=true)

A very common use case for future methods is making HTTP callouts from trigger context. You cannot make callouts directly from a trigger, but you can call a future method that makes the callout:

public class ExternalLoggingService {

    @future(callout=true)
    public static void logToExternalSystem(String endpoint, String payload) {
        HttpRequest req = new HttpRequest();
        req.setEndpoint(endpoint);
        req.setMethod('POST');
        req.setHeader('Content-Type', 'application/json');
        req.setBody(payload);

        Http http = new Http();
        HttpResponse res = http.send(req);

        if (res.getStatusCode() != 200) {
            System.debug(LoggingLevel.ERROR, 'External log failed: ' + res.getStatusCode());
        }
    }
}

// Called from a trigger handler
public class AccountTriggerHandler {

    public static void afterInsert(List<Account> newAccounts) {
        for (Account acc : newAccounts) {
            String payload = JSON.serialize(new Map<String, String>{
                'accountName' => acc.Name,
                'event' => 'CREATED'
            });
            ExternalLoggingService.logToExternalSystem(
                'callout:AuditService/api/log',
                payload
            );
        }
    }
}

Limitations of Future Methods

Future methods come with significant restrictions:

  1. Parameters must be primitive types — You cannot pass sObjects or custom Apex types. You must pass Id, String, Set<Id>, List<String>, etc. This is why we pass Set<Id> and then re-query the records inside the method.
  2. No return value — Future methods must return void.
  3. No chaining — You cannot call a future method from another future method.
  4. No monitoring — Future methods do not return a job ID. You cannot track their progress or status the way you can with batch or queueable.
  5. No guaranteed order — If you call multiple future methods, Salesforce does not guarantee the order in which they execute.
  6. Limit of 50 future calls per transaction — You can enqueue a maximum of 50 future method invocations in a single transaction.
  7. Cannot be called from batch Apex — You cannot invoke a future method from a batch class’s execute method.

When to Use Future Methods vs Queueable

In general, prefer Queueable over future methods for new development. Queueable provides everything future methods do, plus:

  • Accepts complex data types (sObjects, Maps, custom classes)
  • Returns a job ID for monitoring
  • Supports chaining
  • Supports Finalizers for error handling

Use future methods when:

  • You have a very simple, fire-and-forget operation
  • You are working in a codebase that already uses them and consistency matters
  • You need a static method that can be called from a trigger context to make a callout

What Are Platform Events?

Platform Events are a completely different paradigm from the async patterns above. While batch, scheduled, queueable, and future all deal with deferring work, Platform Events implement an event-driven architecture using a publish-subscribe model.

The Publish-Subscribe Model

In a publish-subscribe (pub/sub) system:

  • Publishers create and send events to an event bus. They do not know or care who receives the events.
  • Subscribers listen for specific event types on the event bus. They do not know or care who published the events.
  • The Event Bus is the infrastructure layer that manages delivery. Salesforce manages this for you.

This decoupling is powerful. A single event can have multiple subscribers — a trigger, a Flow, a Lightning component, and an external system can all react to the same event independently.

How Platform Events Differ from Custom Objects

Platform Events might look like custom objects in Setup (they have fields and an API name), but they behave very differently:

AspectCustom ObjectPlatform Event
StorageRecords stored permanently in the databaseEvents are transient — stored temporarily on the event bus
CRUDFull CRUD operations (Create, Read, Update, Delete)Publish only — no update, no delete, limited query
Triggersbefore insert, after insert, before update, after update, etc.after insert only
SOQLFull SOQL supportCannot query with standard SOQL (use EventBus.getOperationId or subscribe)
API Name suffix__c__e
RelationshipsLookups and master-detailNo relationship fields
RetentionIndefinite24 hours (by default, configurable up to 3 days for high-volume)

Use Cases for Platform Events

  • Logging and auditing — Publish events when significant actions occur. Subscribers record them asynchronously.
  • Cross-system integration — External systems subscribe to Salesforce events via the Streaming API or CometD.
  • Decoupled triggers — Instead of complex trigger-to-trigger dependencies, have triggers publish events and other triggers (or Flows) subscribe.
  • Real-time notifications — Lightning Web Components subscribe to events for live UI updates.
  • Error handling — Publish error events from batch or queueable jobs. A subscriber logs them or sends alerts.

How to Set Up a Platform Event

Step 1: Create the Platform Event in Setup

  1. Navigate to Setup > Integrations > Platform Events.
  2. Click New Platform Event.
  3. Fill in the details:
    • Label: Order Status Update
    • Plural Label: Order Status Updates
    • API Name: Order_Status_Update__e (note the __e suffix)
    • Publish Behavior: Choose “Publish After Commit” (default, recommended) or “Publish Immediately”
  4. Add custom fields to the event:
    • Order_Id__c (Text, 18) — The Salesforce Order record ID
    • New_Status__c (Text, 50) — The new status value
    • Changed_By__c (Text, 100) — Who made the change
    • Change_Timestamp__c (DateTime) — When the change happened

Publish After Commit vs Publish Immediately:

  • Publish After Commit — The event is only published if the transaction commits successfully. If the transaction rolls back, the event is discarded. This is safer for most use cases.
  • Publish Immediately — The event is published as soon as EventBus.publish() is called, even if the transaction later rolls back. Useful for logging and auditing where you want to capture the event regardless of transaction outcome.

Step 2: Publishing Events

Use EventBus.publish() to publish events from Apex:

public class OrderEventPublisher {

    public static void publishStatusChange(Id orderId, String newStatus) {
        Order_Status_Update__e event = new Order_Status_Update__e(
            Order_Id__c = orderId,
            New_Status__c = newStatus,
            Changed_By__c = UserInfo.getName(),
            Change_Timestamp__c = DateTime.now()
        );

        Database.SaveResult result = EventBus.publish(event);

        if (result.isSuccess()) {
            System.debug('Event published successfully. ID: ' + result.getId());
        } else {
            for (Database.Error err : result.getErrors()) {
                System.debug(LoggingLevel.ERROR, 'Event publish error: ' + err.getMessage());
            }
        }
    }

    // Publishing multiple events at once
    public static void publishBulkStatusChanges(Map<Id, String> orderStatusMap) {
        List<Order_Status_Update__e> events = new List<Order_Status_Update__e>();

        for (Id orderId : orderStatusMap.keySet()) {
            events.add(new Order_Status_Update__e(
                Order_Id__c = orderId,
                New_Status__c = orderStatusMap.get(orderId),
                Changed_By__c = UserInfo.getName(),
                Change_Timestamp__c = DateTime.now()
            ));
        }

        List<Database.SaveResult> results = EventBus.publish(events);

        for (Integer i = 0; i < results.size(); i++) {
            if (!results[i].isSuccess()) {
                System.debug(LoggingLevel.ERROR,
                    'Failed to publish event for Order: ' + new List<Id>(orderStatusMap.keySet())[i]
                );
            }
        }
    }
}

Step 3: Subscribing with Triggers

Platform event triggers look similar to sObject triggers, but they only support the after insert event:

trigger OrderStatusUpdateTrigger on Order_Status_Update__e (after insert) {
    OrderStatusUpdateHandler.handleEvents(Trigger.new);
}
public class OrderStatusUpdateHandler {

    public static void handleEvents(List<Order_Status_Update__e> events) {
        Set<Id> orderIds = new Set<Id>();
        Map<Id, String> orderStatusMap = new Map<Id, String>();

        for (Order_Status_Update__e evt : events) {
            Id orderId = evt.Order_Id__c;
            orderIds.add(orderId);
            orderStatusMap.put(orderId, evt.New_Status__c);
        }

        List<Order> orders = [
            SELECT Id, Status FROM Order WHERE Id IN :orderIds
        ];

        for (Order ord : orders) {
            if (orderStatusMap.containsKey(ord.Id)) {
                ord.Status = orderStatusMap.get(ord.Id);
            }
        }

        if (!orders.isEmpty()) {
            update orders;
        }
    }
}

Key points about platform event triggers:

  • They only run as after insert.
  • They run under the Automated Process user, not the user who published the event.
  • They have their own governor limits — separate from the publishing transaction.
  • If the trigger fails, the event is retried. Salesforce retries up to a configured number of times.
  • Use EventBus.TriggerContext.currentContext().setResumeCheckpoint(event.ReplayId) to set a checkpoint. If the trigger fails after this point, retries resume from the checkpoint rather than reprocessing already-handled events.

Step 4: Subscribing with Flows

Platform events can also trigger Flows:

  1. Create a new Platform Event-Triggered Flow in Flow Builder.
  2. Select your platform event (e.g., Order_Status_Update__e).
  3. Build your Flow logic using the event’s fields as input.

This is useful for declarative automation. For example, you might have a Flow that sends a Slack notification when an order status changes, without writing any code.

Step 5: Subscribing with Emp API in LWC

Lightning Web Components can subscribe to platform events in real time using the lightning/empApi module. This is how you build live-updating UIs:

// orderStatusMonitor.js
import { LightningElement, wire } from 'lwc';
import { subscribe, unsubscribe, onError } from 'lightning/empApi';

export default class OrderStatusMonitor extends LightningElement {
    channelName = '/event/Order_Status_Update__e';
    subscription = {};
    events = [];

    connectedCallback() {
        this.handleSubscribe();
        this.registerErrorListener();
    }

    disconnectedCallback() {
        this.handleUnsubscribe();
    }

    handleSubscribe() {
        const messageCallback = (response) => {
            const eventData = response.data.payload;
            this.events = [
                ...this.events,
                {
                    orderId: eventData.Order_Id__c,
                    status: eventData.New_Status__c,
                    changedBy: eventData.Changed_By__c,
                    timestamp: eventData.Change_Timestamp__c
                }
            ];
        };

        subscribe(this.channelName, -1, messageCallback).then((response) => {
            this.subscription = response;
            console.log('Subscribed to: ', JSON.stringify(response.channel));
        });
    }

    handleUnsubscribe() {
        unsubscribe(this.subscription, (response) => {
            console.log('Unsubscribed: ', JSON.stringify(response));
        });
    }

    registerErrorListener() {
        onError((error) => {
            console.error('EmpApi error: ', JSON.stringify(error));
        });
    }
}
<!-- orderStatusMonitor.html -->
<template>
    <lightning-card title="Live Order Status Updates">
        <template for:each={events} for:item="evt">
            <div key={evt.orderId} class="slds-p-around_small slds-border_bottom">
                <p><strong>Order:</strong> {evt.orderId}</p>
                <p><strong>New Status:</strong> {evt.status}</p>
                <p><strong>Changed By:</strong> {evt.changedBy}</p>
                <p><strong>Time:</strong> {evt.timestamp}</p>
            </div>
        </template>
        <template if:false={events.length}>
            <p class="slds-p-around_small slds-text-color_weak">
                Waiting for events...
            </p>
        </template>
    </lightning-card>
</template>

Replay IDs

Every published event gets a Replay ID — an incrementing number that identifies the event’s position on the event bus. When subscribing, you can specify a Replay ID to control where you start reading:

  • -1 — Subscribe to new events only (from this point forward).
  • -2 — Subscribe to all available events in the retention window (last 24-72 hours) plus new events.
  • Specific Replay ID — Start reading from a specific position. Useful for resuming after a disconnection.

The ReplayId field is available on every platform event instance in Apex triggers via event.ReplayId.

Delivery Guarantees

Platform events provide at-least-once delivery. This means:

  • Every published event will be delivered to every subscriber at least once.
  • In rare cases (network issues, retries), an event may be delivered more than once.
  • Your subscriber logic should be idempotent — it should produce the same result whether it processes an event once or multiple times.
  • Events are delivered in order within a single publisher’s transaction. Cross-transaction ordering is not guaranteed.

Testing Platform Events

Testing platform events requires the Test.startTest() / Test.stopTest() pattern, plus EventBus.deliver() to force synchronous delivery in test context:

@isTest
private class OrderEventTest {

    @isTest
    static void testEventPublishAndConsume() {
        // Create test data
        Account acc = new Account(Name = 'Test Corp');
        insert acc;

        Order testOrder = new Order(
            AccountId = acc.Id,
            Status = 'Draft',
            EffectiveDate = Date.today()
        );
        insert testOrder;

        Test.startTest();

        // Publish the event
        Order_Status_Update__e event = new Order_Status_Update__e(
            Order_Id__c = testOrder.Id,
            New_Status__c = 'Activated',
            Changed_By__c = 'Test User',
            Change_Timestamp__c = DateTime.now()
        );

        Database.SaveResult result = EventBus.publish(event);
        System.assertEquals(true, result.isSuccess(), 'Event should publish successfully');

        // Force event delivery so the trigger executes
        Test.getEventBus().deliver();

        Test.stopTest();

        // Verify the trigger processed the event
        Order updatedOrder = [SELECT Id, Status FROM Order WHERE Id = :testOrder.Id];
        System.assertEquals('Activated', updatedOrder.Status,
            'Order status should be updated by the event trigger');
    }
}

Key testing points:

  • Use Test.getEventBus().deliver() to force synchronous event delivery in tests.
  • Published events are not automatically delivered in test context — you must call deliver().
  • You can publish and deliver multiple times in a single test to simulate sequences of events.

Comparison Table: All Async Types

Here is a comprehensive comparison of all five async mechanisms:

DimensionBatch ApexScheduled ApexQueueable ApexFuture MethodsPlatform Events
Interface / AnnotationDatabase.BatchableSchedulableQueueable@futureTrigger on __e object
Primary Use CaseLarge data volume processingTime-based executionComplex async with chainingSimple fire-and-forgetEvent-driven decoupled architecture
Max Records50 million (QueryLocator)N/A (delegates to other patterns)Standard limits per transactionStandard limits per transactionN/A (message-based)
Governor LimitsEach execute() gets fresh limitsSynchronous limitsAsync limitsAsync limitsSeparate transaction per delivery
ChainingCan chain in finish()Can schedule another jobCan chain up to 5 jobs in productionCannot chainN/A (subscribers are independent)
MonitoringAsyncApexJob, Setup UICronTrigger, Setup UIAsyncApexJobNo monitoring capabilityEvent Bus metrics in Setup
CalloutsYes (with Database.AllowsCallouts, scope max 100)Not directly (kick off batch or queueable)Yes (with Database.AllowsCallouts)Yes (with @future(callout=true))Not from triggers; use queueable from trigger
StatefulOptional (with Database.Stateful)N/AInstance variables persist naturallyStatelessStateless (events are independent)
Parameter TypesN/A (query-based)N/AAny serializable typePrimitives onlyEvent fields only
Return ValueJob ID from Database.executeBatch()Job ID from System.schedule()Job ID from System.enqueueJob()NoneDatabase.SaveResult from publish
Testing ApproachTest.startTest() / Test.stopTest()Test.startTest() / Test.stopTest()Test.startTest() / Test.stopTest()Test.startTest() / Test.stopTest()EventBus.publish() + Test.getEventBus().deliver()
Concurrent Limit5 active batch jobs per org100 scheduled jobs50 jobs in the queue50 calls per transactionPublish limits based on org edition
Error Handlingtry/catch in execute, results in finishtry/catch in executeFinalizer interfacetry/catch in methodRetry mechanism, setResumeCheckpoint
Best ForNightly data migrations, mass updates, archivingRecurring jobs, kicking off batchesIntegration callouts, sequential processing, moderate dataCallouts from triggers, simple DMLCross-system notifications, real-time UI updates, audit logging

Choosing the Right Async Pattern

Here is a decision flow to help you pick the right tool:

  1. Do you need to process more than 50,000 records? Use Batch Apex.
  2. Do you need code to run on a schedule (daily, weekly, monthly)? Use Scheduled Apex (typically to launch a batch or queueable).
  3. Do you need to chain sequential async jobs, pass complex data, or handle errors with Finalizers? Use Queueable Apex.
  4. Do you just need a simple callout from a trigger with no complex data passing? Use a Future Method (though Queueable is still preferred for new code).
  5. Do you need decoupled, event-driven communication between systems or components? Use Platform Events.

In practice, you will often combine multiple patterns. A very common architecture is:

  • A Scheduled Class that runs nightly
  • It kicks off a Batch Class that processes records
  • The batch’s finish method publishes a Platform Event to notify the UI
  • A Lightning Web Component subscribes to the event and updates in real time

Best Practices

  1. Always bulkify. Even in async context, process records in collections rather than one at a time.
  2. Use Database methods with allOrNone = false in batch execute methods. This prevents a single bad record from failing the entire chunk. Capture partial successes with Database.SaveResult.
  3. Keep Scheduled Apex lightweight. The scheduled class should only kick off batch or queueable jobs, not do heavy processing itself.
  4. Limit what you store in Database.Stateful classes. Serialization overhead grows with each chunk. Store counters and IDs, not full sObject lists.
  5. Make platform event subscribers idempotent. At-least-once delivery means your code might process the same event twice.
  6. Use Finalizers for Queueable error recovery. They give you a clean way to log failures and retry without losing the job.
  7. Test each async type individually. Do not rely on chaining in tests — Salesforce limits chaining depth in test context.
  8. Set appropriate scope sizes for batch classes. If your execute logic is complex, reduce the scope. If it is simple, you can increase it up to 2,000 (or 100 with callouts).
  9. Monitor your async jobs. Build a custom dashboard querying AsyncApexJob so you can spot failures quickly.
  10. Respect concurrent limits. You can only have 5 active batch jobs per org. If you need more throughput, consider Queueable or redesign your batch to handle multiple object types in one class.

Summary

Asynchronous Apex is essential for any Salesforce developer. The four async patterns — Batch, Scheduled, Queueable, and Future — each solve different problems, and Platform Events add an event-driven dimension that enables decoupled, real-time architectures.

Batch Apex handles massive data volumes by splitting work into chunks. Scheduled Apex runs code on a time-based schedule. Queueable Apex provides the flexibility of chaining, complex parameters, and Finalizers. Future methods offer a simple way to run lightweight async work like callouts from triggers. Platform Events enable publish-subscribe communication across Apex, Flows, LWC, and external systems.

Understanding when to use each pattern — and how to combine them — is what separates a junior developer from a senior one. The comparison table in this post should be your reference whenever you are deciding which async tool to reach for.

In the next post, Part 54: Async Apex Project — Scheduling a Batch Update, we will put everything from this post into practice by building a complete scheduled batch solution from scratch. We will create a scheduled class, a batch class, error handling with Finalizers, and a platform event to notify users when the batch completes.

See you there.