Salesforce · · 27 min read

The Most Common Salesforce Limits

A practical guide to Salesforce governor limits — SOQL, DML, CPU timeout, and integration limits. What they are, why they exist, how to hit them, and proven patterns to avoid them.

Part 56: The Most Common Salesforce Limits

Welcome back to the Salesforce series. If there is one topic that every Salesforce developer must understand deeply, it is governor limits. They are not optional knowledge. They are not edge cases you deal with later. They are the constraints you design around from the very first line of code.

This post is a comprehensive guide to the most common Salesforce limits. We will cover what they are, why they exist, the exact numbers you need to know, code examples that hit them, refactored code that avoids them, and the Limits class methods that let you monitor them at runtime. By the end of this post, you will have a practical reference you can come back to whenever you need to debug or prevent a limit exception.


What Are Limits in Salesforce?

Why Governor Limits Exist

Salesforce is a multi-tenant platform. That means your org shares compute resources — CPU, memory, database connections, and storage — with thousands of other orgs on the same infrastructure. If one org ran a query that returned 10 million rows or an Apex class that looped for 5 minutes, it would degrade performance for everyone else on that shared instance.

Governor limits exist to prevent any single tenant from monopolizing shared resources. They enforce fair usage by capping the amount of work any single transaction can perform. Think of them as guardrails, not punishments. They force you to write efficient, bulkified code — which is better code regardless of the platform.

Synchronous vs Asynchronous Limits

Salesforce applies different limit thresholds depending on whether your code runs synchronously or asynchronously.

Synchronous execution happens in real time — a trigger firing on record save, a Visualforce page loading, a Lightning component calling an Apex method. These transactions have tighter limits because the user is waiting for a response.

Asynchronous execution happens in the background — batch Apex, queueable Apex, future methods, scheduled Apex. These transactions get higher limits because they do not block a user and can be scheduled during off-peak times.

Here is a quick comparison of the key differences:

LimitSynchronousAsynchronous
SOQL queries100200
SOQL rows retrieved50,00050,000
DML statements150150
DML rows10,00010,000
CPU time10,000 ms60,000 ms
Callouts100100
Heap size6 MB12 MB
Total query rows via Database.getQueryLocator10,00050,000,000 (Batch)

The row limit for Database.getQueryLocator in a Batch Apex start method is the notable exception — it jumps from 10,000 to 50 million, which is why batch Apex is the go-to pattern for processing large data volumes.

How to Check Limits at Runtime

Salesforce provides the Limits class with static methods that return the current consumption and maximum allowed for each governor limit. Every method comes in a pair: one that returns the amount consumed so far, and one that returns the maximum allowed.

// Check SOQL queries
System.debug('SOQL queries used: ' + Limits.getQueries());
System.debug('SOQL queries limit: ' + Limits.getLimitQueries());

// Check DML statements
System.debug('DML statements used: ' + Limits.getDmlStatements());
System.debug('DML statements limit: ' + Limits.getLimitDmlStatements());

// Check CPU time
System.debug('CPU time used: ' + Limits.getCpuTime() + ' ms');
System.debug('CPU time limit: ' + Limits.getLimitCpuTime() + ' ms');

// Check heap size
System.debug('Heap size used: ' + Limits.getHeapSize() + ' bytes');
System.debug('Heap size limit: ' + Limits.getLimitHeapSize() + ' bytes');

// Check callouts
System.debug('Callouts used: ' + Limits.getCallouts());
System.debug('Callouts limit: ' + Limits.getLimitCallouts());

You can call these methods at any point in your code. A common practice is to add limit checks in utility methods or at the start and end of trigger handlers to monitor consumption across complex transactions.

public class LimitLogger {
    public static void logLimits(String context) {
        System.debug('=== Limits at ' + context + ' ===');
        System.debug('SOQL: ' + Limits.getQueries() + ' / ' + Limits.getLimitQueries());
        System.debug('DML: ' + Limits.getDmlStatements() + ' / ' + Limits.getLimitDmlStatements());
        System.debug('Rows: ' + Limits.getDmlRows() + ' / ' + Limits.getLimitDmlRows());
        System.debug('CPU: ' + Limits.getCpuTime() + ' / ' + Limits.getLimitCpuTime());
        System.debug('Heap: ' + Limits.getHeapSize() + ' / ' + Limits.getLimitHeapSize());
        System.debug('Callouts: ' + Limits.getCallouts() + ' / ' + Limits.getLimitCallouts());
    }
}

The Limits Tab in Debug Logs

When you open a debug log in the Developer Console, the Limits section shows a summary of every governor limit and how much of it was consumed during that transaction. It looks something like this:

Number of SOQL queries: 5 out of 100
Number of query rows: 230 out of 50000
Number of SOSL queries: 0 out of 20
Number of DML statements: 3 out of 150
Number of DML rows: 45 out of 10000
Maximum CPU time: 1203 out of 10000
Maximum heap size: 48210 out of 6000000
Number of callouts: 0 out of 100

This is one of the first places to look when debugging a limit exception. You can also filter debug logs to show only LIMIT_USAGE and LIMIT_USAGE_FOR_NS events to isolate the limits data quickly.

To enable the Limits tab in your debug log, make sure the Apex Code log level is set to at least FINE and the System log level is set to at least DEBUG in your trace flag configuration.


SOQL Limits and How to Avoid Them

SOQL limits are the ones developers hit most frequently, especially when they are new to the platform. The core limits are:

LimitValueLimits Class Method
Total SOQL queries per transaction (sync)100Limits.getQueries() / Limits.getLimitQueries()
Total SOQL queries per transaction (async)200Limits.getQueries() / Limits.getLimitQueries()
Total rows retrieved by SOQL queries50,000Limits.getQueryRows() / Limits.getLimitQueryRows()
SOSL queries per transaction20Limits.getSoslQueries() / Limits.getLimitSoslQueries()

Anti-Pattern: SOQL Inside a Loop

This is the single most common mistake in Salesforce development. If you write a SOQL query inside a for loop, and that loop iterates over more than 100 records, you will hit the 100-query limit.

// BAD: SOQL inside a loop — will hit 100 query limit
trigger SetAccountRating on Contact (before insert) {
    for (Contact con : Trigger.new) {
        // One query per contact — if 200 contacts are inserted, this runs 200 queries
        Account acc = [SELECT Id, Rating FROM Account WHERE Id = :con.AccountId];
        if (acc.Rating == null) {
            con.Description = 'Account has no rating';
        }
    }
}

If a data load inserts 200 contacts at once, this trigger fires 200 SOQL queries — double the synchronous limit of 100.

Fix: Query Before the Loop and Use a Map

The solution is to collect all the IDs you need, run a single query, store the results in a Map, and then reference the Map inside the loop.

// GOOD: Single query before the loop, Map-based lookup inside
trigger SetAccountRating on Contact (before insert) {
    // Step 1: Collect all Account IDs
    Set<Id> accountIds = new Set<Id>();
    for (Contact con : Trigger.new) {
        if (con.AccountId != null) {
            accountIds.add(con.AccountId);
        }
    }

    // Step 2: Single SOQL query — 1 query regardless of batch size
    Map<Id, Account> accountMap = new Map<Id, Account>(
        [SELECT Id, Rating FROM Account WHERE Id IN :accountIds]
    );

    // Step 3: Map-based lookup in the loop — no additional queries
    for (Contact con : Trigger.new) {
        if (con.AccountId != null && accountMap.containsKey(con.AccountId)) {
            Account acc = accountMap.get(con.AccountId);
            if (acc.Rating == null) {
                con.Description = 'Account has no rating';
            }
        }
    }
}

This pattern — collect IDs, query once, build a Map, iterate — is the most important pattern in Apex development. Memorize it.

The 50,000 Row Limit

Even if you only run one query, you can still hit limits if that query returns too many rows. The total rows returned across all SOQL queries in a single transaction cannot exceed 50,000.

// BAD: Query could return more than 50,000 rows
List<Contact> allContacts = [SELECT Id, Name, Email FROM Contact];

If your org has 80,000 contacts, this query will throw a System.LimitException: Too many query rows: 50001.

Fix: Use WHERE Clauses, LIMIT, and SOQL For Loops

Always filter your queries. If you genuinely need to process large datasets, use a SOQL for loop or move the work to Batch Apex.

// GOOD: Filter with WHERE and LIMIT
List<Contact> recentContacts = [
    SELECT Id, Name, Email
    FROM Contact
    WHERE CreatedDate = LAST_N_DAYS:30
    LIMIT 10000
];

For large datasets where you need to iterate through many records without loading them all into memory at once, use a SOQL for loop:

// GOOD: SOQL for loop — retrieves records in chunks of 200
for (Contact con : [SELECT Id, Name, Email FROM Contact WHERE CreatedDate = LAST_N_DAYS:90]) {
    // Each iteration processes one record
    // Salesforce fetches records in batches of 200 behind the scenes
    // This is memory-efficient but still counts against the 50,000 row limit
    if (con.Email == null) {
        con.Email = 'unknown@example.com';
    }
}

The SOQL for loop does not bypass the 50,000 row limit. It reduces heap usage by loading records in chunks of 200 instead of all at once. If you need to process more than 50,000 rows, use Batch Apex with Database.getQueryLocator, which supports up to 50 million rows.

Non-Selective Queries

A query is non-selective when it does not use an indexed field in its WHERE clause and the table has more than 200,000 records (or 100,000 for a standard object with a large data volume). Non-selective queries cause full table scans and can throw a System.QueryException: Non-selective query at runtime.

// BAD: Non-selective query on a large table
// Description__c is not indexed, and the table has 500,000 records
List<Contact> result = [
    SELECT Id, Name
    FROM Contact
    WHERE Description = 'VIP'
];

To fix non-selective queries:

  1. Use indexed fields in your WHERE clause — Id, Name, OwnerId, CreatedDate, SystemModstamp, lookup fields, and any field with an external ID or custom index.
  2. Request a custom index from Salesforce Support for frequently queried fields.
  3. Add selective filters that narrow the result set to less than 10% of total records (or less than 333,333 rows on tables with over 1 million records).
// GOOD: Selective query using an indexed field
List<Contact> result = [
    SELECT Id, Name
    FROM Contact
    WHERE AccountId = :someAccountId
    AND Description = 'VIP'
];

Query Optimization Tips

  • Select only the fields you need. Do not use SELECT * equivalents. Every field you retrieve consumes heap memory.
  • Use relationship queries instead of separate queries for parent and child records.
  • Use aggregate queries (COUNT(), SUM(), AVG()) when you need summary data instead of full records.
  • Avoid querying in utility methods that might be called multiple times in the same transaction. Pass data in as parameters instead.
// GOOD: Relationship query — one query fetches Account and its Contacts
List<Account> accounts = [
    SELECT Id, Name,
        (SELECT Id, FirstName, LastName, Email FROM Contacts WHERE IsActive__c = true)
    FROM Account
    WHERE Industry = 'Technology'
    LIMIT 100
];

Runtime SOQL Limit Check

Before executing a query in a method that might be called multiple times, you can check whether you are approaching the limit:

public static List<Account> safeQuery(Set<Id> ids) {
    if (Limits.getQueries() >= Limits.getLimitQueries() - 5) {
        // Log a warning or throw a custom exception
        throw new LimitException('Approaching SOQL query limit: '
            + Limits.getQueries() + '/' + Limits.getLimitQueries());
    }
    return [SELECT Id, Name FROM Account WHERE Id IN :ids];
}

DML Limits and How to Avoid Them

DML (Data Manipulation Language) operations are inserts, updates, deletes, undeletes, and upserts. The governor limits for DML are:

LimitValueLimits Class Method
DML statements per transaction150Limits.getDmlStatements() / Limits.getLimitDmlStatements()
Total DML rows per transaction10,000Limits.getDmlRows() / Limits.getLimitDmlRows()

Anti-Pattern: DML Inside a Loop

Just like SOQL in loops, performing DML inside a loop is a guaranteed way to hit limits.

// BAD: DML inside a loop — will hit 150 DML statement limit
trigger CreateTaskForNewContacts on Contact (after insert) {
    for (Contact con : Trigger.new) {
        Task t = new Task();
        t.Subject = 'Welcome call for ' + con.FirstName;
        t.WhoId = con.Id;
        t.OwnerId = con.OwnerId;
        t.ActivityDate = Date.today().addDays(3);
        insert t; // One DML per contact
    }
}

If 200 contacts are inserted, this runs 200 DML statements — well over the 150 limit.

Fix: Collect Records and Perform a Single DML

// GOOD: Collect records in a list, insert once after the loop
trigger CreateTaskForNewContacts on Contact (after insert) {
    List<Task> tasksToInsert = new List<Task>();

    for (Contact con : Trigger.new) {
        Task t = new Task();
        t.Subject = 'Welcome call for ' + con.FirstName;
        t.WhoId = con.Id;
        t.OwnerId = con.OwnerId;
        t.ActivityDate = Date.today().addDays(3);
        tasksToInsert.add(t);
    }

    if (!tasksToInsert.isEmpty()) {
        insert tasksToInsert; // Single DML statement regardless of batch size
    }
}

This reduces 200 DML statements to 1.

The 10,000 Row Limit

Even with a single DML statement, you cannot insert, update, or delete more than 10,000 rows in a single transaction.

// BAD: Trying to insert more than 10,000 records in one transaction
List<Contact> contacts = new List<Contact>();
for (Integer i = 0; i < 15000; i++) {
    contacts.add(new Contact(
        FirstName = 'Test',
        LastName = 'Contact ' + i,
        AccountId = someAccountId
    ));
}
insert contacts; // Throws LimitException: Too many DML rows: 15000

If you need to process more than 10,000 records, use Batch Apex. Each batch execute method gets its own transaction with its own set of limits.

Database Methods with Partial Success

The standard DML statements (insert, update, delete) are all-or-nothing. If one record fails, the entire operation rolls back. The Database class methods let you allow partial success:

// Partial success — successful records are committed, failed records are returned with errors
List<Contact> contactsToInsert = buildContactList();

Database.SaveResult[] results = Database.insert(contactsToInsert, false);

for (Integer i = 0; i < results.size(); i++) {
    if (!results[i].isSuccess()) {
        for (Database.Error err : results[i].getErrors()) {
            System.debug('Error on record ' + i + ': ' + err.getMessage());
            System.debug('Fields: ' + err.getFields());
        }
    }
}

The second parameter false in Database.insert(records, false) means “allow partial success.” This is critical in batch processing and integrations where you do not want one bad record to roll back thousands of successful ones.

The Mixed DML Error

One of the more confusing DML errors in Salesforce is the mixed DML error. It occurs when you try to perform DML on a setup object and a non-setup object in the same transaction.

Setup objects include: User, Group, GroupMember, PermissionSet, PermissionSetAssignment, QueueSObject, ObjectTerritory2Association, and others.

Non-setup objects are standard and custom objects like Account, Contact, Opportunity, and your custom objects.

// BAD: Mixed DML — inserting a non-setup and setup object in the same transaction
Account acc = new Account(Name = 'Test Corp');
insert acc;

// This will throw MIXED_DML_OPERATION error
User u = [SELECT Id, IsActive FROM User WHERE Id = :UserInfo.getUserId()];
u.IsActive = true;
update u;

Fix: Use @future or System.runAs

Option 1: @future method — move the setup object DML to a future method so it runs in a separate transaction.

public class UserUpdateService {
    @future
    public static void activateUser(Id userId) {
        User u = [SELECT Id, IsActive FROM User WHERE Id = :userId];
        u.IsActive = true;
        update u;
    }
}

// In your trigger or main class:
Account acc = new Account(Name = 'Test Corp');
insert acc;
UserUpdateService.activateUser(someUserId); // Runs in a separate transaction

Option 2: System.runAs in tests — if you encounter this in a test class, wrap the setup object DML in System.runAs:

@isTest
static void testMixedDML() {
    // Create a user (setup object)
    Profile p = [SELECT Id FROM Profile WHERE Name = 'Standard User'];
    User testUser = new User(
        FirstName = 'Test',
        LastName = 'User',
        Email = 'testuser@example.com',
        Username = 'testuser' + DateTime.now().getTime() + '@example.com',
        Alias = 'tuser',
        TimeZoneSidKey = 'America/New_York',
        LocaleSidKey = 'en_US',
        EmailEncodingKey = 'UTF-8',
        ProfileId = p.Id,
        LanguageLocaleKey = 'en_US'
    );
    insert testUser;

    // Wrap non-setup DML in System.runAs to avoid mixed DML
    System.runAs(testUser) {
        Account acc = new Account(Name = 'Test Corp');
        insert acc;
        System.assertNotEquals(null, acc.Id);
    }
}

Runtime DML Limit Check

public static void safeDml(List<SObject> records) {
    if (Limits.getDmlStatements() >= Limits.getLimitDmlStatements() - 5) {
        throw new LimitException('Approaching DML statement limit: '
            + Limits.getDmlStatements() + '/' + Limits.getLimitDmlStatements());
    }
    if (Limits.getDmlRows() + records.size() > Limits.getLimitDmlRows()) {
        throw new LimitException('DML row limit would be exceeded. Current: '
            + Limits.getDmlRows() + ', Requested: ' + records.size()
            + ', Limit: ' + Limits.getLimitDmlRows());
    }
    insert records;
}

CPU Timeout Limits and How to Avoid Them

The CPU time limit caps how long your code can use the CPU within a single transaction:

LimitValueLimits Class Method
CPU time (synchronous)10,000 ms (10 seconds)Limits.getCpuTime() / Limits.getLimitCpuTime()
CPU time (asynchronous)60,000 ms (60 seconds)Limits.getCpuTime() / Limits.getLimitCpuTime()

What Counts as CPU Time

CPU time includes:

  • All Apex code execution (method calls, loops, calculations)
  • Library function calls (String methods, Math operations, JSON parsing)
  • Trigger execution
  • Flow and process builder execution that runs in the same transaction
  • Formula field evaluation during DML

CPU time does not include:

  • Time spent waiting for SOQL queries to return (database time)
  • Time spent waiting for callout responses (network time)
  • Time spent waiting for DML operations to commit

This distinction is important. A transaction might take 30 seconds wall-clock time but only 2 seconds of CPU time if most of the time was spent waiting for database queries and callouts.

Anti-Pattern: Nested Loops and Inefficient Lookups

The fastest way to burn through CPU time is nested loops where the inner loop scans a large collection.

// BAD: Nested loops — O(n*m) complexity, burns CPU time fast
public static void matchContactsToAccounts(
    List<Contact> contacts,
    List<Account> accounts
) {
    for (Contact con : contacts) {
        for (Account acc : accounts) {
            if (con.AccountId == acc.Id) {
                con.Description = 'Belongs to ' + acc.Name;
                break;
            }
        }
    }
}
// If contacts = 5000 and accounts = 5000, worst case is 25,000,000 iterations

Fix: Use Maps for O(1) Lookups

// GOOD: Map-based lookup — O(n+m) complexity, minimal CPU usage
public static void matchContactsToAccounts(
    List<Contact> contacts,
    List<Account> accounts
) {
    Map<Id, Account> accountMap = new Map<Id, Account>(accounts);

    for (Contact con : contacts) {
        if (con.AccountId != null && accountMap.containsKey(con.AccountId)) {
            con.Description = 'Belongs to ' + accountMap.get(con.AccountId).Name;
        }
    }
}
// If contacts = 5000 and accounts = 5000, this runs ~10,000 iterations total

The difference is dramatic. The nested loop version does up to 25 million comparisons. The Map version does about 10,000 operations. That is a 2,500x improvement.

Identifying CPU-Heavy Code

Common CPU-intensive operations:

  • Nested loops — as shown above, these multiply iteration counts.
  • String concatenation in loops — Strings in Apex are immutable. Each concatenation creates a new String object.
  • JSON/XML parsing of large payloads — deserializing large JSON bodies is CPU-expensive.
  • Complex SOQL result processing — iterating through deeply nested relationship query results.
  • Regular expressions — complex regex patterns on large strings consume significant CPU.

Optimization Techniques

1. Early returns and continue statements:

// GOOD: Skip unnecessary processing early
for (Contact con : contacts) {
    if (con.AccountId == null) continue; // Skip orphan contacts immediately
    if (con.Email == null) continue;     // Skip contacts without email

    // Only process contacts that meet all criteria
    processContact(con);
}

2. Avoid string concatenation in loops — use String.join:

// BAD: String concatenation in a loop
String result = '';
for (Contact con : contacts) {
    result += con.Name + ', ';
}

// GOOD: Use String.join
List<String> names = new List<String>();
for (Contact con : contacts) {
    names.add(con.Name);
}
String result = String.join(names, ', ');

3. Move heavy processing to async:

If a synchronous transaction is running close to the 10-second CPU limit, consider offloading the heavy work to a queueable or future method where you get 60 seconds.

// Move CPU-heavy work to a Queueable
public class HeavyProcessingJob implements Queueable {
    private List<Id> recordIds;

    public HeavyProcessingJob(List<Id> recordIds) {
        this.recordIds = recordIds;
    }

    public void execute(QueueableContext context) {
        // 60,000 ms CPU limit instead of 10,000
        List<Contact> contacts = [SELECT Id, Name, AccountId FROM Contact WHERE Id IN :recordIds];
        performExpensiveCalculations(contacts);
    }

    private void performExpensiveCalculations(List<Contact> contacts) {
        // Heavy processing here
    }
}

// Enqueue from a trigger or controller
System.enqueueJob(new HeavyProcessingJob(contactIds));

4. Cache repeated calculations:

// BAD: Recalculating the same value in every iteration
for (Contact con : contacts) {
    Decimal taxRate = TaxService.calculateRate(con.MailingState); // Expensive call
    con.Tax_Amount__c = con.Amount__c * taxRate;
}

// GOOD: Cache the result by state
Map<String, Decimal> taxRateCache = new Map<String, Decimal>();
for (Contact con : contacts) {
    if (!taxRateCache.containsKey(con.MailingState)) {
        taxRateCache.put(con.MailingState, TaxService.calculateRate(con.MailingState));
    }
    con.Tax_Amount__c = con.Amount__c * taxRateCache.get(con.MailingState);
}

Runtime CPU Limit Check

public static void checkCpuLimit(String context) {
    Integer cpuUsed = Limits.getCpuTime();
    Integer cpuLimit = Limits.getLimitCpuTime();
    Integer percentUsed = (cpuUsed * 100) / cpuLimit;

    if (percentUsed > 80) {
        System.debug(LoggingLevel.WARN,
            'CPU WARNING at ' + context + ': ' + cpuUsed + 'ms / ' + cpuLimit + 'ms (' + percentUsed + '%)');
    }

    if (percentUsed > 95) {
        throw new LimitException('CPU limit nearly exhausted at ' + context
            + ': ' + cpuUsed + 'ms / ' + cpuLimit + 'ms');
    }
}

Integration Limits and How to Avoid Them

When Apex code communicates with external systems via HTTP callouts, a separate set of limits applies:

LimitValueLimits Class Method
Callouts per transaction100Limits.getCallouts() / Limits.getLimitCallouts()
Maximum callout timeout120,000 ms (120 seconds)N/A (set per request)
Maximum callout request/response size6 MB (sync) / 12 MB (async)N/A
Maximum combined callout timeout120,000 ms totalN/A

The 100 Callout Limit

Each HTTP request — whether GET, POST, PUT, or DELETE — counts as one callout. If you need to call an external API for each record in a trigger, you will hit the 100-callout limit quickly.

// BAD: One callout per record — hits 100 callout limit
trigger SyncContactsToExternal on Contact (after insert) {
    for (Contact con : Trigger.new) {
        HttpRequest req = new HttpRequest();
        req.setEndpoint('callout:ExternalCRM/contacts');
        req.setMethod('POST');
        req.setBody(JSON.serialize(con));
        req.setHeader('Content-Type', 'application/json');
        new Http().send(req); // One callout per contact
    }
}

This also has a second problem: you cannot make callouts from a trigger context directly. Triggers are transactional, and callouts are not allowed in the default trigger execution context.

Fix: Batch Callouts with Async Processing

// GOOD: Collect records and process via Queueable with callout enabled
trigger SyncContactsToExternal on Contact (after insert) {
    List<Id> contactIds = new List<Id>();
    for (Contact con : Trigger.new) {
        contactIds.add(con.Id);
    }
    System.enqueueJob(new ContactSyncJob(contactIds));
}

public class ContactSyncJob implements Queueable, Database.AllowsCallouts {
    private List<Id> contactIds;

    public ContactSyncJob(List<Id> contactIds) {
        this.contactIds = contactIds;
    }

    public void execute(QueueableContext context) {
        List<Contact> contacts = [
            SELECT Id, FirstName, LastName, Email, Phone
            FROM Contact
            WHERE Id IN :contactIds
        ];

        // Batch records into a single API call if the external system supports it
        HttpRequest req = new HttpRequest();
        req.setEndpoint('callout:ExternalCRM/contacts/batch');
        req.setMethod('POST');
        req.setBody(JSON.serialize(contacts));
        req.setHeader('Content-Type', 'application/json');
        req.setTimeout(120000);

        HttpResponse res = new Http().send(req);

        if (res.getStatusCode() != 200) {
            System.debug('Sync failed: ' + res.getStatusCode() + ' ' + res.getBody());
        }
    }
}

Key improvements: the callout runs asynchronously via a Queueable job, multiple records are batched into a single API call, and the Database.AllowsCallouts interface enables HTTP callouts in the async context.

Callout Timeout

Each individual callout can have a timeout of up to 120 seconds (120,000 milliseconds). The default timeout is 10 seconds if you do not set it explicitly.

HttpRequest req = new HttpRequest();
req.setEndpoint('callout:SlowExternalService/data');
req.setMethod('GET');
req.setTimeout(120000); // Maximum: 120 seconds
HttpResponse res = new Http().send(req);

If the external service is slow and your transaction is synchronous, the user will be waiting. For long-running callouts in a UI context, consider using a Continuation in Lightning components.

Callout Body Size Limit

The maximum size of a callout request or response body is 6 MB in synchronous transactions and 12 MB in asynchronous transactions. If you are working with large file transfers or data exports, you may need to implement chunked transfers.

// Check response size before processing
HttpResponse res = new Http().send(req);
Blob responseBody = res.getBodyAsBlob();

if (responseBody.size() > 5 * 1024 * 1024) { // 5 MB safety margin
    System.debug('Response is large: ' + responseBody.size() + ' bytes');
    // Consider processing in chunks or storing in a ContentVersion
}

Continuation for Long-Running Callouts

In Lightning components, you can use the Continuation class to make callouts that do not count against the standard timeout. A Continuation allows the server to release the thread while waiting for the external service to respond, then picks up processing when the response arrives.

// Aura-enabled controller using Continuation
public class ExternalDataController {
    private static final String CALLBACK_METHOD = 'processResponse';

    @AuraEnabled(continuation=true cacheable=false)
    public static Object startLongCallout() {
        Continuation con = new Continuation(60); // 60-second timeout
        con.continuationMethod = CALLBACK_METHOD;

        HttpRequest req = new HttpRequest();
        req.setEndpoint('callout:SlowService/data');
        req.setMethod('GET');

        con.addHttpRequest(req);
        return con;
    }

    @AuraEnabled(cacheable=false)
    public static Object processResponse(List<String> labels, Object state) {
        HttpResponse response = Continuation.getResponse(labels[0]);
        return response.getBody();
    }
}

Named Credential Considerations

Named Credentials simplify authentication for callouts by storing endpoint URLs and credentials securely. Keep these limits in mind:

  • Named Credentials support OAuth 2.0, password authentication, JWT, and custom authentication protocols.
  • Each Named Credential callout still counts against the 100-callout limit per transaction.
  • Named Credentials with OAuth refresh tokens may consume an additional callout for token refresh.
  • External Credentials (the newer model) have their own per-user and per-named-principal limits.
// Using Named Credentials — authentication is handled automatically
HttpRequest req = new HttpRequest();
req.setEndpoint('callout:My_Named_Credential/api/v1/records');
req.setMethod('GET');
req.setHeader('Content-Type', 'application/json');

HttpResponse res = new Http().send(req);

Runtime Callout Limit Check

public static HttpResponse safeCallout(HttpRequest req) {
    if (Limits.getCallouts() >= Limits.getLimitCallouts()) {
        throw new LimitException('Callout limit reached: '
            + Limits.getCallouts() + '/' + Limits.getLimitCallouts());
    }
    return new Http().send(req);
}

Comprehensive Limits Reference Table

Here is a consolidated reference of the most common Salesforce governor limits. Keep this table bookmarked.

CategoryLimitSynchronousAsynchronousLimits Class Method
SOQLQueries per transaction100200getQueries() / getLimitQueries()
SOQLTotal rows retrieved50,00050,000getQueryRows() / getLimitQueryRows()
SOQLRecords via QueryLocator10,00050,000,000N/A
SOSLQueries per transaction2020getSoslQueries() / getLimitSoslQueries()
SOSLRecords returned2,0002,000N/A
DMLStatements per transaction150150getDmlStatements() / getLimitDmlStatements()
DMLTotal rows processed10,00010,000getDmlRows() / getLimitDmlRows()
CPUCPU time10,000 ms60,000 msgetCpuTime() / getLimitCpuTime()
HeapHeap size6 MB12 MBgetHeapSize() / getLimitHeapSize()
CalloutsHTTP callouts per transaction100100getCallouts() / getLimitCallouts()
CalloutsMax timeout per callout120,000 ms120,000 msN/A
CalloutsRequest or response body size6 MB12 MBN/A
EmailSingle emails per transaction1010getEmailInvocations() / getLimitEmailInvocations()
EmailSingle emails per day (org-wide)5,0005,000N/A
Future@future calls per transaction500 (cannot chain)getFutureCalls() / getLimitFutureCalls()
QueueableQueueable jobs per transaction501getQueueableJobs() / getLimitQueueableJobs()
DescribeDescribe calls100100getFieldsDescribes() / getLimitFieldsDescribes()
PublishPlatform Events published150150getPublishImmediateDML() / getLimitPublishImmediateDML()

Limits You Cannot Check at Runtime

Some limits do not have corresponding Limits class methods. You need to know them by heart or track them manually:

  • Callout timeout (120s per callout) — no runtime check; set via HttpRequest.setTimeout().
  • Callout body size (6 MB / 12 MB) — no runtime check; monitor payload size manually.
  • SOSL returned rows (2,000) — no runtime check.
  • Query locator rows (10,000 / 50,000,000) — no runtime check.
  • Total email sends per day (5,000) — use Messaging.reserveSingleEmailCapacity() to check.

Best Practices Summary

Here is a condensed list of the patterns that keep you within limits:

SOQL Best Practices

  1. Never put a SOQL query inside a loop.
  2. Collect IDs in a Set, query once, build a Map for lookups.
  3. Use WHERE clauses and LIMIT to control result set size.
  4. Use SOQL for loops for memory-efficient iteration over large datasets.
  5. Use relationship queries to reduce the number of separate queries.
  6. Use aggregate queries when you only need counts or sums.
  7. Move large-volume processing to Batch Apex.

DML Best Practices

  1. Never put a DML statement inside a loop.
  2. Collect records in a List and perform a single DML after the loop.
  3. Use Database.insert(records, false) for partial success in integrations and batch processing.
  4. Use @future or Queueable to separate setup object DML from non-setup object DML.
  5. Move operations exceeding 10,000 rows to Batch Apex.

CPU Best Practices

  1. Replace nested loops with Map-based lookups.
  2. Use early return and continue to skip unnecessary iterations.
  3. Avoid string concatenation in loops — use String.join().
  4. Cache repeated calculations in a Map.
  5. Move CPU-heavy work to asynchronous processing for 6x the limit.

Integration Best Practices

  1. Batch multiple records into a single callout when the external API supports it.
  2. Use Queueable with Database.AllowsCallouts for callouts triggered by DML events.
  3. Set explicit timeouts with HttpRequest.setTimeout().
  4. Use Continuation for long-running callouts in Lightning components.
  5. Use Named Credentials to manage authentication securely.
  6. Implement retry logic with exponential backoff for transient failures.

General Best Practices

  1. Use the Limits class proactively to monitor consumption during complex transactions.
  2. Review the Limits tab in debug logs when troubleshooting.
  3. Write bulkified code from day one — never assume triggers will process one record at a time.
  4. Design for the worst case: triggers can receive up to 200 records per batch in data loader scenarios.
  5. Use System.debug with Limits.getCpuTime() to profile slow sections of code.

Wrapping Up

Governor limits are not obstacles to work around — they are design constraints to build within. The patterns you have learned in this post — bulkification, Map-based lookups, single-DML collection, async offloading, and runtime limit checking — are not just limit-avoidance techniques. They are the standard patterns of professional Salesforce development.

Every limit exists because Salesforce is a shared platform. When you write code that respects those limits, you are writing code that is efficient, scalable, and production-ready. When you ignore them, your code will work in your sandbox with 50 records and break in production with 50,000.

The Limits class is your best friend. Use it in your utility methods, your trigger handlers, and your test classes. Know the numbers by heart. And when you hit a limit, do not look for a workaround — look for a better design.

In Part 57, we will shift gears and talk about Version Control — how to manage your Salesforce code and metadata with Git, branching strategies, and deployment pipelines that keep your team productive and your orgs stable.

See you there.