HomeNikola Knezevic

In this article

Banner

Bulk Update with EF Core and Dapper in .NET

30 Oct 2025
7 min

Sponsor Newsletter

Working with large datasets can make bulk operations challenging, especially when performance matters.

When you need to update thousands or even millions of records, the methods or packages you choose make a big difference.

In today's blog post, we'll explore how to perform bulk updates using EF Core and Dapper.

NOTE: This post lays the groundwork for a future article, where we’ll go beyond these examples and achieve even better performance using NuGet packages such as ZZZ Projects' Bulk Operations, which takes performance to another level.

Getting Started

To get a clear picture of performance, we’ll use BenchmarkDotNet.

BenchmarkDotNet is a popular .NET library for accurate and reliable performance measurements. To learn more, check out my detailed blog post: Benchmark Code using BecnhamarkDotNet

To get started with BenchmarkDotNet, you need to install the NuGet package. You can do this via the NuGet Package Manager or by running the following command in the Package Manager Console:

bash
Install-Package BenchmarkDotNet

For this benchmark, we will use a simple Product entity:

csharp
public sealed class Product
{
    public Guid Id { get; set; }

    public DateTime CreatedAt { get; set; }

    public DateTime? ModifiedAt { get; set; }

    public string Name { get; set; }

    public string Description { get; set; }

    public decimal Price { get; set; }
}

We also need rows to update for the benchmark:

sql
SET NOCOUNT ON;

DO $$
DECLARE
    BatchSize INT := 1000; 
    TotalRows INT := 100000;
    CurrentBatch INT := 0;
BEGIN

    WHILE CurrentBatch * BatchSize < TotalRows LOOP
        INSERT INTO Products (Id, CreatedAt, ModifiedAt, Name, Description, Price)
        SELECT 
            gen_random_uuid() AS Id, 
            CURRENT_TIMESTAMP - (floor(random() * 730)) * interval '1 day' AS CreatedAt, 
            NULL AS ModifiedAt,
            CONCAT('Product ', (CurrentBatch * BatchSize + row_number() OVER ()) ) AS Name, 
            CONCAT('Description for Product ', (CurrentBatch * BatchSize + row_number() OVER ()) ) AS Description, 
            ROUND((random() * 10) + 10)::numeric, 2) AS Price  
        FROM generate_series(1, BatchSize); 
        
        CurrentBatch := CurrentBatch + 1;
    END LOOP;
END $$;

NOTE: This script is for PostgreSQL. If you’re using a different database, the script may need adjustments.

EF Core 1 by 1

The simplest way to update entities looks like this:

csharp
var productsToUpdate = await _dbContext.Products
    .ToListAsync();

foreach (var product in productsToUpdate)
{
    product.Price *= 1.02m;
}

await _dbContext.SaveChangesAsync();

Fetching products in memory, updating price and saving changes, completely fine for a few entities.

However, to load thousands or millions of records into memory there are much more efficient approaches.

Using Raw SQL

One way to handle this is by relying directly on raw SQL:

csharp
await using var connection = new NpgsqlConnection(ConnectionString);
await connection.OpenAsync();

var transaction = await connection.BeginTransactionAsync();

await connection.ExecuteAsync(
    $"UPDATE "Products" SET "Price" = "Price" * 1.02",
    transaction: transaction);

await transaction.CommitAsync();

This is a very efficient and fast way to perform a bulk update.

We can achieve the same result using EF Core as well:

csharp
await _dbContext.Database.ExecuteSqlInterpolatedAsync(
    $"UPDATE "Products" SET "Price" = "Price" * 1.02");

Both approaches run the SQL directly on the database, bypassing the EF Core change tracker which makes them ideal for scenarios where speed and simplicity are more important than entity tracking.

EF Core ExecuteUpdate

However, if you’re using EF Core and want to avoid raw SQL, there’s a cleaner and nearly as performant alternative:

csharp
await _dbContext.Products
    .ExecuteUpdateAsync(set => set
        .SetProperty(p =>
            p.Price, p => p.Price * 1.02m));

ExecuteUpdate runs a bulk update directly in the database.

SetProperty specifies which property to update and how.

NOTE: ExecuteUpdate runs SQL immediately, it does not wait for SaveChangesAsync().

If you want it to execute within a transaction, wrap it manually in a transaction.

csharp
await using var connection = new NpgsqlConnection(ConnectionString);
await connection.OpenAsync();

var transaction = await connection.BeginTransactionAsync();

await _dbContext.Products
    .ExecuteUpdateAsync(set => set
        .SetProperty(p =>
            p.Price, p => p.Price * 1.02m));

await transaction.CommitAsync();

Benchmark Results

To back this up with real data, here are the benchmark results:

benchmark
| Method                  | Mean      | Error     | StdDev    | Allocated  |
|-------------------------|-----------|-----------|-----------|------------|
| EfCoreFetchAndUpdateAll | 577.26 ms | 3.911 ms  | 3.467 ms  | 51580.97 kB|
| EfCoreExecuteUpdate     | 66.01 ms  | 1.311 ms  | 1.347 ms  | 2.71 kB    |
| EfCoreSql               | 63.32 ms  | 0.686 ms  | 0.537 ms  | 2.84 kB    |
| Dapper                  | 63.22 ms  | 0.876 ms  | 0.819 ms  | 2.5 kB     |

Using raw SQL is consistently a bit faster, but ExecuteUpdate remains the cleanest and safest approach overall.

Next Step

If you’re looking to push performance even further, stay tuned for the future blog posts where we’ll explore NuGet packages that take EF Core bulk update performance to an entirely new level.

Conclusion

Dapper is amazingly fast and easy to use, although EF Core can be as fast if used properly.

With EF Core’s ExecuteUpdate, it can be even easier to use while being on par with Dappers performance.

If you want to check out examples I created, you can find the source code here:

Source Code

I hope you enjoyed it, subscribe and get a notification when a new blog is up!

Subscribe

Stay tuned for valuable insights every Thursday morning.