Your database query takes 3 seconds. Users complain. You add an index. Still 2.8 seconds. The problem isn't SQL. It's how you're modeling data. You're loading entire object graphs when you need three fields. You're running thirty UPDATE statements when one would do. You're missing EF Core's advanced features that turn slow code into fast code.
Most EF Core tutorials cover CRUD operations and basic relationships. That gets you 60% of the way. The remaining 40% separates applications that struggle under load from those that scale. This tutorial focuses on that gap. You'll learn owned types that enforce domain boundaries, value converters that protect invariants, JSON columns for flexible schemas, temporal tables for auditing, and compiled queries that eliminate translation overhead. These aren't academic patterns. They solve real production problems.
What You'll Build
You'll build an e-commerce order system that demonstrates production-grade patterns:
Owned types for value objects like Money and Address with immutability guarantees
Value converters for smart types, enums, and encrypted columns with transparent encryption
JSON columns for product specifications that query deep into nested structures
Temporal tables for order history with point-in-time queries and restores
Compiled queries for dashboard hot paths that eliminate query translation overhead
Bulk operations using ExecuteUpdate/Delete for maintenance tasks without loading entities
Performance labs comparing tracking vs no-tracking, split queries, and projection strategies
Project Setup & Sample Domain
Start with a focused domain that demonstrates advanced patterns without excess complexity. An e-commerce order system has the right mix: value objects, flexible attributes, audit requirements, and performance constraints.
Create the Project
Create a .NET 8 console application with EF Core 8 packages. You'll need the SQL Server provider and optionally Npgsql for PostgreSQL examples.
Model products, orders, and customers with relationships. Keep it simple initially. You'll add owned types and converters next.
Models/Product.cs
public class Product
{
public int Id { get; set; }
public string Sku { get; set; } = string.Empty;
public string Name { get; set; } = string.Empty;
public decimal Price { get; set; }
public string Currency { get; set; } = "USD";
public string AttributesJson { get; set; } = "{}";
public bool IsDeleted { get; set; }
public int TenantId { get; set; }
}
public class Order
{
public int Id { get; set; }
public DateTime OrderDate { get; set; }
public int CustomerId { get; set; }
public Customer Customer { get; set; } = null!;
public List Lines { get; set; } = new();
public decimal TotalAmount { get; set; }
public byte[] RowVersion { get; set; } = Array.Empty();
}
public class OrderLine
{
public int Id { get; set; }
public int OrderId { get; set; }
public int ProductId { get; set; }
public Product Product { get; set; } = null!;
public int Quantity { get; set; }
public decimal UnitPrice { get; set; }
}
public class Customer
{
public int Id { get; set; }
public string Email { get; set; } = string.Empty;
public string AddressLine1 { get; set; } = string.Empty;
public string City { get; set; } = string.Empty;
public string PostalCode { get; set; } = string.Empty;
}
Configure DbContext
Set up the DbContext with SQL Server by default. Configure connection strings via appsettings or environment variables in real applications.
Data/EcommerceDbContext.cs
using Microsoft.EntityFrameworkCore;
public class EcommerceDbContext : DbContext
{
public DbSet Products => Set();
public DbSet Orders => Set();
public DbSet OrderLines => Set();
public DbSet Customers => Set();
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
// SQL Server
optionsBuilder.UseSqlServer(
"Server=localhost;Database=EcommerceAdvanced;Trusted_Connection=true;TrustServerCertificate=true");
// Or PostgreSQL (comment out SQL Server)
// optionsBuilder.UseNpgsql("Host=localhost;Database=ecommerce;Username=postgres;Password=postgres");
optionsBuilder.EnableSensitiveDataLogging();
optionsBuilder.LogTo(Console.WriteLine, LogLevel.Information);
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
// Configurations go here in subsequent sections
}
}
Domain Design Tip
Start with plain properties and basic relationships. Don't jump to owned types or JSON columns until you identify which fields cluster together (owned types) or need schema flexibility (JSON). Let the domain reveal its patterns through initial development.
Owned & Value Objects
Owned types enforce aggregate boundaries. Money doesn't exist without an Order. Address doesn't exist without a Customer. These are value objects with no identity. They're always loaded with their parent and never referenced independently.
Create Value Objects
Define Money and Address as immutable value types. Use init-only setters or constructors that enforce invariants. Override Equals to compare by value, not reference.
Models/ValueObjects.cs
public class Money
{
public decimal Amount { get; init; }
public string Currency { get; init; } = "USD";
public Money() { }
public Money(decimal amount, string currency = "USD")
{
if (amount < 0) throw new ArgumentException("Amount cannot be negative");
Amount = amount;
Currency = currency ?? "USD";
}
public Money Add(Money other)
{
if (Currency != other.Currency)
throw new InvalidOperationException("Cannot add different currencies");
return new Money(Amount + other.Amount, Currency);
}
public override bool Equals(object? obj) =>
obj is Money m && Amount == m.Amount && Currency == m.Currency;
public override int GetHashCode() => HashCode.Combine(Amount, Currency);
}
public class Address
{
public string Line1 { get; init; } = string.Empty;
public string Line2 { get; init; } = string.Empty;
public string City { get; init; } = string.Empty;
public string State { get; init; } = string.Empty;
public string PostalCode { get; init; } = string.Empty;
public string Country { get; init; } = string.Empty;
public Address() { }
public Address(string line1, string city, string postalCode, string country)
{
Line1 = line1;
City = city;
PostalCode = postalCode;
Country = country;
}
public override bool Equals(object? obj) =>
obj is Address a &&
Line1 == a.Line1 && City == a.City &&
PostalCode == a.PostalCode && Country == a.Country;
public override int GetHashCode() =>
HashCode.Combine(Line1, City, PostalCode, Country);
}
Configure Owned Types
Map owned types in OnModelCreating. EF stores them in the parent table using column prefixes. Changes to owned properties trigger updates on the parent entity.
Updated Models
public class Order
{
public int Id { get; set; }
public DateTime OrderDate { get; set; }
public int CustomerId { get; set; }
public Customer Customer { get; set; } = null!;
public List Lines { get; set; } = new();
public Money Total { get; set; } = new();
public byte[] RowVersion { get; set; } = Array.Empty();
}
public class Customer
{
public int Id { get; set; }
public string Email { get; set; } = string.Empty;
public Address ShippingAddress { get; set; } = new();
public Address? BillingAddress { get; set; }
}
// In EcommerceDbContext.OnModelCreating:
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.Entity(entity =>
{
entity.OwnsOne(o => o.Total, money =>
{
money.Property(m => m.Amount)
.HasColumnName("TotalAmount")
.HasPrecision(18, 2);
money.Property(m => m.Currency)
.HasColumnName("TotalCurrency")
.HasMaxLength(3);
});
});
modelBuilder.Entity(entity =>
{
entity.OwnsOne(c => c.ShippingAddress, addr =>
{
addr.Property(a => a.Line1).HasColumnName("ShipLine1");
addr.Property(a => a.City).HasColumnName("ShipCity");
addr.Property(a => a.PostalCode).HasColumnName("ShipPostalCode");
addr.Property(a => a.Country).HasColumnName("ShipCountry");
});
entity.OwnsOne(c => c.BillingAddress, addr =>
{
addr.Property(a => a.Line1).HasColumnName("BillLine1");
addr.Property(a => a.City).HasColumnName("BillCity");
addr.Property(a => a.PostalCode).HasColumnName("BillPostalCode");
addr.Property(a => a.Country).HasColumnName("BillCountry");
});
});
}
Update Owned Values
Updating owned types requires replacing the entire value. EF detects the change via snapshot comparison. Partial updates don't work. Create new instances with updated values.
Updating Owned Types
var order = await context.Orders.FindAsync(orderId);
if (order != null)
{
// Replace the entire owned value
order.Total = new Money(order.Total.Amount + 10.00m, order.Total.Currency);
await context.SaveChangesAsync();
}
var customer = await context.Customers.FindAsync(customerId);
if (customer != null)
{
// Update address by creating new instance
customer.ShippingAddress = new Address(
"123 New Street",
customer.ShippingAddress.City,
customer.ShippingAddress.PostalCode,
customer.ShippingAddress.Country
);
await context.SaveChangesAsync();
}
Owned Type Boundaries
Use owned types for value objects that define identity by their properties, not by an ID. Money, DateRange, Coordinates, and Address are classic examples. Avoid owned types for entities that need independent queries, foreign keys from other tables, or separate lifecycle management.
Value Converters
Value converters translate between C# types and database types. They enforce invariants, enable strongly-typed primitives, and handle custom serialization. Use them for enums stored as strings, single-value smart types, and encryption.
Enum String Converter
Store enums as strings for readability in the database. Numeric enums break when you reorder or add values. Strings survive schema evolution.
Enum Converter
public enum OrderStatus
{
Pending,
Processing,
Shipped,
Delivered,
Cancelled
}
public class Order
{
public int Id { get; set; }
public OrderStatus Status { get; set; }
// ... other properties
}
// In OnModelCreating:
modelBuilder.Entity()
.Property(o => o.Status)
.HasConversion()
.HasMaxLength(20);
Smart Type Converter
Create single-value types that wrap primitives with validation. Sku and Email prevent invalid values from entering your domain. Converters map them to simple columns.
Smart Types
public readonly record struct Sku
{
private readonly string _value;
public Sku(string value)
{
if (string.IsNullOrWhiteSpace(value) || value.Length > 50)
throw new ArgumentException("Invalid SKU");
_value = value.ToUpperInvariant();
}
public override string ToString() => _value;
public static implicit operator string(Sku sku) => sku._value;
}
public class Product
{
public int Id { get; set; }
public Sku Sku { get; set; }
public string Name { get; set; } = string.Empty;
}
// In OnModelCreating:
modelBuilder.Entity()
.Property(p => p.Sku)
.HasConversion(
sku => sku.ToString(),
value => new Sku(value))
.HasMaxLength(50);
Encrypted Column Converter
Encrypt sensitive fields transparently using a custom converter. Store encrypted bytes in the database. Your domain code works with plain text. Handle encryption keys via configuration or key vaults.
Encryption Converter
using System.Security.Cryptography;
using System.Text;
public class EncryptedStringConverter : ValueConverter
{
private static readonly byte[] Key = Encoding.UTF8.GetBytes("YourSecure32ByteKeyForAES256!!!!"); // Must be exactly 32 bytes for AES-256
public EncryptedStringConverter() : base(
plainText => Encrypt(plainText),
encrypted => Decrypt(encrypted))
{ }
private static byte[] Encrypt(string plainText)
{
if (string.IsNullOrEmpty(plainText)) return Array.Empty();
using var aes = Aes.Create();
aes.Key = Key;
aes.GenerateIV();
using var encryptor = aes.CreateEncryptor(aes.Key, aes.IV);
using var ms = new MemoryStream();
ms.Write(aes.IV, 0, aes.IV.Length);
using (var cs = new CryptoStream(ms, encryptor, CryptoStreamMode.Write))
using (var sw = new StreamWriter(cs))
{
sw.Write(plainText);
}
return ms.ToArray();
}
private static string Decrypt(byte[] encrypted)
{
if (encrypted.Length == 0) return string.Empty;
using var aes = Aes.Create();
aes.Key = Key;
var iv = new byte[16];
Array.Copy(encrypted, 0, iv, 0, iv.Length);
aes.IV = iv;
using var decryptor = aes.CreateDecryptor(aes.Key, aes.IV);
using var ms = new MemoryStream(encrypted, 16, encrypted.Length - 16);
using var cs = new CryptoStream(ms, decryptor, CryptoStreamMode.Read);
using var sr = new StreamReader(cs);
return sr.ReadToEnd();
}
}
public class Customer
{
public int Id { get; set; }
public string Email { get; set; } = string.Empty;
public string CreditCardLast4 { get; set; } = string.Empty; // Encrypted
}
// In OnModelCreating:
modelBuilder.Entity()
.Property(c => c.CreditCardLast4)
.HasConversion(new EncryptedStringConverter())
.HasColumnType("varbinary(max)");
Converter Performance
Converters run for every value read or written. Keep conversion logic fast. Avoid heavy allocations or I/O. For encryption, consider using database-native encryption (Always Encrypted for SQL Server) for better performance. Custom converters work for moderate-throughput scenarios where you control the encryption key lifecycle.
JSON Columns
JSON columns store semi-structured data without schema changes. Use them for attributes that vary by record or evolve frequently. EF Core 8 maps JSON to strongly-typed C# objects and enables querying into JSON documents.
SQL Server JSON Mapping
Map a C# class to a JSON column. EF serializes the entire object graph. Query JSON properties using EF.Functions.JsonValue for scalars or JsonQuery for nested objects.
JSON on SQL Server
public class ProductAttributes
{
public string Color { get; set; } = string.Empty;
public string Size { get; set; } = string.Empty;
public Dictionary Specs { get; set; } = new();
}
public class Product
{
public int Id { get; set; }
public string Name { get; set; } = string.Empty;
public ProductAttributes Attributes { get; set; } = new();
}
// In OnModelCreating:
modelBuilder.Entity()
.OwnsOne(p => p.Attributes, options => options.ToJson());
// Querying JSON properties
var redProducts = await context.Products
.Where(p => p.Attributes.Color == "Red")
.ToListAsync();
// Query nested dictionary (requires raw SQL or computed columns)
var products = await context.Products
.FromSqlRaw(@"
SELECT * FROM Products
WHERE JSON_VALUE(Attributes, '$.Specs.Material') = 'Cotton'")
.ToListAsync();
PostgreSQL JSON Mapping
PostgreSQL has native jsonb support with operators and indexes. Use Npgsql extensions to query JSON using LINQ. Create GIN indexes on jsonb columns for containment queries.
JSON on PostgreSQL
// Same entity model
public class Product
{
public int Id { get; set; }
public string Name { get; set; } = string.Empty;
public ProductAttributes Attributes { get; set; } = new();
}
// In OnModelCreating:
modelBuilder.Entity()
.OwnsOne(p => p.Attributes, options => options.ToJson());
// Query with Npgsql extensions
var redProducts = await context.Products
.Where(p => p.Attributes.Color == "Red")
.ToListAsync();
// Create GIN index for fast JSON queries (in migration):
// CREATE INDEX idx_product_attributes ON products USING GIN (attributes);
// Containment query using @> operator
var products = await context.Products
.FromSqlRaw(@"
SELECT * FROM ""Products""
WHERE ""Attributes"" @> '{""Color"": ""Red""}'::jsonb")
.ToListAsync();
Updating JSON Properties
Modify JSON properties on tracked entities. EF detects changes via snapshot comparison and updates the entire JSON column. Partial updates aren't supported natively.
Use JSON for schema-less data like product specs, user preferences, or metadata. Use tables for structured data with foreign keys, indexes, or heavy query requirements. JSON trades query performance for flexibility. Index frequently-queried JSON paths using computed columns (SQL Server) or GIN indexes (PostgreSQL).
Temporal Tables
Temporal tables track row-level history automatically. SQL Server maintains a shadow history table. Query data as it existed at any point in time. PostgreSQL requires manual audit tables and triggers.
Enable Temporal Tables (SQL Server)
Configure temporal support in OnModelCreating. EF generates migrations that enable SYSTEM_VERSIONING. The database tracks PeriodStart and PeriodEnd columns automatically.
Use TemporalAsOf to query historical data. Retrieve orders as they existed on a specific date. Useful for compliance, reporting, and debugging.
Temporal Queries
// Query current state
var currentOrder = await context.Orders
.Include(o => o.Lines)
.FirstOrDefaultAsync(o => o.Id == orderId);
// Query state as of specific date
var historicalDate = new DateTime(2025, 10, 1);
var historicalOrder = await context.Orders
.TemporalAsOf(historicalDate)
.Include(o => o.Lines)
.FirstOrDefaultAsync(o => o.Id == orderId);
// Query all changes between dates
var startDate = new DateTime(2025, 10, 1);
var endDate = new DateTime(2025, 11, 1);
var orderHistory = await context.Orders
.TemporalBetween(startDate, endDate)
.Where(o => o.Id == orderId)
.OrderBy(o => EF.Property(o, "PeriodStart"))
.ToListAsync();
// Get all versions of an order
var allVersions = await context.Orders
.TemporalAll()
.Where(o => o.Id == orderId)
.OrderBy(o => EF.Property(o, "PeriodStart"))
.ToListAsync();
Restore Previous State
Revert an entity to a previous version by querying history and applying those values. This creates a new current version with historical data.
Restore from History
async Task RestoreOrderToDate(int orderId, DateTime restoreDate)
{
var historical = await context.Orders
.TemporalAsOf(restoreDate)
.AsNoTracking()
.FirstOrDefaultAsync(o => o.Id == orderId);
if (historical == null) return;
var current = await context.Orders.FindAsync(orderId);
if (current != null)
{
current.Status = historical.Status;
current.Total = historical.Total;
// Copy other properties as needed
await context.SaveChangesAsync();
}
}
PostgreSQL Alternative
PostgreSQL lacks native temporal tables. Implement audit tables with triggers or use application-level history tracking. Create a view that unions current and history tables.
Manual Audit (PostgreSQL)
-- Create history table
CREATE TABLE orders_history (
id INT,
order_date TIMESTAMP,
status VARCHAR(20),
total_amount DECIMAL(18,2),
valid_from TIMESTAMP,
valid_to TIMESTAMP
);
-- Trigger to copy to history on update
CREATE OR REPLACE FUNCTION orders_history_trigger()
RETURNS TRIGGER AS $$
BEGIN
INSERT INTO orders_history
SELECT OLD.*, NOW(), '9999-12-31'::timestamp
WHERE OLD.id IS NOT NULL;
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
CREATE TRIGGER orders_audit
BEFORE UPDATE ON orders
FOR EACH ROW EXECUTE FUNCTION orders_history_trigger();
Temporal vs Audit Logs
Temporal tables capture what changed but not who or why. Use them for compliance and point-in-time queries. Add application-level audit logs to capture user identity, IP addresses, and business reasons. Temporal tables give you data snapshots; audit logs provide operational context.
Global Query Filters & Soft-Delete
Global query filters apply WHERE clauses automatically to all queries. Use them for soft deletes, multi-tenant isolation, and row-level security. Filters compose with your query logic and can be temporarily disabled.
Soft Delete Pattern
Add an IsDeleted flag to entities. Configure a filter that excludes deleted rows. Queries automatically respect the filter unless explicitly ignored.
Soft Delete Filter
public class Product
{
public int Id { get; set; }
public string Name { get; set; } = string.Empty;
public bool IsDeleted { get; set; }
}
// In OnModelCreating:
modelBuilder.Entity()
.HasQueryFilter(p => !p.IsDeleted);
// Regular queries automatically exclude deleted products
var products = await context.Products.ToListAsync();
// Admin queries can see all products including deleted
var allProducts = await context.Products
.IgnoreQueryFilters()
.ToListAsync();
// Soft delete operation
var product = await context.Products.FindAsync(productId);
if (product != null)
{
product.IsDeleted = true;
await context.SaveChangesAsync();
}
Multi-Tenant Filter
Add a TenantId to entities. Filter queries to the current tenant. Store tenant context in a service that OnModelCreating can access. This prevents cross-tenant data leaks.
Multi-Tenant Filter
public interface ITenantProvider
{
int CurrentTenantId { get; }
}
public class Product
{
public int Id { get; set; }
public string Name { get; set; } = string.Empty;
public int TenantId { get; set; }
public bool IsDeleted { get; set; }
}
public class EcommerceDbContext : DbContext
{
private readonly ITenantProvider _tenantProvider;
public EcommerceDbContext(
DbContextOptions options,
ITenantProvider tenantProvider) : base(options)
{
_tenantProvider = tenantProvider;
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
// Combine tenant and soft-delete filters
modelBuilder.Entity()
.HasQueryFilter(p =>
p.TenantId == _tenantProvider.CurrentTenantId &&
!p.IsDeleted);
}
}
// Queries automatically filter by tenant
var tenantProducts = await context.Products.ToListAsync();
Filter Composition
Filters compose with query predicates. EF combines them with AND logic. Temporal queries and filters interact correctly. Test filter combinations thoroughly.
Filter Composition
// Filter + query predicate + ordering
var expensiveProducts = await context.Products
.Where(p => p.Price > 100) // Your predicate
.OrderBy(p => p.Name)
.ToListAsync();
// Generated SQL includes: WHERE TenantId = @tenant AND IsDeleted = 0 AND Price > 100
// Ignore filters for admin operations
var allTenantsProducts = await context.Products
.IgnoreQueryFilters()
.Where(p => p.IsDeleted == false)
.ToListAsync();
Filter Gotchas
Filters don't apply to Set operations, raw SQL queries, or direct DbSet access in some scenarios. Always test admin operations that bypass filters. Use IgnoreQueryFilters explicitly rather than relying on implicit bypasses. Filters compose, so multiple filters on the same entity combine with AND logic.
Compiled Queries
Compiled queries cache the query translation and execution plan. EF normally translates LINQ to SQL on every execution. Compilation eliminates that overhead for hot paths. Use compiled queries for stable query shapes called frequently.
Create Compiled Queries
Use EF.CompileQuery or EF.CompileAsyncQuery to define queries once. Store them in static fields. The query shape must be fixed. Parameters are allowed but dynamic projections aren't.
Compiled Query Definitions
public static class CompiledQueries
{
// Product lookup by ID (async, single result)
public static readonly Func> GetProductById =
EF.CompileAsyncQuery((EcommerceDbContext context, int id) =>
context.Products.FirstOrDefault(p => p.Id == id));
// Recent orders for customer (async collection returns IAsyncEnumerable)
public static readonly Func> GetRecentOrders =
EF.CompileAsyncQuery((EcommerceDbContext context, int customerId, int limit) =>
context.Orders
.Where(o => o.CustomerId == customerId)
.OrderByDescending(o => o.OrderDate)
.Take(limit));
// Order total by status (sync, scalar result)
public static readonly Func GetTotalByStatus =
EF.CompileQuery((EcommerceDbContext context, string status) =>
context.Orders
.Where(o => o.Status.ToString() == status)
.Sum(o => o.Total.Amount));
// Top products by revenue (sync collection returns IEnumerable)
public static readonly Func> GetTopProducts =
EF.CompileQuery((EcommerceDbContext context, int limit) =>
context.OrderLines
.GroupBy(ol => new { ol.ProductId, ol.Product.Name })
.Select(g => new ProductRevenue
{
ProductId = g.Key.ProductId,
ProductName = g.Key.Name,
TotalRevenue = g.Sum(ol => ol.UnitPrice * ol.Quantity)
})
.OrderByDescending(pr => pr.TotalRevenue)
.Take(limit));
}
public class ProductRevenue
{
public int ProductId { get; set; }
public string ProductName { get; set; } = string.Empty;
public decimal TotalRevenue { get; set; }
}
Use Compiled Queries
Call compiled queries like functions. Pass the context and parameters. The query plan is cached after first execution. Subsequent calls skip translation.
Using Compiled Queries
// Regular query (translates LINQ every time)
var product = await context.Products
.Where(p => p.Id == productId)
.FirstOrDefaultAsync();
// Compiled query (translation cached)
var compiledProduct = await CompiledQueries.GetProductById(context, productId);
// Dashboard hot path (IAsyncEnumerable streams results efficiently)
var recentOrders = new List();
await foreach (var order in CompiledQueries.GetRecentOrders(context, customerId, 10))
recentOrders.Add(order);
var pendingTotal = CompiledQueries.GetTotalByStatus(context, "Pending");
var topProducts = CompiledQueries.GetTopProducts(context, 10).ToList();
// Benchmark shows 10-50ms savings per query depending on complexity
When Not to Compile
Don't compile dynamic queries where projections or filters vary. Don't compile rarely-used queries. The cache overhead exceeds the benefit. Reserve compilation for hot paths with stable shapes.
Dynamic Query (Don't Compile)
// Dynamic projection - doesn't benefit from compilation
var products = await context.Products
.Select(p => includeDetails
? new { p.Id, p.Name, p.Price, p.Attributes }
: new { p.Id, p.Name, Price = 0m, Attributes = (ProductAttributes?)null })
.ToListAsync();
// Dynamic filtering - doesn't benefit from compilation
IQueryable query = context.Orders;
if (startDate.HasValue) query = query.Where(o => o.OrderDate >= startDate);
if (status.HasValue) query = query.Where(o => o.Status == status);
var orders = await query.ToListAsync();
Compilation Tradeoffs
Compiled queries save 10-50ms per execution depending on query complexity. The benefit scales with frequency. A query executed 1000 times per minute saves 10-50 seconds of CPU. Use them for dashboards, hot API endpoints, and repeated lookups. Skip them for admin operations, reports, and one-off queries.
Bulk Operations & Batching
ExecuteUpdate and ExecuteDelete run server-side operations without loading entities into memory. They bypass change tracking, interceptors, and domain events. Use them for maintenance tasks where you want raw performance.
Bulk Update
Update multiple rows with a single SQL statement. ExecuteUpdate translates to an UPDATE statement with your SET clause and WHERE condition. No entities are loaded or tracked.
Delete rows without loading them. ExecuteDelete translates to a DELETE statement. Cascade deletes fire if configured. Foreign key constraints apply.
Bulk Delete
// Delete old soft-deleted records permanently
var deletedCount = await context.Products
.IgnoreQueryFilters()
.Where(p => p.IsDeleted && p.DeletedDate < DateTime.Now.AddYears(-1))
.ExecuteDeleteAsync();
Console.WriteLine($"Permanently deleted {deletedCount} products");
// Delete orders without lines
await context.Orders
.Where(o => !o.Lines.Any())
.ExecuteDeleteAsync();
Command Batching
EF automatically batches INSERT, UPDATE, and DELETE statements. Configure batch size via MaxBatchSize. Batching reduces round trips but requires careful error handling.
Command Batching
// Configure batch size
optionsBuilder.UseSqlServer(connectionString, options =>
{
options.MaxBatchSize(100); // Batch up to 100 commands
options.CommandTimeout(60);
});
// Import 10,000 products
var products = LoadProductsFromCsv(); // Returns 10,000 products
await context.Products.AddRangeAsync(products);
await context.SaveChangesAsync(); // Sends ~100 batches of 100 INSERTs each
// For massive imports, use AsNoTracking and explicit batching
foreach (var batch in products.Chunk(1000))
{
var batchContext = new EcommerceDbContext();
await batchContext.Products.AddRangeAsync(batch);
await batchContext.SaveChangesAsync();
await batchContext.DisposeAsync();
}
Bulk Operation Caveats
ExecuteUpdate and ExecuteDelete bypass EF's change tracking pipeline. Domain events don't fire. Interceptors don't run. Temporal tables may not track changes depending on provider. Use them for maintenance operations where you explicitly want to skip the EF machinery. For business operations that need auditing, load entities and use SaveChanges.
Transactions & Concurrency
Transactions ensure atomic operations. Concurrency tokens prevent lost updates. EF supports optimistic concurrency with row versions and pessimistic concurrency with explicit locks.
Explicit Transactions
Wrap multiple SaveChanges calls in a transaction. Use BeginTransaction to start a unit of work. Commit on success or rollback on failure. Transactions scope to a single DbContext.
Explicit Transactions
using var transaction = await context.Database.BeginTransactionAsync();
try
{
// Create order
var order = new Order
{
CustomerId = customerId,
OrderDate = DateTime.UtcNow,
Status = OrderStatus.Pending
};
context.Orders.Add(order);
await context.SaveChangesAsync();
// Add order lines
foreach (var item in cartItems)
{
var line = new OrderLine
{
OrderId = order.Id,
ProductId = item.ProductId,
Quantity = item.Quantity,
UnitPrice = item.Price
};
context.OrderLines.Add(line);
}
await context.SaveChangesAsync();
// Update inventory
await context.Products
.Where(p => cartItems.Select(c => c.ProductId).Contains(p.Id))
.ExecuteUpdateAsync(setters => setters
.SetProperty(p => p.StockQuantity, p => p.StockQuantity - 1));
await transaction.CommitAsync();
}
catch (Exception ex)
{
await transaction.RollbackAsync();
throw;
}
Optimistic Concurrency
Use RowVersion or concurrency tokens to detect conflicts. When two users edit the same row, the second save fails with DbUpdateConcurrencyException. Handle by reloading and retrying.
Concurrency Handling
public class Order
{
public int Id { get; set; }
public decimal TotalAmount { get; set; }
[Timestamp]
public byte[] RowVersion { get; set; } = Array.Empty();
}
// Or configure in OnModelCreating:
modelBuilder.Entity()
.Property(o => o.RowVersion)
.IsRowVersion();
// Handle concurrency conflict
async Task UpdateOrderWithRetry(int orderId, decimal newTotal, int maxRetries = 3)
{
for (int attempt = 0; attempt < maxRetries; attempt++)
{
try
{
var order = await context.Orders.FindAsync(orderId);
if (order == null) return false;
order.TotalAmount = newTotal;
await context.SaveChangesAsync();
return true;
}
catch (DbUpdateConcurrencyException ex)
{
// Another user modified the order
var entry = ex.Entries.Single();
var databaseValues = await entry.GetDatabaseValuesAsync();
if (databaseValues == null)
{
// Order was deleted
return false;
}
// Reload with current database values
entry.OriginalValues.SetValues(databaseValues);
if (attempt == maxRetries - 1)
throw; // Max retries exceeded
}
}
return false;
}
PostgreSQL xmin Concurrency
PostgreSQL uses xmin system column for concurrency. Configure it as a concurrency token. EF includes it in UPDATE statements automatically.
PostgreSQL Concurrency
public class Order
{
public int Id { get; set; }
public decimal TotalAmount { get; set; }
public uint Version { get; set; } // Maps to xmin
}
// In OnModelCreating:
modelBuilder.Entity()
.Property(o => o.Version)
.HasColumnName("xmin")
.HasColumnType("xid")
.ValueGeneratedOnAddOrUpdate()
.IsConcurrencyToken();
Retry Strategy
Implement exponential backoff for transient errors. Use Microsoft.EntityFrameworkCore.SqlServer's EnableRetryOnFailure for automatic retries. For concurrency conflicts, reload the entity and reapply business logic rather than blindly overwriting. Always log conflicts for monitoring.
Performance Lab
Benchmarking reveals the cost of tracking, eager loading, and query patterns. Test scenarios that mirror production workloads. Small differences compound under load.
Tracking vs No-Tracking
AsNoTracking skips change detection. Use it for read-only queries. Tracking queries are 2-5x slower depending on entity graph size.
Tracking Comparison
var sw = Stopwatch.StartNew();
// Tracking query - EF snapshots entities for change detection
var tracked = await context.Products.ToListAsync();
Console.WriteLine($"Tracking: {sw.ElapsedMilliseconds}ms"); // ~150ms for 10k rows
sw.Restart();
// No-tracking query - skips snapshots
var noTracking = await context.Products.AsNoTracking().ToListAsync();
Console.WriteLine($"No-Tracking: {sw.ElapsedMilliseconds}ms"); // ~60ms for 10k rows
sw.Restart();
// Projection - most efficient (only loads needed columns)
var projection = await context.Products
.Select(p => new { p.Id, p.Name, p.Price })
.ToListAsync();
Console.WriteLine($"Projection: {sw.ElapsedMilliseconds}ms"); // ~40ms for 10k rows
Include vs Split Queries
Include uses LEFT JOINs by default. Split queries use separate SQL statements. Split queries avoid cartesian explosion for multiple collections.
Include Strategies
// Single query with JOIN - can cause cartesian explosion
var singleQuery = await context.Orders
.Include(o => o.Lines)
.Include(o => o.Customer)
.ToListAsync();
// SQL: LEFT JOIN OrderLines LEFT JOIN Customers
// Returns N_orders × N_lines rows (duplicate order/customer data)
// Split queries - separate SQL statements
var splitQuery = await context.Orders
.Include(o => o.Lines)
.Include(o => o.Customer)
.AsSplitQuery()
.ToListAsync();
// SQL 1: SELECT * FROM Orders
// SQL 2: SELECT * FROM OrderLines WHERE OrderId IN (...)
// SQL 3: SELECT * FROM Customers WHERE Id IN (...)
// Configure split queries globally
optionsBuilder.UseSqlServer(connectionString,
options => options.UseQuerySplittingBehavior(QuerySplittingBehavior.SplitQuery));
Compiled vs Regular Queries
Measure the translation overhead. Compiled queries save 10-50ms per execution. The benefit scales with call frequency.
Query Compilation Benchmark
const int iterations = 1000;
var sw = Stopwatch.StartNew();
// Regular query
for (int i = 0; i < iterations; i++)
{
var product = await context.Products
.Where(p => p.Id == i % 100)
.FirstOrDefaultAsync();
}
Console.WriteLine($"Regular: {sw.ElapsedMilliseconds}ms"); // ~5000ms
sw.Restart();
// Compiled query
for (int i = 0; i < iterations; i++)
{
var product = await CompiledQueries.GetProductById(context, i % 100);
}
Console.WriteLine($"Compiled: {sw.ElapsedMilliseconds}ms"); // ~3000ms
// Savings: ~2ms per query × 1000 queries = 2000ms total
Batch Size Tuning
Test different batch sizes for bulk inserts. Small batches increase round trips. Large batches risk timeout. Find the sweet spot for your data shape.
Batch Size Testing
var products = GenerateProducts(10000);
foreach (var batchSize in new[] { 10, 50, 100, 500, 1000 })
{
var sw = Stopwatch.StartNew();
var options = new DbContextOptionsBuilder()
.UseSqlServer(connectionString, opts => opts.MaxBatchSize(batchSize))
.Options;
using var context = new EcommerceDbContext(options);
await context.Products.AddRangeAsync(products);
await context.SaveChangesAsync();
Console.WriteLine($"Batch size {batchSize}: {sw.ElapsedMilliseconds}ms");
}
// Results: 10=12000ms, 50=3000ms, 100=1800ms, 500=1500ms, 1000=1600ms
// Sweet spot: 500 for this data shape
Benchmark Principles
Use production-like data volumes. Warm up queries before measuring. Run multiple iterations and average results. Test on the actual database server, not SQLite or in-memory. Profile with SQL Profiler or pg_stat_statements to verify generated SQL. Monitor database metrics alongside application metrics.
Testing & Seeding
Use SQLite in-memory for fast unit tests. Use Testcontainers for provider-correct integration tests. Seed test data consistently to avoid flaky tests.
SQLite In-Memory Tests
SQLite runs entirely in memory. Tests execute in milliseconds. It lacks some features like temporal tables and full JSON support. Use it for basic logic tests.
SQLite Test Setup
public class TestDbContextFactory
{
public static EcommerceDbContext CreateSqliteContext()
{
var options = new DbContextOptionsBuilder()
.UseSqlite("DataSource=:memory:")
.Options;
var context = new EcommerceDbContext(options);
context.Database.OpenConnection(); // Keep in-memory database alive
context.Database.EnsureCreated();
return context;
}
}
// Test example
[Fact]
public async Task Order_Total_Calculated_Correctly()
{
using var context = TestDbContextFactory.CreateSqliteContext();
var product = new Product { Name = "Test Product", Price = 10.00m };
context.Products.Add(product);
await context.SaveChangesAsync();
var order = new Order
{
CustomerId = 1,
Lines = new List
{
new() { ProductId = product.Id, Quantity = 2, UnitPrice = 10.00m },
new() { ProductId = product.Id, Quantity = 3, UnitPrice = 10.00m }
}
};
order.Total = new Money(50.00m);
context.Orders.Add(order);
await context.SaveChangesAsync();
var saved = await context.Orders.Include(o => o.Lines).FirstAsync();
Assert.Equal(50.00m, saved.Total.Amount);
Assert.Equal(5, saved.Lines.Sum(l => l.Quantity));
}
Testcontainers Integration
Testcontainers spins up real database containers for tests. Use it for provider-specific features like temporal tables, JSON queries, and stored procedures. Tests run slower but cover actual behavior.
Testcontainers Setup
// Add package: Testcontainers.MsSql or Testcontainers.PostgreSql
using Testcontainers.MsSql;
public class DatabaseFixture : IAsyncLifetime
{
private readonly MsSqlContainer _container = new MsSqlBuilder()
.WithImage("mcr.microsoft.com/mssql/server:2022-latest")
.Build();
public string ConnectionString => _container.GetConnectionString();
public async Task InitializeAsync()
{
await _container.StartAsync();
// Run migrations
var options = new DbContextOptionsBuilder()
.UseSqlServer(ConnectionString)
.Options;
using var context = new EcommerceDbContext(options);
await context.Database.MigrateAsync();
}
public async Task DisposeAsync()
{
await _container.DisposeAsync();
}
}
[Collection("Database")]
public class TemporalTableTests : IClassFixture
{
private readonly DatabaseFixture _fixture;
public TemporalTableTests(DatabaseFixture fixture)
{
_fixture = fixture;
}
[Fact]
public async Task Temporal_Query_Returns_Historical_Data()
{
var options = new DbContextOptionsBuilder()
.UseSqlServer(_fixture.ConnectionString)
.Options;
using var context = new EcommerceDbContext(options);
// Create order
var order = new Order { Status = OrderStatus.Pending };
context.Orders.Add(order);
await context.SaveChangesAsync();
var orderDate = DateTime.UtcNow;
// Update order
await Task.Delay(100);
order.Status = OrderStatus.Shipped;
await context.SaveChangesAsync();
// Query historical state
var historical = await context.Orders
.TemporalAsOf(orderDate)
.FirstAsync(o => o.Id == order.Id);
Assert.Equal(OrderStatus.Pending, historical.Status);
}
}
Seed Test Data
Create reusable seed methods for test data. Use builders or factories for complex object graphs. Reset state between tests to avoid coupling.
Test Data Seeding
public static class TestDataSeeder
{
public static async Task SeedBasicData(EcommerceDbContext context)
{
var products = new[]
{
new Product { Name = "Product A", Sku = new Sku("SKU-A"), Price = 10.00m },
new Product { Name = "Product B", Sku = new Sku("SKU-B"), Price = 20.00m },
new Product { Name = "Product C", Sku = new Sku("SKU-C"), Price = 30.00m }
};
context.Products.AddRange(products);
var customer = new Customer
{
Email = "test@example.com",
ShippingAddress = new Address("123 Main St", "City", "12345", "US")
};
context.Customers.Add(customer);
await context.SaveChangesAsync();
}
public static OrderBuilder Order() => new OrderBuilder();
}
public class OrderBuilder
{
private readonly Order _order = new() { OrderDate = DateTime.UtcNow };
public OrderBuilder WithCustomer(int customerId)
{
_order.CustomerId = customerId;
return this;
}
public OrderBuilder WithLine(int productId, int quantity, decimal price)
{
_order.Lines.Add(new OrderLine
{
ProductId = productId,
Quantity = quantity,
UnitPrice = price
});
return this;
}
public Order Build() => _order;
}
Test Strategy
Use SQLite for 80% of tests covering business logic. Use Testcontainers for 20% covering provider-specific features like temporal tables, JSON queries, and concurrency. Mock external dependencies. Reset database state between tests using transactions or database recreation. Keep tests fast to encourage running them frequently.
Migrations & CI/CD
Migrations evolve your database schema safely. Generate idempotent SQL scripts for production. Test migrations in staging before production. Handle online migrations to avoid downtime.
Create and Apply Migrations
Use the dotnet ef tool to generate migrations from model changes. Review generated SQL before applying. Test rollback scenarios.
Migration Commands
# Install EF Core tools
dotnet tool install --global dotnet-ef
# Create migration
dotnet ef migrations add AddTemporalTables
# Review generated migration
# Edit Up/Down methods if needed
# Apply to local database
dotnet ef database update
# Generate SQL script for production
dotnet ef migrations script --idempotent --output migrations.sql
# Apply specific migration
dotnet ef database update AddOwnedTypes
# Rollback to previous migration
dotnet ef database update PreviousMigration
# Remove last migration (if not applied)
dotnet ef migrations remove
Idempotent Scripts
Idempotent scripts can run multiple times safely. They check for existing schema before making changes. Use them in CI/CD pipelines where execution state is uncertain.
Idempotent Migration Script
# Generate idempotent script for all migrations
dotnet ef migrations script --idempotent --output deploy.sql
# Script includes checks like:
# IF NOT EXISTS (SELECT * FROM [__EFMigrationsHistory] WHERE [MigrationId] = N'20250114_AddTemporalTables')
# BEGIN
# -- Migration code here
# INSERT INTO [__EFMigrationsHistory] ([MigrationId], [ProductVersion])
# VALUES (N'20250114_AddTemporalTables', N'8.0.0');
# END
Online Migrations
Avoid downtime by making migrations backward-compatible. Add columns as nullable, deploy code, backfill data, then enforce constraints. Use feature flags to toggle new behavior.
Online Migration Strategy
// Phase 1: Add nullable column
public partial class AddStatusColumn : Migration
{
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.AddColumn(
name: "Status",
table: "Orders",
nullable: true,
maxLength: 20);
}
}
// Phase 2: Deploy code that writes to new column (reads from old)
public class Order
{
public string? Status { get; set; }
public string GetEffectiveStatus() => Status ?? "Pending";
}
// Phase 3: Backfill data
await context.Database.ExecuteSqlRawAsync(
"UPDATE Orders SET Status = 'Pending' WHERE Status IS NULL");
// Phase 4: Make column non-nullable
public partial class MakeStatusRequired : Migration
{
protected override void Up(MigrationBuilder migrationBuilder)
{
migrationBuilder.AlterColumn(
name: "Status",
table: "Orders",
nullable: false);
}
}
CI/CD Integration
Run migrations as part of deployment pipeline. Use migration bundles for containerized deployments. Validate migrations in staging before production.
CI/CD Pipeline (GitHub Actions)
name: Deploy
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup .NET
uses: actions/setup-dotnet@v3
with:
dotnet-version: '8.0.x'
- name: Install EF Core tools
run: dotnet tool install --global dotnet-ef
- name: Generate migration bundle
run: dotnet ef migrations bundle --configuration Release
- name: Run migrations on staging
run: ./efbundle --connection "${{ secrets.STAGING_CONNECTION }}"
- name: Run tests
run: dotnet test
- name: Deploy application
run: dotnet publish -c Release
- name: Run migrations on production
run: ./efbundle --connection "${{ secrets.PROD_CONNECTION }}"
Migration Safety
Never delete columns immediately. Mark them unused, deploy, verify, then remove in a later migration. Test rollback scenarios. Keep migrations small and focused. Use transactions where supported. Monitor lock times in production. Consider maintenance windows for large schema changes that require table rebuilds.
Observability
EF Core emits detailed logs, diagnostic events, and metrics. Configure logging levels to balance verbosity and performance. Use interceptors to inject tags for distributed tracing. Monitor query performance and connection pool health.
Configure Logging
Control which EF categories log. Enable sensitive data logging in development. Send logs to Application Insights or other sinks in production.
Add tags to queries that appear as SQL comments. Use them to correlate queries with application traces. Tag with operation name, user ID, or request ID.
Track query durations, connection pool usage, and SaveChanges latency. Export metrics to Prometheus or Application Insights. Alert on slow queries and pool exhaustion.
Metrics Collection
public class MetricsInterceptor : DbCommandInterceptor
{
private readonly IMetrics _metrics;
public MetricsInterceptor(IMetrics metrics)
{
_metrics = metrics;
}
public override async ValueTask> ReaderExecutingAsync(
DbCommand command,
CommandEventData eventData,
InterceptionResult result,
CancellationToken cancellationToken = default)
{
var sw = Stopwatch.StartNew();
var reader = await base.ReaderExecutingAsync(command, eventData, result, cancellationToken);
sw.Stop();
_metrics.RecordQueryDuration(
command.CommandText,
sw.ElapsedMilliseconds,
eventData.Context?.Database.GetDbConnection().Database ?? "unknown");
if (sw.ElapsedMilliseconds > 1000)
{
_metrics.IncrementSlowQueryCount();
_logger.LogWarning(
"Slow query detected: {Duration}ms - {Query}",
sw.ElapsedMilliseconds,
command.CommandText);
}
return reader;
}
}
// Monitor connection pool
DiagnosticListener.AllListeners.Subscribe(listener =>
{
if (listener.Name == "Microsoft.EntityFrameworkCore")
{
listener.Subscribe(new Observer(
onNext: kvp =>
{
if (kvp.Key == "Microsoft.EntityFrameworkCore.Database.Connection.ConnectionOpening")
{
_metrics.IncrementConnectionOpened();
}
else if (kvp.Key == "Microsoft.EntityFrameworkCore.Database.Connection.ConnectionClosed")
{
_metrics.IncrementConnectionClosed();
}
}));
}
});
Observability Strategy
Log SQL at Warning level in production to capture errors and slow queries. Tag queries with request IDs for trace correlation. Monitor connection pool metrics to detect leaks. Set alerts for queries exceeding 5 seconds. Use structured logging to enable filtering by operation type. Export metrics to centralized systems for dashboards and alerting.
Frequently Asked Questions
When should I use owned types vs separate entities?
Use owned types for value objects that have no identity outside their parent. Address, Money, and DateRange are perfect candidates. They're always queried with their parent and never referenced independently. Use separate entities when you need independent queries, relationships to other entities, or identity-based equality.
Are temporal tables a replacement for audit logging?
Temporal tables capture what changed, but not who changed it or why. Use them for point-in-time queries and compliance requirements. Complement them with application-level audit logs that capture user identity, IP addresses, and business context. Temporal tables give you the data history; audit logs give you the operational context.
How do compiled queries improve performance?
Compiled queries cache the query translation and execution plan, saving 10-50ms per query. The benefit compounds with query frequency. A dashboard hitting the same query 1000 times/minute saves 10-50 seconds of CPU time. Use them for hot paths with stable query shapes. Don't use them for dynamic queries with variable projections or filters.
Should I use JSON columns or separate tables for flexible data?
Use JSON for schema-less attributes that vary by record and are queried infrequently. Product specifications, user preferences, and metadata work well. Use tables for data with consistent structure, foreign keys, or heavy query requirements. JSON trades query performance for schema flexibility. Index JSON paths you query frequently.
What's the catch with ExecuteUpdate and ExecuteDelete?
These operations bypass change tracking and interceptors. Entity events don't fire. Temporal tables may not track changes correctly. Use them for maintenance tasks and bulk operations where you explicitly want to skip the EF pipeline. For business operations that need auditing or domain events, load entities normally and call SaveChanges.