Efficient File I/O Operations with System.IO in .NET

Why Efficient File I/O Matters

File operations are fundamental to most applications. Whether you're reading configuration files, processing user uploads, logging application events, or generating reports, you'll work with files constantly.

The System.IO namespace provides classes for reading and writing files efficiently. Understanding when to use FileStream versus higher-level abstractions like StreamReader affects both performance and code clarity. Async operations become critical when handling multiple concurrent file operations in server environments.

This guide covers reading and writing text files, working with binary data, handling large files without memory issues, and using async patterns for scalable file operations.

Basic Text File Operations

The File class provides simple methods for common operations. These methods handle opening, reading, and closing files automatically, making them perfect for straightforward scenarios.

Simple file read and write
// Write all text to a file (overwrites if exists)
string content = "Hello, World!\nThis is a test file.";
File.WriteAllText("output.txt", content);

// Read all text from a file
string readContent = File.ReadAllText("output.txt");
Console.WriteLine(readContent);

// Write lines to a file
var lines = new[] { "Line 1", "Line 2", "Line 3" };
File.WriteAllLines("lines.txt", lines);

// Read all lines from a file
string[] readLines = File.ReadAllLines("lines.txt");
foreach (var line in readLines)
{
    Console.WriteLine(line);
}

// Append text to existing file
File.AppendAllText("output.txt", "\nAppended line");

// Check if file exists before reading
if (File.Exists("config.txt"))
{
    string config = File.ReadAllText("config.txt");
    Console.WriteLine(config);
}

These methods load the entire file into memory, which works fine for small to medium files. For large files, you'll need streaming approaches to avoid memory problems.

StreamReader and StreamWriter for Text

StreamReader and StreamWriter give you more control over text file processing. They handle encoding automatically and let you read or write data incrementally.

Reading files line by line
// Read file line by line
using (var reader = new StreamReader("large-file.txt"))
{
    string line;
    int lineNumber = 0;

    while ((line = reader.ReadLine()) != null)
    {
        lineNumber++;
        Console.WriteLine($"{lineNumber}: {line}");

        // Process each line without loading entire file
        if (line.Contains("ERROR"))
        {
            Console.WriteLine($"Found error on line {lineNumber}");
        }
    }
}

// Using File.ReadLines for lazy enumeration
foreach (var line in File.ReadLines("log.txt"))
{
    if (line.StartsWith("[ERROR]"))
    {
        Console.WriteLine(line);
    }
}

ReadLine() returns null when reaching the end of the file. The using statement ensures the file handle closes properly even if exceptions occur.

Writing files with StreamWriter
// Write to file with StreamWriter
using (var writer = new StreamWriter("output.txt"))
{
    writer.WriteLine("Header");
    writer.WriteLine("========");

    for (int i = 1; i <= 10; i++)
    {
        writer.WriteLine($"Line {i}");
    }

    writer.WriteLine("Footer");
}

// Append to existing file
using (var writer = new StreamWriter("log.txt", append: true))
{
    writer.WriteLine($"[{DateTime.Now}] Application started");
}

// Specify encoding
using (var writer = new StreamWriter("utf8-file.txt", false, Encoding.UTF8))
{
    writer.WriteLine("Text with special characters: © ® ™");
}

StreamWriter buffers output for efficiency. The Dispose() method flushes the buffer and closes the file. You can call Flush() manually if you need to ensure data gets written before disposal.

FileStream for Binary Data

FileStream works directly with bytes, giving you maximum control over file operations. Use it for binary files or when you need specific positioning and buffer control.

Reading and writing binary files
// Write binary data
byte[] data = { 0x48, 0x65, 0x6C, 0x6C, 0x6F }; // "Hello" in ASCII

using (var stream = new FileStream("data.bin", FileMode.Create, FileAccess.Write))
{
    stream.Write(data, 0, data.Length);
}

// Read binary data
using (var stream = new FileStream("data.bin", FileMode.Open, FileAccess.Read))
{
    byte[] buffer = new byte[stream.Length];
    int bytesRead = stream.Read(buffer, 0, buffer.Length);

    Console.WriteLine($"Read {bytesRead} bytes");
    string text = Encoding.ASCII.GetString(buffer);
    Console.WriteLine(text);  // Output: Hello
}

// Copy file in chunks
using (var source = new FileStream("source.dat", FileMode.Open, FileAccess.Read))
using (var dest = new FileStream("destination.dat", FileMode.Create, FileAccess.Write))
{
    byte[] buffer = new byte[4096];  // 4KB buffer
    int bytesRead;

    while ((bytesRead = source.Read(buffer, 0, buffer.Length)) > 0)
    {
        dest.Write(buffer, 0, bytesRead);
    }
}

Chunked reading keeps memory usage constant regardless of file size. The buffer size affects performance: too small causes many read operations, too large wastes memory.

Async File Operations

Async file methods prevent blocking threads during I/O operations. This matters for server applications handling multiple concurrent requests or desktop apps that need to stay responsive.

Async read and write
// Async file reading
public static async Task<string> ReadFileAsync(string path)
{
    using (var reader = new StreamReader(path))
    {
        return await reader.ReadToEndAsync();
    }
}

// Async file writing
public static async Task WriteFileAsync(string path, string content)
{
    using (var writer = new StreamWriter(path))
    {
        await writer.WriteAsync(content);
    }
}

// Async line-by-line processing
public static async Task ProcessLogFileAsync(string path)
{
    using (var reader = new StreamReader(path))
    {
        string line;
        while ((line = await reader.ReadLineAsync()) != null)
        {
            if (line.Contains("ERROR"))
            {
                await LogErrorAsync(line);
            }
        }
    }
}

// Usage
var content = await ReadFileAsync("data.txt");
await WriteFileAsync("output.txt", content.ToUpper());
await ProcessLogFileAsync("application.log");

Async methods free up threads while waiting for disk operations. Your application can handle other work instead of blocking. This improves scalability in environments where many operations happen concurrently.

File Management Operations

Beyond reading and writing, you'll need to check existence, copy, move, and delete files. The File and Directory classes provide these operations.

File management operations
// Check if file or directory exists
if (File.Exists("config.json"))
{
    Console.WriteLine("Config file found");
}

if (Directory.Exists("logs"))
{
    Console.WriteLine("Logs directory exists");
}

// Create directory if it doesn't exist
Directory.CreateDirectory("output/reports");

// Copy file
File.Copy("source.txt", "backup.txt", overwrite: true);

// Move file
File.Move("temp.dat", "archive/temp.dat");

// Delete file
if (File.Exists("old-file.txt"))
{
    File.Delete("old-file.txt");
}

// Get file information
var fileInfo = new FileInfo("data.txt");
Console.WriteLine($"Size: {fileInfo.Length} bytes");
Console.WriteLine($"Created: {fileInfo.CreationTime}");
Console.WriteLine($"Modified: {fileInfo.LastWriteTime}");

// List files in directory
string[] files = Directory.GetFiles("logs", "*.log");
foreach (var file in files)
{
    Console.WriteLine(Path.GetFileName(file));
}

These operations throw exceptions if files are in use, locked, or permissions are insufficient. Always wrap file operations in try-catch blocks when dealing with user-provided paths or external files.

Best Practices and Error Handling

File operations can fail for many reasons: files not found, permissions denied, disk full, or files locked by other processes. Proper error handling and resource management are critical.

Robust file handling
public static bool TryReadFile(string path, out string content)
{
    content = null;

    try
    {
        if (!File.Exists(path))
        {
            Console.WriteLine($"File not found: {path}");
            return false;
        }

        using (var reader = new StreamReader(path))
        {
            content = reader.ReadToEnd();
            return true;
        }
    }
    catch (UnauthorizedAccessException)
    {
        Console.WriteLine($"Access denied: {path}");
        return false;
    }
    catch (IOException ex)
    {
        Console.WriteLine($"I/O error: {ex.Message}");
        return false;
    }
}

// Safe file writing with temp file
public static void SafeWriteFile(string path, string content)
{
    string tempPath = path + ".tmp";

    try
    {
        // Write to temp file first
        File.WriteAllText(tempPath, content);

        // Replace original only if write succeeded
        if (File.Exists(path))
        {
            File.Replace(tempPath, path, path + ".bak");
        }
        else
        {
            File.Move(tempPath, path);
        }
    }
    catch (Exception ex)
    {
        Console.WriteLine($"Write failed: {ex.Message}");

        // Clean up temp file
        if (File.Exists(tempPath))
        {
            File.Delete(tempPath);
        }

        throw;
    }
}

Writing to a temporary file first protects against corruption if the write fails partway through. The File.Replace method provides atomic replacement with automatic backup creation.

Frequently Asked Questions (FAQ)

When should you use FileStream versus StreamReader?

Use FileStream when working with binary data or when you need fine-grained control over reading and writing bytes. StreamReader is designed for text files and handles encoding automatically, making it easier to read text line-by-line or character-by-character. For most text file operations, StreamReader provides a simpler, more convenient API.

Why should you use async file operations?

Async file operations prevent blocking threads while waiting for disk I/O to complete. This matters in server applications where you need to handle many concurrent requests. By using async methods, threads remain available to process other work instead of sitting idle during file operations. This improves scalability and responsiveness.

How do you properly dispose of file streams?

Always wrap stream operations in using statements to ensure proper disposal. The using statement automatically calls Dispose() even if exceptions occur, closing the file handle and flushing buffers. Failing to dispose streams can lock files, cause memory leaks, and prevent other processes from accessing files.

Can you read and write to the same file simultaneously?

You can open a file for both reading and writing by specifying FileMode and FileAccess parameters when creating a FileStream. However, you must carefully manage the stream position to avoid corrupting data. For most scenarios, reading the entire file, processing it, and writing results to a new file is safer and simpler.

What's the most efficient way to read large files?

Process large files line-by-line or in chunks rather than reading everything into memory at once. Use File.ReadLines() for line-by-line enumeration or FileStream with a buffer for binary data. These approaches keep memory usage constant regardless of file size, preventing OutOfMemoryException for very large files.

Back to Articles