Hello, .NET developers! 👋
When your application starts running tasks in parallel — multiple threads reading, writing, and updating data — one of the biggest challenges is data safety.
Traditional collections like List<T> or Dictionary<TKey,TValue> are not thread-safe, meaning concurrent modifications can cause exceptions or corrupt data.
To solve this, .NET introduced Concurrent Collections — a set of specialized thread-safe classes under System.Collections.Concurrent.
These include ConcurrentDictionary, ConcurrentBag, ConcurrentQueue, and ConcurrentStack.
Each serves a different purpose in multi-threaded scenarios, balancing speed, safety, and structure.
ConcurrentDictionary — When You Need Key-Based Access
ConcurrentDictionary<TKey,TValue> is a thread-safe version of Dictionary<TKey,TValue>.
It allows multiple threads to read and write simultaneously without locking the entire collection.
Real-Time Use Case
Imagine an API gateway caching user tokens. Each user ID maps to a token.
Multiple threads (requests) might try to update or validate the same token concurrently — that’s a perfect case for ConcurrentDictionary.
Example
using System.Collections.Concurrent;
public class TokenCache
{
private readonly ConcurrentDictionary<string, string> _userTokens = new();
public void AddOrUpdate(string userId, string token)
{
_userTokens.AddOrUpdate(userId, token, (key, oldValue) => token);
}
public string GetToken(string userId)
{
_userTokens.TryGetValue(userId, out var token);
return token ?? "Token not found";
}
}
class Program
{
static void Main()
{
var cache = new TokenCache();
Parallel.For(0, 100, i =>
{
cache.AddOrUpdate($"user{i}", $"token{i}");
});
Console.WriteLine($"Total tokens stored: {cache.GetType().GetProperties().Length}");
}
}
AddOrUpdate and TryGetValue handle concurrency gracefully, ensuring data consistency without explicit locks.
ConcurrentBag — When Order Doesn’t Matter
ConcurrentBag<T> is an unordered, thread-safe collection designed for scenarios where multiple threads add and remove items, but order is irrelevant.
It’s efficient because each thread maintains a local storage area to reduce contention with others.
Real-Time Use Case
Imagine a background job scheduler pooling reusable objects — like reusable HttpClient instances, file handles, or task results.
You don’t care who gets which one — you just want safe, fast access.
Example
using System.Collections.Concurrent;
public class WorkerPool
{
private readonly ConcurrentBag<string> _completedJobs = new();
public void ProcessJob(int jobId)
{
_completedJobs.Add($"Job {jobId} completed at {DateTime.Now}");
}
public void DisplayResults()
{
foreach (var job in _completedJobs)
Console.WriteLine(job);
}
}
class Program
{
static void Main()
{
var pool = new WorkerPool();
Parallel.For(0, 10, i => pool.ProcessJob(i));
pool.DisplayResults();
}
}
Since order doesn’t matter, ConcurrentBag optimizes for speed by reducing lock contention — perfect for logging or collection of independent results.
ConcurrentStack — When You Need Last-In, First-Out (LIFO)
ConcurrentStack<T> behaves like a traditional stack, but it’s safe for multiple threads to push or pop items simultaneously.
It’s ideal when the latest data should be processed first.
Real-Time Use Case
Think of an image processing service where new tasks arrive faster than old ones are processed. You might prioritize the most recent images (LIFO order) to give users faster feedback.
Example
using System.Collections.Concurrent;
public class ImageProcessor
{
private readonly ConcurrentStack<string> _images = new();
public void AddImage(string fileName)
{
_images.Push(fileName);
}
public void ProcessImages()
{
while (_images.TryPop(out var file))
{
Console.WriteLine($"Processing {file}");
}
}
}
class Program
{
static void Main()
{
var processor = new ImageProcessor();
Parallel.For(1, 5, i => processor.AddImage($"image_{i}.png"));
processor.ProcessImages();
}
}
ConcurrentStack ensures the most recently added image is processed first without worrying about synchronization issues.
ConcurrentQueue — When You Need First-In, First-Out (FIFO)
ConcurrentQueue<T> guarantees that items are processed in the same order they were added, even under concurrent access.
It’s the backbone of many message-queue or producer-consumer systems in .NET.
Real-Time Use Case
Consider a real-time chat application or task queue where messages must be delivered in sequence — no skipping or reordering allowed.
ConcurrentQueue makes sure every producer thread adds messages safely, and consumer threads dequeue them correctly.
Example
using System.Collections.Concurrent;
public class MessageQueue
{
private readonly ConcurrentQueue<string> _messages = new();
public void EnqueueMessage(string message)
{
_messages.Enqueue(message);
}
public void ProcessMessages()
{
while (_messages.TryDequeue(out var msg))
{
Console.WriteLine($"Delivered: {msg}");
}
}
}
class Program
{
static void Main()
{
var queue = new MessageQueue();
Parallel.For(1, 6, i => queue.EnqueueMessage($"Message {i}"));
queue.ProcessMessages();
}
}
In high-throughput systems, ConcurrentQueue helps you maintain reliable message flow without race conditions.
Choosing the Right Concurrent Collection
Each concurrent collection has its own specialty:
ConcurrentDictionary → Key/value lookups and updates (like caches or session stores). ConcurrentBag → Unordered data collection (parallel results, logs, jobs). ConcurrentStack → LIFO structure (undo operations, latest-first processing). ConcurrentQueue → FIFO structure (message queues, background task pipelines).
All of them are lock-free (internally use fine-grained synchronization), optimized for multi-threaded workloads, and designed for modern parallelism using Task and Parallel.For.
Wrapping Up
Concurrent collections make multi-threaded programming approachable and reliable. Instead of locking entire collections manually, these classes handle synchronization under the hood, providing both speed and safety. Whether you’re caching tokens, processing images, queuing messages, or gathering results from background tasks — these structures let multiple threads cooperate without colliding.
Comments
Post a Comment