Building a Low-Latency gRPC Service for Real-Time Inter-Microservice Communication in C# and ASP.NET Core
Building a gRPC Service in C# and ASP.NET Core – Part 2

In modern distributed systems, low-latency communication is crucial for maintaining performance and responsiveness, especially in microservice architectures. This blog demonstrates how to use gRPC in C# and ASP.NET Core to achieve low-latency inter-microservice communication, using a real-world scenario: real-time order tracking in a delivery system.
Why gRPC for Low-Latency Communication?
gRPC excels in microservice communication because of its:
Efficient Protocol: gRPC uses HTTP/2, enabling multiplexed streams, binary serialization (via Protocol Buffers), and reduced overhead compared to REST.
Streaming Support: gRPC supports client, server, and bidirectional streaming, making it ideal for real-time use cases.
Strong Typing: Protocol Buffers ensure schema validation, reducing runtime errors.
Compact Payloads: Protobuf serialization produces smaller payloads, improving network efficiency.
These features make gRPC an excellent choice for scenarios demanding low latency and high throughput.
Scenario: Real-Time Order Tracking
In a delivery system, different microservices (e.g., Order Management, Delivery Updates, Notifications) need to communicate with minimal delay. We’ll build a gRPC service for the Order Management microservice to enable real-time updates and interactions with other microservices.
Step 1: Define the gRPC Contract
The service will allow:
Fetching the current status of an order.
Streaming real-time status updates for inter-microservice communication.
Define the order.proto file:
syntax = "proto3";
option csharp_namespace = "OrderService";
package order;
// Service definition.
service OrderTracking {
rpc GetOrderStatus (OrderRequest) returns (OrderResponse);
rpc StreamOrderUpdates (OrderRequest) returns (stream OrderUpdate);
}
// Request message containing the order ID.
message OrderRequest {
string order_id = 1;
}
// Response message for the current order status.
message OrderResponse {
string order_id = 1;
string status = 2;
}
// Real-time order update message.
message OrderUpdate {
string order_id = 1;
string status = 2;
int64 timestamp = 3;
}
Step 2: Implement the Service
Create a OrderTrackingService.cs file in the Services folder. Implement efficient, low-latency logic:
using Grpc.Core;
using System.Collections.Concurrent;
namespace OrderService.Services
{
public class OrderTrackingService : OrderTracking.OrderTrackingBase
{
// Simulated in-memory order store.
private static readonly ConcurrentDictionary<string, string> Orders = new()
{
["123"] = "Processing",
["456"] = "Shipped",
["789"] = "Delivered"
};
// Get the current status of an order.
public override Task<OrderResponse> GetOrderStatus(OrderRequest request, ServerCallContext context)
{
Orders.TryGetValue(request.OrderId, out var status);
return Task.FromResult(new OrderResponse
{
OrderId = request.OrderId,
Status = status ?? "Unknown"
});
}
// Stream real-time updates.
public override async Task StreamOrderUpdates(OrderRequest request, IServerStreamWriter<OrderUpdate> responseStream, ServerCallContext context)
{
var statuses = new[] { "Processing", "Shipped", "Out for Delivery", "Delivered" };
var random = new Random();
foreach (var status in statuses)
{
await Task.Delay(random.Next(500, 1500)); // Simulate real-time delay.
await responseStream.WriteAsync(new OrderUpdate
{
OrderId = request.OrderId,
Status = status,
Timestamp = DateTimeOffset.UtcNow.ToUnixTimeSeconds()
});
if (status == "Delivered") break; // End stream when delivered.
}
}
}
}
Step 3: Optimize for Low Latency
To ensure low latency, consider these techniques:
Efficient Serialization: Protocol Buffers serialize data into a compact binary format, reducing transmission time.
Connection Multiplexing: HTTP/2 allows multiple simultaneous streams over a single connection, reducing overhead.
Asynchronous Operations: Both the service and client use non-blocking, async APIs to minimize delays.
In-Memory Caching: Use
ConcurrentDictionaryto store order data for quick access without database calls.
Step 4: Configure the gRPC Server
Ensure the server is configured to handle gRPC's HTTP/2 requirements. Modify Program.cs:
using OrderService.Services;
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddGrpc();
var app = builder.Build();
app.MapGrpcService<OrderTrackingService>();
app.MapGet("/", () => "This service supports gRPC. Use a gRPC client to connect.");
app.Run();
Step 5: Test the Service
Create a Client for Low-Latency Communication
Here's a sample client to test the real-time streaming:
using Grpc.Net.Client;
using OrderService;
var channel = GrpcChannel.ForAddress("https://localhost:5001");
var client = new OrderTracking.OrderTrackingClient(channel);
// Fetch current order status.
var statusResponse = await client.GetOrderStatusAsync(new OrderRequest { OrderId = "123" });
Console.WriteLine($"Order ID: {statusResponse.OrderId}, Status: {statusResponse.Status}");
// Stream real-time updates.
using var call = client.StreamOrderUpdates(new OrderRequest { OrderId = "123" });
await foreach (var update in call.ResponseStream.ReadAllAsync())
{
Console.WriteLine($"Update: {update.Status} at {update.Timestamp}");
}
Performance Comparison: gRPC vs REST
| Feature | gRPC | REST |
| Latency | ~10-20ms overhead | ~50-100ms overhead |
| Payload Size | Compact (Protobuf) | Larger (JSON) |
| Streaming | Native Support | Workaround Needed |
| Connection Overhead | Low (HTTP/2) | Higher (HTTP/1.1) |
Conclusion
gRPC is a robust solution for low-latency inter-microservice communication, offering performance benefits over traditional REST APIs. By leveraging features like HTTP/2, efficient serialization, and streaming, you can build real-time, responsive systems tailored for modern distributed architectures.



