NestJS Communication Between Microservices

NestJS Communication Between Microservices

A Brief on microservice communication patterns

Communicating between microservices can happen through a synchronous Request/Response pattern or the asynchronous event/message pattern.

synchronous request/response pattern is useful where the response/ack is needed before proceeding with the next task. e.g. User Authentication Service which returns the auth token as the response and until we receive the auth token for the requested user id we cannot proceed with the next task that can be performed only by the authenticated user.

An asynchronous message/event pattern is useful in the case where the immediate response/ack is not expected and when we are ok with the eventual consistency. It is also worth noting that the eventual consistency does not apply to all the use cases.

How does Nest js establish communication between microservices?

NestJS has built-in transporters for communicating between microservices to support both synchronous and Asynchronous communication. Transporter module imported from ‘@nestjs/microservices’.

NestJS abstract the implementation for these transporters so it is easy for us to change them without any major implications to our code base.

npm i --save @nestjs/microservices

import { Transport, MicroserviceOptions } from '@nestjs/microservices';

export declare enum Transport { TCP = 0,REDIS = 1,NATS = 2,MQTT = 3,GRPC = 4,RMQ = 5,KAFKA = 6}Specify the transporter for the micro service. TCP is the default transporter used.

NestJS provides a factory for creating microservice which accepts one of the named constants from the enum Transporter and the MicroserviceOptions.

The below snippet shows the sample to create an Auth microservice that has the transporter as TCP using NestFactory.

Synchronous communication Pattern is supported by TCP whereas Asynchronous communication pattern is supported by Kafka, MQTT, NATS

For illustration purpose, let us create three microservices and see how these microservice communicates with each other. Added we will see an example on Mongoose as ODM (Object Data Mapper) to save data into MongoDB as the persistence layer and Kafka js as the message broker.

  1. API Gateway — Act as the gateway for incoming requests and redirect the request to the appropriate service registered with the API gateway.

  2. Auth Micro service — Simple authentication service for hashing user passwords to save in the persistence layer and also to return a valid JWT token for the requested user Id.

  3. User Microservice — service for user entity and it follows the CQRS pattern for the CRUD operation on the user entity.

API Gateway

This will act as the gateway for the incoming request and have the reference to the other microservice hosting information. This will also be the client and the triggering point for the event-driven communication that is happening b/w the microservices.

User microservice uses Kafka as its Transport layer. I am running Kafka using the below docker-compose file. It provides UI for Kafka monitoring. I find this very useful to view messages sent over topics.

version: "2"

services:
  zookeeper:
    image: confluentinc/cp-zookeeper:latest
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000
    ports:
      - 22181:2181
  kafka:
    image: confluentinc/cp-kafka:latest
    depends_on:
      - zookeeper
    ports:
      - 29092:29092
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:29092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
      KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
  kafka_ui:
    image: provectuslabs/kafka-ui:latest
    depends_on:
      - kafka
    ports:
      - 8080:8080
    environment:
      KAFKA_CLUSTERS_0_ZOOKEEPER: zookeeper:2181
      KAFKA_CLUSTERS_0_NAME: local
      KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS: kafka:9092

Registering clients with ClientModule

The client module registers the microservice with the transport protocol used in communication. This setting can be passed onto to application by injecting config service from the config module. In the app.module.ts file ClientModule registers the list of services that the API gateway needs to access along with their transport protocol. Below code snippet, in API gateway app.module.ts file the “User-service” is registered as one of the microservice API gateways that need to communicate. Also please the config service is loaded with the variable from the .env file and injected into each client in the ClientModule. We used Kafka as the transport protocol for communication.

/* eslint-disable prettier/prettier */
import { Inject, Module, OnModuleInit, Options } from '@nestjs/common';
import { AppController } from './app.controller';
import { AppService } from './app.service';
import {ClientKafka, Transport, ClientsModule, Client}  from '@nestjs/microservices'
import { EventEmitterModule } from '@nestjs/event-emitter';
import { ConfigModule, ConfigService } from '@nestjs/config';
import { AuthModule } from './auth/auth.module';

@Module({
  imports: [
    EventEmitterModule.forRoot({global:true,}),
    ConfigModule.forRoot({isGlobal:true, envFilePath:`.env.${process.env.NODE_ENV}`}),

    //Load the clinet module asyn manner , read the config  from config modules i.e. env
    ClientsModule.registerAsync([
      {
        name: 'User-Service',
        imports: [ConfigModule],
         useFactory:async(configService: ConfigService) => ({
          transport:Transport.KAFKA, 
          options:
           {
                 client:
                 {
                   brokers:[configService.getOrThrow('KAFKA_BROKER_URL')]
                 },
                 consumer :
                        {
                          groupId:'user-service-consumer'
                       }
           }
          }),
         inject: [ConfigService],
      },

Kafka protocol supports both request-response style and asynchronous style messaging. Nest js provides an option to listen to the response topic from the Kafka broker. The controller subscribes to this response topic to receive back the response to the query initiated. On the other hand nest js also supports event emitting and listening to the message in topic emitted from producer service.

Continued in the below article.

Did you find this article valuable?

Support Vivekananthan Pasupathi by becoming a sponsor. Any amount is appreciated!