23 C
New York
Saturday, June 28, 2025

Buy now

spot_img

Safe entry to a cross-account Amazon MSK cluster from Amazon MSK Join utilizing IAM authentication


Amazon Managed Streaming for Apache Kafka (MSK) Join is a totally managed, scalable, and extremely obtainable service that permits the streaming of knowledge between Apache Kafka and different knowledge programs. Amazon MSK Join is constructed on prime of Kafka Join, an open-source framework that gives a typical method to join Kafka with exterior knowledge programs. Kafka Join helps quite a lot of connectors, that are used to stream knowledge out and in of Kafka. MSK Join extends the capabilities of Kafka Join by offering a managed service with added security measures, simple configuration, and automated scaling capabilities, enabling companies to give attention to their knowledge streaming wants with out the overhead of managing the underlying infrastructure.

In some use instances, you would possibly want to make use of an MSK cluster in a single AWS account, however MSK Join is positioned in a separate account. On this publish, we show tips on how to create a connector to realize this use case. On the time of writing, MSK Join connectors could be created just for MSK clusters which have AWS Identification and Entry Administration (IAM) role-based authentication or no authentication. We show tips on how to implement IAM authentication after establishing community connectivity. IAM gives enhanced safety measures, ensuring your programs are protected in opposition to unauthorized entry.

Answer overview

The connector could be configured for quite a lot of functions, reminiscent of sinking knowledge to an Amazon Easy Storage Service (Amazon S3) bucket, monitoring the supply database adjustments, or serving as a migration software reminiscent of MirrorMaker2 on MSK Join to switch knowledge from a supply cluster to a goal cluster that is positioned in a unique account.

The next diagram illustrates a use case utilizing Debezium and Amazon S3 supply connectors.

The next diagram illustrates utilizing S3 Sink and migration to a cross-account failover cluster utilizing a MirrorMaker connector deployed on MSK Join.

At the moment MSK Join connectors could be created just for MSK clusters which have IAM role-based authentication or no authentication. On this weblog, I’ll information you thru the important steps for implementing the industry-recommended IAM (Identification and Entry Administration) authentication after establishing community connectivity. IAM gives enhanced safety measures, guaranteeing your programs are protected in opposition to unauthorized entry.

The launch of multi-VPC non-public connectivity (powered by AWS PrivateLink) and cluster coverage help for MSK clusters simplifies the connectivity of Kafka shoppers to brokers. By enabling this characteristic on the MSK cluster, you need to use the cluster-based coverage to handle all entry management centrally in a single place. On this publish, we cowl the method of enabling this characteristic on the supply MSK cluster.

We don’t absolutely make the most of the multi-VPC connectivity offered by this new characteristic as a result of that requires you to make use of totally different bootstrap URLs with port numbers (14001:3) that aren’t supported by MSK Join as of writing of this publish. We discover a safe community connectivity resolution that makes use of non-public connectivity patterns, as detailed in How Goldman Sachs builds cross-account connectivity to their Amazon MSK clusters with AWS PrivateLink.

Connecting to a cross-account MSK cluster from MSK Join includes the next steps.

Steps to configure the MSK cluster in Account A:

  1. Allow the multi-VPC non-public connectivity(Non-public Hyperlink) characteristic for IAM authentication scheme that’s enabled to your MSK cluster.
  2. Configure the cluster coverage to permit a cross-account connector.
  3. Implement one of many previous community connectivity patterns in response to your use case to determine the connectivity with the Account B VPC and make community adjustments accordingly.

Steps to configure the MSK connector in Account B:

  1. Create an MSK connector in non-public subnets utilizing the AWS Command Line Interface (AWS CLI).
  2. Confirm the community connectivity from Account A and make community adjustments accordingly.
  3. Test the vacation spot service to confirm the incoming knowledge.

Stipulations

To comply with together with this publish, it’s best to have an MSK cluster in a single AWS account and MSK Join in a separate account.

Arrange the MSK cluster setup in Account A:

On this publish, we solely present the necessary steps which might be required to allow the multi-VPC characteristic on an MSK cluster:

  1. Create a provisioned MSK cluster in Account A’s VPC with the next issues, that are required for the multi-VPC characteristic:
    • Cluster model should be 2.7.1 or increased.
    • Occasion kind should be m5.giant or increased.
    • Authentication ought to be IAM (you need to not allow unauthenticated entry for this cluster).
  2. After you create the cluster, go to the Networking settings part of your cluster and select Edit. Then select Activate multi-VPC connectivity.

  1. Choose IAM role-based authentication and select Activate choice.

It’d take round half-hour to allow. This step is required to allow the cluster coverage characteristic that permits the cross-account connector to entry the MSK cluster.

  1. After it has been enabled, scroll all the way down to Safety settings and select Edit cluster coverage.
  2. Outline your cluster coverage and select Save adjustments.

  1. The brand new cluster coverage permits for outlining a Primary or Superior cluster coverage. With the Primary choice, it solely permits CreateVPCConnection, GetBootstrapBrokers, DescribeCluster, and DescribeClusterV2 actions which might be required for creating the cross-VPC connectivity to your cluster. Nonetheless, we have now to make use of Superior to permit extra actions which might be required by the MSK Connector. The coverage ought to be as follows:
    {
    
        "Model": "2012-10-17",
        "Assertion": [{
            "Effect": "Allow",
            "Principal": {
                "AWS": "Connector-AccountId"
            },
            "Action": [
                "kafka:CreateVpcConnection",
                "kafka:GetBootstrapBrokers",
                "kafka:DescribeCluster",
                "kafka:DescribeClusterV2",
                "kafka-cluster:Connect",
                "kafka-cluster:DescribeCluster",
                "kafka-cluster:ReadData",
                "kafka-cluster:DescribeTopic",
                "kafka-cluster:WriteData",
                "kafka-cluster:CreateTopic",
                "kafka-cluster:AlterGroup",
                "kafka-cluster:DescribeGroup"
            ],
    "Useful resource": [
                    "arn:aws:kafka:::cluster//",
                    "arn:aws:kafka:::topic///",
                    "arn:aws:kafka:::group///"
                ]
        }]
    }

You would possibly want to switch the previous permissions to restrict entry to your sources (matters, teams). Additionally, you possibly can limit entry to a selected connector by giving the connector IAM function, or you possibly can point out the account quantity to permit the connectors in that account.

Now the cluster is prepared. Nonetheless, it is advisable ensure that of the community connectivity between the cross-account connector VPC and the MSK cluster VPC.

If you happen to’re utilizing VPC peering or Transit Gateway whereas connecting to MSK Join both from cross-account or the identical account, don’t configure your connector to succeed in the peered VPC sources with IPs within the following CIDR ranges (for extra particulars, see Connecting from connectors):

  • 10.99.0.0/16
  • 192.168.0.0/16
  • 172.21.0.0/16

Within the MSK cluster safety group, ensure you allowed port 9098 from Account B community sources and make adjustments within the subnets in response to your community connectivity sample.

Arrange the MSK connector in Account B:

On this part, we show tips on how to use the S3 Sink connector. Nonetheless, you need to use a unique connector in response to your use case and make the adjustments accordingly.

  1. Create an S3 bucket (or use an present bucket).
  2. Guarantee that the VPC that you just’re utilizing on this account has a safety group and personal subnets. In case your connector for MSK Join wants entry to the web, discuss with Allow web entry for Amazon MSK Join.
  3. Confirm the community connectivity between Account A and Account B by utilizing the telnet command to the dealer endpoints with port 9098.
  4. Create an S3 VPC endpoint.
  5. Create a connector plugin in response to your connector plugin supplier (confluent or lenses). Make a remark of the customized plugin Amazon Useful resource Title (ARN) to make use of in a later step.
  6. Create an IAM function to your connector to permit entry to your S3 bucket and the MSK cluster.
    • The IAM function’s belief relationship ought to be as follows:
      {
          "Model": "2012-10-17",
          "Assertion": [
              {
                  "Effect": "Allow",
                  "Principal": {
                      "Service": "kafkaconnect.amazonaws.com"
                  },
                  "Action": "sts:AssumeRole"
              }
          ]
      }

    • Add the next S3 entry coverage to your IAM function:
      {
          "Model": "2012-10-17",
          "Assertion": [{
              "Effect": "Allow",
              "Action": [
                  "s3:ListAllMyBuckets",
                  "s3:ListBucket",
                  "s3:GetBucketLocation",
                  "s3:DeleteObject",
                  "s3:PutObject",
                  "s3:GetObject",
                  "s3:AbortMultipartUpload",
                  "s3:ListMultipartUploadParts",
                  "s3:ListBucketMultipartUploads"
              ],
              "Useful resource": [
                              "arn:aws:s3:::",
                           "arn:aws:s3:::/*"
              ],
                 "Situation": {
              "StringEquals": {
                     "aws:SourceVpc": "vpc-xxxx"
                     }
                     }
          }]
      }

    • The next coverage incorporates the required actions by the connector:
      {
      "Model": "2012-10-17",
      "Assertion": [
         {
              "Effect": "Allow",
              "Action": [
                  "kafka-cluster:Connect",
                  "kafka-cluster:DescribeCluster",
                  "kafka-cluster:ReadData",
                  "kafka-cluster:DescribeTopic",
                  "kafka-cluster:WriteData",
                  "kafka-cluster:CreateTopic",
                  "kafka-cluster:AlterGroup",
                  "kafka-cluster:DescribeGroup"
              ],
              "Useful resource": [
                  "arn:aws:kafka:::cluster//",
                  "arn:aws:kafka:::topic///",
                  "arn:aws:kafka:::group///"
              ]
          }
      ]
      }

You would possibly want to switch the previous permissions to restrict entry to your sources (matters, teams)

Lastly, it’s time to create the MSK connector. As a result of the Amazon MSK console doesn’t permit viewing MSK clusters in different accounts, we present you tips on how to use the AWS CLI as an alternative. We additionally use primary Amazon S3 configuration for testing functions. You would possibly want to switch the configuration in response to your connector’s use case.

  1. Create a connector utilizing the AWS CLI with the next command with the required parameters of the connector, together with Account A’s MSK cluster dealer endpoints:
    aws kafkaconnect create-connector 
    --capacity "autoScaling={maxWorkerCount=2,mcuCount=1,minWorkerCount=1,scaleInPolicy={cpuUtilizationPercentage=10},scaleOutPolicy={cpuUtilizationPercentage=80}}" 
    --connector-configuration 
    "connector.class=io.confluent.join.s3.S3SinkConnector, 
    s3.area=, 
    schema.compatibility=NONE, 
    flush.dimension=2, 
    duties.max=1, 
    matters=, 
    safety.protocol=SASL_SSL, 
    s3.compression.kind=gzip, 
    format.class=io.confluent.join.s3.format.json.JsonFormat, 
    sasl.mechanism=AWS_MSK_IAM, 
    sasl.jaas.config=software program.amazon.msk.auth.iam.IAMLoginModule required, 
    sasl.shopper.callback.handler.class=software program.amazon.msk.auth.iam.IAMClientCallbackHandler, 
    worth.converter=org.apache.kafka.join.storage.StringConverter, 
    storage.class=io.confluent.join.s3.storage.S3Storage, 
    s3.bucket.title=, 
    timestamp.extractor=Document, 
    key.converter=org.apache.kafka.join.storage.StringConverter" 
    --connector-name "Connector-name" 
    --kafka-cluster '{"apacheKafkaCluster": {"bootstrapServers": ":9098","vpc": {"securityGroups": ["sg-0b36a015789f859a3"],"subnets": ["subnet-07950da1ebb8be6d8","subnet-026a729668f3f9728"]}}}' 
    --kafka-cluster-client-authentication "authenticationType=IAM" 
    --kafka-cluster-encryption-in-transit "encryptionType=TLS" 
    --kafka-connect-version "2.7.1" 
    --log-delivery workerLogDelivery='{cloudWatchLogs={enabled=true,logGroup=""}}' 
    --plugins "customPlugin={customPluginArn=,revision=1}" 
    --service-execution-role-arn ""

  2. After you create the connector, join the producer to your subject and insert knowledge into it. Within the following code, we use a Kafka shopper to insert knowledge for testing functions:
    bin/kafka-console-producer.sh --broker-list  --producer.config shopper.properties --topic 

If every thing is about up accurately, it’s best to see the information in your vacation spot S3 bucket. If not, test the troubleshooting ideas within the following part.

Troubleshooting ideas

After deploying the connector, if it’s within the CREATING state on the connector particulars web page, entry the Amazon CloudWatch log group laid out in your connector creation request. Overview the logs for any errors. If no errors are discovered, look forward to the connector to finish its creation course of.

Moreover, ensure that the IAM roles have their required permissions, and test the safety teams and NACLs for correct connectivity between VPCs.

Clear up

Once you’re accomplished testing this resolution, clear up any undesirable sources to keep away from ongoing fees

Conclusion

On this publish, we demonstrated tips on how to create an MSK connector when it is advisable use an MSK cluster in a single AWS account, however MSK Join is positioned in a separate account. This structure consists of an S3 Sink connector for demonstration functions, however it might probably accommodate different forms of sink and supply connectors. Moreover, this structure focuses solely on IAM authenticated connectors. If an unauthenticated connector is desired, the multi-VPC connectivity (PrivateLink) and cluster coverage parts could be ignored. The remaining course of, which includes making a community connection between the account VPCs, stays the identical.

Check out the answer for your self, and tell us your questions and suggestions within the feedback part.

Take a look at extra AWS Companions or contact an AWS Consultant to learn the way we may also help speed up your small business.


Concerning the Writer

Venkata Sai Mahesh Swargam is a Cloud Engineer at AWS in Hyderabad. He focuses on Amazon MSK and Amazon Kinesis companies. Mahesh is devoted to serving to clients by offering technical steerage and fixing points associated to their Amazon MSK architectures. In his free time, he enjoys being with household and touring all over the world.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

0FansLike
0FollowersFollow
0SubscribersSubscribe
- Advertisement -spot_img

Latest Articles

Hydra v 1.03 operacia SWORDFISH