Architecting with Ease: Harnessing the Power of Amazon Q Assistant for Streamlined AWS Solutions

Note: This blog was written when Amazon Q was still in Public Preview. Some features may have changed at the moment of reading. Features are described at the moment of writing (December 2023).

Amazon Web Services provides a wide range of powerful cloud computing options. Yet, understanding AWS architecture can be challenging, even for experienced users. Amazon Q Assistant will come to your aid as a helpful tool in this complex environment.

In this blog, we’ll explore the Amazon Q Assistant, an innovative tool that makes creating AWS architectures easier and more efficient. We’ll show you how to easily set up a basic AWS architecture with the Amazon Q Assistant, demonstrating that deep AWS knowledge isn’t essential for this task, all due to the capabilities of Amazon Q.

Highlights of this Blog Post

To demonstrate the capabilities of Amazon Q, we’ll design a straightforward AWS architecture, using the tool’s ability to understand and execute instructions given in natural language. We will divide the questions into separate discussions to ensure the most effective responses for common scenarios, aiming to minimize the impact of previous inquiries on the dialogue. Our goal is to minimize manual configuration by allowing Amazon Q to interpret our requirements and provide us with the necessary code and configuration settings. We will create an Amazon S3 bucket to store our data, whenever the S3 bucket detects an upload of a new file, an AWS Lambda function will be activated to process and store the incoming data in Amazon DynamoDB.

Getting Started: Essential Prerequisites for This Blog

As a start ensure you have an active AWS account with the appropriate permissions to create resources via the AWS console. Once you have verified your access rights, you can start using Amazon Q by navigating to the AWS console and selecting the Amazon Q assistant.

Effortless S3 Bucket Creation with Amazon Q Assistant

Once the Amazon Q user interface is accessed, we can start by asking the first question: “How do I create an S3 bucket?” The AI model will provide you with a step-by-step plan on how to create an S3 bucket. Additionally, the AI model provides links to the sources of its information beneath the response to help you understand where the responses came from.

DynamoDB Database Creation Made Easy

After successfully setting up an S3 bucket, the next step will be creating a DynamoDB database. This is a crucial step before we proceed to configure the Lambda function. Similar to our approach with the S3 bucket, we will ask the Amazon Q Assistant for help on how to create a DynamoDB database.

Once again, the Amazon Q Assistant provides us with a detailed, step-by-step tutorial on setting up a DynamoDB database, matching the structure of the data in the S3 bucket. Therefore, it’s necessary to upload some data into the bucket. For our purposes, we’ll use a small dataset from the well-known site Kaggle, specifically focusing on Netflix data. Uploading this data into an S3 bucket is typically a straightforward process, but if any issues, don’t hesitate to ask Amazon Q how this is done.

Although the AI model recommends setting up a DynamoDB table initially, a closer examination of the console suggests that it might be possible to directly import data from S3 without creating a table first. To verify this, we will ask the model if it is possible to import data without creating a DynamoDB table first.

As indicated in the screenshot provided earlier, it turns out that creating a table is not needed. With this straightforward, step-by-step guide, you should be able to effortlessly create a DynamoDB table using your S3 data.

Additionally, Amazon Q offers information about pricing, which is incredibly beneficial when working within a cloud infrastructure. This information helps with planning and budgeting, ensuring that you avoid any unforeseen expenses.

Automating with Lambda: A Step-by-Step Guide

As of this moment, we have successfully set up both the S3 bucket and the DynamoDB. However, the current setup requires manual intervention for each data load. To automate this process, we can implement a Lambda function that will automatically transfer data placed in the S3 bucket to the DynamoDB table. We will present the current challenge and compare the solution given by Amazon Q.

As you can see, the solution offered by Amazon Q is very similar with our initial plan. This demonstrates the effectiveness of Amazon Q in providing relevant and practical guidance for AWS configurations, particularly in automating processes. For the creation of the Lambda function we can use the following guide.

The initial step is to create a role with the required permissions. This might be challenging for those who are new to AWS or other cloud platforms. Therefore, we will turn to the AI model to help us create the needed IAM Role and policies.

This guide significantly simplifies the process of creating IAM roles. The provided response uses AWS managed policies. However, if you prefer to create your own policies, you could request the AI Model to generate the different steps and code for the creation of a custom inline policy for you.

Once you’ve followed the instructions as shown in the earlier picture, your IAM role’s permissions should look like following.

Having successfully set up the IAM role, we can now start to write the Lambda function. To do this, navigate to the Lambda console and click on “Create Function”. This will navigate you to the next screen.

Once you have selected a name for your function and chosen the programming language you wish to use, click on “Create function”. After your function is created, navigate to the Code tab within the function. Here, you can use Amazon Q to generate the necessary code to transfer data from S3 to DynamoDB.

Remember to replace ‘yourTable’ in the code with the name of the DynamoDB table you created earlier. Once you have made change, click on “Deploy” to save the changes.

When we try to generate the correct code using references to other running services, for example via following question: “Replace ‘yourTable’ with the name of the DynamoDB table running in this account”, we seem to get an answer showing this is not possible.

We do get a step-by-step guide on where we can find the name, but the model was not capable of retrieving the table name itself.

Important: The provided code is written to handle JSON type data. If you are working with CSV files, you will need to make some adjustments to the code. If you need guidance on what needs to be changed use the Amazon Q Assistant.

The answer is very clear and straightforward.

After applying these changes, the code should look something like this.

Once the correct code for your project is deployed, the only remaining task before testing the architecture is to add the S3 bucket notification. While implementing this type of notification is not overly complicated, finding the location within the AWS console to create it can be. Therefore, we’ll use Amazon Q to explain where we can set up these notifications.

Once you have correctly set everything up, you should be able to upload data to your S3 bucket, and it will automatically be transferred to your DynamoDB table. However, it’s important that the file you upload meets certain criteria: it should contain the same columns as your DynamoDB table, the type of file should match the filetype specified in your Lambda code, and the data in the file shouldn’t already exist in the table.


If you attempt to upload data where the primary key already exists in the DynamoDB table, those records will not be added. For testing purposes, you could remove existing records in the table and re-upload the file to the bucket. This will help verify whether the architecture works correctly. Below you can see an example of data added in the DynamoDB.

Conclusion

In conclusion, the Amazon Q Assistant is an extremely useful tool, ideal for creating various architectures within your AWS account. As of this moment the model isn’t capable of referring to other AWS services running in the same account. However, for the IAM service the model is capable of referring to the AWS managed policies, which are often enough. While AWS knowledge may not be as needed as it once was, it is still highly recommended. Familiarity with the different AWS services can be beneficial in understanding why the model made certain decisions.

Amazon Q is likely to be a popular choice among AWS Solutions Architects to further improve their architecture.

Aron Coosemans

I’m Aron Coosemans, specialized in Information Management and Security. Joining Aivix as a data engineer/scientist. I obtained two AWS certificates: “AWS Cloud Practitioner” and “AWS Solutions Architect.”  Currently I’m working on an AWS data platform project, responsible for the creation and maintenance of pipelines built in airflow.