Getting Started with Collecting and Managing AWS Logs | Better Stack Community (2023)

Logging is a critical component of any cloud-based infrastructure, and AWSoffers a wide range of services for logging, monitoring, and analyzing logs suchas CloudWatch Logs, CloudTrail, and Elasticsearch. These services allow you tocollect and store log data, set up alerts, and perform advanced analysis on yourlogs. By using these services, you can gain visibility into the health andperformance of your infrastructure, troubleshoot issues, and comply withregulatory requirements.

In this article, we will go over the basics of logging on AWS, including settingup log collection, sending logs to different destinations, and creating alerts.We will also provide examples of how to use these services to solve commonlogging challenges. Whether you're new to AWS or an experienced user, this guidewill help you get started with logging on AWS and make the most of the servicesprovided by AWS.

Getting Started with Collecting and Managing AWS Logs | Better Stack Community (1)
Getting Started with Collecting and Managing AWS Logs | Better Stack Community (2)

🔭 Want a more cost-effective way to centralize and monitor your AWS logs?

Head over to Logtail and start ingesting your logs in 5 minutes.

What logs do AWS generate?

Before we proceed with the rest of this article, let's briefly discuss how logsare generated in AWS and some of the commonly generated logs that you are likelyto encounter when using the AWS platform.

There are two primary log sources: the AWS services, and the applicationsrunning on such services. Amazon's built-in log management service, CloudWatch,is the primary tool for collecting and aggregating such logs, but there's alsoCloudTrail which stores events describing all user and API activity.

Once the log data is aggregated, it can be monitored and analyzed within thetools or archived in AWS S3. It can also be forwarded to a different logmanagement solution such as Logtail.

AWS Lambda logs

AWS Lambda is an event-driven compute service that lets you execute businesslogic in response to a wide variety of triggers without provisioning or managingservers. Several logs are generated by the service whenever a function isexecuted, and these logs provide various details about the function executionsuch as the start and end time, any error messages, and various metrics that canhelp you optimize performance and alert you to application-level issues.

Lambda logs are automatically sent to CloudWatch Logs in real-time, and can beviewed using the CloudWatch Logs console, the CloudWatch Logs API, or the AWSCLI. There are three types of logs that are generated by AWS Lambda:

  1. Function logs: these are messages emitted to the standard output andstandard error from a Lambda function. Each Lambda function will output itslogs to separate log groups (/aws/lambda/<FunctionName>) and streams(YYY/MM/DD/[<FunctionVersion>]<InstanceId>). Ensure to use a loggingframework to classify your log messages appropriately(through log levels) and output them in JSON formatso that they can be queried, filtered and exported easily in CloudWatch Logs.

  2. Extension logs: Lambda extensions are used to integrate the Lambdaexecution environment with various tools for observability, monitoring,security, and more. Log output from such extensions are streamed toCloudWatch so that you can analyze them to identify extension-relatedproblems.

  3. Platform logs: these logs are generated by the Lambda executionenvironment and they record events and errors related to function invocationsand extensions. Such events include start and end time for an invocation,various metrics about the invocation (such as the duration of the function'sexecution and the memory usage), the request ID that uniquely identifies thefunction execution, and more.

 { "time": "2020-08-20T12:31:32.123Z", "type": "", "record": {"requestId": "6f7f0961f83442118a7af6fe80b88d56", "metrics": {"durationMs": 101.51, "billedDurationMs": 300, "memorySizeMB": 512, "maxMemoryUsedMB": 33, "initDurationMs": 116.67 } } }


AWS API Gateway logs

Getting Started with Collecting and Managing AWS Logs | Better Stack Community (3)
Getting Started with Collecting and Managing AWS Logs | Better Stack Community (4)

AWS API Gateway is a fully managed service that makes it easy for developers tocreate, publish, maintain, monitor, and secure APIs at any scale. It generateslogs that provide detailed information about the requests and responsesprocessed by the API Gateway which are categorized into two:

1. Access logs

They are similar to Apache or NGINX access logs as they contain details about each request that passes through the API Gateway. Such logs provide a summary of the request by including details such as the time the request occurred, the HTTP status code, the resource that was requested, and more.

 { "requestId": "e6d3cd70-655b-4e07-8405-d257b768a90b", "ip": "", "caller": "-", "user": "-", "requestTime": "18/Jan/2023:12:40:20 +0000", "httpMethod": "GET", "resourcePath": "/", "status": "200", "protocol": "HTTP/1.1", "responseLength": "1310" }


(Video) Step-by-Step Guide to Setting Up an AWS Analytics Stack with Dremio

2. Execution logs

These document all the processes happening within the API Gateway for each request. They contain a lot more detail about each request compared to access logs so they should typically only be enabled during troubleshooting sessions to avoid incurring heavy CloudWatch costs. For example, the logs in the screenshot below are for a single request.

Getting Started with Collecting and Managing AWS Logs | Better Stack Community (5)
Getting Started with Collecting and Managing AWS Logs | Better Stack Community (6)

Amazon S3 logs

Amazon S3 (Simple Storage Service) is a fully managed service that enables youto store, retrieve and manage data in the cloud. It stores data as objects(files and their metadata) within buckets (containers for objects), and giveseach object a unique indentifier.

S3 access logs help you keep track of how each object in your various bucketsare accessed and used for auditing or compliance purposes. They are stored in aseparate S3 bucket and they include information such as the requester ID, therequest type (e.g. GET, PUT, DELETE), the request date and time, the bucketname, object size, and more.

Here's what anS3 access loglooks like:

79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be DOC-EXAMPLE-BUCKET1 [06/Feb/2019:00:00:38 +0000] 79a59df900b949e55d96a1e698fbacedfd6e09d98eacf8f8d5218e7cd47ef2be 3E57427F3EXAMPLE REST.GET.VERSIONING - "GET /DOC-EXAMPLE-BUCKET1?versioning HTTP/1.1" 200 - 113 - 7 - "-" "S3Console/0.4" - s9lzHYrFp76ZVxRcpX9+5cjAnEH2ROuNkd2BHfIa6UkFVdtjf5mKR3/eTPFvsiP/XV/VLi31234= SigV4 ECDHE-RSA-AES128-GCM-SHA256 AuthHeader TLSV1.2 arn:aws:s3:us-west-1:123456789012:accesspoint/example-AP Yes


You'll need to enable S3 server access logging for the bucket you'd like totrack and specify the target bucket (where the logs will be stored) and prefixfor the logs. Once enabled, S3 will automatically deliver the logs to thespecified bucket.

Getting Started with Collecting and Managing AWS Logs | Better Stack Community (7)
Getting Started with Collecting and Managing AWS Logs | Better Stack Community (8)

Amazon RDS logs

Getting Started with Collecting and Managing AWS Logs | Better Stack Community (9)
Getting Started with Collecting and Managing AWS Logs | Better Stack Community (10)

Amazon RDS (Relational Database Service) is a fully managed service provided byAWS that makes it easy to set up, operate, and scale a relational database inthe cloud. It supports various database engines such as MySQL, MariaDB, Oracle,SQL Server, and PostgreSQL.

Amazon RDS generates a variety of logs to help you monitor and troubleshoot yourdatabase instances. These logs include:

  1. Error Logs: containing information about errors and warning messages thatoccur when running your database.

  2. Slow Query Logs: indicates slow-performing queries that take longer thana specified time to execute.

  3. Audit Logs: for tracking database activities such as logins, useractivity, and database changes.

    (Video) AWS Summit ANZ 2022 - Unleash your cloud operations team (SYS1)

  4. General Logs: contains information about all client connections anddisconnections, as well as SQL statements that are executed by the server.

  5. Binary Logs: records all changes made to the database, such as datamodification statements and table structure changes.

It's worth noting that for some database engines, the log types that areavailable and how to access them may be different. You can view the logs in theRDS console, download them to your local computer, or stream them to CloudWatchLogs for further analysis and storage.

AWS CloudTrail logs

Getting Started with Collecting and Managing AWS Logs | Better Stack Community (11)
Getting Started with Collecting and Managing AWS Logs | Better Stack Community (12)

AWS CloudTrail is a service that allows you to record and track activity in yourAWS account. It records all AWS Management Console sign-in events and API callsmade in your account so that such data may be used to monitor user activity,troubleshoot issues, and audit the use of your AWS resources.

The logs generated by CloudTrail are stored in S3 bucket, and you can accessthem using the S3 console, the AWS CLI, or the S3 API. CloudTrail logs includethe following information such as:

  • The identity of the user or role that made the request.
  • The time of the request.
  • The source IP address of the request.
  • The request parameters.
  • The response elements returned by the service.

CloudTrail logs can be used for various use cases such as compliance, security,operational troubleshooting, and incident response. You can also use CloudTraillogs in conjunction with other AWS services such as Amazon CloudWatch, AmazonElasticsearch Service, and AWS Lambda to create custom monitoring and automationsolutions.

Collecting and viewing AWS logs in CloudWatch

Before you can derive value from your AWS logs, you need to collect them firstand centralize them in one place. CloudWatch is the primary logging andmonitoring service for the AWS platform, and it can help with collecting andcentralizing logs and metrics from various AWS offerings such as the onesdiscussed in the previous section. It also provides search and analysisfunctionality to help you derive value from your log data, and it can also alertyou to anomalies or other patterns in your logs.

Here are the steps to collect logs from AWS services using CloudWatch Logs:

Getting Started with Collecting and Managing AWS Logs | Better Stack Community (13)
Getting Started with Collecting and Managing AWS Logs | Better Stack Community (14)
  1. Create a log group: A log group is a container for your logs. It is acollection of log streams that share the same log retention policy, accesscontrol, and monitoring settings. Generally, you'll create a different loggroup for each service or application that you want to monitor.

  2. Create a log stream: A log stream is a sequence of log events that comefrom the same source. Each distinct source of logs in CloudWatch Logs isrepresented by a separate log stream. There can be an unlimited number ofstreams in each log group.

  3. Enable logging for the service or application: You can enable logging formost AWS services through the AWS Management Console or by using the AWS CLIor SDKs.

  4. Send the service/application logs to CloudWatch: Once logging is enabledfor a service or application, ensure that the logs are being transmitted toCloudWatch Logs. In many cases, the functionality toautomatically send logs to CloudWatchis already built-into the service. However, if such functionality is notavailable, you can also employ the CloudWatch agent to collect metrics andlogs from EC2 insances and on-premise servers, or use the AWS CLI or API asappropriate.

  5. View your log data: The CloudWatch Logs console, API, or the AWS CLI maybe used to view and analyze the collected log data. If you're using the webconsole, you need to select the log group and log stream that you want toview and then inspect the individual log events in the stream.

Getting Started with Collecting and Managing AWS Logs | Better Stack Community (15)
Getting Started with Collecting and Managing AWS Logs | Better Stack Community (16)

Searching and querying your log data

CloudWatch Logs provides several ways to search, query, and filter log data inorder to find the specific information you need. For example, the CloudWatchconsole provides a basic way to search your log data usingfilter patterns.Once you're on the Log events page, you can enter a filter pattern to searchfor and match terms, phrases, or values in your log events.

CloudWatch Logs Insights is another powerful tool that offers a query languagethat allows you to filter, aggregate, and perform calculations on your log data.You can use the CloudWatch Logs Insights console or the CloudWatch Logs API torun queries and visualize the results.

Creating metrics from log data

Another useful way to use log data stored in CloudWatch logs is by turning theminto numerical CloudWatch metrics that you can visualize using dashboards ormonitor and respond to them via CloudWatch alarms. For example, you can monitoryour error rate, 4xx rate, the occurrences of a specific term in your logs, andmore.

Getting Started with Collecting and Managing AWS Logs | Better Stack Community (17)
Getting Started with Collecting and Managing AWS Logs | Better Stack Community (18)

Visualizing AWS log data through CloudWatch dashboards

CloudWatch Dashboards is a feature provided in CloudWatch that allows you toview the performance and health of your AWS resources and applications in asingle place, which can help you quickly identify trends and troubleshootissues.

With Dashboards, you can create custom visualizations to display CloudWatchmetrics, alarms, and logs. You can do this through the AWS Management Console,the AWS CLI, or the CloudWatch Dashboards API. Once a dashboard is created, youcan add one or more widgets to it, and each one can display a specific metric,alarm, or log group. Widget can also be customized with differentvisualizations, such as line, stacked area, and bar charts, pie charts, andmore.

(Video) AWS re:Invent 2022 - Scaling on AWS for your first 10 million users (STP211-R)

CloudWatch Dashboards also support CloudWatch Logs Insights which allows you tohave a dynamic view of your logs by running ad-hoc queries and viewing theresults in widget form. You can also use dashboards to communicate the status ofyour services with other stakeholders by creating a URL or embedding thedashboard in your application or website using an iframe.

Exporting CloudWatch logs to Amazon S3

By default, CloudWatch Logs retains logs indefinitely, meaning that they willnot be automatically deleted. However, as your services grow, this behavior canincur prohibitive costs and make searching through them more challenging. Youcan configure a log retention period for individual log groups in CloudWatchLogs so that logs older than that period will be deleted automatically.

If you want to retain your older logs for a longer period without incurringheavy costs, exporting them Amazon S3 is a common way to archive them. It alsoallows you to use S3's lifecycle policies to automatically move the logs toother storage classes or even other services such as Amazon Glacier or AmazonS3-IA.

Once you've configured CloudWatch to archive your logs in S3, you'll be able toaccess them and use the S3's features such as versioning, access management, anddata lifecycle policies to store, protect and analyze your logs.

Exporting CloudWatch logs to other destinations

Besides archiving your AWS logs in S3, you can also send them to a third-partyplatform. This is a way to gain advanced visibility and analysis capabilitiesfor your logs, and also allows you to use your preferred log analysis tool. Forexample, you can send your CloudWatch logs toLogtail and benefit from Logtail's moremodern interface that provides a more cost-effective log management solution foryour serverless logs and correlate them with the rest of your telemetry data.

Getting Started with Collecting and Managing AWS Logs | Better Stack Community (19)
Getting Started with Collecting and Managing AWS Logs | Better Stack Community (20)

To achieve this, you can a CloudWatch Logs subscription filter for the specificlog group and choose the Lambda subscription option which allows you toautomatically send log events to an AWS Lambda function for further processingand analysis.

You'll need to create a Lambda function that can process the log events fromCloudWatch Logs, and this function should include the necessary logic to parse,filter, and send the log events to the desired location. With Logtail, you canuse theHTTP REST API inyour Lambda function to route a single event or a list of events to the service.

See the AWS CloudWatch documentation for more details on how toaccess CloudWatch logs in AWS Lambda.

Best practices for managing AWS log data

Monitoring logs in AWS is an important part of maintaining the security,performance, and availability of your resources and applications. Here are a fewbest practices for collecting and managing your AWS log data:

  • Collect and store logs from all of your AWS services and resources, includingEC2 instances, Lambda functions, and CloudTrail. Use CloudWatch Logs or otherservices like S3, Elasticsearch, or Kinesis Data Streams to store andaggregate your logs.

  • Use CloudWatch Logs Insights to search, query, and analyze your logs inreal-time. This will allow you to quickly identify issues and troubleshootproblems.

  • Create CloudWatch Alarms to automatically notify you when specific conditionsare met in your logs. You can also use SNS to send email or SMS notifications,or invoke a Lambda function to take automated actions.

  • Visualize your metrics, alarms, and logs in a single place through CloudWatchDashboards. This allows you to quickly view the performance and health of yourresources and applications, and share this information with others.

  • Use third-party log analysis tools likeLogtail to gain advanced visibility andanalysis capabilities for your logs.

  • Implement security and compliance best practices for your logs such asencryption.

  • Regularly review your logging strategy and archive your logs, keeping only thelogs that you need for a certain period of time and delete the rest, this wayyou can save on storage and processing cost and comply with the legal andregulatory requirements.

Final thoughts

In this article, we aimed to provide you with the basics of logging on AWS andshow you how to get started with logging on AWS. We hope this guide has beenhelpful in understanding the different logging services provided by AWS and howto use them effectively. With the right logging strategy in place, you canensure that your infrastructure is running smoothly and that you have theinformation you need to make informed decisions.

Thanks for reading, and happy logging!

Logs tell stories.
Read them.

Experience SQL-compatible
structured log management.

Explore logging →

Centralize all your logs into one place.

Analyze, correlate and filter logs with SQL.

Create actionable

Share and comment with built-in collaboration.

Start logging

(Video) AWS re:Invent 2021 - Deep dive on Amazon Timestream [REPEAT]

Got an article suggestion?Let us know

Next article

Google Cloud Platform Logging with a Practical ExampleIn this article, we will cover the basics of logging on Google Cloud Platform, including setting up log collection, sending logs to different destinations, and creating alerts. Get started with logging on GCP today and ensure the smooth running of your infrastructure→

Getting Started with Collecting and Managing AWS Logs | Better Stack Community (24)

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

(Video) AWS Tutorial For Beginners | AWS Full Course - Learn AWS In 10 Hours | AWS Training | Edureka


1. Webinar: Supercharging LoRaWAN with AWS and The Things Stack
(The Things Network)
2. Logging in Kubernetes | Fluent Bit | Observability | ADITYA JOSHI |
(Aditya Joshi)
3. AWS Certified Developer - Associate 2020 (PASS THE EXAM!)
4. Running the Elastic Stack on Kubernetes with ECK
(Official Elastic Community)
5. AWS Certified Cloud Practitioner Certification Course (CLF-C01) - Pass the Exam!
6. AWS re:Invent 2022 - How to manage resources and applications at scale on AWS (COP314)
(AWS Events)
Top Articles
Latest Posts
Article information

Author: Moshe Kshlerin

Last Updated: 02/05/2023

Views: 5712

Rating: 4.7 / 5 (77 voted)

Reviews: 84% of readers found this page helpful

Author information

Name: Moshe Kshlerin

Birthday: 1994-01-25

Address: Suite 609 315 Lupita Unions, Ronnieburgh, MI 62697

Phone: +2424755286529

Job: District Education Designer

Hobby: Yoga, Gunsmithing, Singing, 3D printing, Nordic skating, Soapmaking, Juggling

Introduction: My name is Moshe Kshlerin, I am a gleaming, attractive, outstanding, pleasant, delightful, outstanding, famous person who loves writing and wants to share my knowledge and understanding with you.