Monitor Industrial Processes Using AWS IoT Core

Monitor Industrial Processes Using AWS IoT Core
Monitor Industrial Processes Using AWS IoT Core

CLOUD LABS



Monitor Industrial Processes Using AWS IoT Core

Learn to build an IoT data pipeline using AWS services, including IoT Core, Kinesis Data Streams, and Amazon S3, for efficient data management and storage.

13 Tasks

intermediate

2hr

Certificate of Completion

Desktop OnlyDevice is not compatible.
No Setup Required
Amazon Web Services

Learning Objectives

Ability to collect data from IoT devices using AWS IoT Core
Understanding of setting and maintaining Kinesis Data Streams for seamless data flow
Hands-on experience developing ETL jobs for data transformation using AWS Glue

Technologies
IoT Core
IAM logoIAM
Kinesis
S3 logoS3
Glue
Skills Covered
Using AWS Cloud Services
Cloud Lab Overview

In this Cloud Lab, we will learn to create a data pipeline to efficiently gather, process, and securely store industrial data. We will start with creating IAM roles and an S3 bucket required for the pipeline. Next, we will connect our device with IoT Core by creating the required IoT Core infrastructure and then using the security credentials related to our infrastructure to establish a connection between our device and IoT Core. This will conclude the first part of the pipeline—-sending the device data to IoT Core.

The next step is to create a Kinesis Data Stream which will act as a data conduit, allowing data from IoT Core to flow through it efficiently. It will also set the stage for effective and secure transfer of data to Amazon S3 for long-term storage. Once the data stream is created, we will integrate it with IoT Core and test if the integration is working.

Next, we will set up an ETL (Extract, Transform, Load) job. This job is tasked with extracting data from the Kinesis Data Stream and efficiently transferring it to an Amazon S3 bucket for long-term storage. For that, we will first create a catalog table with our kinesis data stream as its source and then write an ETL job to transfer the data from the stream to an Amazon S3 bucket using this catalog table.

After completing this Cloud Lab, you will have gained hands-on experience in setting up an end-to-end data pipeline for IoT data using AWS IoT Core, Kinesis Data Streams, and Amazon S3. 

Here is a high-level architecture diagram of the infrastructure that you will create in this Cloud Lab:

Architecture diagram
Architecture diagram
Cloud Lab Tasks
1.Introduction
Getting Started
2.Create the Pre-Required Resources
Create IAM Roles
Create an S3 Bucket
3.Register the Device
Create an IoT Thing
Generate the Required Security Credentials
Publish Messages
4.Send Data to Kinesis Stream
Create a Kinesis Stream
Create an IoT Rule
Publish Messages
5.Use Glue ETL Job
Create a Catalog Table
Create an ETL Job
6.Conclusion
Clean Up
Wrap Up
Labs Rules Apply
Stay within resource usage requirements.
Do not engage in cryptocurrency mining.
Do not engage in or encourage activity that is illegal.

Before you start...

Try these optional labs before starting this lab.

Relevant Course

Use the following content to review prerequisites or explore specific concepts in detail.

Hear what others have to say
Join 1.4 million developers working at companies like