Chapter1-IoT Ecosystem

Download as pdf or txt
Download as pdf or txt
You are on page 1of 31

Chapter1:IoT Ecosystem

Rekha M N
Asst. Professor,
Dept. Of E &EE
JSSSTU, Mysuru
Contents:
➢ Meaning of IoT
➢ Vision Of IoT
➢ Commonly Used Definitions
➢ Characteristics Of IoT
➢ IoT v/s IoE v/s M2M
➢ Enabling Technologies in IoT
➢ Applications of IoT
➢ General IoT Framework and IoT reference model
➢ Communication Models in IoT
➢ IoT Platforms

7 hours
IoT Ecosystem

Meaning of IoT:
The inter-networking of physical devices, vehicles, buildings, and other items—embedded with
electronics, software, sensors, actuators, and network connectivity that enable these objects to
collect and exchange data. IoT allows objects to be sensed or controlled remotely across existing
network infrastructure, creating opportunities for more direct integration of the physical world
into computer-based systems, and resulting in improved efficiency, accuracy and economic
benefit in addition to reduced human intervention.
The IoT harnesses the power of the internet to connect physical devices in real-time. Thus, any
physical object connected through the internet can be converted into an IoT device. For
example, reading the article over a smartphone, tablet, or PC. Similarly, a light bulb, which is
a physical object itself, will be considered as an IoT device when connected through a
smartphone app to turn it on and off.
IoT examples may include:
➢ Smart home security systems
➢ Autonomous farming instrumentation
➢ Wearable health monitors
➢ Smart mill instrumentation
➢ Wireless inventory trackers
➢ Ultra-high-speed wireless web
➢ Biometric cybersecurity scanners
➢ Shipping instrumentality and provision chase
➢ Home automation

Figure 1: Internet Of Things


Vision Of IoT:
Things becoming intelligent, smart and behaving alive.
Here things (wearable watches, alarm clocks, home devices, surrounding objects) become
‘Smart’ and function like living entities by sensing, computing and communicating through
embedded devices which interact with remote objects (servers, clouds, applications, services and
processes) or persons through the internet.

Commonly Used Definitions and Concepts in IoT:


IoT: It’s a system of web connected objects which are ready to gather and trade information
utilizing sensors embedded on a system.
IoT Device: A Standalone Entity connected to a Web which can be identified and monitored
from a Remote area.

Figure 2: IoT devices

IoT Ecosystem: The IoT ecosystem is not easy to define. It is also difficult to capture its proper
image due to the vastness and emerging possibility and the rapidity with which it is expanding in
the entire sector. However, the IoT ecosystem is a connection of various kind of devices that
sense and analyze the data and communicates with each other over the networks.

In the IoT ecosystem, the user uses smart devices such as smartphones, tablet, sensors, etc. to
send the command or request to devices for information over the networks. The device response
and performs the command to send information back to the user through networks after analyzed.

The typical IoT ecosystem is shown in below image, where the smarter devices send and receive
data from the devices themselves in the environment that are integrate over network and Cloud
Computing.
Figure 3: IoT Ecosystem

The IoT is itself an ecosystem of network devices that transfer the data.

1. Perception Layer
This is the basic layer of the entire IoT architecture. It is responsible for gathering all of the
data from various sensors. Furthermore, the devices in this layer are responsible for sending
and receiving data to and from the upper layers. Some information that could be collected from
the sensors in the static or dynamic states are the objects’ state, the environment of the
surrounding areas, and the objects’ characteristics. Therefore, all of the objects that are used for
collecting data in the IoT, such as sensors, people, electronics, and smartphones, are called
“things”.

2. Network Layer
This layer is responsible for transmitting data and providing network access to the Internet.
Therefore, all of the information that is collected from the sensors (perception layer) is transmitted
through this layer. Various communication technologies, such as GSM, WLAN, and IPv6, are
used in this layer to achieve the main function of transmitting data. This layer can contain one or
more network devices, such as gateways, edge computing, and mobile communication network,
which are needed to provide the lower layer with three main functionalities: network
communications, software protocol, and communication security.

3. Processing Layer
This layer contains high computational resources to process the massive amount of data
collected from the sensors in the perception layer. The layer links the upper and the lower
layer by processing the collected information intelligently and presenting this information
in the application layer. Various computational resources, such as high computing devices,
cloud computing devices, and clusters, are utilized in this layer to achieve the main functionalities
of this layer.
4. Application Layer
The application layer is the top layer of the IoT architecture. It provides users with many
services, such as management devices and the interface of the device’s display. This layer
has an intelligent decision system that responds quickly to the needs of various businesses, such
as healthcare, energy management, and environment monitoring. The accuracy of the response
result depends on the latest information that is used to train the intelligent decision system.

Figure 4: IoT Layers

Dashboard: An Internet of Things (IoT) dashboard is a data visualization tool that transforms,
displays, and organizes a collection of data captured and transmitted by network-connected
devices. The primary purpose of an IoT dashboard is to provide human-readable information-at-
a-glance to remotely monitor historical and real-time IoT data.
Figure 5: IoT Dashboard

Internet of Things (IoT) analytics: It is a data analysis tool that assesses the wide range of data
collected from IoT devices. IoT analytics assesses vast quantities of data and produces useful
information from it.
IoT analytics are usually discussed in tandem with Industrial IoT (IIoT). Data is collected from
a wide range of sensors on manufacturing infrastructure, weather stations, smart meters,
delivery trucks, and all forms of machinery. IoT analytics can be applied to managing data
centers and applications handling retail and healthcare.

Figure 6: IoT Analytics


Characteristics Of IoT:
Any IoT device comes up with the following Characteristics:
1. Connectivity
Connectivity is an important requirement of the IoT infrastructure. It is the ability to
communicate with and share information between two or more devices. With this connectivity,
comes a plethora of opportunities for businesses to create new products and services. The
internet of things has created a world where everything is connected, which opens up a world
of possibilities for the future.
These objects/devices can be anything from fridge to car, and even dog. This connectivity
enables these objects to be controlled remotely, and also allows them to exchange data with
other objects.

2. Intelligence
The intelligence of IoT devices is the intelligence of smart sensors and devices to sense data,
interact with each other and collect a huge amount of data for analysis. Complex software,
algorithms, and protocols are used to connect IoT devices to the networks and process the data
from millions of data nodes. Intelligence in IoT is only concerned with the interaction between
devices, while user and device interaction is achieved by standard input methods and graphical
user interfaces.

Figure 7: Characteristics Of IoT


3. Heterogeneity

The devices in the IoT are heterogeneous as based on different hardware platforms and
networks. They can interact with other devices or service platforms through different networks.
The requirement of heterogeneous networks in IoT is scalabilities, modularity, extensibility,
and interoperability.
4. Safety(Security)
IoT devices are vulnerable to security threats. There is a high level of transparency issues and
privacy issues with IoT. For creating a security paradigm, it is important to secure the
endpoints, networks, and data that are transferred across all of them.
5. Sensing
IoT without sensors cannot be imagined. IoT sensors helps in detecting or measuring any
changes in the environment to generate data that can interact with the environment.
6. Enormous Scale
The number of devices that communicate with each other will be much larger than the devices
connected to the current internet. The management of these devices and interpretation for
application purposes is more critical. Gartner (2015) confirms the enormous scale of IoT in the
estimated report where it expressed those 5.5 million new things will get connected every day
and 6.4 billion connected devices will be in use worldwide in 2016, which is up by 30% from
2015. In 2022, the market for the Internet of Things is expected to grow 18 percent to 14.4
billion active connections
7. Dynamic Nature
The most important part of IoT is gathering data from its environment, which is achieved with
the dynamic changes that take place around the devices. The state of these devices changes
dynamically like connected or disconnected. In addition to the context of devices including
temperature, location, and speed and the number of devices also changes dynamically with a
person, place, and time.

IoT v/s IoE v/s M2M

Figure 8: IoT v/s IoE v/s M2M


M2M IoT IoE

M2M is a subset of IOT IoT is a superset of M2M IoE is superset of IoT

M2M encompasses three key IOT encompasses four key IoE consists of four key
components such as components such as components such as people,
things, data, and processes
▪ Devices that generate or ▪ A sensor or device used to
receive data from other generate or receive data
devices from another device
▪ Communication for efficient ▪ Communication for
data transfer between transferring of data to
devices and gateways internet or between devices
▪ An application designed to ▪ Storage services, for
meet the needs of end users efficient storage of data
into database or to cloud
▪ Application to provide
intended service

Point to point communication exists IP network exists between devices, IoE is a network connection of
between the devices by integrating various people, data and things
communication protocols

M2M communication may exist Devices in IoT require an active Devices and their application
without the internet internet in most of the cases require active internet

Enabling Technologies in IoT:

Figure 9: Enabling Technologies in IoT


1. Wireless Sensor Networks

A wireless sensor network comprises of distributed device with sensor which are used to
monitor the environmental and physical conditions. Sensor nodes are used in WSN with the
onboard processor that manages and monitors the environment in a particular area. They are
connected to the Base Station which acts as a processing unit in the WSN System. Base Station
in a WSN System is connected through the Internet to share data.

Example of WSNs used in IoT systems is as follows: Weather monitoring system use WSNs
in which the nodes collect temperature humidity and other data which is aggregated and
analysed.

Figure 10: Wireless Sensor Networks

2. Cloud Computing

The Cloud is a centralised system that helps to deliver and transport data and various files
across the Internet to data centres. The different data and programmes can be accessed easily
from the centralised Cloud system. Cloud Computing is an economic solution, as it does not
require on-site infrastructure for storage, processing and analytics.

The main three cloud service models are: IAAS, PAAS AND SAAS. Each cloud service model
covers different user and company needs, and provides a different level of control, security and
scalability.

• Infrastructure as a Service (IAAS)


IAAS is a cloud service model that consists of provisioning and managing computing
resources over the Internet; such as servers, storage, networking and virtualization. The Cloud
provider offers resources like object storage, runtime, queuing, databases, etc. However, the
responsibility of configuration and implementation related tasks depend on the consumer.
IAAS can be used for many purposes such as deploying web applications, running a
CRM, doing Big Data analysis, backups or Disaster Recovery plans. Some examples of IaaS
are Stackscale, AWS and VMware.

• Platform as a Service (PAAS)


PAAS is a cloud service model that provides a ready-to-use development
environment where developers can focus on writing and executing high-quality code in
order to create customized applications. Platform as a Service is delivered via the web,
allowing developers to build scalable and highly-available applications without worrying
about the OS, storage or updates. It provides a framework developers can use for
developing, managing, distributing and testing software applications. Some examples of
PaaS are Heroku, Apache Stratos and OpenShift.

• Software as a Service (SAAS)


SAAS is a cloud service model that consists of delivering cloud-based applications to users
over the Internet. Software is hosted online and made available to customers on a
subscription basis or for purchase. SAAS cloud providers host applications in his network
and users can access them through a browser or app, from different devices. Software as a
Service is also known as “on-demand software” or “cloud application services”. SAAS
providers are responsible for developing, hosting, maintaining and updating the software.
Some examples of SaaS are Google Workspace, Dropbox and Salesforce.

Figure 11: Cloud Service Model


3. Big Data Analytics
Big Data analytics means a large set (petabytes or gigabytes) of structured, unstructured or
semi-structured data (called Big data) and analyzing those data to get the insights of the
business trend.
The role of big data in IoT is to process a large amount of data on a real-time basis and
storing them using different storage technologies.

IoT big data processing follows four sequential steps –

1. A large amount of unstructured data is generated by IoT devices which are collected
in the big data system. This IoT generated big data largely depends on their 3V factors
that are volume, velocity, and variety.
2. In the big data system, which is basically a shared distributed database, the huge
amount of data is stored in big data files.
3. Analysing the stored IoT big data using analytic tools like Hadoop MapReduce or
Spark
4. Generating the reports of analysed data.

Since in IoT the unstructured data are collected via the internet, hence, big data for the
internet of things need lightning-fast analysis with large queries to gain rapid insights from
data to make quick decisions. Hence the need for big data in IoT is compelling.Hence, from
the big data perspective, it is the fuel that drives Internet of Things run

Figure 12: Big Data Analytics

Characteristics

Big data can be described by the following characteristics (3 Vs…)

• Volume of data being stored and used by organizations;


• Variety of data being generated by IoT devices; and
• Velocity, or speed, in which that data was being generated and updated.

Figure 13: 3 V’s in Big Data

Some examples of big data generated by IoT systems are described as follows:

▪ Sensor data generated by IoT system such as weather monitoring stations.


▪ Machine sensor data collected from sensors embedded in industrial and energy
systems for monitoring their health and detecting Failures.

4. Communication protocols
Communication protocols form the backbone of IoT systems and enable network connectivity
and coupling to applications. Communication protocols allow devices to exchange data over
the network. Multiple protocols often describe different aspects of a single communication. A
group of protocols designed to work together are known as a protocol suite; when implemented
in software they are a protocol stack.

Internet communication protocols are published by the Internet Engineering Task Force (IETF).
The IEEE handles wired and wireless networking, and the International Organization for
Standardization (ISO) handles other types. The ITU-T handles telecommunication protocols
and formats for the public switched telephone network (PSTN). As the PSTN and Internet
converge, the standards are also being driven towards convergence.

Figure 14: Communication Protocols


5. Embedded Systems
An embedded system can be thought of as a computer hardware system having software
embedded in it. An embedded system can be an independent system or it can be a part of a
large system. An embedded system is a controller programmed and controlled by a real-time
operating system (RTOS) with a dedicated function within a larger mechanical or electrical
system, often with real-time computing constraints. It is embedded as part of a complete
device often including hardware and mechanical parts. Embedded systems control many
devices in common use today. Ninety-eight percent of all microprocessors are manufactured
to serve as embedded system component.
An embedded system has three components −

▪ It has hardware.
▪ It has application software.
▪ It has Real Time Operating system (RTOS) that supervises the application software
and provide mechanism to let the processor run a process as per scheduling by
following a plan to control the latencies. RTOS defines the way the system works. It
sets the rules during the execution of application program. A small-scale embedded
system may not have RTOS

Applications of IoT:

Consumer Smart Home Control, Lightning, Maintenance etc.

Industrial Smart meters, Wear outs, Climate control, Product tracking

Automotive Parking, Traffic control, Anti-Theft location

Environmental Weather prediction, Resource management

Agriculture Crop management, soil analysis

Military Troop Monitoring, Threat analysis

Medical Wearable Devices


Figure 15: Applications of IoT

General IoT Framework and IoT reference model:


General IoT Framework:
In general, all the requirements for any IoT based product requires 4 stages of IoT architecture.
In simple terms, the 4 Stage IoT architecture consists of
1. Sensors and actuators
2. Internet getaways and Data Acquisition Systems
3. Edge IT
4. Data center and cloud.

The detailed presentation of these stages can be found on the diagram below.

Figure 16: 4 Stage IoT Solutions Architecture


Stage 1. Networked things (wireless sensors and actuators)
The outstanding feature about sensors is their ability to convert the information obtained in the
outer world into data for analysis. In other words, it’s important to start with the inclusion of
sensors in the 4 stages of an IoT architecture framework to get information in an appearance that
can be actually processed.

For actuators, the process goes even further — these devices are able to intervene the physical
reality. For example, they can switch off the light and adjust the temperature in a room.
Because of this, sensing and actuating stage covers and adjusts everything needed in the physical
world to gain the necessary insights for further analysis.
Stage 2. Sensor data aggregation systems and analog-to-digital data conversion
Internet/Network gateways, Data Acquisition System (DAS) are present in this layer. DAS
performs data aggregation and conversion function (Collecting data and aggregating data then
converting analog data of sensors to digital data etc). Advanced gateways which mainly opens
up connection between Sensor networks and Internet also performs many basic gateway
functionalities like malware protection, and filtering also some times decision making based on
inputted data and data management services, etc.
In short, Stage 2 makes data both digitalized and aggregated.

Figure 17: Sensor data aggregation systems and analog-to-digital data conversion

Stage 3. The appearance of edge IT systems


In this stage, transfer of the data that is prepared in stage 2 is exposed to the IT world. To be
precise, the edge IT system performs enhanced analytics along with pre-processing. Particularly,
machine learning and visual representation.
Some additional processing may also happen here, before the data is entered in data centers.
Step 3 enables data to be captured at local sensors and at the same time transferring the data to
the remote locations.
Figure 18: Appearance of edge IT systems

Stage 4. Analysis, management, and storage of data


The main processes on the last stage of IoT architecture happen in data centre or cloud. Precisely,
it enables in-depth processing, along with a follow-up revision for feedback. Here, the skills of
both IT and OT (operational technology) professionals are needed. In other words, the phase
already includes the analytical skills of the highest rank, both in digital and human worlds.
Therefore, the data from other sources may be included here to ensure an in-depth analysis.
After meeting all the quality standards and requirements, the information is brought back to the
physical world — but in a processed and precisely analyzed appearance already.
Note:Stage 5 of IoT Architecture?
In fact, there is an option to extend the process of building a sustainable IoT architecture by
introducing an extra stage in it. It refers to initiating a user’s control over the structure — if
only your result doesn’t include full automation, of course. The main tasks here are visualization
and management. After including Stage 5, the system turns into a circle where a user sends
commands to sensors/actuators (Stage 1) to perform some actions.

Architectural frameworks are high-level descriptions of an organization as a system; they


capture the structure of its main components at varied levels, the interrelationships among these
components, and the principles that guide their evolution

IoT reference model:

Figure 19: IoT reference Model


1. Physical Devices and Controllers – The model calls this layer the “things” of the IoT.
Unfortunately, this layer is a little ambiguous.on one hand, the “things” are the assets being
managed. From a system design perspective, the “things” are the sensors and devices that are
directly managed by the IoT architecture. These two are not often identical, at least not yet, as
many assets (“things”) under management will not yet have integrated sensors and Edge Node
intelligent elements integrated. So, we should think of this layer as consisting of the “things”
themselves and the sensors and Edge Node devices connected to them, as well as a more
modern class of “things” with integrated sensors and Edge Node functionality.

It is worth noting that in the later case, and where connecting to existing field assets, there can
be significant design effort to connect the sensors and Edge Node intelligent hardware as well
as in mapping these systems to any management or intelligence that may exist in legacy assets
(unless they are just “dumb” assets that need instrumentation).

An important IoT concept, Edge Intelligence, to allow low latency reaction to field events and
to allow higher levels of autonomy and distributed processing, needs to be implemented at this
layer.

2. Connectivity – This layer spans from the “middle” of an Edge Node device up through
transport to the cloud. Many alternatives can be used for communications and this layer
includes the mapping of field data to the logical and physical technologies used as well as the
backhaul to the on premise or cloud and the next layer, Edge Computing.

In deployment, this layer can use a single solution or multiple technologies, depending on the
need. Field Area Networks (FANs) alternatives can include wired, cellular, LPWAN, and many
other wireless options, as well as multi-tired solutions, and can be build out of private, public,
or a mix of private and public transport solutions.

3. Edge Computing – The next layer in the World Forum Model architecture is Edge
Computing, or more properly “Cloud Edge” or “Cloud Gateway” computing. Required to some
degree in any IoT system this layer interfaces the data and control plains to the higher layers of
cloud, SaaS, or enterprise software layers. Protocol conversion, routing to higher layer software
functions and even “fast path” logic for low latency decision making will be implemented at
this layer.

4. Data Accumulation – Given the Velocity, Volume and Variety that IoT systems can provide
it is essential to provide incoming data storage for subsequent processing, normalization,
integration, and preparation for upstream applications. While part of the overall “data lake”
architecture, this layer of the architecture serves the intermediate storage of incoming storage
and outgoing traffic queued for delivery to lower layers. This layer may be implemented in
simple SQL or may require more sophisticated Hadoop & Hadoop File System, Mongo,
Cassandra, Spark or other NoSQL solutions.
5. Data Abstraction – In the data abstraction layer we “make sense” of the data, collecting
“like” information from multiple IoT sensors or measurements, expedite high priority traffic or
alarms, and organize incoming data from the data lake into appropriate schema and flows for
upstream processing. Similarly, application data destined for downstream layers is reformatted
appropriately for device interaction and queued for processing.

A key architecture element for larger high-performance deployments is a publish / subscribe or


data distribution service (DDS) software framework to simplify data movement between Edge
Computing, Data Accumulation, Application Layer, and User Processes. Whether this is a high-
performance service or a simpler message bus this infrastructure simplifies implementation and
improves performance for all but the simplest applications.

6. Application Layer – This layer is self-explanatory and is where control plane and data plane
application logic are executed. Monitoring, process optimization, alarm management, statistical
analysis, control logic, logistics, consumer patterns, are just a few examples of IoT applications.

7. Collaboration and Processes – At this layer, application processing is presented to users,


and data processed at lower layers is integrated in to business applications. This layer is about
human interaction with all of the layers of the IoT system and where economic value is
delivered. The challenge at this layer is to effectively leverage the value of IoT and the layers
of infrastructure and services below and leverage this into economic growth, business
optimization and/or social good.

Communication Models in IoT:

In March 2015, the Internet Architecture Board released a guide to IoT networking (PDF). This
outlined four common communication models used by IoT “smart objects”: Device-to-Device,
Device-to-Cloud, Device-to-Gateway, and Back-End Data-Sharing.

1.Device-to-Device

Device-to-device communication represents two or more devices that directly connect and
communicate between one another. They can communicate over many types of networks,
including IP networks or the Internet, but most often use protocols like Bluetooth, Z-Wave, and
ZigBee.

This model is commonly used in home automation systems to transfer small data packets of
information between devices at a relatively low data rate. This could be light bulbs, thermostats,
and door locks sending small amounts of information to each other.

With Device-to-Device connectivity “security is specifically simplified because you have these
short-range radio technology [and a] one-to-one relationship between these two devices.”
Figure 20: Example of device-to-device communication model.

Device-to-device is popular among wearable IoT devices like a heart monitor paired to a
smartwatch where data doesn’t necessarily have be to shared with multiple people.

There are several standards being developed around Device-to-Device including Bluetooth
Low Energy (also known as Bluetooth Smart or Bluetooth Version 4.0+) which is popular
among portable and wearable devices because its low power requirements could mean devices
could operate for months or years on one battery. Its lower complexity can also reduce its size
and cost.

2. Device-to-Cloud

Device-to-cloud communication involves an IoT device connecting directly to an Internet cloud


service like an application service provider to exchange data and control message traffic. It
often uses traditional wired Ethernet or Wi-Fi connections, but can also use cellular technology.

Cloud connectivity lets the user (and an application) to obtain remote access to a device. It also
potentially supports pushing software updates to the device.

A use case for cellular-based Device-to-Cloud would be a smart tag that tracks your dog while
you’re not around, which would need wide-area cellular communication because you wouldn’t
know where the dog might be.

Specifically, if you’re away and you want to see what’s on your webcam at home. You contact
the cloud infrastructure and then the cloud infrastructure relays to your IoT device.

Figure 21 :Device-to-cloud communication model diagram.


From a security perspective, it is more complicated than Device-to-Device because it involves
two different types of credentials – the network access credentials (such as the mobile device’s
SIM card) and then the credentials for cloud access.

The IAB’s report also mentioned that interoperability is also a factor with Device-to-Cloud
when attempting to integrate devices made by different manufacturers given that the device and
cloud service are typically from the same vendor. An example would be the Nest Labs Learning
Thermostat, where the Learning Thermostat can only work with Nest’s cloud service.

There’s work going into making Wifi devices that make cloud connections while consuming
less power with standards such as LoRa, Sigfox, and Narrowband.

3. Device-to-Gateway

In the Device-to-Gateway model, IoT devices basically connect to an intermediary device to


access a cloud service. This model often involves application software operating on a local
gateway device (like a smartphone or a “hub”) that acts as an intermediary between an IoT
device and a cloud service.

This gateway could provide security and other functionality such as data or protocol translation.
If the application-layer gateway is a smartphone, this application software might take the form
of an app that pairs with the IoT device and communicates with a cloud service.

This might be a fitness device that connects to the cloud through a smartphone app like Nike+,
or home automation applications that involve devices that connect to a hub like Samsung’s
SmartThings ecosystem.

Figure 22: Device-to-gateway communication model diagram.

Gateway devices can also potentially bridge the interoperability gap between devices that
communicate on different standards. For instance, SmartThings’ Z-Wave and Zigbee
transceivers can communicate with both families of devices.
4. Backend Data Sharing

Back-End Data-Sharing essentially extends the single device-to-cloud communication model


so that IoT devices and sensor data can be accessed by authorized third parties. Under this
model, users can export and analyze smart object data from a cloud service in combination with
data from other sources, and send it to other services for aggregation and analysis.

The app Map My Fitness is a good example of this because it compiles fitness data from various
devices ranging from the Fitbit to the Adidas miCoach to the Wahoo Bike Cadence Sensor.
“They provide hooks, REST APIs to allow security and privacy-friendly data sharing to Map
My Fitness.” This means an exercise can be analyzed from the viewpoint of various sensors.

Figure 23: Back-end data sharing model diagram.

Note: There’s No Clear IoT Deployment Model; It All Depends on the Use Case

The decision process for IoT developers is quite complicated when considering how it will be
integrated and how it will get connectivity to the internet working.

To further complicate things, newer technologies with lower power consumption, size and cost
are often lacking in maturity compared to traditional Ethernet or Wi-Fi.

“The equation is not just what is most convenient for me, but what are the limitations of those
radio technologies and how do I deal with factors like the size limitations, energy consumption,
the cost – these aspects play a big role.”

IoT Platforms:

IoT platforms are the middleware solutions that connect the IoT devices to the cloud and help
seamlessly exchange data over the network. It acts as a mediator between the application layer
and the hardware.
Figure 24: Top Seven IoT Platforms

Amazon Web Services (AWS)

AWS IoT Platform:

One of the most reliable and secure Internet of Things platforms, AWS IoT not only helps to
connect devices to the cloud but also safeguard the interactions with the applications available
on the cloud and other devices. Even when the devices are not connected to the Internet, the
AWS IoT Platform allows applications to monitor devices and facilitate round-the-clock
communication between them.

Architecture building blocks

The three building blocks of this technical architecture are the edge portfolio, data ingestion,
and data processing and analytics, shown below.

Figure 25:Building blocks of Meshify’s technical architecture


A. Edge portfolio (EP)

Starting with the edge sensors, the Meshify edge portfolio covers two types of sensors:

➢ LoRaWAN (Low power, long range WAN) sensor suite:This sensor provides the
long connectivity range (> 1000 feet) and extended battery life (~ 5 years) needed for
enterprise environments.
➢ Cellular-based sensors: This sensor is a narrow band/LTE-M device that operates at
LTE-M band 2/4/12 radio frequency and uses edge intelligence to conserve battery
life.

B. Data ingestion (DI)

For the LoRaWAN solution, aggregated sensor data at the Meshify gateway is sent to AWS
using AWS IoT Core and Meshify’s REST service endpoints. AWS IoT Core is a managed
cloud platform that lets IoT devices easily and securely connect using multiple protocols like
HTTP, MQTT, and WebSockets. It expands its protocol coverage through a new fully managed
feature called AWS IoT Core for LoRaWAN. This gives Meshify the ability to connect
LoRaWAN wireless devices with the AWS Cloud.

C. Data processing and analytics (DPA)

Initial processing of the data is done at the ingestion layer, using Meshify REST API endpoints
and the Rules Engine of AWS IoT Core. Meshify applies filtering logic to route relevant events
to Amazon Managed Streaming for Apache Kafka (Amazon MSK). Amazon MSK is an AWS
streaming data service that manages Apache Kafka infrastructure and operations, streamlining
the process of running Apache Kafka applications on AWS.

Meshify’s applications then consume the events from Amazon MSK as per the configured topic
subscription. They enrich and correlate the events with the records with a managed service,
Amazon Relational Database Service (RDS). These applications run as scalable containers on
another managed service, Amazon Elastic Kubernetes Service (EKS), which runs container
applications.

Below Figure illustrate the technical workflow from the ingestion of field events to their
processing, enrichment, and persistence. Finally, these events are used to power risk avoidance
decision-making
Figure 26:Technical workflow for Meshify IoT architecture

1. After installation, Meshify-designed LoRa sensors transmit information to the cloud through
Meshify’s gateways. LoRaWAN capabilities create connectivity between the sensors and the
gateways. They establish a low power, wide area network protocol that securely transmits
data over a long distance, through walls and floors of even the largest buildings.
2. The Meshify Gateway is a redundant edge system, capable of sending sensor data from
various sensors to the Meshify cloud environment. Once the LoRa sensor information is
received by the Meshify Gateway, it converts the incoming radio frequency (RF) signals,
which support faster transfer rate to Meshify’s cloud environment.
3. Data from the Meshify Gateway and sensors is initially processed at Meshify’s AWS IoT
Core and REST service endpoints. These destinations for IoT streaming data help with the
initial intake and introduce field data to the Meshify cloud environment. The initial ingestion
points can scale automatically based upon the volume of sensor data received. This enables
rapid scaling and ease of implementation.
4. After the data has entered the Meshify cloud environment, Meshify uses Amazon EKS and
Amazon MSK to process the incoming data stream. Amazon MSK producer and consumer
applications within the EKS systems enrich the data streams for the end users and systems to
consume.
5. Producer applications running on EKS send processed events to the Amazon MSK service.
These events include storing and retrieval of raw data, enriched data, and system-level data.
6. Consumer applications hosted on the EKS pods receive events per the subscribed Amazon
MSK topic. Web, mobile, and analytic applications enrich and use these data streams to
display data to end users, business teams, and systems operations.
7. Processed events are persisted in Amazon RDS. The databases are used for reporting, machine
learning, and other analytics and processing services.

Vital features:

The following are the features of AWS:

Figure 27: Features of AWS

1) Flexibility
• The flexibility of AWS allows us to choose which programming models, languages, and
operating systems are better suited for their project, so we do not have to learn new skills
to adopt new technologies.
• Flexibility means that migrating legacy applications to the cloud is easy, and cost-effective.
Instead of re-writing the applications to adopt new technologies, you just need to move the
applications to the cloud and tap into advanced computing capabilities.
• The larger organizations run in a hybrid mode, i.e., some pieces of the application run in
their data center, and other portions of the application run in the cloud.
• The flexibility of AWS is a great asset for organizations to deliver the product with updated
technology in time, and overall enhancing the productivity.
2) Cost-effective
• Cost is one of the most important factors that need to be considered in delivering IT
solutions.
• You can scale up or scale down as the demand for resources increases or decreases
respectively.
• An aws allows you to access the resources more instantly. It has the ability to respond the
changes more quickly, and no matter whether the changes are large or small, means that we
can take new opportunities to meet the business challenges that could increase the revenue,
and reduce the cost.

3) Scalable and elastic


• Scalability in AWS has the ability to scale the computing resources up or down when
demand increases or decreases respectively.
• Elasticity in AWS is defined as the distribution of incoming application traffic across
multiple targets such as Amazon EC2 instances, containers, IP addresses, and Lambda
functions.

4) Secure

AWS provides

Physical security: The data centers are physically secured to prevent unauthorized access.

Secure services: Each service provided by the AWS cloud is secure.

Data privacy: A personal and business data can be encrypted to maintain data privacy.

Microsoft Azure:

The Azure application architecture fundamentals guidance is organized as a series of steps,


from the architecture and design to implementation.

Architecture styles

The first decision point is the most fundamental ie., architecture. It might be a microservices
architecture, a more traditional N-tier application, or a big data solution. We have identified
several distinct architectural styles. There are benefits and challenges to each.
Technology choices

Knowing the type of architecture building, one can start to choose the main technology pieces
for the architecture. The following technology choices are critical:

• Compute refers to the hosting model for the computing resources that your
applications run on.
• Data stores include databases but also storage for message queues, caches, logs,
and anything else that an application might persist to storage.
• Messaging technologies enable asynchronous messages between components of
the system.

Additional technology choices can be made along the way, but these three elements (compute,
data, and messaging) are central to most cloud applications and will determine many aspects of
your design.

Design the architecture

After choosing architecture style and the major technology components, the specific design of
the application has to be tackled. Every application is different, but the following resources are
most commonly used:

Reference architectures
Reference architecture includes recommended practices, along with considerations for
scalability, availability, security, resilience, and other aspects of the design. Most also include
a deployable solution or reference implementation.

Design principles
Microsoft azure has identified some high-level design principles that will make the application
more scalable, resilient, and manageable. These design principles will be applied to any
architectural style. Throughout the design process, keep these high-level design principles as
reference.

Design patterns
Software design patterns are repeatable patterns that are proven to solve specific problems.
Microsoft azure catalog of Cloud design patterns addresses specific challenges in distributed
systems. They address aspects such as availability, high availability, operational excellence,
resiliency, performance, and security.

Best practices

Microsoft azure’s best practices articles cover various design considerations including API
design, autoscaling, data partitioning, caching, and so forth. Review these and apply the best
practices that are appropriate for the application need to be designed.
Security best practices
Security best practices describe how to ensure that the confidentiality, integrity, and availability
of the application aren't compromised by malicious factors.

Quality pillars
A successful cloud application will focus on five pillars of software quality: Reliability,
Security, Cost Optimization, Operational Excellence, and Performance Efficiency.

Leverage the Microsoft Azure Well-Architected Framework to assess the architecture across
these five pillars.

Pillar Description

Reliability The ability of a system to recover from failures and continue to function.

Security Protecting applications and data from threats.

Cost Optimization Managing costs to maximize the value delivered.

Operational Excellence Operations processes that keep a system running in production.

Performance Efficiency The ability of a system to adapt to changes in load.

Figure 28: Microsoft Azure Application Architecture


Vital Features:

Scalability –Based on the demand, addition and reduction of the storage access in a few simple
clicks is possible.

Security - An added security layer at the datacentre level gives an additional physical security
option. Azure has many security assessments both public and proprietary to counter security
threats of any kind.

Disaster Management –With regular security updates, regional and global fail-over options,
hot and cold standby models and rolling reboot capabilities, Azure is easily the most resilient
cloud infrastructure provider in the market.

Cost-Effective – With its pay-as-you-go model and extreme flexibility Azure’s IaaS, when
used effectively, is bound to save the organization a lot of money both in the short and long
run.

Project Management Capabilities-Azure has built-in project management capabilities. It is


one of the very few services which is development focused and is integrated into the delivery
pipeline making it popular among especially many geographically distributed teams around the
world.

Higher Acceptance and Access - Microsoft is one of the largest company in the world and
windows is the most popular OS in the world. This makes the Azure cloud accessible from
almost any place in the world. With its continuous integration capability with all the Microsoft
apps, third party software, and other cloud services, Azure is gaining popularity among the
hard-core Linux users as well.

Figure 29: Features of Microsoft Azure

------------------------o-o-o-----------------------------

You might also like