Chapter1-IoT Ecosystem
Chapter1-IoT Ecosystem
Chapter1-IoT Ecosystem
Rekha M N
Asst. Professor,
Dept. Of E &EE
JSSSTU, Mysuru
Contents:
➢ Meaning of IoT
➢ Vision Of IoT
➢ Commonly Used Definitions
➢ Characteristics Of IoT
➢ IoT v/s IoE v/s M2M
➢ Enabling Technologies in IoT
➢ Applications of IoT
➢ General IoT Framework and IoT reference model
➢ Communication Models in IoT
➢ IoT Platforms
7 hours
IoT Ecosystem
Meaning of IoT:
The inter-networking of physical devices, vehicles, buildings, and other items—embedded with
electronics, software, sensors, actuators, and network connectivity that enable these objects to
collect and exchange data. IoT allows objects to be sensed or controlled remotely across existing
network infrastructure, creating opportunities for more direct integration of the physical world
into computer-based systems, and resulting in improved efficiency, accuracy and economic
benefit in addition to reduced human intervention.
The IoT harnesses the power of the internet to connect physical devices in real-time. Thus, any
physical object connected through the internet can be converted into an IoT device. For
example, reading the article over a smartphone, tablet, or PC. Similarly, a light bulb, which is
a physical object itself, will be considered as an IoT device when connected through a
smartphone app to turn it on and off.
IoT examples may include:
➢ Smart home security systems
➢ Autonomous farming instrumentation
➢ Wearable health monitors
➢ Smart mill instrumentation
➢ Wireless inventory trackers
➢ Ultra-high-speed wireless web
➢ Biometric cybersecurity scanners
➢ Shipping instrumentality and provision chase
➢ Home automation
IoT Ecosystem: The IoT ecosystem is not easy to define. It is also difficult to capture its proper
image due to the vastness and emerging possibility and the rapidity with which it is expanding in
the entire sector. However, the IoT ecosystem is a connection of various kind of devices that
sense and analyze the data and communicates with each other over the networks.
In the IoT ecosystem, the user uses smart devices such as smartphones, tablet, sensors, etc. to
send the command or request to devices for information over the networks. The device response
and performs the command to send information back to the user through networks after analyzed.
The typical IoT ecosystem is shown in below image, where the smarter devices send and receive
data from the devices themselves in the environment that are integrate over network and Cloud
Computing.
Figure 3: IoT Ecosystem
The IoT is itself an ecosystem of network devices that transfer the data.
1. Perception Layer
This is the basic layer of the entire IoT architecture. It is responsible for gathering all of the
data from various sensors. Furthermore, the devices in this layer are responsible for sending
and receiving data to and from the upper layers. Some information that could be collected from
the sensors in the static or dynamic states are the objects’ state, the environment of the
surrounding areas, and the objects’ characteristics. Therefore, all of the objects that are used for
collecting data in the IoT, such as sensors, people, electronics, and smartphones, are called
“things”.
2. Network Layer
This layer is responsible for transmitting data and providing network access to the Internet.
Therefore, all of the information that is collected from the sensors (perception layer) is transmitted
through this layer. Various communication technologies, such as GSM, WLAN, and IPv6, are
used in this layer to achieve the main function of transmitting data. This layer can contain one or
more network devices, such as gateways, edge computing, and mobile communication network,
which are needed to provide the lower layer with three main functionalities: network
communications, software protocol, and communication security.
3. Processing Layer
This layer contains high computational resources to process the massive amount of data
collected from the sensors in the perception layer. The layer links the upper and the lower
layer by processing the collected information intelligently and presenting this information
in the application layer. Various computational resources, such as high computing devices,
cloud computing devices, and clusters, are utilized in this layer to achieve the main functionalities
of this layer.
4. Application Layer
The application layer is the top layer of the IoT architecture. It provides users with many
services, such as management devices and the interface of the device’s display. This layer
has an intelligent decision system that responds quickly to the needs of various businesses, such
as healthcare, energy management, and environment monitoring. The accuracy of the response
result depends on the latest information that is used to train the intelligent decision system.
Dashboard: An Internet of Things (IoT) dashboard is a data visualization tool that transforms,
displays, and organizes a collection of data captured and transmitted by network-connected
devices. The primary purpose of an IoT dashboard is to provide human-readable information-at-
a-glance to remotely monitor historical and real-time IoT data.
Figure 5: IoT Dashboard
Internet of Things (IoT) analytics: It is a data analysis tool that assesses the wide range of data
collected from IoT devices. IoT analytics assesses vast quantities of data and produces useful
information from it.
IoT analytics are usually discussed in tandem with Industrial IoT (IIoT). Data is collected from
a wide range of sensors on manufacturing infrastructure, weather stations, smart meters,
delivery trucks, and all forms of machinery. IoT analytics can be applied to managing data
centers and applications handling retail and healthcare.
2. Intelligence
The intelligence of IoT devices is the intelligence of smart sensors and devices to sense data,
interact with each other and collect a huge amount of data for analysis. Complex software,
algorithms, and protocols are used to connect IoT devices to the networks and process the data
from millions of data nodes. Intelligence in IoT is only concerned with the interaction between
devices, while user and device interaction is achieved by standard input methods and graphical
user interfaces.
The devices in the IoT are heterogeneous as based on different hardware platforms and
networks. They can interact with other devices or service platforms through different networks.
The requirement of heterogeneous networks in IoT is scalabilities, modularity, extensibility,
and interoperability.
4. Safety(Security)
IoT devices are vulnerable to security threats. There is a high level of transparency issues and
privacy issues with IoT. For creating a security paradigm, it is important to secure the
endpoints, networks, and data that are transferred across all of them.
5. Sensing
IoT without sensors cannot be imagined. IoT sensors helps in detecting or measuring any
changes in the environment to generate data that can interact with the environment.
6. Enormous Scale
The number of devices that communicate with each other will be much larger than the devices
connected to the current internet. The management of these devices and interpretation for
application purposes is more critical. Gartner (2015) confirms the enormous scale of IoT in the
estimated report where it expressed those 5.5 million new things will get connected every day
and 6.4 billion connected devices will be in use worldwide in 2016, which is up by 30% from
2015. In 2022, the market for the Internet of Things is expected to grow 18 percent to 14.4
billion active connections
7. Dynamic Nature
The most important part of IoT is gathering data from its environment, which is achieved with
the dynamic changes that take place around the devices. The state of these devices changes
dynamically like connected or disconnected. In addition to the context of devices including
temperature, location, and speed and the number of devices also changes dynamically with a
person, place, and time.
M2M encompasses three key IOT encompasses four key IoE consists of four key
components such as components such as components such as people,
things, data, and processes
▪ Devices that generate or ▪ A sensor or device used to
receive data from other generate or receive data
devices from another device
▪ Communication for efficient ▪ Communication for
data transfer between transferring of data to
devices and gateways internet or between devices
▪ An application designed to ▪ Storage services, for
meet the needs of end users efficient storage of data
into database or to cloud
▪ Application to provide
intended service
Point to point communication exists IP network exists between devices, IoE is a network connection of
between the devices by integrating various people, data and things
communication protocols
M2M communication may exist Devices in IoT require an active Devices and their application
without the internet internet in most of the cases require active internet
A wireless sensor network comprises of distributed device with sensor which are used to
monitor the environmental and physical conditions. Sensor nodes are used in WSN with the
onboard processor that manages and monitors the environment in a particular area. They are
connected to the Base Station which acts as a processing unit in the WSN System. Base Station
in a WSN System is connected through the Internet to share data.
Example of WSNs used in IoT systems is as follows: Weather monitoring system use WSNs
in which the nodes collect temperature humidity and other data which is aggregated and
analysed.
2. Cloud Computing
The Cloud is a centralised system that helps to deliver and transport data and various files
across the Internet to data centres. The different data and programmes can be accessed easily
from the centralised Cloud system. Cloud Computing is an economic solution, as it does not
require on-site infrastructure for storage, processing and analytics.
The main three cloud service models are: IAAS, PAAS AND SAAS. Each cloud service model
covers different user and company needs, and provides a different level of control, security and
scalability.
1. A large amount of unstructured data is generated by IoT devices which are collected
in the big data system. This IoT generated big data largely depends on their 3V factors
that are volume, velocity, and variety.
2. In the big data system, which is basically a shared distributed database, the huge
amount of data is stored in big data files.
3. Analysing the stored IoT big data using analytic tools like Hadoop MapReduce or
Spark
4. Generating the reports of analysed data.
Since in IoT the unstructured data are collected via the internet, hence, big data for the
internet of things need lightning-fast analysis with large queries to gain rapid insights from
data to make quick decisions. Hence the need for big data in IoT is compelling.Hence, from
the big data perspective, it is the fuel that drives Internet of Things run
Characteristics
Some examples of big data generated by IoT systems are described as follows:
4. Communication protocols
Communication protocols form the backbone of IoT systems and enable network connectivity
and coupling to applications. Communication protocols allow devices to exchange data over
the network. Multiple protocols often describe different aspects of a single communication. A
group of protocols designed to work together are known as a protocol suite; when implemented
in software they are a protocol stack.
Internet communication protocols are published by the Internet Engineering Task Force (IETF).
The IEEE handles wired and wireless networking, and the International Organization for
Standardization (ISO) handles other types. The ITU-T handles telecommunication protocols
and formats for the public switched telephone network (PSTN). As the PSTN and Internet
converge, the standards are also being driven towards convergence.
▪ It has hardware.
▪ It has application software.
▪ It has Real Time Operating system (RTOS) that supervises the application software
and provide mechanism to let the processor run a process as per scheduling by
following a plan to control the latencies. RTOS defines the way the system works. It
sets the rules during the execution of application program. A small-scale embedded
system may not have RTOS
Applications of IoT:
The detailed presentation of these stages can be found on the diagram below.
For actuators, the process goes even further — these devices are able to intervene the physical
reality. For example, they can switch off the light and adjust the temperature in a room.
Because of this, sensing and actuating stage covers and adjusts everything needed in the physical
world to gain the necessary insights for further analysis.
Stage 2. Sensor data aggregation systems and analog-to-digital data conversion
Internet/Network gateways, Data Acquisition System (DAS) are present in this layer. DAS
performs data aggregation and conversion function (Collecting data and aggregating data then
converting analog data of sensors to digital data etc). Advanced gateways which mainly opens
up connection between Sensor networks and Internet also performs many basic gateway
functionalities like malware protection, and filtering also some times decision making based on
inputted data and data management services, etc.
In short, Stage 2 makes data both digitalized and aggregated.
Figure 17: Sensor data aggregation systems and analog-to-digital data conversion
It is worth noting that in the later case, and where connecting to existing field assets, there can
be significant design effort to connect the sensors and Edge Node intelligent hardware as well
as in mapping these systems to any management or intelligence that may exist in legacy assets
(unless they are just “dumb” assets that need instrumentation).
An important IoT concept, Edge Intelligence, to allow low latency reaction to field events and
to allow higher levels of autonomy and distributed processing, needs to be implemented at this
layer.
2. Connectivity – This layer spans from the “middle” of an Edge Node device up through
transport to the cloud. Many alternatives can be used for communications and this layer
includes the mapping of field data to the logical and physical technologies used as well as the
backhaul to the on premise or cloud and the next layer, Edge Computing.
In deployment, this layer can use a single solution or multiple technologies, depending on the
need. Field Area Networks (FANs) alternatives can include wired, cellular, LPWAN, and many
other wireless options, as well as multi-tired solutions, and can be build out of private, public,
or a mix of private and public transport solutions.
3. Edge Computing – The next layer in the World Forum Model architecture is Edge
Computing, or more properly “Cloud Edge” or “Cloud Gateway” computing. Required to some
degree in any IoT system this layer interfaces the data and control plains to the higher layers of
cloud, SaaS, or enterprise software layers. Protocol conversion, routing to higher layer software
functions and even “fast path” logic for low latency decision making will be implemented at
this layer.
4. Data Accumulation – Given the Velocity, Volume and Variety that IoT systems can provide
it is essential to provide incoming data storage for subsequent processing, normalization,
integration, and preparation for upstream applications. While part of the overall “data lake”
architecture, this layer of the architecture serves the intermediate storage of incoming storage
and outgoing traffic queued for delivery to lower layers. This layer may be implemented in
simple SQL or may require more sophisticated Hadoop & Hadoop File System, Mongo,
Cassandra, Spark or other NoSQL solutions.
5. Data Abstraction – In the data abstraction layer we “make sense” of the data, collecting
“like” information from multiple IoT sensors or measurements, expedite high priority traffic or
alarms, and organize incoming data from the data lake into appropriate schema and flows for
upstream processing. Similarly, application data destined for downstream layers is reformatted
appropriately for device interaction and queued for processing.
6. Application Layer – This layer is self-explanatory and is where control plane and data plane
application logic are executed. Monitoring, process optimization, alarm management, statistical
analysis, control logic, logistics, consumer patterns, are just a few examples of IoT applications.
In March 2015, the Internet Architecture Board released a guide to IoT networking (PDF). This
outlined four common communication models used by IoT “smart objects”: Device-to-Device,
Device-to-Cloud, Device-to-Gateway, and Back-End Data-Sharing.
1.Device-to-Device
Device-to-device communication represents two or more devices that directly connect and
communicate between one another. They can communicate over many types of networks,
including IP networks or the Internet, but most often use protocols like Bluetooth, Z-Wave, and
ZigBee.
This model is commonly used in home automation systems to transfer small data packets of
information between devices at a relatively low data rate. This could be light bulbs, thermostats,
and door locks sending small amounts of information to each other.
With Device-to-Device connectivity “security is specifically simplified because you have these
short-range radio technology [and a] one-to-one relationship between these two devices.”
Figure 20: Example of device-to-device communication model.
Device-to-device is popular among wearable IoT devices like a heart monitor paired to a
smartwatch where data doesn’t necessarily have be to shared with multiple people.
There are several standards being developed around Device-to-Device including Bluetooth
Low Energy (also known as Bluetooth Smart or Bluetooth Version 4.0+) which is popular
among portable and wearable devices because its low power requirements could mean devices
could operate for months or years on one battery. Its lower complexity can also reduce its size
and cost.
2. Device-to-Cloud
Cloud connectivity lets the user (and an application) to obtain remote access to a device. It also
potentially supports pushing software updates to the device.
A use case for cellular-based Device-to-Cloud would be a smart tag that tracks your dog while
you’re not around, which would need wide-area cellular communication because you wouldn’t
know where the dog might be.
Specifically, if you’re away and you want to see what’s on your webcam at home. You contact
the cloud infrastructure and then the cloud infrastructure relays to your IoT device.
The IAB’s report also mentioned that interoperability is also a factor with Device-to-Cloud
when attempting to integrate devices made by different manufacturers given that the device and
cloud service are typically from the same vendor. An example would be the Nest Labs Learning
Thermostat, where the Learning Thermostat can only work with Nest’s cloud service.
There’s work going into making Wifi devices that make cloud connections while consuming
less power with standards such as LoRa, Sigfox, and Narrowband.
3. Device-to-Gateway
This gateway could provide security and other functionality such as data or protocol translation.
If the application-layer gateway is a smartphone, this application software might take the form
of an app that pairs with the IoT device and communicates with a cloud service.
This might be a fitness device that connects to the cloud through a smartphone app like Nike+,
or home automation applications that involve devices that connect to a hub like Samsung’s
SmartThings ecosystem.
Gateway devices can also potentially bridge the interoperability gap between devices that
communicate on different standards. For instance, SmartThings’ Z-Wave and Zigbee
transceivers can communicate with both families of devices.
4. Backend Data Sharing
The app Map My Fitness is a good example of this because it compiles fitness data from various
devices ranging from the Fitbit to the Adidas miCoach to the Wahoo Bike Cadence Sensor.
“They provide hooks, REST APIs to allow security and privacy-friendly data sharing to Map
My Fitness.” This means an exercise can be analyzed from the viewpoint of various sensors.
Note: There’s No Clear IoT Deployment Model; It All Depends on the Use Case
The decision process for IoT developers is quite complicated when considering how it will be
integrated and how it will get connectivity to the internet working.
To further complicate things, newer technologies with lower power consumption, size and cost
are often lacking in maturity compared to traditional Ethernet or Wi-Fi.
“The equation is not just what is most convenient for me, but what are the limitations of those
radio technologies and how do I deal with factors like the size limitations, energy consumption,
the cost – these aspects play a big role.”
IoT Platforms:
IoT platforms are the middleware solutions that connect the IoT devices to the cloud and help
seamlessly exchange data over the network. It acts as a mediator between the application layer
and the hardware.
Figure 24: Top Seven IoT Platforms
One of the most reliable and secure Internet of Things platforms, AWS IoT not only helps to
connect devices to the cloud but also safeguard the interactions with the applications available
on the cloud and other devices. Even when the devices are not connected to the Internet, the
AWS IoT Platform allows applications to monitor devices and facilitate round-the-clock
communication between them.
The three building blocks of this technical architecture are the edge portfolio, data ingestion,
and data processing and analytics, shown below.
Starting with the edge sensors, the Meshify edge portfolio covers two types of sensors:
➢ LoRaWAN (Low power, long range WAN) sensor suite:This sensor provides the
long connectivity range (> 1000 feet) and extended battery life (~ 5 years) needed for
enterprise environments.
➢ Cellular-based sensors: This sensor is a narrow band/LTE-M device that operates at
LTE-M band 2/4/12 radio frequency and uses edge intelligence to conserve battery
life.
For the LoRaWAN solution, aggregated sensor data at the Meshify gateway is sent to AWS
using AWS IoT Core and Meshify’s REST service endpoints. AWS IoT Core is a managed
cloud platform that lets IoT devices easily and securely connect using multiple protocols like
HTTP, MQTT, and WebSockets. It expands its protocol coverage through a new fully managed
feature called AWS IoT Core for LoRaWAN. This gives Meshify the ability to connect
LoRaWAN wireless devices with the AWS Cloud.
Initial processing of the data is done at the ingestion layer, using Meshify REST API endpoints
and the Rules Engine of AWS IoT Core. Meshify applies filtering logic to route relevant events
to Amazon Managed Streaming for Apache Kafka (Amazon MSK). Amazon MSK is an AWS
streaming data service that manages Apache Kafka infrastructure and operations, streamlining
the process of running Apache Kafka applications on AWS.
Meshify’s applications then consume the events from Amazon MSK as per the configured topic
subscription. They enrich and correlate the events with the records with a managed service,
Amazon Relational Database Service (RDS). These applications run as scalable containers on
another managed service, Amazon Elastic Kubernetes Service (EKS), which runs container
applications.
Below Figure illustrate the technical workflow from the ingestion of field events to their
processing, enrichment, and persistence. Finally, these events are used to power risk avoidance
decision-making
Figure 26:Technical workflow for Meshify IoT architecture
1. After installation, Meshify-designed LoRa sensors transmit information to the cloud through
Meshify’s gateways. LoRaWAN capabilities create connectivity between the sensors and the
gateways. They establish a low power, wide area network protocol that securely transmits
data over a long distance, through walls and floors of even the largest buildings.
2. The Meshify Gateway is a redundant edge system, capable of sending sensor data from
various sensors to the Meshify cloud environment. Once the LoRa sensor information is
received by the Meshify Gateway, it converts the incoming radio frequency (RF) signals,
which support faster transfer rate to Meshify’s cloud environment.
3. Data from the Meshify Gateway and sensors is initially processed at Meshify’s AWS IoT
Core and REST service endpoints. These destinations for IoT streaming data help with the
initial intake and introduce field data to the Meshify cloud environment. The initial ingestion
points can scale automatically based upon the volume of sensor data received. This enables
rapid scaling and ease of implementation.
4. After the data has entered the Meshify cloud environment, Meshify uses Amazon EKS and
Amazon MSK to process the incoming data stream. Amazon MSK producer and consumer
applications within the EKS systems enrich the data streams for the end users and systems to
consume.
5. Producer applications running on EKS send processed events to the Amazon MSK service.
These events include storing and retrieval of raw data, enriched data, and system-level data.
6. Consumer applications hosted on the EKS pods receive events per the subscribed Amazon
MSK topic. Web, mobile, and analytic applications enrich and use these data streams to
display data to end users, business teams, and systems operations.
7. Processed events are persisted in Amazon RDS. The databases are used for reporting, machine
learning, and other analytics and processing services.
Vital features:
1) Flexibility
• The flexibility of AWS allows us to choose which programming models, languages, and
operating systems are better suited for their project, so we do not have to learn new skills
to adopt new technologies.
• Flexibility means that migrating legacy applications to the cloud is easy, and cost-effective.
Instead of re-writing the applications to adopt new technologies, you just need to move the
applications to the cloud and tap into advanced computing capabilities.
• The larger organizations run in a hybrid mode, i.e., some pieces of the application run in
their data center, and other portions of the application run in the cloud.
• The flexibility of AWS is a great asset for organizations to deliver the product with updated
technology in time, and overall enhancing the productivity.
2) Cost-effective
• Cost is one of the most important factors that need to be considered in delivering IT
solutions.
• You can scale up or scale down as the demand for resources increases or decreases
respectively.
• An aws allows you to access the resources more instantly. It has the ability to respond the
changes more quickly, and no matter whether the changes are large or small, means that we
can take new opportunities to meet the business challenges that could increase the revenue,
and reduce the cost.
4) Secure
AWS provides
Physical security: The data centers are physically secured to prevent unauthorized access.
Data privacy: A personal and business data can be encrypted to maintain data privacy.
Microsoft Azure:
Architecture styles
The first decision point is the most fundamental ie., architecture. It might be a microservices
architecture, a more traditional N-tier application, or a big data solution. We have identified
several distinct architectural styles. There are benefits and challenges to each.
Technology choices
Knowing the type of architecture building, one can start to choose the main technology pieces
for the architecture. The following technology choices are critical:
• Compute refers to the hosting model for the computing resources that your
applications run on.
• Data stores include databases but also storage for message queues, caches, logs,
and anything else that an application might persist to storage.
• Messaging technologies enable asynchronous messages between components of
the system.
Additional technology choices can be made along the way, but these three elements (compute,
data, and messaging) are central to most cloud applications and will determine many aspects of
your design.
After choosing architecture style and the major technology components, the specific design of
the application has to be tackled. Every application is different, but the following resources are
most commonly used:
Reference architectures
Reference architecture includes recommended practices, along with considerations for
scalability, availability, security, resilience, and other aspects of the design. Most also include
a deployable solution or reference implementation.
Design principles
Microsoft azure has identified some high-level design principles that will make the application
more scalable, resilient, and manageable. These design principles will be applied to any
architectural style. Throughout the design process, keep these high-level design principles as
reference.
Design patterns
Software design patterns are repeatable patterns that are proven to solve specific problems.
Microsoft azure catalog of Cloud design patterns addresses specific challenges in distributed
systems. They address aspects such as availability, high availability, operational excellence,
resiliency, performance, and security.
Best practices
Microsoft azure’s best practices articles cover various design considerations including API
design, autoscaling, data partitioning, caching, and so forth. Review these and apply the best
practices that are appropriate for the application need to be designed.
Security best practices
Security best practices describe how to ensure that the confidentiality, integrity, and availability
of the application aren't compromised by malicious factors.
Quality pillars
A successful cloud application will focus on five pillars of software quality: Reliability,
Security, Cost Optimization, Operational Excellence, and Performance Efficiency.
Leverage the Microsoft Azure Well-Architected Framework to assess the architecture across
these five pillars.
Pillar Description
Reliability The ability of a system to recover from failures and continue to function.
Scalability –Based on the demand, addition and reduction of the storage access in a few simple
clicks is possible.
Security - An added security layer at the datacentre level gives an additional physical security
option. Azure has many security assessments both public and proprietary to counter security
threats of any kind.
Disaster Management –With regular security updates, regional and global fail-over options,
hot and cold standby models and rolling reboot capabilities, Azure is easily the most resilient
cloud infrastructure provider in the market.
Cost-Effective – With its pay-as-you-go model and extreme flexibility Azure’s IaaS, when
used effectively, is bound to save the organization a lot of money both in the short and long
run.
Higher Acceptance and Access - Microsoft is one of the largest company in the world and
windows is the most popular OS in the world. This makes the Azure cloud accessible from
almost any place in the world. With its continuous integration capability with all the Microsoft
apps, third party software, and other cloud services, Azure is gaining popularity among the
hard-core Linux users as well.
------------------------o-o-o-----------------------------