Edge computing is a technology that brings computing to the end devices. In EDGE-Computing the computations performed at the centre of the network which is the cloud is moved to end devices. This reduces the complexity and overhead of processing large computation in the cloud. This also decreases the network usage and enables fast responses which result in better transfer rate.
In this article, we will be diving into a simple experiment to implement EDGE-C. Firstly we’ll discuss a problem to start with. Let us say we have a device which collects temperature and humidity at a particular place and transfers it to the cloud. Seems to be a pretty simple thing. Let’s say we want to continuously record data for some sort of analysis. So, here comes the problem, What if the sensor goes down with some sort of issue?. We will be missing data until the sensor is up running. To address these kinds of issues we rely on the age-old technique of deploying an ML model in the cloud to generate missing values when the sensor goes down. This is fine for one or two devices. But as we scale on we require more computational power to address many device failures. This is where edge computing comes into the picture. It brings the well-trained complex model to be deployed on the device itself. So, when the sensor is down the device still manages to send the data to the cloud hassle-free.
Workflow:
We’ll start the process by TensorFlow Keras Sequential model, after which we’ll be converting it into TensorFlow Lite which is specially built for microcontrollers. Then we will generate a C Array from TensorFlow Lite converter. This C Array holds the trained model which will be deployed to the edge devices.
In this project, we will be predicting both Temperature and Humidity if the sensors fail to produce the data. First, we’ll go with the Humidity.
1. Building TensorFlow Model
Let’s import the necessary modules
import tensorflow as tf
from tensorflow.keras import layers
import pandas as pd
Now let us separate independent and dependent variables from the dataset and store them in ‘x’ and ‘y’ respectively.
dataset = pd.read_csv("dataset_humidity.csv")
x = dataset.iloc[:,:-1].values
y = dataset.iloc[:,-1:].values
Now from Keras sequential, we will create a neural network model with 2 layers of 16 neurons with ‘relu’ activation function. And train the model via ‘fit’ function bypassing ‘x’ and ‘y’ values to the model. After training, we’ll store the model to a file with ‘save’ function as shown below.
model = tf.keras.Sequential()
model.add(layers.Dense(16, activation='relu'))
model.add(layers.Dense(16, activation='relu'))
model.add(layers.Dense(1))
model.compile(optimizer='rmsprop', loss='mse', metrics=['mse'])
model.fit(x, y, epochs=1000, batch_size=16)
model.save('Humidity_predictor_model')
Now finally we have created and trained our Neural Network Model using Tensorflow Keras.
2. Converting TF Model to TF Lite Model
Now we will convert the saved TF model to TF Lite and save it as a file with extension ‘.tflite’. For optimization let us use ‘tf.lite.Optimize.DEFAULT’ to avoid errors.
We’ll use a Linux command ‘xxd’ to convert TF Lite Model to C Array. This command converts a file or data to its equivalent hexadecimal format. This is known as “hex dump”.
The ‘.tflite’ file generated in the previous step will go as input to the ‘xxd’ command and for the output, we will specify the filename with extension. Here, I will be specifying ‘.h’ as the file extension. You could use other formats like ‘.cc’.
Since the DHT11 Sensor reads both Humidity and Temperature. We will predict both the values. Till now you have prepared the files for Humidity only. For temperature repeat all the steps from beginning till here to get your Temperature C Array.
If you get any error visit this GitHub repository to find the exact code. https://github.com/MohithGowdaHR/Edge_Computing.git
You will find the code in the ‘Models’ directory of the ‘Edge_Computing’ repository.
Here the temperature_predictor.h & humidity_predictor.h which is generated in the previous steps should be stored in the same directory where ‘.ino’ the file is created as shown in the below picture.
create a TensorFlowlite library instance as shown in the below code.
If the sensor fails to read the values then, we’ll be predicting it until the sensors are back. As an input to the model, we’ll be passing the spliced date time and previously recorded temperature and humidity values.
float h = dht.readHumidity();
float t = dht.readTemperature(true);
delay(1000);
if (isnan(h) || isnan(t) ) {
Serial.println(F("Failed to read from DHT sensor!"));
float input_array[8] = {2020 , 5, 26, 11, 30, 0, prevtemp, prevhum}; //use RTC module or GPS module to get realtime date and time
float input_array2[8] = {2020 , 2, 4, 6, 40, 0, prevhum, prevtemp}; //year,month,day,hour,min,sec,temp,hum
float hum = humudity.predict( input_array2);
float temp = temprature.predict( input_array);
prevhum = hum;
prevtemp = temp;
Serial.print("\t predicted humidity: ");
Serial.println(hum);
Serial.print("\t predicted temp: ");
Serial.println(temp);
delay(1000);
return;
}
else
{
Serial.print("\t humidity: ");
Serial.println(h);
Serial.print("\t temp: ");
Serial.println(t);
prevtemp = t;
prevhum = h;
}
Results:
Finally, We are done!Now we’ll be recording continuous uninterrupted values from the Edge Device.
Most teams across IT organizations need access to virtual machines (VM), or Elastic Compute Cloud (EC2), for various developmental activities in the AWS ecosystem. Providing Secure Shell (SSH) access with well-defined security policies and roles Read more…
We at Ambee are always dealing with large chunks of data ( ~4TB / day ). Now that’s huge. You obviously can’t handle it using pandas ( or can you? ) Here is where the Read more…
Idea in brief: The United Nations estimates the global population to reach 9.8 billion by 2050, making the food and agriculture industry more important than ever To address the pressure of ever-increasing demands, stakeholders of Read more…