Energy and performance optimization for resource and data management in Cloud of Things with Semantic Networks
The reduction in the cost and the increase of the performance in smart devices have reached in a fast evolution of the Internet of Things (IoT) paradigm. IoT ecosystems consist of a large number of devices, ranging from sensors to small embedded computers, that register, process and transfer data with several objectives in function of the nature of the scenario. The large number of devices and the high frequencies of data monitoring, almost performed in real time, have caused the volume and variety of the data of an IoT system to be so high that it requires additional infrastructure for its processing and storage. The direct solution to this problem has been to incorporate Cloud infrastructures into IoT systems, that has been called the Cloud of Things (CoT).
The main problem of using Cloud in IoT architectures is the need to transport large amounts of data from the devices that generate it, to Cloud infrastructures, which can be physically located at a very far distance. To mitigate this problem, it has been proposed to take advantage of the processing and storage capacities of the devices that take part in the transport of such data between devices and Cloud. Thus, the data does not have to go all the way devices-Cloud, as it can be stored and processed in an intermediate point, reducing in this way the load of the infrastructure Cloud and of the systems of communication devices-Cloud, this is the definition of Fog Computing architectures.
Our proposal aims to reduce energy consumption and resource usages, both Cloud infrastructure and intermediate components, by improving the placement policies and scheduling of the jobs that processes it. Through the use of complex networks along with semantic information, it is intended to model the Cloud architectures, the data generated by the devices and the interrelation between them (resource allocation). We will define metaheuristics that will use the architecture and data model as inputs of the architecture to obtain the model of the allocation of resources (location and planning of the data process) as a result.