Sign in to use this feature.

Years

Between: -

Article Types

Countries / Regions

Search Results (75)

Search Parameters:
Journal = Network

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
Review
A Review of Blockchain Technology in Knowledge-Defined Networking, Its Application, Benefits, and Challenges
Network 2023, 3(3), 343-421; https://doi.org/10.3390/network3030017 - 30 Aug 2023
Viewed by 245
Abstract
Knowledge-Defined Networking (KDN) necessarily consists of a knowledge plane for the generation of knowledge, typically using machine learning techniques, and the dissemination of knowledge, in order to make knowledge-driven intelligent network decisions. In one way, KDN can be recognized as knowledge-driven Software-Defined Networking [...] Read more.
Knowledge-Defined Networking (KDN) necessarily consists of a knowledge plane for the generation of knowledge, typically using machine learning techniques, and the dissemination of knowledge, in order to make knowledge-driven intelligent network decisions. In one way, KDN can be recognized as knowledge-driven Software-Defined Networking (SDN), having additional management and knowledge planes. On the other hand, KDN encapsulates all knowledge-/intelligence-/ cognition-/machine learning-driven networks, emphasizing knowledge generation (KG) and dissemination for making intelligent network decisions, unlike SDN, which emphasizes logical decoupling of the control plane. Blockchain is a technology created for secure and trustworthy decentralized transaction storage and management using a sequence of immutable and linked transactions. The decision-making trustworthiness of a KDN system is reliant on the trustworthiness of the data, knowledge, and AI model sharing. To this point, a KDN may make use of the capabilities of the blockchain system for trustworthy data, knowledge, and machine learning model sharing, as blockchain transactions prevent repudiation and are immutable, pseudo-anonymous, optionally encrypted, reliable, access-controlled, and untampered, to protect the sensitivity, integrity, and legitimacy of sharing entities. Furthermore, blockchain has been integrated with knowledge-based networks for traffic optimization, resource sharing, network administration, access control, protecting privacy, traffic filtering, anomaly or intrusion detection, network virtualization, massive data analysis, edge and cloud computing, and data center networking. Despite the fact that many academics have employed the concept of blockchain in cognitive networks to achieve various objectives, we can also identify challenges such as high energy consumption, scalability issues, difficulty processing big data, etc. that act as barriers for integrating the two concepts together. Academicians have not yet reviewed blockchain-based network solutions in diverse application categories for diverse knowledge-defined networks in general, which consider knowledge generation and dissemination using various techniques such as machine learning, fuzzy logic, and meta-heuristics. Therefore, this article fills a void in the content of the literature by first reviewing the diverse existing blockchain-based applications in diverse knowledge-based networks, analyzing and comparing the existing works, describing the advantages and difficulties of using blockchain systems in KDN, and, finally, providing propositions based on identified challenges and then presenting prospects for the future. Full article
Show Figures

Figure 1

Article
Route Optimization of Unmanned Aerial Vehicle Sensors for Localization of Wireless Emitters in Outdoor Environments
Network 2023, 3(3), 326-342; https://doi.org/10.3390/network3030016 - 18 Aug 2023
Viewed by 249
Abstract
Localization methods of unknown emitters are used for the monitoring of illegal radio waves. The localization methods using ground-based sensors suffer from a degradation of localization accuracy in environments where the distance between the emitter and the sensor is non-line-of-sight (NLoS). Therefore, research [...] Read more.
Localization methods of unknown emitters are used for the monitoring of illegal radio waves. The localization methods using ground-based sensors suffer from a degradation of localization accuracy in environments where the distance between the emitter and the sensor is non-line-of-sight (NLoS). Therefore, research is being conducted to improve localization accuracy by utilizing Unmanned Aerial Vehicles (UAVs) as sensors to ensure a line-of-sight (LoS) condition. However, UAVs can fly freely over the sky, making it difficult to optimize flight paths based on particle swarm optimization (PSO) for efficient and accurate localization. This paper examines the optimization of UAV flight paths to achieve highly efficient and accurate outdoor localization of unknown emitters via two approaches, a circular orbit and free-path trajectory, respectively. Our numerical results reveal the improved localization estimation error performance of our proposed approach. Particularly, when evaluating at the 90th percentile of the error’s cumulative distribution function (CDF), the proposed approach can reach an error of 28.59 m with a circular orbit and 12.91 m with a free-path orbit, as compared to the conventional fixed sensor case whose localization estimation error is 55.02 m. Full article
(This article belongs to the Special Issue Innovative Mobile Computing, Communication, and Sensing Systems)
Show Figures

Figure 1

Article
Arithmetic Study about Efficiency in Network Topologies for Data Centers
Network 2023, 3(3), 298-325; https://doi.org/10.3390/network3030015 - 26 Jun 2023
Viewed by 691
Abstract
Data centers are getting more and more attention due the rapid increase of IoT deployments, which may result in the implementation of smaller facilities being closer to the end users as well as larger facilities up in the cloud. In this paper, an [...] Read more.
Data centers are getting more and more attention due the rapid increase of IoT deployments, which may result in the implementation of smaller facilities being closer to the end users as well as larger facilities up in the cloud. In this paper, an arithmetic study has been carried out in order to measure a coefficient related to both the average number of hops among nodes and the average number of links among devices for a range of typical network topologies fit for data centers. Such topologies are either tree-like or graph-like designs, where this coefficient provides a balance between performance and simplicity, resulting in lower values in the coefficient accounting for a better compromise between both factors in redundant architectures. The motivation of this contribution is to craft a coefficient that is easy to calculate by applying simple arithmetic operations. This coefficient can be seen as another tool to compare network topologies in data centers that could act as a tie-breaker so as to select a given design when other parameters offer contradictory results. Full article
(This article belongs to the Special Issue Advances on Networks and Cyber Security)
Show Figures

Figure 1

Review
Recent Development of Emerging Indoor Wireless Networks towards 6G
Network 2023, 3(2), 269-297; https://doi.org/10.3390/network3020014 - 12 May 2023
Cited by 2 | Viewed by 1407
Abstract
Sixth-generation (6G) mobile technology is currently under development, and is envisioned to fulfill the requirements of a fully connected world, providing ubiquitous wireless connectivity for diverse users and emerging applications. Transformative solutions are expected to drive the surge to accommodate a rapidly growing [...] Read more.
Sixth-generation (6G) mobile technology is currently under development, and is envisioned to fulfill the requirements of a fully connected world, providing ubiquitous wireless connectivity for diverse users and emerging applications. Transformative solutions are expected to drive the surge to accommodate a rapidly growing number of intelligent devices and services. In this regard, wireless local area networks (WLANs) have a major role to play in indoor spaces, from supporting explosive growth in high-bandwidth applications to massive sensor arrays with diverse network requirements. Sixth-generation technology is expected to have a superconvergence of networks, including WLANs, to support this growth in applications in multiple dimensions. To this end, this paper comprehensively reviews the latest developments in diverse WLAN technologies, including WiFi, visible light communication, and optical wireless communication networks, as well as their technical capabilities. This paper also discusses how well these emerging WLANs align with supporting 6G requirements. The analyses presented in the paper provide insight into the research opportunities that need to be investigated to overcome the challenges in integrating WLANs in a 6G ecosystem. Full article
Show Figures

Figure 1

Article
Clustered Distributed Learning Exploiting Node Centrality and Residual Energy (CINE) in WSNs
Network 2023, 3(2), 253-268; https://doi.org/10.3390/network3020013 - 23 Apr 2023
Viewed by 920
Abstract
With the explosion of big data, the implementation of distributed machine learning mechanisms in wireless sensor networks (WSNs) is becoming required for reducing the number of data traveling throughout the network and for identifying anomalies promptly and reliably. In WSNs, the above need [...] Read more.
With the explosion of big data, the implementation of distributed machine learning mechanisms in wireless sensor networks (WSNs) is becoming required for reducing the number of data traveling throughout the network and for identifying anomalies promptly and reliably. In WSNs, the above need has to be considered along with the limited energy and processing resources available at the nodes. In this paper, we tackle the resulting complex problem by designing a multi-criteria protocol CINE that stands for “Clustered distributed learnIng exploiting Node centrality and residual Energy” for distributed learning in WSNs. More specifically, considering the energy and processing capabilities of nodes, we design a scheme that assumes that nodes are partitioned in clusters and selects a central node in each cluster, called cluster head (CH), that executes the training of the machine learning (ML) model for all the other nodes in the cluster, called cluster members (CMs). In fact, CMs are responsible for executing the inference only. Since the CH role requires the consumption of more resources, the proposed scheme rotates the CH role among all nodes in the cluster. The protocol has been simulated and tested using real environmental data sets. Full article
Show Figures

Figure 1

Article
Improvement of Network Flow Using Multi-Commodity Flow Problem
Network 2023, 3(2), 239-252; https://doi.org/10.3390/network3020012 - 04 Apr 2023
Viewed by 1235
Abstract
In recent years, Internet traffic has increased due to its widespread use. This can be attributed to the growth of social games on smartphones and video distribution services with increasingly high image quality. In these situations, a routing mechanism is required to control [...] Read more.
In recent years, Internet traffic has increased due to its widespread use. This can be attributed to the growth of social games on smartphones and video distribution services with increasingly high image quality. In these situations, a routing mechanism is required to control congestion, but most existing routing protocols select a single optimal path. This causes the load to be concentrated on certain links, increasing the risk of congestion. In addition to the optimal path, the network has redundant paths leading to the destination node. In this study, we propose a multipath control with multi-commodity flow problem. Comparing the proposed method with OSPF, which is single-path control, and OSPF-ECMP, which is multipath control, we confirmed that the proposed method records higher packet arrival rates. This is expected to reduce congestion. Full article
Show Figures

Figure 1

Article
SDN-Based Routing Framework for Elephant and Mice Flows Using Unsupervised Machine Learning
Network 2023, 3(1), 218-238; https://doi.org/10.3390/network3010011 - 02 Mar 2023
Viewed by 1228
Abstract
Software-defined networks (SDNs) have the capabilities of controlling the efficient movement of data flows through a network to fulfill sufficient flow management and effective usage of network resources. Currently, most data center networks (DCNs) suffer from the exploitation of network resources by large [...] Read more.
Software-defined networks (SDNs) have the capabilities of controlling the efficient movement of data flows through a network to fulfill sufficient flow management and effective usage of network resources. Currently, most data center networks (DCNs) suffer from the exploitation of network resources by large packets (elephant flow) that enter the network at any time, which affects a particular flow (mice flow). Therefore, it is crucial to find a solution for identifying and finding an appropriate routing path in order to improve the network management system. This work proposes a SDN application to find the best path based on the type of flow using network performance metrics. These metrics are used to characterize and identify flows as elephant and mice by utilizing unsupervised machine learning (ML) and the thresholding method. A developed routing algorithm was proposed to select the path based on the type of flow. A validation test was performed by testing the proposed framework using different topologies of the DCN and comparing the performance of a SDN-Ryu controller with that of the proposed framework based on three factors: throughput, bandwidth, and data transfer rate. The results show that 70% of the time, the proposed framework has higher performance for different types of flows. Full article
Show Figures

Figure 1

Article
Machine Learning Applied to LoRaWAN Network for Improving Fingerprint Localization Accuracy in Dense Urban Areas
Network 2023, 3(1), 199-217; https://doi.org/10.3390/network3010010 - 09 Feb 2023
Cited by 1 | Viewed by 1232
Abstract
In the area of low-power wireless networks, one technology that many researchers are focusing on relates to positioning methods such as fingerprinting in densely populated urban areas. This work presents an experimental study aimed at quantifying mean location estimation error in populated areas. [...] Read more.
In the area of low-power wireless networks, one technology that many researchers are focusing on relates to positioning methods such as fingerprinting in densely populated urban areas. This work presents an experimental study aimed at quantifying mean location estimation error in populated areas. Using a dataset provided by the University of Antwerp, a neural network was implemented with the aim of providing end-device location. In this way, we were able to measure the mean localization error in areas of high urban density. The results obtained show a deviation of less than 150 m in locating the end device. This offset can be decreased up to a few meters, provided that there is a greater density of nodes per square meter. This result could enable Internet of Things (IoT) applications to use fingerprinting in place of energy-consuming alternatives. Full article
Show Figures

Figure 1

Article
Improving Bundle Routing in a Space DTN by Approximating the Transmission Time of the Reliable LTP
Network 2023, 3(1), 180-198; https://doi.org/10.3390/network3010009 - 03 Feb 2023
Viewed by 1090
Abstract
Because the operation of space networks is carefully planned, it is possible to predict future contact opportunities from link budget analysis using the anticipated positions of the nodes over time. In the standard approach to space delay-tolerant networking (DTN), such knowledge is used [...] Read more.
Because the operation of space networks is carefully planned, it is possible to predict future contact opportunities from link budget analysis using the anticipated positions of the nodes over time. In the standard approach to space delay-tolerant networking (DTN), such knowledge is used by contact graph routing (CGR) to decide the paths for data bundles. However, the computation assumes nearly ideal channel conditions, disregarding the impact of the convergence layer retransmissions (e.g., as implemented by the Licklider transmission protocol (LTP)). In this paper, the effect of the bundle forwarding time estimation (i.e., the link service time) to routing optimality is analyzed, and an accurate expression for lossy channels is discussed. The analysis is performed first from a general and protocol-agnostic perspective, assuming knowledge of the statistical properties and general features of the contact opportunities. Then, a practical case is studied using the standard space DTN protocol, evaluating the performance improvement of CGR under the proposed forwarding time estimation. The results of this study provide insight into the optimal routing problem for a space DTN and a suggested improvement to the current routing standard. Full article
Show Figures

Figure 1

Article
A Federated Learning-Based Approach for Improving Intrusion Detection in Industrial Internet of Things Networks
Network 2023, 3(1), 158-179; https://doi.org/10.3390/network3010008 - 30 Jan 2023
Cited by 3 | Viewed by 2466
Abstract
The Internet of Things (IoT) is a network of electrical devices that are connected to the Internet wirelessly. This group of devices generates a large amount of data with information about users, which makes the whole system sensitive and prone to malicious attacks [...] Read more.
The Internet of Things (IoT) is a network of electrical devices that are connected to the Internet wirelessly. This group of devices generates a large amount of data with information about users, which makes the whole system sensitive and prone to malicious attacks eventually. The rapidly growing IoT-connected devices under a centralized ML system could threaten data privacy. The popular centralized machine learning (ML)-assisted approaches are difficult to apply due to their requirement of enormous amounts of data in a central entity. Owing to the growing distribution of data over numerous networks of connected devices, decentralized ML solutions are needed. In this paper, we propose a Federated Learning (FL) method for detecting unwanted intrusions to guarantee the protection of IoT networks. This method ensures privacy and security by federated training of local IoT device data. Local IoT clients share only parameter updates with a central global server, which aggregates them and distributes an improved detection algorithm. After each round of FL training, each of the IoT clients receives an updated model from the global server and trains their local dataset, where IoT devices can keep their own privacy intact while optimizing the overall model. To evaluate the efficiency of the proposed method, we conducted exhaustive experiments on a new dataset named Edge-IIoTset. The performance evaluation demonstrates the reliability and effectiveness of the proposed intrusion detection model by achieving an accuracy (92.49%) close to that offered by the conventional centralized ML models’ accuracy (93.92%) using the FL method. Full article
(This article belongs to the Special Issue Networking Technologies for Cyber-Physical Systems)
Show Figures

Figure 1

Article
Formal Algebraic Model of an Edge Data Center with a Redundant Ring Topology
Network 2023, 3(1), 142-157; https://doi.org/10.3390/network3010007 - 30 Jan 2023
Viewed by 1045
Abstract
Data center organization and optimization presents the opportunity to try and design systems with specific characteristics. In this sense, the combination of artificial intelligence methodology and sustainability may lead to achieve optimal topologies with enhanced feature, whilst taking care of the environment by [...] Read more.
Data center organization and optimization presents the opportunity to try and design systems with specific characteristics. In this sense, the combination of artificial intelligence methodology and sustainability may lead to achieve optimal topologies with enhanced feature, whilst taking care of the environment by lowering carbon emissions. In this paper, a model for a field monitoring system has been proposed, where an edge data center topology in the form of a redundant ring has been designed for redundancy purposes to join together nodes spread apart. Additionally, a formal algebraic model of such a design has been exposed and verified. Full article
Show Figures

Figure 1

Article
IoT and Blockchain Integration: Applications, Opportunities, and Challenges
Network 2023, 3(1), 115-141; https://doi.org/10.3390/network3010006 - 24 Jan 2023
Cited by 2 | Viewed by 1984
Abstract
During the recent decade, two variants of evolving computing networks have augmented the Internet: (i) The Internet of Things (IoT) and (ii) Blockchain Network(s) (BCNs). The IoT is a network of heterogeneous digital devices embedded with sensors and software for various automation and [...] Read more.
During the recent decade, two variants of evolving computing networks have augmented the Internet: (i) The Internet of Things (IoT) and (ii) Blockchain Network(s) (BCNs). The IoT is a network of heterogeneous digital devices embedded with sensors and software for various automation and monitoring purposes. A Blockchain Network is a broadcast network of computing nodes provisioned for validating digital transactions and recording the “well-formed” transactions in a unique data storage called a blockchain ledger. The power of a blockchain network is that (ideally) every node maintains its own copy of the ledger and takes part in validating the transactions. Integrating IoT and BCNs brings promising applications in many areas, including education, health, finance, agriculture, industry, and the environment. However, the complex, dynamic and heterogeneous computing and communication needs of IoT technologies, optionally integrated by blockchain technologies (if mandated), draw several challenges on scaling, interoperability, and security goals. In recent years, numerous models integrating IoT with blockchain networks have been proposed, tested, and deployed for businesses. Numerous studies are underway to uncover the applications of IoT and Blockchain technology. However, a close look reveals that very few applications successfully cater to the security needs of an enterprise. Needless to say, it makes less sense to integrate blockchain technology with an existing IoT that can serve the security need of an enterprise. In this article, we investigate several frameworks for IoT operations, the applicability of integrating them with blockchain technology, and due security considerations that the security personnel must make during the deployment and operations of IoT and BCN. Furthermore, we discuss the underlying security concerns and recommendations for blockchain-integrated IoT networks. Full article
Show Figures

Figure 1

Article
Edge Data Center Organization and Optimization by Using Cage Graphs
Network 2023, 3(1), 93-114; https://doi.org/10.3390/network3010005 - 18 Jan 2023
Viewed by 958
Abstract
Data center organization and optimization are increasingly receiving attention due to the ever-growing deployments of edge and fog computing facilities. The main aim is to achieve a topology that processes the traffic flows as fast as possible and that does not only depend [...] Read more.
Data center organization and optimization are increasingly receiving attention due to the ever-growing deployments of edge and fog computing facilities. The main aim is to achieve a topology that processes the traffic flows as fast as possible and that does not only depend on AI-based computing resources, but also on the network interconnection among physical hosts. In this paper, graph theory is introduced, due to its features related to network connectivity and stability, which leads to more resilient and sustainable deployments, where cage graphs may have an advantage over the rest. In this context, the Petersen graph cage is studied as a convenient candidate for small data centers due to its small number of nodes and small network diameter, thus providing an interesting solution for edge and fog data centers. Full article
(This article belongs to the Special Issue Advances in Edge and Cloud Computing)
Show Figures

Figure 1

Editorial
Acknowledgment to the Reviewers of Network in 2022
Network 2023, 3(1), 91-92; https://doi.org/10.3390/network3010004 - 17 Jan 2023
Viewed by 729
Abstract
High-quality academic publishing is built on rigorous peer review [...] Full article
Review
On Attacking Future 5G Networks with Adversarial Examples: Survey
Network 2023, 3(1), 39-90; https://doi.org/10.3390/network3010003 - 30 Dec 2022
Viewed by 2468
Abstract
The introduction of 5G technology along with the exponential growth in connected devices is expected to cause a challenge for the efficient and reliable network resource allocation. Network providers are now required to dynamically create and deploy multiple services which function under various [...] Read more.
The introduction of 5G technology along with the exponential growth in connected devices is expected to cause a challenge for the efficient and reliable network resource allocation. Network providers are now required to dynamically create and deploy multiple services which function under various requirements in different vertical sectors while operating on top of the same physical infrastructure. The recent progress in artificial intelligence and machine learning is theorized to be a potential answer to the arising resource allocation challenges. It is therefore expected that future generation mobile networks will heavily depend on its artificial intelligence components which may result in those components becoming a high-value attack target. In particular, a smart adversary may exploit vulnerabilities of the state-of-the-art machine learning models deployed in a 5G system to initiate an attack. This study focuses on the analysis of adversarial example generation attacks against machine learning based frameworks that may be present in the next generation networks. First, various AI/ML algorithms and the data used for their training and evaluation in mobile networks is discussed. Next, multiple AI/ML applications found in recent scientific papers devoted to 5G are overviewed. After that, existing adversarial example generation based attack algorithms are reviewed and frameworks which employ these algorithms for fuzzing stat-of-art AI/ML models are summarised. Finally, adversarial example generation attacks against several of the AI/ML frameworks described are presented. Full article
Back to TopTop