Sign in to use this feature.

Years

Between: -

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (11,380)

Search Parameters:
Journal = Entropy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
Article
Quantum-Walk-Inspired Dynamic Adiabatic Local Search
Entropy 2023, 25(9), 1287; https://doi.org/10.3390/e25091287 (registering DOI) - 31 Aug 2023
Abstract
We investigate the irreconcilability issue that arises when translating the search algorithm from the Continuous Time Quantum Walk (CTQW) framework to the Adiabatic Quantum Computing (AQC) framework. For the AQC formulation to evolve along the same path as the CTQW, it requires a [...] Read more.
We investigate the irreconcilability issue that arises when translating the search algorithm from the Continuous Time Quantum Walk (CTQW) framework to the Adiabatic Quantum Computing (AQC) framework. For the AQC formulation to evolve along the same path as the CTQW, it requires a constant energy gap in the Hamiltonian throughout the AQC schedule. To resolve the constant gap issue, we modify the CTQW-inspired AQC catalyst Hamiltonian from an XZ operator to a Z oracle operator. Through simulation, we demonstrate that the total running time for the proposed approach for AQC with the modified catalyst Hamiltonian remains optimal as CTQW. Inspired by this solution, we further investigate adaptive scheduling for the catalyst Hamiltonian and its coefficient function in the adiabatic path of Grover-inspired AQC to improve the adiabatic local search. Full article
(This article belongs to the Special Issue Advances in Quantum Computing)
Article
Effective Excess Noise Suppression in Continuous-Variable Quantum Key Distribution through Carrier Frequency Switching
Entropy 2023, 25(9), 1286; https://doi.org/10.3390/e25091286 (registering DOI) - 31 Aug 2023
Abstract
Continuous-variable quantum key distribution (CV-QKD) is a promising protocol that can be easily integrated with classical optical communication systems. However, in the case of quantum-classical co-transmissions, such as dense wavelength division multiplexing with classical channels and time division multiplexing with large-power classical signal, [...] Read more.
Continuous-variable quantum key distribution (CV-QKD) is a promising protocol that can be easily integrated with classical optical communication systems. However, in the case of quantum-classical co-transmissions, such as dense wavelength division multiplexing with classical channels and time division multiplexing with large-power classical signal, a quantum signal is more susceptible to crosstalk caused by a classical signal, leading to signal distortion and key distribution performance reduction. To address this issue, we propose a noise-suppression scheme based on carrier frequency switching (CFS) that can effectively mitigate the influence of large-power random noise on the weak coherent state. In this noise-suppression scheme, a minimum-value window of the channel’s noise power spectrum is searched for and the transmission signal frequency spectrum shifts to the corresponding frequency to avoid large-power channel noise. A digital filter is also utilized to filter out most of the channel noise. Simulation results show that compared to the traditional fixed carrier frequency scheme, the proposed noise-suppression scheme can reduce the excess noise to 1.8%, and the secret key rate can be increased by 1.43 to 2.86 times at different distances. This noise-suppression scheme is expected to be applied in scenarios like quantum–classical co-transmission and multi-QKD co-transmission to provide noise-suppression solutions. Full article
(This article belongs to the Special Issue Quantum and Classical Physical Cryptography)
Show Figures

Figure 1

Article
Spectrum Sensing Based on Hybrid Spectrum Handoff in Cognitive Radio Networks
Entropy 2023, 25(9), 1285; https://doi.org/10.3390/e25091285 (registering DOI) - 31 Aug 2023
Abstract
The rapid advancement of wireless communication combined with insufficient spectrum exploitation opens the door for the expansion of novel wireless services. Cognitive radio network (CRN) technology makes it possible to periodically access the open spectrum bands, which in turn improves the effectiveness of [...] Read more.
The rapid advancement of wireless communication combined with insufficient spectrum exploitation opens the door for the expansion of novel wireless services. Cognitive radio network (CRN) technology makes it possible to periodically access the open spectrum bands, which in turn improves the effectiveness of CRNs. Spectrum sensing (SS), which allows unauthorized users to locate open spectrum bands, plays a fundamental part in CRNs. A precise approximation of the power spectrum is essential to accomplish this. On the assumption that each SU’s parameter vector contains some globally and partially shared parameters, spectrum sensing is viewed as a parameter estimation issue. Distributed and cooperative spectrum sensing (CSS) is a key component of this concept. This work introduces a new component-specific cooperative spectrum sensing model (CSCSSM) in CRNs considering the amplitude and phase components of the input signal including Component Specific Adaptive Estimation (CSAE) for mean squared deviation (MSD) formulation. The proposed concept ensures minimum information loss compared to the traditional methods that consider error calculation among the direct signal vectors. The experimental results and performance analysis prove the robustness and efficiency of the proposed work over the traditional methods. Full article
Show Figures

Figure 1

Article
Defining the Scale to Build Complex Networks with a 40-Year Norwegian Intraplate Seismicity Dataset
Entropy 2023, 25(9), 1284; https://doi.org/10.3390/e25091284 - 31 Aug 2023
Abstract
We present a new complex network-based study focused on intraplate earthquakes recorded in southern Norway during the period 1980–2020. One of the most recognized limitations of spatial complex network procedures and analyses concerns the definition of adequate cell size, which is the focus [...] Read more.
We present a new complex network-based study focused on intraplate earthquakes recorded in southern Norway during the period 1980–2020. One of the most recognized limitations of spatial complex network procedures and analyses concerns the definition of adequate cell size, which is the focus of this approach. In the present study, we analyze the influence of observational errors of hypocentral and epicentral locations of seismic events in the construction of a complex network, looking for the best cell size to build it and to develop a basis for interpreting the results in terms of the structure of the complex network in this seismic region. We focus the analysis on the degree distribution of the complex networks. We observed a strong result of the cell size for the slope of the degree distribution of the nodes, called the critical exponent γ. Based on the Abe–Suzuki method, the slope (γ) showed a negligible variation between the construction of 3- and 2-dimensional complex networks. The results were also very similar for a complex network built with subsets of seismic events. These results suggest a weak influence of observational errors measured for the coordinates latitude, longitude, and depth in the outcomes obtained with this particular methodology and for this high-quality dataset. These results imply stable behavior of the complex network, which shows a structure of hubs for small values of the cell size and a more homogeneous degree distribution when the cell size increases. In all the analyses, the γ parameter showed smaller values of the error bars for greater values of the cell size. To keep the structure of hubs and small error bars, a better range of the side sizes was determined to be between 8 to 16 km. From now on, these values can be used as the most stable cell sizes to perform any kind of study concerning complex network studies in southern Norway. Full article
(This article belongs to the Special Issue Complexity and Statistical Physics Approaches to Earthquakes)
Show Figures

Figure 1

Article
Nighttime Image Stitching Method Based on Image Decomposition Enhancement
Entropy 2023, 25(9), 1282; https://doi.org/10.3390/e25091282 (registering DOI) - 31 Aug 2023
Abstract
Image stitching technology realizes alignment and fusion of a series of images with common pixel areas taken from different viewpoints of the same scene to produce a wide field of view panoramic image with natural structure. The night environment is one of the [...] Read more.
Image stitching technology realizes alignment and fusion of a series of images with common pixel areas taken from different viewpoints of the same scene to produce a wide field of view panoramic image with natural structure. The night environment is one of the important scenes of human life, and the night image stitching technology has more urgent practical significance in the fields of security monitoring and intelligent driving at night. Due to the influence of artificial light sources at night, the brightness of the image is unevenly distributed and there are a large number of dark light areas, but often these dark light areas have rich structural information. The structural features hidden in the darkness are difficult to extract, resulting in ghosting and misalignment when stitching, which makes it difficult to meet the practical application requirements. Therefore, a nighttime image stitching method based on image decomposition enhancement is proposed to address the problem of insufficient line feature extraction in the stitching process of nighttime images. The proposed algorithm performs luminance enhancement on the structural layer, smoothes the nighttime image noise using a denoising algorithm on the texture layer, and finally complements the texture of the fused image by an edge enhancement algorithm. The experimental results show that the proposed algorithm improves the image quality in terms of information entropy, contrast, and noise suppression compared with other algorithms. Moreover, the proposed algorithm extracts the most line features from the processed nighttime images, which is more helpful for the stitching of nighttime images. Full article
(This article belongs to the Topic Color Image Processing: Models and Methods (CIP: MM))
Show Figures

Figure 1

Article
Systems of Precision: Coherent Probabilities on Pre-Dynkin Systems and Coherent Previsions on Linear Subspaces
Entropy 2023, 25(9), 1283; https://doi.org/10.3390/e25091283 (registering DOI) - 31 Aug 2023
Viewed by 77
Abstract
In the literature on imprecise probability, little attention is paid to the fact that imprecise probabilities are precise on a set of events. We call these sets systems of precision. We show that, under mild assumptions, the system of precision of a [...] Read more.
In the literature on imprecise probability, little attention is paid to the fact that imprecise probabilities are precise on a set of events. We call these sets systems of precision. We show that, under mild assumptions, the system of precision of a lower and upper probability form a so-called (pre-)Dynkin system. Interestingly, there are several settings, ranging from machine learning on partial data over frequential probability theory to quantum probability theory and decision making under uncertainty, in which, a priori, the probabilities are only desired to be precise on a specific underlying set system. Here, (pre-)Dynkin systems have been adopted as systems of precision, too. We show that, under extendability conditions, those pre-Dynkin systems equipped with probabilities can be embedded into algebras of sets. Surprisingly, the extendability conditions elaborated in a strand of work in quantum probability are equivalent to coherence from the imprecise probability literature. On this basis, we spell out a lattice duality which relates systems of precision to credal sets of probabilities. We conclude the presentation with a generalization of the framework to expectation-type counterparts of imprecise probabilities. The analogue of pre-Dynkin systems turns out to be (sets of) linear subspaces in the space of bounded, real-valued functions. We introduce partial expectations, natural generalizations of probabilities defined on pre-Dynkin systems. Again, coherence and extendability are equivalent. A related but more general lattice duality preserves the relation between systems of precision and credal sets of probabilities. Full article
(This article belongs to the Special Issue Quantum Probability and Randomness IV)
Show Figures

Figure 1

Article
Origins of Genetic Coding: Self-Guided Molecular Self-Organisation
Entropy 2023, 25(9), 1281; https://doi.org/10.3390/e25091281 (registering DOI) - 31 Aug 2023
Viewed by 117
Abstract
The origin of genetic coding is characterised as an event of cosmic significance in which quantum mechanical causation was transcended by constructive computation. Computational causation entered the physico-chemical processes of the pre-biotic world by the incidental satisfaction of a condition of reflexivity between [...] Read more.
The origin of genetic coding is characterised as an event of cosmic significance in which quantum mechanical causation was transcended by constructive computation. Computational causation entered the physico-chemical processes of the pre-biotic world by the incidental satisfaction of a condition of reflexivity between polymer sequence information and system elements able to facilitate their own production through translation of that information. This event, which has previously been modelled in the dynamics of Gene–Replication–Translation systems, is properly described as a process of self-guided self-organisation. The spontaneous emergence of a primordial genetic code between two-letter alphabets of nucleotide triplets and amino acids is easily possible, starting with random peptide synthesis that is RNA-sequence-dependent. The evident self-organising mechanism is the simultaneous quasi-species bifurcation of the populations of information-carrying genes and enzymes with aminoacyl-tRNA synthetase-like activities. This mechanism allowed the code to evolve very rapidly to the ~20 amino acid limit apparent for the reflexive differentiation of amino acid properties using protein catalysts. The self-organisation of semantics in this domain of physical chemistry conferred on emergent molecular biology exquisite computational control over the nanoscopic events needed for its self-construction. Full article
(This article belongs to the Special Issue Recent Advances in Guided Self-Organization)
Show Figures

Figure 1

Article
HE-YOLOv5s: Efficient Road Defect Detection Network
Entropy 2023, 25(9), 1280; https://doi.org/10.3390/e25091280 (registering DOI) - 31 Aug 2023
Viewed by 128
Abstract
In recent years, the number of traffic accidents caused by road defects has increased dramatically all over the world, and the repair and prevention of road defects is an urgent task. Researchers in different countries have proposed many models to deal with this [...] Read more.
In recent years, the number of traffic accidents caused by road defects has increased dramatically all over the world, and the repair and prevention of road defects is an urgent task. Researchers in different countries have proposed many models to deal with this task, but most of them are either highly accurate and slow in detection, or the accuracy is low and the detection speed is high. The accuracy and speed have achieved good results, but the generalization of the model to other datasets is poor. Given this, this paper takes YOLOv5s as a benchmark model and proposes an optimization model to solve the problem of road defect detection. First, we significantly reduce the parameters of the model by pruning the model and removing unimportant modules, propose an improved Spatial Pyramid Pooling-Fast (SPPF) module to improve the feature signature fusion ability, and finally add an attention module to focus on the key information. The activation function, sampling method, and other strategies were also replaced in this study. The test results on the Global Road Damage Detection Challenge (GRDDC) dataset show that the FPS of our proposed model is not only faster than the baseline model but also improves the MAP by 2.08%, and the size of this model is also reduced by 6.07 M. Full article
Show Figures

Figure 1

Editorial
Quantum Chaos—Dedicated to Professor Giulio Casati on the Occasion of His 80th Birthday
Entropy 2023, 25(9), 1279; https://doi.org/10.3390/e25091279 (registering DOI) - 31 Aug 2023
Viewed by 125
Abstract
Quantum chaos is the study of phenomena in the quantum domain which correspond to classical chaos [...] Full article
Article
Investigation of Feature Engineering Methods for Domain-Knowledge-Assisted Bearing Fault Diagnosis
Entropy 2023, 25(9), 1278; https://doi.org/10.3390/e25091278 (registering DOI) - 30 Aug 2023
Viewed by 89
Abstract
The engineering challenge of rolling bearing condition monitoring has led to a large number of method developments over the past few years. Most commonly, vibration measurement data are used for fault diagnosis using machine learning algorithms. In current research, purely data-driven deep learning [...] Read more.
The engineering challenge of rolling bearing condition monitoring has led to a large number of method developments over the past few years. Most commonly, vibration measurement data are used for fault diagnosis using machine learning algorithms. In current research, purely data-driven deep learning methods are becoming increasingly popular, aiming for accurate predictions of bearing faults without requiring bearing-specific domain knowledge. Opposing this trend in popularity, the present paper takes a more traditional approach, incorporating domain knowledge by evaluating a variety of feature engineering methods in combination with a random forest classifier. For a comprehensive feature engineering study, a total of 42 mathematical feature formulas are combined with the preprocessing methods of envelope analysis, empirical mode decomposition, wavelet transforms, and frequency band separations. While each single processing method and feature formula is known from the literature, the presented paper contributes to the body of knowledge by investigating novel series connections of processing methods and feature formulas. Using the CWRU bearing fault data for performance evaluation, feature calculation based on the processing method of frequency band separation leads to particularly high prediction accuracies, while at the same time being very efficient in terms of low computational effort. Additionally, in comparison with deep learning approaches, the proposed feature engineering method provides excellent accuracies and enables explainability. Full article
Show Figures

Figure 1

Article
Multi-UAV Cooperative Trajectory Planning Based on the Modified Cheetah Optimization Algorithm
Entropy 2023, 25(9), 1277; https://doi.org/10.3390/e25091277 - 30 Aug 2023
Viewed by 139
Abstract
The capacity for autonomous functionality serves as the fundamental ability and driving force for the cross-generational upgrading of unmanned aerial vehicles (UAVs). With the disruptive transformation of artificial intelligence technology, autonomous trajectory planning based on intelligent algorithms has emerged as a key technique [...] Read more.
The capacity for autonomous functionality serves as the fundamental ability and driving force for the cross-generational upgrading of unmanned aerial vehicles (UAVs). With the disruptive transformation of artificial intelligence technology, autonomous trajectory planning based on intelligent algorithms has emerged as a key technique for enhancing UAVs’ capacity for autonomous behavior, thus holding significant research value. To address the challenges of UAV trajectory planning in complex 3D environments, this paper proposes a multi-UAV cooperative trajectory-planning method based on a Modified Cheetah Optimization (MCO) algorithm. Firstly, a spatiotemporal cooperative trajectory planning model is established, incorporating UAV-cooperative constraints and performance constraints. Evaluation criteria, including fuel consumption, altitude, and threat distribution field cost functions, are introduced. Then, based on its parent Cheetah Optimization (CO) algorithm, the MCO algorithm incorporates a logistic chaotic mapping strategy and an adaptive search agent strategy, thereby improving the home-returning mechanism. Finally, extensive simulation experiments are conducted using a considerably large test dataset containing functions with the following four characteristics: unimodal, multimodal, separable, and inseparable. Meanwhile, a strategy for dimensionality reduction searching is employed to solve the problem of autonomous trajectory planning in real-world scenarios. The results of a conducted simulation demonstrate that the MCO algorithm outperforms several other related algorithms, showcasing smaller trajectory costs, a faster convergence speed, and stabler performance. The proposed algorithm exhibits a certain degree of correctness, effectiveness, and advancement in solving the problem of multi-UAV cooperative trajectory planning. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

Article
Generalized Bell Scenarios: Disturbing Consequences on Local-Hidden-Variable Models
Entropy 2023, 25(9), 1276; https://doi.org/10.3390/e25091276 - 30 Aug 2023
Viewed by 134
Abstract
Bell nonlocality and Kochen–Specker contextuality are among the main topics in the foundations of quantum theory. Both of them are related to stronger-than-classical correlations, with the former usually referring to spatially separated systems, while the latter considers a single system. In recent works, [...] Read more.
Bell nonlocality and Kochen–Specker contextuality are among the main topics in the foundations of quantum theory. Both of them are related to stronger-than-classical correlations, with the former usually referring to spatially separated systems, while the latter considers a single system. In recent works, a unified framework for these phenomena was presented. This article reviews, expands, and obtains new results regarding this framework. Contextual and disturbing features inside the local models are explored, which allows for the definition of different local sets with a non-trivial relation among them. The relations between the set of quantum correlations and these local sets are also considered, and post-quantum local behaviours are found. Moreover, examples of correlations that are both local and non-contextual but such that these two classical features cannot be expressed by the same hidden variable model are shown. Extensions of the Fine–Abramsky–Brandenburger theorem are also discussed. Full article
(This article belongs to the Special Issue Quantum Correlations, Contextuality, and Quantum Nonlocality)
Show Figures

Figure 1

Review
Mental Gravity: Depression as Spacetime Curvature of the Self, Mind, and Brain
Entropy 2023, 25(9), 1275; https://doi.org/10.3390/e25091275 - 30 Aug 2023
Viewed by 145
Abstract
The principle of mental gravity contends that the mind uses physical gravity as a mental model or simulacrum to express the relation between the inner self and the outer world in terms of “UP”-ness and “DOWN”-ness. The simulation of increased gravity characterises a [...] Read more.
The principle of mental gravity contends that the mind uses physical gravity as a mental model or simulacrum to express the relation between the inner self and the outer world in terms of “UP”-ness and “DOWN”-ness. The simulation of increased gravity characterises a continuum of mental gravity which states includes depression as the paradigmatic example of being down, low, heavy, and slow. The physics of gravity can also be used to model spacetime curvature in depression, particularly gravitational time dilation as a property of MG analogous to subjective time dilation (i.e., the slowing of temporal flow in conscious experience). The principle has profound implications for the Temporo-spatial Theory of Consciousness (TTC) with regard to temporo-spatial alignment that establishes a “world-brain relation” that is centred on embodiment and the socialisation of conscious states. The principle of mental gravity provides the TTC with a way to incorporate the structure of the world into the structure of the brain, conscious experience, and thought. In concert with other theories of cognitive and neurobiological spacetime, the TTC can also work towards the “common currency” approach that also potentially connects the TTC to predictive processing frameworks such as free energy, neuronal gauge theories, and active inference accounts of depression. It gives the up/down dimension of space, as defined by the gravitational field, a unique status that is connected to both our embodied interaction with the physical world, and also the inverse, reflective, emotional but still embodied experience of ourselves. Full article
(This article belongs to the Special Issue Temporo-Spatial Theory of Consciousness (TTC))
Show Figures

Figure 1

Article
A Semi-Supervised Stacked Autoencoder Using the Pseudo Label for Classification Tasks
Entropy 2023, 25(9), 1274; https://doi.org/10.3390/e25091274 - 30 Aug 2023
Viewed by 172
Abstract
The efficiency and cognitive limitations of manual sample labeling result in a large number of unlabeled training samples in practical applications. Making full use of both labeled and unlabeled samples is the key to solving the semi-supervised problem. However, as a supervised algorithm, [...] Read more.
The efficiency and cognitive limitations of manual sample labeling result in a large number of unlabeled training samples in practical applications. Making full use of both labeled and unlabeled samples is the key to solving the semi-supervised problem. However, as a supervised algorithm, the stacked autoencoder (SAE) only considers labeled samples and is difficult to apply to semi-supervised problems. Thus, by introducing the pseudo-labeling method into the SAE, a novel pseudo label-based semi-supervised stacked autoencoder (PL-SSAE) is proposed to address the semi-supervised classification tasks. The PL-SSAE first utilizes the unsupervised pre-training on all samples by the autoencoder (AE) to initialize the network parameters. Then, by the iterative fine-tuning of the network parameters based on the labeled samples, the unlabeled samples are identified, and their pseudo labels are generated. Finally, the pseudo-labeled samples are used to construct the regularization term and fine-tune the network parameters to complete the training of the PL-SSAE. Different from the traditional SAE, the PL-SSAE requires all samples in pre-training and the unlabeled samples with pseudo labels in fine-tuning to fully exploit the feature and category information of the unlabeled samples. Empirical evaluations on various benchmark datasets show that the semi-supervised performance of the PL-SSAE is more competitive than that of the SAE, sparse stacked autoencoder (SSAE), semi-supervised stacked autoencoder (Semi-SAE) and semi-supervised stacked autoencoder (Semi-SSAE). Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

Article
Graph Regression Model for Spatial and Temporal Environmental Data—Case of Carbon Dioxide Emissions in the United States
Entropy 2023, 25(9), 1272; https://doi.org/10.3390/e25091272 - 29 Aug 2023
Viewed by 147
Abstract
We develop a new model for spatio-temporal data. More specifically, a graph penalty function is incorporated in the cost function in order to estimate the unknown parameters of a spatio-temporal mixed-effect model based on a generalized linear model. This model allows for more [...] Read more.
We develop a new model for spatio-temporal data. More specifically, a graph penalty function is incorporated in the cost function in order to estimate the unknown parameters of a spatio-temporal mixed-effect model based on a generalized linear model. This model allows for more flexible and general regression relationships than classical linear ones through the use of generalized linear models (GLMs) and also captures the inherent structural dependencies or relationships of the data through this regularization based on the graph Laplacian. We use a publicly available dataset from the National Centers for Environmental Information (NCEI) in the United States of America and perform statistical inferences of future CO2 emissions in 59 counties. We empirically show how the proposed method outperforms widely used methods, such as the ordinary least squares (OLS) and ridge regression for this challenging problem. Full article
(This article belongs to the Special Issue Spatial–Temporal Data Analysis and Its Applications)
Show Figures

Figure 1

Back to TopTop