https://revistas.usal.es/cinco/index.php/2255-2863/issue/feedADCAIJ: Advances in Distributed Computing and Artificial Intelligence Journal2023-09-19T13:42:39+02:00Juan M. CORCHADOadcaij@usal.esOpen Journal Systems<p dir="ltr">The <a title="adcaij" href="http://adcaij.usal.es" target="_blank" rel="noopener">Advances in Distributed Computing and Artificial Intelligence Journal</a> (ISSN: 2255-2863) is an open access journal that publishes articles which contribute new results associated with distributed computing and artificial intelligence, and their application in different areas, such as the Internet, electronic commerce, mobile communications, wireless devices, distributed computing and so on. These technologies are changing constantly as a result of the large research and technical effort being undertaken in both universities and businesses. Authors are solicited to contribute to the journal by submitting articles that illustrate research results, projects, surveying works and industrial experiences that describe significant advances in the areas of computing.</p> <p dir="ltr">Adcaij focuses attention in the exchange of ideas between scientists and technicians. Both, academic and business areas, are essential to facilitate the development of systems that meet the demands of today's society. The journal is supported by the research group and start-up value <a title="bisite" href="http://bisite.usal.es/en/research/research-lines" target="_blank" rel="noopener">BISITE</a>.</p> <p dir="ltr">The journal commenced publication in 2012; has quarterly periodicity and has published 192 articles with peer review. All the articles are written in scientific English language.</p> <p dir="ltr">It has indexed in DOAJ, ProQuest, Scholar, WorldCat, Dialnet, Sherpa ROMEO, Dulcinea, UlrichWeb, Emerging Sources Citation Index of Thomson Reuters, BASE y Academic Journals Database.</p>https://revistas.usal.es/cinco/index.php/2255-2863/article/view/29081Restricted Computations and Parameters in Type-Theory of Acyclic Recursion2023-09-19T13:42:36+02:00Roussanka Loukanovarloukanova@gmail.com<p>The paper extends the formal language and the reduction calculus of Moschovakis type-theory of recursion, by adding a restrictor operator on terms with predicative restrictions. Terms with restrictions over memory variables formalise inductive algorithms with generalised, restricted parameters. The extended type-theory of restricted recursion (TTRR) provides computations for algorithmic semantics of mathematical expressions and definite descriptors, in formal and natural languages.</p> <p>The reduction calculi of TTRR provides a mathematical foundation of the work of compilers for reducing recursive programs to iterative ones. The type-theory of acyclic recursion (TTAR) has a special importance to syntax-semantics interfaces in computational grammars.</p>2023-07-20T00:00:00+02:00Copyright (c) 2023 Roussanka Loukanovahttps://revistas.usal.es/cinco/index.php/2255-2863/article/view/29199Eta-Reduction in Type-Theory of Acyclic Recursion2023-09-19T13:42:39+02:00Roussanka Loukanovarloukanova@gmail.com<p>We investigate the applicability of the classic eta-conversion in the type-theory of acyclic algorithms. While denotationally valid, classic eta-conversion is not algorithmically valid in the type theory of algorithms, with the exception of few limited cases. The paper shows how the restricted, algorithmic eta-rule can recover algorithmic eta-conversion in the reduction calculi of type-theory of algorithms.</p>2023-07-18T00:00:00+02:00Copyright (c) 2022 Roussanka Loukanovahttps://revistas.usal.es/cinco/index.php/2255-2863/article/view/29969Comparison of Swarm-based Metaheuristic and Gradient Descent-based Algorithms in Artificial Neural Network Training2023-09-19T13:42:33+02:00Erdal Ekere.eker@alparslan.edu.trMurat Kayrimuratkayri@yyu.edu.trSerdar Ekinciserdar.ekinci@batman.edu.trDavut İzcidavut.izci@batman.edu.tr<p><em>This paper aims to compare the gradient descent-based algorithms under classical training model and swarm-based metaheuristic algorithms in feed forward backpropagation artificial neural network training. Batch weight and bias rule, Bayesian regularization, cyclical weight and bias rule and Levenberg-Marquardt algorithms are used as the classical gradient descent-based algorithms. In terms of the swarm-based metaheuristic algorithms, hunger games search, gray wolf optimizer, Archimedes optimization, and the Aquila optimizer are adopted. The Iris data set is used in this paper for the training. Mean square error, mean absolute error and determination coefficient are used as statistical measurement techniques to determine the effect of the network architecture and the adopted training algorithm. The metaheuristic algorithms are shown to have superior capability over the gradient descent-based algorithms in terms of artificial neural network training. In addition to their success in error rates, the classification capabilities of the metaheuristic algorithms are also observed to be in the range of 94%-97%. The hunger games search algorithm is also observed for its specific advantages amongst the metaheuristic algorithms as it maintains good performance in terms of classification ability and other statistical measurements.</em></p>2023-09-19T00:00:00+02:00Copyright (c) 2023 erdal ekerhttps://revistas.usal.es/cinco/index.php/2255-2863/article/view/30240A Framework for Improving the Performance of QKDN using Machine Learning Approach2023-09-19T13:42:30+02:00R Arthiarthir2@srmist.edu.inA Saravanandean.academic@srmrmp.edu.inJ S Nayananj1672@srmist.edu.inChandresh MuthuKumarancm7061@srmist.edu.in<div class="page" title="Page 1"> <div class="layoutArea"> <div class="column"> <p>A reliable secure communication can be given between two remote parties by key sharing, quantum key distribution (QKD) is widely concentrated as the information in QKD is safeguarded by the laws of quantum physics. There are many techniques that deal with quantum key distribution network (QKDN), however, only few of them use machine learning (ML) and soft computing techniques to improve QKDN. ML can analyze data and improve itself through model training without having to be programmed manually. There has been a lot of progress in both the hardware and software of ML technologies. Given ML’s advantageous features, it can help improve and resolve issues in QKDN, facilitating its commercialization. The proposed work provides a detailed understanding of role of each layer of QKDN, addressing the limitations of each layer, and suggesting a framework to improve the performance metrics for various applications of QKDN by applying machine learning techniques, such as support vector machine and decision tree algorithms.</p> </div> </div> </div>2023-09-19T00:00:00+02:00Copyright (c) 2023 Arthi R, Saravanan A, Nayana J S, Chandresh MuthuKumaranhttps://revistas.usal.es/cinco/index.php/2255-2863/article/view/30381Energy Efficient 4-2 and 5-2 Compressor for Low Power Computing2023-09-19T13:42:27+02:00Rahul Mani Upadhyayrahulmaniupadhyay@gmail.comR.K. Chauhanrkchauhana27@gmail.comManish Kumar mkecemmmut@gmail.com<p><em>The rising use of multimedia devices, power management has become a major challenge. Various types of compressors have been designed in this study. Compressor circuits are designed using several circuits of XOR-XNOR gates and multiplexers. XOR-XNOR gate combinations and multiplexer circuits can be used to construct the suggested compressor design. The performance of proposed compressor circuits using these low-power XOR-XNOR gates and multiplexer blocks has been found to be space and power economical. This study proposes low-power and high-speed 3-2, 4-2, and 5-2 compressors for digital signal processing applications. A new compressor has also been proposed that is faster and uses less energy than the traditional compressor. The Full adder circuit which is constructed using various combinations of XOR-XNOR gates is used to develop this proposed compressor. The proposed 4-2 compressor shows average power dissipation 1235 nW and average delay 2.7 nS while 5-2 compressor shows average power dissipation 2973.50 nW and average delay 3.75 nS.</em></p>2023-09-19T00:00:00+02:00Copyright (c) 2023 Rahul Mani Upadhyay, R.K. Chauhan, Manish Kumar