مطالب مرتبط با کلیدواژه
۱.
۲.
۳.
۴.
Quantum Computing
حوزههای تخصصی:
The finance sector is experiencing substantial technological disruption as Quantum Computing and Artificial Intelligence (AI) continue to advance at a rapid pace. This study employs bibliometric analysis, specifically VOS Viewer, to investigate the academic environment at the intersection of financial risk, AI, and quantum computation. From 2014 to 2023, a comprehensive bibliometric analysis was performed on a total of 145 journal articles that were published in Scopus and Web of Sciences (WoS). Articles are categorized based on their homogeneity in the disciplines of Quantum Computing, Financial Risk, and AI, as well as their interdisciplinary compositions. The results, which include authorship trends, keyword dynamics, and linked works, are analyzed and presented. This extensive bibliometric analysis offers critical insights into contemporary research and pinpointing areas necessitating further exploration. As quantum computers and AI algorithms become more sophisticated, this paper investigates the potential weaknesses and issues that financial institutions may encounter. By analyzing the intersection of two transformative technologies, the report offers critical insights into the discourse surrounding the safeguarding of financial systems in the quantum era. The analysis not only enhances the quality of the review but also directs researchers to significant papers and identifies regions of publications, thereby facilitating a more comprehensive understanding of the research environment.
Harnessing Quantum Computing for Real-Time Data Analytics: A 2025 Perspective(مقاله علمی وزارت علوم)
منبع:
پژوهشنامه پردازش و مدیریت اطلاعات دوره ۴۰ تابستان ۱۴۰۴ ویژه نامه انگلیسی ۴ (پیاپی ۱۲۵)
315 - 340
حوزههای تخصصی:
Background: Quantum computing has brought in all new paradigm for computational processing providing unparallel ability for data analysis. Considering worldwide data production is expected to exceed 180 trillion zettabytes by 2025 the utilization of the conventional computing framework hampers the real-time processing of data. People consider quantum computing, which uses principles of quantum mechanics to solve problems 100 and 1,000 times faster than classical computing. Objective: The article looks at quantum computing and its relevance to real time data analytics to determine its relevance, hence its impact, by the year 2025. It is worthwhile to emphasize the comparison of quantum algorithms with traditional approaches to dealing with extensive, data-centered workloads in various fields. Methods: A comparison was made on quantum versus classical computing algorithms based on criteria such as, the flow rate, precision, and flexibility. Data sets provided by the finance stream, including real-time stock analysis, supply chain and logistics, genomic sequencing from the healthcare domain were used. Over 10 million simulation experiments were performed to gain trends and insights into the operational problems for quantum simulation. Results: The study establishes differences in the efficiencies of these two approaches, with quantum algorithms speeding up particular tasks as much as a hundred times higher than classical algorithms and almost 15% of the error rate being decreased if quantum error correction modes were used. In scalability tests it was shown that quantum systems could process data sets larger than 10 terabytes with little slowdown, compared to a classical system, which reduced efficiency by as much as 30%. However, in present day quantum hardware, processing the capability is limited and problems arise with regards the error correction protocol. Conclusion: Quantum computing, on the other hand, has an unconventional prospect of real-time data analytics to operate at high efficiency and big scale on data-bound concerns. However, much progress is required in the way of bettering coherence times and reducing exacting error rates, crucial advances for total realization of quantum potentialities by 2025
Next-Gen Machine Learning Models: Pushing the Boundaries of AI(مقاله علمی وزارت علوم)
منبع:
پژوهشنامه پردازش و مدیریت اطلاعات دوره ۴۰ تابستان ۱۴۰۴ ویژه نامه انگلیسی ۴ (پیاپی ۱۲۵)
435 - 464
حوزههای تخصصی:
Background: Machine learning (ML) has developed significantly over the years, changing several industries through the use of automation and Big Data. By building better next-generation machine learning models, AI’s future has the potential of improving on existing problematic methods such as scalability, interpretability, and generalization. Objective: This article examines about how new generation of ML models are developed and used to explain about the capabilities of AI in different fields. In particular, it is focused on changes in structural models, certain methods of training them, and the application of brand-new technologies as quantum computing. Methods: A review of the state of the art and several case studies were carried out with regard to the latest work being done on different types of ML algorithms such as transformer models, reinforcement learning, and Neural Architecture Search. Moreover, the given models were tested in experiments concerning the applicability of these models in tasks including image recognition, natural language processing, and in autonomous systems. Results: The next-gen models, thereby outperformed the traditional models in terms of accuracy, computational speed, and flexibility. The identified benefits were decreased training time, better interpretability, and better performance with multi-modal and cross-domain tasks. Conclusion: These new generation of ML models are the game changers in AI development solving previous challenges while providing opportunities across numerous sectors. In this vein, further research in this field is needed to achieve AI’s solving of problems.
Quantum Key Distribution Protocols for Enhancing Cryptographic Resilience in Next-Generation 5G Network Infrastructures(مقاله علمی وزارت علوم)
منبع:
پژوهشنامه پردازش و مدیریت اطلاعات دوره ۴۰ تابستان ۱۴۰۴ ویژه نامه انگلیسی ۴ (پیاپی ۱۲۵)
797 - 829
حوزههای تخصصی:
Background: Quantum computing has posed a profound threat to the classical cryptographic systems as it is advancing at an exponential rate with the help of quantum algorithms like Shor’s and Grover’s which can easily decipher the Rivest–Shamir–Adleman (RSA) and Elliptic Curve Cryptography (ECC) algorithms. Huge requirements for cryptographic frameworks that can withstand quantum hacking have inspired Quantum Key Distribution (QKD), Post-Quantum Cryptography (PQC), and systems that use both. Objective: The aim of this article is to review the performance, scalability and integration of quantum-secure cryptographic services, with a practical lens on how they can be used in real-time environments like self-driving cars, industrial IoT, and intelligent health systems. It also aims at establishing the drawback of the current model and directions for further enhancement. Methods: The study employs simulative experimentation to understand lest exposures to quantum algorithms and rates cryptographic systems on standards such as latency, Quantum Bit Error Rate (QBER), computational overhead, scalability, and cost. Comparative assessment furniture integrated analysis of QKD, PQC, and hybrid system by identifying the advantages and disadvantage of each system. Results: As a result, adopting hybrid systems provided the best or comparable median results with lowest latency in real-time applications of ~45 ms or lower compared to alternative Multi-Access Edge Computing (MEC) architectures and types of security elements at high scalability. Thus, QKD, while being exceptional in security, has the problem of scalability, while PQC had average results on the given parameters. Conclusion : Quantum threats are adequately dealt with by hybrid cryptographic systems as this study has also pointed out. It is seen that initiation to future work may someday distribute resources effectively, expedite PQC standardization, and embrace artificially intelligent network frameworks for flexibility and expansiveness across different networks.