پژوهشنامه پردازش و مدیریت اطلاعات (علوم و فناوری اطلاعات سابق)
پژوهشنامه پردازش و مدیریت اطلاعات دوره 40 تابستان 1404 ویژه نامه انگلیسی 4 (پیاپی 125) (مقاله علمی وزارت علوم)
مقالات
حوزههای تخصصی:
Background: The exponential growth of data centers has significantly increased their carbon footprint, raising concerns about their environmental impact. As the demand for digital services and cloud computing intensifies, sustainable computing practices have become crucial for mitigating climate change. Objective: This paper aims to explore strategies for reducing the carbon footprint of data centers by integrating sustainable computing practices, including energy-efficient hardware, renewable energy sources, and optimized cooling technologies. Methods: A comprehensive review of existing literature was conducted, along with an analysis of case studies from major technology firms employing green computing strategies. Data center energy consumption patterns and carbon emissions were evaluated using energy efficiency metrics such as Power Usage Effectiveness (PUE) and Carbon Usage Effectiveness (CUE). Results: Findings indicate that adopting energy-efficient hardware, coupled with renewable energy sources, can significantly reduce energy consumption and carbon emissions. Optimized cooling techniques, such as liquid cooling and free-air cooling, further contribute to energy savings. Companies employing these practices reported a reduction in carbon emissions by up to 30%. Conclusion: Sustainable computing practices offer a viable path for reducing the environmental impact of data centers. By prioritizing energy efficiency and renewable energy integration, data centers can minimize their carbon footprint while maintaining operational efficiency, thus contributing to global sustainability goals.
Advancing Natural Language Processing with New Models and Applications in 2025(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Recent advancements in Natural Language Processing (NLP) have been significantly influenced by transformer models. However, challenges related to scalability, discrepancies between pretraining and finetuning, and suboptimal performance on tasks with diverse and limited data remain. The integration of Reinforcement Learning (RL) with transformers has emerged as a promising approach to address these limitations. Objective: This article aims to evaluate the performance of a transformer-based NLP model integrated with RL across multiple tasks, including translation, sentiment analysis, and text summarization. Additionally, the study seeks to assess the model's efficiency in real-time operations and its fairness. Methods: The hybrid model's effectiveness was evaluated using task-oriented metrics such as BLEU, F1, and ROUGE scores across various task difficulties, dataset sizes, and demographic samples. Fairness was measured based on demographic parity and equalized odds. Scalability and real-time performance were assessed using accuracy and latency metrics. Results: The hybrid model consistently outperformed the baseline transformer across all evaluated tasks, demonstrating higher accuracy, lower error rates, and improved fairness. It also exhibited robust scalability and significant reductions in latency, enhancing its suitability for real-time applications. Conclusion: This article illustrates that the proposed hybrid model effectively addresses issues related to scale, diversity, and fairness in NLP. Its flexibility and efficacy make it a valuable tool for a wide range of linguistic and practical applications. Future research should focus on improving time complexity and exploring the use of deep unsupervised learning for low-resource languages.
AI Future of Augmented Reality in Education: From Concept to Classroom(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: The integration of artificial intelligence (AI) with augmented reality (AR) has significantly revolutionized educational practices. By blending digital content with the physical environment, AR enhances student engagement, while AI-driven tools personalize learning experiences. Objective: This article aims to explore the future of AI-powered AR in education, analyzing its potential to transform traditional learning environments by improving student interaction, knowledge retention, and personalized learning. Methods: A comprehensive literature review was conducted, examining current AI-AR applications in educational settings. Additionally, case studies from early adopters of this technology in classrooms were analyzed. Interviews with educators and experts were conducted to gain insights into the challenges and opportunities associated with AI-enhanced AR. Results: The findings indicate that AI-AR systems significantly enhance student engagement, promote interactive learning experiences, and offer personalized feedback based on individual learning styles. However, challenges such as high implementation costs, technical expertise requirements, and the need for curriculum alignment were identified. Conclusion: AI-AR has the potential to reshape educational practices by fostering a more interactive, engaging, and tailored learning experience. Future efforts should focus on addressing the technical and pedagogical challenges to ensure successful adoption across various educational contexts.
AI-Driven Automation for Transforming the Future of Software Development(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background : Artificial Intelligence (AI) has recently emerged as a transformative innovation within the software industry, disrupting conventional approaches to application development by automating tasks, refining code, and enhancing resource efficiency. Prior research indicates the effectiveness of AI-powered tools across various domains. However, contemporary studies lack a detailed analysis of the diverse sectors utilizing AI tools for software development. Objective : This article aims to identify the potential benefits and impacts of AI in software development, specifically regarding time-to-market, productivity, code quality, bug-fixing rates, resource flexibility, and developer satisfaction. The goal is to present fact-based information about AI’s impact on multiple industries and scopes of work. Methods : A mixed-methods research design was employed to analyze quantitative data from 40 projects across healthcare, financial services, retail, technology, and e-commerce industries. Data were collected using various project management tools, automated testing environments, and online questionnaires addressed to developers. The study incorporated a comparative evaluation of AI-based projects and traditional projects, with statistical analysis. Results : AI-driven software development projects demonstrated a mean reduction in time-to-market by 34.6%, an improvement in code quality by 70%, and a mean reduction in bug-fixing time by 57.7%. Productivity per sprint increased by over 70%, resource flexibility was higher (90.2% in AI projects vs. 67.8% in traditional projects), and developers reported higher satisfaction levels. These findings reinforce the concept that AI significantly enhances workflow and the achievement of optimal results. Conclusion : AI substantially improves both the speed and quality of software development. Further research should expand to explore the experiences of different sectors, the application of AI-driven tools, their differentiation, and usage, as well as the ethical considerations to promote sustainable and innovative software engineering solutions.
AI-Powered Network Management with Enhancing Reliability and Security(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Contemporary multi-protocol networks necessitate scalability, reliability, energy efficiency, and security due to the increasing number of devices and the diversification of network traffic. Conventional network management methods are inadequate to meet these demands, necessitating sophisticated solutions. Artificial intelligence (AI) has emerged as a significant field, offering advanced methods including predictive maintenance, anomaly detection, and intelligent resource management. Objective: This article aims to critically evaluate the effectiveness, flexibility, and productivity of AI-based applications in addressing major challenges in network management, including performance, scalability, energy consumption, threat detection rates, and cost. Methods: The study employs simulations and modeled datasets to assess AI-oriented solutions across various network environments, such as industrial IoT, smart cities, and telecommunications. The evaluation encompasses factors including Mean Time Between Failure (MTBF), resource utilization, delay minimization, and operating cost reduction. Digital twins, intelligent routing algorithms, and self-attention-based anomaly detection models are utilized, and the overall performance of these integrated technologies is analyzed. Results: The analysis demonstrates that AI-powered systems achieve near-optimal performance across all evaluated indicators. Specifically, the Manufacturing and Automotive Knowledge (MAK) sector observed a 52% increase in MTBF, the Banking, Financial Services, and Insurance (BFSI) sector noted a 32.39% improvement in energy efficiency, and the Defense and Public Enterprise (DPE) sector experienced a 94% increase in advanced threat detection. Conclusion: The findings indicate that AI solutions can effectively address many of the challenges present in current networks, offering cost-efficient and secure methods for implementing new communication networks with vast potential. Nonetheless, further empirical research is necessary to generalize these results and validate their applicability in real-world scenarios.
Artificial Intelligence in Healthcare: Revolutionizing Diagnostics with Predictive Algorithms(مقاله علمی وزارت علوم)
حوزههای تخصصی:
ABSTRACT Background: Artificial Intelligence (AI) has rapidly integrated into healthcare, proving indispensable in diagnostic processes. Event-predicting equations in medicine offer solutions to longstanding issues related to early diagnosis and personalized patient care. Objective: This article aims to explore best practices in objective and quantitative diagnostic predictions using AI and predictive algorithms. It seeks to revolutionize healthcare diagnostics by enhancing effectiveness and reducing diagnostic error rates. Methods: This study involves a literature review of the past five years, focusing on recent innovations in AI for healthcare diagnostics. The review includes fields such as oncology, cardiology, and others to evaluate the efficacy of prediction algorithms in practice. Results: The findings indicate that machine learning-based computer-aided diagnosis models significantly improve diagnostic accuracy by detecting diseases at early stages and personalizing treatment programs. The integration of these algorithms has led to reduced diagnostic errors and improved patient experiences across various medical fields. Conclusion: AI predictive algorithms represent the future of diagnostic medicine. Their adoption is set to personalize and advance patient treatment, enhance health outcomes, and improve the efficiency of healthcare systems. However, comprehensive research and precise implementation are essential to fully harness the potential of AI in diagnostics.
Beyond 5G. Strategic Pathways to 6G Development and Emerging Applications(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: The rapid evolution from 4G to 5G has transformed the telecommunications landscape, but as technological demands continue to grow, the shift toward 6G is gaining attention. 6G aims to address the limitations of 5G, such as latency and bandwidth constraints, while introducing new capabilities like terahertz communication and ubiquitous AI integration. Objective: This article explores the development roadmap of 6G, highlighting its applications across industries and addressing key challenges in its deployment. Methods: A comprehensive review of current literature on 5G advancements and emerging 6G technologies was conducted. Comparative analyses were performed on the theoretical frameworks of 6G’s core capabilities, including network architecture, spectrum management, and AI integration. Results: The study identified key applications for 6G, such as smart cities, autonomous transportation, healthcare, and industrial automation. It also highlighted the anticipated improvements in data transmission speed, reliability, and connectivity. Conclusion: 6G represents a pivotal evolution in telecommunications, offering transformation in numerous sectors. However, challenges such as infrastructure development, regulatory frameworks, and energy efficiency must be addressed.
Blockchain Beyond Cryptocurrency: Emerging Applications in Secure Data Sharing(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Blockchain, mainly known for supporting cryptocurrencies, has a much broader role, as seen in this paper. These fundamental features of decentralization and immutability guarantee improved security and transparency in multiple spheres of human life. Objective: The article seeks to review current literature on new prospects of using blockchain as a secure way of sharing data with the purpose of establishing its advantages and disadvantages in this field. Methods: Relevant academic articles and papers published in the last 5 years were considered, and research cases of blockchain applications in numerous fields including healthcare, finance, supply chain, etc. This incorporates a review of blockchain within the capacity of data integrity, confidentiality, and availability. Results: The results show that blockchain can greatly improve the security and credibility of data in data sharing by reducing common vulnerability and offering reliable traceability. The technology ensures safe transactions of data and minimizes the possibilities of manipulation of data in fields which involve sensitive data processes. Conclusion: Opportunities for the blockchain for secure data sharing are demonstrated across several industries through current advancements. However, it also has limitations that includes size ability, compatibility and legislation issues which still has to be solved. The study should therefore consider the following recommendations about the barriers outlined above in order to enhance the application of blockchain in secure data sharing in the future.
Blockchain Technology and Its Impact on Transparency, Security, and Efficiency in Supply Chain Management(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: The length and depth of global supply networks have been rising over time, causing permanent problems in visibility, protection, and performance. Blockchain technology has come out as a disruptive technology to tackle all these problems, by providing decentralized, safe and transparent systems. Objective: This article examines the use of blockchain technology within the context of supply chain and specifically digs deeper into the area of increased transparency, security and subsequent efficient supply chain transactions. The aim is to show that it is possible to transform resource supply chain operations through blockchain and engender trust amongst stakeholders. Methods: Permissioned blockchain system has been design and implemented using proof of Authority (PoA) consensus algorithm. IoT sensors were deployed to obtain data in real-time, and smart contracts were incorporated to perform the tasks of product evaluation, and payment authorization. The performance of the system was assessed depending on the indicators including the number of transactions per time or volume per time, time taken for each transaction, auditability and workability. Results: The combination of high transaction throughput with low latency made the blockchain system scalable as well as operationally stable. Smart contracts were able to minimize the time taken and mistakes, and improve the integration of IoT in relation to tracing transactions in real time. The system also proved that it had the ability to withstand cyber assaults and no data were compromised. Conclusion: Based on the analysis of supply chain problems, blockchain technology can be used to transform supply chain management. Furthermore, more studies should be done on the future compatibility of the usage of such a system with new technology trends and also its implementation in multiple regions supply chain to harness this valuable system.
Cloud-Native Architectures: Transforming Enterprise IT Operations(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: The cloud-native architectures have reinvented the original strategies of the companies’ IT infrastructure approach and became popular due to the concepts of modularity, scalability, and resilience. These architectures respond to the shortcomings of the monolithic architectures to meet the new business challenges and workloads, including embracing innovation technologies like Artificial Intelligence and big data processing solutions. Objective: This study was designed with the objective of assessing the performance and business viability of cloud-native systems, based on critical indicators such as availability, resilience to failure, resource use, and compatibility with innovative technologies. The objective was to define the barriers and possibilities for improving cloud native architectures in various enterprises. Methods: A cross-sectional research, consideration, experiment test and case study and performance analysis. Response time, CPU and memory consumption and recovery time were compared across the range of throughput from 1000 to 12000 requests per second. To enhance the interpretational framework, key usage scenarios in the three sectors of healthcare, retail and finance were collected and compared with the results. Results: Cloud-native systems proved to provide high availability rates (> 99.9%), resource scalability, and component resource efficiency. With the use of AI in combination with big data analytics, improvement in performance was realized. But some of the problems that were seen include vendor lock, integration issues, and fluctuating peak load issues. Conclusion: All identified improvements signify the potential of cloud-native architectures for improving enterprise IT functioning. It is thus possible to continue perfecting the identified challenges to enhance their effectiveness, optimal for the current dynamic digital environment.
Exploring the Synergy between AI and Cybersecurity for Threat Detection(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background : Security has been a major issue of discussion due to increase in the number and sophistication of Cyber threats in the modern era. Conventional approaches to threat identification might face difficulties in a number of things, namely the relevancy and the ability to process new and constantly evolving threats. Machine learning (ML) and deep learning (DL) based Approaches present AI as a potential solution to the problem of efficient threat detection. Objective : The article aims to compare the RF, SVM, CNNs, and RNNs models’ performance, computational time, and resilience in identifying potential cyber threats, such as malware, phishing, and DoS attacks. Methods : The proposed models were trained as well as evaluated on the NSL-KDD and CICIDS 2017 datasets. This was done based on common scheme indicators including accuracy, precision, recollection, F1 measure, detection rate of efficiency, AUC-ROC, False Alarm Rate (FAR), and the stability to adversaries. Rating of computational efficiency was defined by training time and memory consumption. Results : The findings indicate that the CNNs gave the best accuracy (96%) and resisted perturbation better, and the RF showed good performance with little computational load. RNNs have been proved effective in sequential data analysis and SVM also performed fairly well on binary data classification although there is a problem of scalability. Conclusion : CNNs used in AI models are the best solutions to protection from the threats in the cybersecurity space. Nevertheless, some of them still require computational optimization in order to make those beneficial in scenarios with a limited usage of computational resources. It is suggested that these findings can be used in the context of subsequent research and practical applications.
Harnessing Quantum Computing for Real-Time Data Analytics: A 2025 Perspective(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Quantum computing has brought in all new paradigm for computational processing providing unparallel ability for data analysis. Considering worldwide data production is expected to exceed 180 trillion zettabytes by 2025 the utilization of the conventional computing framework hampers the real-time processing of data. People consider quantum computing, which uses principles of quantum mechanics to solve problems 100 and 1,000 times faster than classical computing. Objective: The article looks at quantum computing and its relevance to real time data analytics to determine its relevance, hence its impact, by the year 2025. It is worthwhile to emphasize the comparison of quantum algorithms with traditional approaches to dealing with extensive, data-centered workloads in various fields. Methods: A comparison was made on quantum versus classical computing algorithms based on criteria such as, the flow rate, precision, and flexibility. Data sets provided by the finance stream, including real-time stock analysis, supply chain and logistics, genomic sequencing from the healthcare domain were used. Over 10 million simulation experiments were performed to gain trends and insights into the operational problems for quantum simulation. Results: The study establishes differences in the efficiencies of these two approaches, with quantum algorithms speeding up particular tasks as much as a hundred times higher than classical algorithms and almost 15% of the error rate being decreased if quantum error correction modes were used. In scalability tests it was shown that quantum systems could process data sets larger than 10 terabytes with little slowdown, compared to a classical system, which reduced efficiency by as much as 30%. However, in present day quantum hardware, processing the capability is limited and problems arise with regards the error correction protocol. Conclusion: Quantum computing, on the other hand, has an unconventional prospect of real-time data analytics to operate at high efficiency and big scale on data-bound concerns. However, much progress is required in the way of bettering coherence times and reducing exacting error rates, crucial advances for total realization of quantum potentialities by 2025
Integrating IoT, Artificial Intelligence, and Blockchain Technologies for the Development of Smart Networks(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: IoT Smart networks are the latest creation of smart technology where Internet of Things, Artificial Intelligence, and Blockchain technologies have merged. Such technologies have the possibility of increasing performance, security and the degree of expansion in different fields like smart city, health and manufacturing. As it is, there are several issues that organisations continued to encounter when implementing both these systems in order to address diversified network requirements. Objective: The study aims to define how IoT, AI, and Blockchain technologies can be integrated to develop smart networks and how their integration will address the issues of performance, data integrity, and resource utilization in smart networks. Methods: The solution consisted of three components: IoT for instant data gathering, AI for modeling and efficient traffic control, Blockchain for secure data storage. Analyses of various objectives such as data throughput, latency, energy consumption, and security were conducted for smart city applications through simulations. Results: The linked matrix obtained a 45% increase in data transfer rate, a 40% cut in response time and a 50% enhancement of power utilization compared to other systems. Purchases made using blockchain were correct to the last digit, achieved with a success rate of 99.9%, and there were no cases of hacking. AI algorithms minimized congestion levels of the network by 55%, and IoT devices remained available 98% of the time. Conclusion: The incorporation of the IoT, AI and Blockchain enhances the effectiveness and assures the stability of smart networks greatly. From these findings, there is a significant potential for broad utility thus the need for research on the scale, integration, and testing of these in practice.
Network Slicing for Customizing 5G Networks for Industry-Specific Needs(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Network slicing has turned out to be one of the key enablers in the 5G networks due to the ability to support the diverse applications such as ultra reliable and low latency communications for the self-driving cars or IoT-like massive machine type communications. Prior expeditions lacked integrated tools for the dynamic assignment and allocation of resources and no possibility for maintaining constant QoS. Objective: In this article, the primary aim is to synthesis and test a reinforcement learning–driven slicing framework in order to orchestrate the resources of the three types of slices – URLLC, mMTC, and eMBB. This is to improve the performance of the sliced resource, ensure high availability, and minimize competition of the resources in multi-tenant scenarios in 5G networks. Methods: The proposed study design includes a focus on the key stakeholders and their needs for requirements gathering and an experimental field for actual implementation. Resource distribution is guided by the reinforcement learning algorithms by trying to minimize a cost function which incorporates the relation between the latency, isolation, throughput and energy expended. Using a number of runs, quality of performance is monitored to enable assessment of stability as well as response rates. Results: Experimental results show that the proposed framework achieves a lower level of latency violations and capacity oversubscription compared to heuristic methods. Furthermore, it consistently achieves nearly 2.5X better throughput for telemedicine slices and guarantees less than 5 ms latency for time-sensitive services during dynamic traffic conditions. Conclusion: The study shows how reinforcement learning can be effective and applied for end-to-end 5G network slicing. This sort of adaptive orchestration can increase service dependability while optimising overhead and herald instantly climbable multi-tenant networks compatible with various industries
Neuromorphic Computing with a Paradigm Shift in Energy-Efficient and Scalable AI Hardware for Real-Time Applications(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Neuromorphic computing is a newly developed technology that is based on data-flow architectures similar to the brain, which has the potential to power energy-constrained, latency-sensitive, and large-scale applications. The lack of flexibility in energy consumption and response time of traditional systems is a problem where neuromorphic platforms shine in real-time applications like robotics, IoT and autonomous systems. Objective: The article aims to assess the capabilities of neuromorphic computing platforms with respect to conventional schemes, both quantitatively and qualitatively, in terms of energy consumption, response time, modularity, and application-dependent adaptability, and to determine the drawbacks and application prospects for its further development. Methods: The study uses a comparative analysis approach to compare the identified factors and make statistical comparisons of the performance measures. The performance of the neuromorphic platforms as compared to non-neuromorphic platforms like Intel Loihi, IBM TrueNorth, NVIDIA Tesla V100, and Google TPU is compared based on its applications in robotics, IoT, and especially in healthcare. Data is derived from the experimental assessments of knowledge and theoretical paradigms encountered in prior research studies. Results: Neuromorphic systems showed better energy consumption, system size, and delay characteristics. Nevertheless, that the algorithm so excellently solves particular tasks does not mean that it can successfully be used regardless of its purpose, or can be adapted freely to new, further-reaching trends, such as quantum computing. Regression results demonstrate a high degree of dependency between these measures as well as their potential for real time data processing. Conclusion: Neuromorphic computing can be regarded as a new paradigm of energy-efficient and scalable AI and is especially promising for latency-sensitive deployment. Their shortcomings have been discussed earlier, yet it is worth stating that extension of these approaches by hybrid systems and more sophisticated integration frameworks might open new opportunities and eventually promote them as a foundation for new-generation computation models.
Next-Gen Machine Learning Models: Pushing the Boundaries of AI(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Machine learning (ML) has developed significantly over the years, changing several industries through the use of automation and Big Data. By building better next-generation machine learning models, AI’s future has the potential of improving on existing problematic methods such as scalability, interpretability, and generalization. Objective: This article examines about how new generation of ML models are developed and used to explain about the capabilities of AI in different fields. In particular, it is focused on changes in structural models, certain methods of training them, and the application of brand-new technologies as quantum computing. Methods: A review of the state of the art and several case studies were carried out with regard to the latest work being done on different types of ML algorithms such as transformer models, reinforcement learning, and Neural Architecture Search. Moreover, the given models were tested in experiments concerning the applicability of these models in tasks including image recognition, natural language processing, and in autonomous systems. Results: The next-gen models, thereby outperformed the traditional models in terms of accuracy, computational speed, and flexibility. The identified benefits were decreased training time, better interpretability, and better performance with multi-modal and cross-domain tasks. Conclusion: These new generation of ML models are the game changers in AI development solving previous challenges while providing opportunities across numerous sectors. In this vein, further research in this field is needed to achieve AI’s solving of problems.
Quantum Cryptography in Telecommunications as a New Era of Secure Communications(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Quantum Key Distribution (QKD) has turned into a crucial point for secure communication in the era of quantum networks. Quantum key distribution provides the client with a theoretically secure key by taking advantage of the principles of quantum mechanics to counteract what could be posed by quantum computing to classical cryptography. Photons are lost in the system and there are some limitations which don’t allow scalability and integration with already existing networks. Objective: The study seeks to assess the viability of QKD systems, review some of the challenges associated with it, and investigate possible methods of utilizing both QKD and PQC to cope with new security threats in telecommunication industry. Methods: An in-depth analysis was made based on the experimental observations of key generation rates, photon loss, error correction, data throughput, and latency. Performance of quantum repeaters was experimented with for the purposes of measuring distance improvement abilities. A combined QKD-PQC approach was assessed for integrated integration for restricted settings. Results: QKD was seen to have high security and high performance in short distances and when quantum repeaters were implemented the distance could be greatly enhanced. In the QKD-PQC model, the rate of error correction, throughput, and scalability was noticed to be higher than in standalone QKD. Challenges that faced the work were photon loss, processing latency, and system vulnerabilities. Conclusion: New opportunities for secure communication are opened with QKD supported by quantum repeaters and hybrid cryptographic approaches. The technical and operational issues need to be resolved to realize the potential role of B3G evolution in enabling global telecommunications for the mass market.
Synergizing 5G and Artificial Intelligence: Catalyzing the Evolution of Industry 4.0(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: The marriage of 5G and Artificial Intelligence (AI) has been brought forward as a key enabler of Industry 4.0 and smart city applications. These technologies solve the problem of latency, scalability, and energy use, providing technology support for real-time decision-making and efficient organization of work. Nevertheless, studies regarding their individual and collective effects in a plethora of industrial and urban contexts are still limited. Objective: The objective of this research is to assess the performance, energy saving, and expansibility of 5G and AI synergies in manufacturing, logistics, healthcare, and smart city applications and highlight their challenges and potential for further exploration. Methods: An experimental data collection, mathematical modeling and comparative analysis approach was employed. Performance indicators including latency, possible and actual throughput, power usage, and predicting achievement were measured in real pilot tests implemented in dense networks and IoT contexts. Available data were compared with other similar studies to gain an understanding of the results. Results: The conjoin with 5G and AI suggested potential optimization of process; the latency has been decreased to more than 90%, its predictive maintenance was sharpened, and its power consumption was decrease to 75%. The feasibility of extending scalability and system reliability of the protocol was confirmed in dense IoT environments, with further potential for emission reduction. Conclusion: The study identifies the use of 5G in Industry 4.0 with AI in addressing dynamic issues but potential drawback includes scalability and security. More studies should be conducted on the novel hybrid architectures and 6G integration concerning more extensive areas.
The Role of Edge Computing in Enhancing IoT Performance in 2025(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: The growth of the number of connected devices and the extent of Internet of Things (IoT) integration has led to new and emerging needs such as the management of big data, real-time reaction, efficient bandwidth utilization, and security considerations. Due to the intrinsic latency, network load and argue of scalability, standard cloud computing models do not suffice these requirements. In response to this, edge computing the function of analyzing data closer to its source hence leading to performance gains. Objective: This article explores the impact of incorporating edge computing in the optimization of IoT systems specifically in aspects like latency minimization, bandwidth utilization, security, processing capability, flexibility in expansion, and data reliability. Methods: A combined computational model was used to mimic edge and cloud platforms. Performance metrics were evaluated under three primary IoT scenarios: traffic management of smart cities, industrial applications, and health care management applications. Regression models and confidence intervals also provided general support to the findings. Results: The findings showed edge computing to be a more effective substitute for cloud-based systems; proving that latency can be reduced by 82%, and data bandwidth by 65-68%. Perennial threats including interception of data were cut by 50-66% while processing was done at 73% higher efficiency. Other criteria such as scalability and data consistency also pointed out the application of edge computing for resilience in more extensive IoT environment. Conclusion: Essentially, edge computing helps overcome limitations of cloud-based IoT systems, and is therefore imperative to real-time, secure, and scalable IoT. Future work should consider the integration of hybrid edge-cloud models, self-healing schemes, and more robust rigorous security solutions in order to fine-tune its applicability.
The Role of Software-Defined Networking (SDN) in Modern Telecommunications(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Software-Defined Networking (SDN) is widely considered a new paradigm shift in today’s telecommunication evolving method of centralized control, program interface, and dynamic resource configuration. Members of such a network can be reached through single-hop or multi-hop communication and is, however, still faced with inexhaustible challenges in scalability, security, energy consumption as well as Quality of Service (QoS). Objective: Specifically, the article will seek to compare both SDN enabled network as well as legacy networks as regards to established parameters like scalability, security, power consumption, traffic control and path finding. The research aims to fill these gaps by employing state-of-art methods and offer useful recommendations of SDN implementation. Methods: Both simulation and analytical modeling were used to evaluate the proposed SDN architectures under different loads. Metrics were assessed with the congestion control based on the neural network, optimization involved the multiple objectives, and security assessment via game theory. Analyses for statistical significance further supported the performance enhancements determined. Results: The results show 44% improved latency, 33% better energy consumption, and better load balancing in SDN-enabled network. Neural network-based mechanisms were able to reroute 95% of the time under low traffic conditions, while distributed controller-based strategy had high scalability and security. Conclusion: This study points to the capacity of SDN to revolutionize the contemporary telecommunication with strong techniques for comprehensive problems. For the future work it is recommended to conduct validations in operational conditions, and include underdevelopment technologies into the system hierarchy to improve its flexibility and operation characteristics.
Advancing Sustainability in IT by Transitioning to Zero-Carbon Data Centers(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Cyber threats are changing constantly and these days more than 560,000 new malware varieties are launched daily, which means that rudimentary measures of protecting networks from attacks cannot be of much help in handling real time threats. Single-static security control and manual intervention are insufficient to address APTs, Zero Day, and high-volume DDoS attacks. This is where the application of AI in network security lays its foundation, where real time threat response programs become possible where they are trained to automatically identify, categorize, and mitigate highly complex attacks without requiring massive amount of time and effort. The changing role of AI in network security is examined in this work since it can contribute to the improvement of threat detection, decrease response time, and minimize reliance on human factors. This research reviews more than 150 AI-based security frameworks, and 25 case studies of different industries including finance, healthcare, telecommunications, to assess the efficiency of machine learning and deep learning algorithms for autonomous threat response. The insights show that in challenging contexts, AI-based solutions provide anomaly detection scores of up to 97%, which are far higher than those obtained by conventional systems with average scores of 80%. The response time increased up to 75% as the AI systems responded under 3 seconds during the large scale cyberattack simulation operations. Significant achievement of scalability was across networks with number of nodes more than ten thousand nodes at 90% reliability in different threat scenarios. These findings underscore the importance of AI as the cornerstone of today’s cybersecurity: delivering accurate and timely threat coverage and demonstrating high resilience to threat evolution. However, issues like, algorithm bias, ethical concerns, and resistance to adversarial perturbation calls the need for research to develop effective measures towards the longevity of banking security systems integrated with AI. This study emphasizes the importance of search for new strategies to strengthen current digital environments against the increasing number of threats.
Artificial Intelligence and Machine Learning in Telecommunications Revolutionizing Customer Experience and Enhancing Service Delivery(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: The telecommunications industry is at the crossroad of change seemingly precipitated by the use of Artificial Intelligence (AI) and Machine Learning (ML). These technologies have yielded new features like network automation, prescriptive analytics, and contextual-consumer engagement, solving traditional dilemmas in service delivery and operationalization. Objective: The current article seeks to understand how AI and ML has positively affected customer experience and service provision in the telecommunication industry. The research objectives focus on how to increase KPIs to service latencies, network reliability, and customer retention while at the same time establishing the problems associated with big data large-scale implementation. Methods: Samples were gathered using systematic reviews of the current literature, meta-analysis of case studies, and assessment of industry datasets. This concerned artificial intelligence enabled operations such as dynamic resource management, real-time customer emotions analysis and real-time fault detection. Regression analysis and time series models were used in order for measuring performance indices. Results: AI and ML integration led to multifaceted advancements: a decrease of average service latency by 55%, reduction of network downtime by 70%, and an increase of maintenance predictions accuracy by 35%. The customer retention rate which had improved to 25% was also credited to better personalization of the services as well as having proper service management. AI-equipped resource allocation also raised efficiency in bandwidth utilization by 60%. Conclusion: AI and ML are positively disrupting telecommunications as they deliver remarkable enhancements in the caliber of services and client satisfaction. With all the challenges in data governance and interoperability, it is clear that their adoption promises a great chance in enhancing the current standards within the telecommunications field and creating the basis for the development of a more sophisticated environment.
Drone-Based Network Coverage Expansion in 6G Networks(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: The emergence of 6G networks requires new approaches to extend coverage, increase network availability and optimize performance in difficult conditions, including urban and rural areas. Thus, UAVs or UAV systems have developed as a powerful candidate to counter these problems by offering on-demand contingent coverage and differing communication services. Objective: The opportunity of the development of UAVs’ application in the extension of the network’s coverage is studied in the context of energy efficiency, latency, and Inter-UE interference in high-density 6G environment. Methods: A three-layered optimization architecture was devised, including multi-agent reinforcement learning (MARL) for interference control, trajectory optimization techniques, and energy-aware deployment schemes. Small scale scenarios including urban, suburban and rural environment were considered and the results were analyzed based on the network coverage, energy efficiency, end to end latency and interference encountered on UAVs. Results: The outcome significantly revealed the enhancements in the spatial coverage of the network; UAVs prevented considerable gaps and offered enhancements of network coverage in rural and suburban regions. These achievements include up to 30.5% energy efficiency enhancement, more than 50% latency minimization and interference management that enabled 35.4% enhancement of SINR. Conclusion: Integrating of drones in 6G network is invaluable in enhancing coverage in the networks by providing massive coverage while at the same time providing scalable solutions to problems of coverage gaps, power demands and real-time network adjustments. In future studies, researchers should channel their efforts toward increasing real-time dynamism and energy consumption that suit large-scale executions.
Drones as Mobile 5G Base Stations with Expanding Coverage in Remote Areas(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: The rapid development of fifth-generation (5G) networks highlights challenges in extending coverage to remote and underserved areas due to infrastructure limitations and cost constraints. UAVs (drones) equipped with 5G base stations emerge as an innovative solution to this problem. Objective: This study aims to analyze the potential of drones as mobile 5G base stations to enhance connectivity in remote regions, addressing challenges like optimal deployment, energy efficiency, and user coverage. Methods: The research utilizes algorithms like Particle Swarm Optimization (PSO) and Grey Wolf Optimization (GWO) for placement and energy management of drone-based 5G stations. Simulation models were employed to test these algorithms, with key metrics including coverage efficiency and energy consumption. Results: The study shows that drone-based stations can significantly improve coverage in remote areas, achieving up to 95% user coverage with optimized algorithms. Tethered drones and advanced energy management strategies were instrumental in enhancing endurance. Conclusion: Drones as mobile 5G base stations present a feasible and scalable approach to bridging the digital divide in remote regions. However, energy and regulatory challenges remain critical areas for future research.
Drones for Disaster Recovery with Rapid Deployment of Communication Networks(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: UAV-assisted communication networks have emerged as vital tools for disaster recovery, offering rapid deployment and scalability in dynamic environments. However, challenges such as regulatory compliance, data security, energy efficiency, and real-time adaptability limit their widespread implementation. Objective: This study aims to develop a multi-objective optimization framework for UAV-assisted networks that enhances coverage efficiency, reduces latency, and optimizes energy consumption while addressing regulatory and data security challenges. Methods: The proposed framework integrates k-means clustering, genetic algorithms, and real-time adaptation mechanisms. Key metrics: coverage, latency, energy efficiency, and regulatory compliance, were evaluated across urban, suburban, and rural disaster scenarios. Dynamic geofencing, end-to-end encryption, and anomaly detection were incorporated to ensure compliance and secure operations. Results: The framework achieved significant improvements: coverage efficiency increased by 8%, latency reduced by 43%, and battery life extended by 33%. Regulatory compliance rose from 75% to 95%, and data security was enhanced with a 50% improvement in threat detection. The framework demonstrated robust scalability, maintaining high performance across diverse user densities. Conclusion: The study presents a scalable and adaptable UAV-assisted communication framework that addresses operational, regulatory, and security challenges. Its results validate its potential for real-world disaster recovery, paving the way for further innovations in this critical domain.
Green Telecommunications as An Innovations in Energy-Efficient Networking(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Telecommunication system plays a crucial role in fast development of energy demand growth and carbon dioxide emissions. As sustainability becomes part of corporate goals green telecommunications strive to bring innovation in energy efficiency. Objective: As part of examining the state of art developments in energy-efficient networking technologies and approaches to minimize power consumption in telecommunication facilities, the important global task of using green telecommunication for sustainable development goals is highlighted. Methods: A literature review and analysis were successfully performed to examine the use of advanced hardware technologies, SDN technology, NFV, and intelligent renewable energy integration. Some of the green telecommunication’s solutions that were implemented are explained with case studies in this article. Results: The studies reveal that new practices including energy-sensitive algorithms, state-of-art cooling solutions and integration of renewable power into Telecommunications networks have improved the energy efficiency standards. In addition, SDN and NFV also improve resource allocation of data centers, which also boosts energy efficiency. Conclusion: Green telecoms offer available strategies for cutting back energy use in telecoms sector. Mitigation of the environmental impacts can therefore be achieved through incorporation of Energy Efficiency measures and Renewable Energy Source technology to utility services without necessarily compromising quality of service delivery hence catalyzing the Advancement of the progress of sustainability.
Low-Latency Communication with Drone-Assisted 5G Networks(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Unmanned Aerial Vehicles (UAVs) utilizing and active interface with 5G networks has become the new frontier to tackling problems of latency and energy efficiency, interference, and resource management. Although prior researches explained the benefits of UAV integrated networks; overall assessment of various parameters and cases is still scarce. Objective: The article seeks to assess the performance of UAV integrated 5G network in terms of latency, power, signal quality, task coordination and coverage optimization and to ascertain the efficiency of optimization algorithms in the improvement of the integrated 5G network. Methods: Emulations were done in MATLAB and NS3 platforms in urban / suburban / emergency call settings. Latency, power consumption, SINR, and completion time were the performance indicator chosen in the paper. Optimization algorithms: Particle Swarm Optimization (PSO), and Genetic Algorithm (GA), and the Multi-Objective Evolutionary Algorithm (MOEA) is evaluated in terms of Convergence time and Solution quality. Results : UAV-aided networks showed 36.7% and 29.2 % improvement in latency and energy consumption, while 33.6 % enhancement in SINR. MOEA offered the best results with 98.3% solution quality, and the PSO being the most convergence oriented. Minor deviations between simulation and real results highlight the need for adaptive mechanisms. Conclusion: The results presented focus on the enough potential of UAV-assisted 5G networks and their potential influence on improving performances in case of different criteria. Further research should focus on successfully implementing and deploying the proposed solutions and broadening the context of study to include 6G technologies.
Quantum Key Distribution Protocols for Enhancing Cryptographic Resilience in Next-Generation 5G Network Infrastructures(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Quantum computing has posed a profound threat to the classical cryptographic systems as it is advancing at an exponential rate with the help of quantum algorithms like Shor’s and Grover’s which can easily decipher the Rivest–Shamir–Adleman (RSA) and Elliptic Curve Cryptography (ECC) algorithms. Huge requirements for cryptographic frameworks that can withstand quantum hacking have inspired Quantum Key Distribution (QKD), Post-Quantum Cryptography (PQC), and systems that use both. Objective: The aim of this article is to review the performance, scalability and integration of quantum-secure cryptographic services, with a practical lens on how they can be used in real-time environments like self-driving cars, industrial IoT, and intelligent health systems. It also aims at establishing the drawback of the current model and directions for further enhancement. Methods: The study employs simulative experimentation to understand lest exposures to quantum algorithms and rates cryptographic systems on standards such as latency, Quantum Bit Error Rate (QBER), computational overhead, scalability, and cost. Comparative assessment furniture integrated analysis of QKD, PQC, and hybrid system by identifying the advantages and disadvantage of each system. Results: As a result, adopting hybrid systems provided the best or comparable median results with lowest latency in real-time applications of ~45 ms or lower compared to alternative Multi-Access Edge Computing (MEC) architectures and types of security elements at high scalability. Thus, QKD, while being exceptional in security, has the problem of scalability, while PQC had average results on the given parameters. Conclusion : Quantum threats are adequately dealt with by hybrid cryptographic systems as this study has also pointed out. It is seen that initiation to future work may someday distribute resources effectively, expedite PQC standardization, and embrace artificially intelligent network frameworks for flexibility and expansiveness across different networks.
Revolutionizing Telecom Latency with Edge Computing and 5G(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: The telecommunications’ growth, especially with the emergence of 5G, has led to the requirement of low latency solutions. Current cloud computing models possess architectural flaws that prevent real-time service delivery, critical in applications of autonomous vehicles, augmented reality among others. Objective: This article reviews how edge computing can be combined with 5G networks to overcome the latency issues in today’s telecommunication systems. They look at how this combination can cut down latency by processing data closer to the end consumer and its potential to disrupt several industries. Methods: This research uses the literature review of current information in 5G and edge computing systems, architectures, practices, and theoretical frameworks. The result of the work is based on the assessment of the existing solutions in the implementation of edge computing within the 5G environment based on case analysis. Results: The analysis shows that all the applications such as self-driving cars and industrial robotics experienced 40 to 70% reduced latency. Also, edge computing results in better resources management in case of telecommunications since it deems many computing tasks to localized edge nodes from cloud. Conclusion: Combining edge computing with networking also provides a distinctive model for addressing latency problems while enhancing the network and boosting industry development. Concerning the research limitations, the future research should explore ways of improving the efficiency of resource allocation to meet the company’s needs and explore the scalability issues.
The Future of Airborne Networks Through Integrating Drones into Next-Gen Telecom(مقاله علمی وزارت علوم)
حوزههای تخصصی:
: Background: Unmanned aerial vehicle (UAV) networks as an important part of the modern telecommunication are gaining importance in various situations, including rural coverage, urban settings and emergency situations. However, Interference, scalability and energy efficiency still pose problems to the advancement of wireless networks. Objective: The aim of current study is to integrate and assess an adaptive frequency management technique for improving the performance of communication networks involving UAVs in terms of interference, transmission rates, and reliability within different deployment settings. Methods: Experimental and simulated studies were performed to evaluate the effectiveness of the algorithm in this combination. Performance measurements in terms of latency, throughput, packet loss, energy consumption and signal strength were made under rural, urban and emergency conditions. The adaptable algorithm used certain working frequencies depending on the interferences present and the network performance parameters recorded. Results: The algorithm showed very distinct enhancements in all models and positions, decreasing latency by 20.5%, enhancing throughput by 14.5%, and decreasing the packet loss by 57.6% in the urban site settings. Other executed experiments documented improved energy efficiency and communication reliability in rural and emergency situations. Conclusion: The adaptive frequency management algorithm proposed by the authors effectively solves significant issues of critical concern in UAV networks while offering robust scalability for next-generation telecommunications infrastructure. The future research recommendation incorporates the combination of the proposed optimization with other complementary approaches and/or the testing of the developed system at more severe actual conditions.
The Integration of Drones and IoT in Smart City Networks(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Smart city technology solutions have recently ramped up the utilization of drones with Internet of Things (IoT) technologies for improving smart city systems. IoT sensors combined with real-time communication ad hoc network drones are also another area with great potential including traffic monitoring, environment management, disaster management, etc. Nevertheless, issues regarding energy consumption and density, the number of nodes that can be incorporated into the network, as well as the issue of avoiding collisions between the signal sent by one node with the signals that may be transmitted by other nodes are still observed as essential impediments to the wide application of WSNs. Objective: The article seeks to propose and assess algorithms for operating drone-IoT systems whilst dealing with issues like energy efficiency, real-time data communication, avoiding mid-air collisions, and dealing with the increasing number of systems in crowded urban areas. Methods: This study utilizes a two-time algorithm technique that was adopted from the prior study. The first algorithm provides a method for speed and position control of drones, ensuring that the distance between the drones is sufficient and not violable. The second algorithm is centered on energy reduction, which selects the precise energy usage by employing path planning in real time. The effectiveness of these algorithms was determined using simulation models with respect to metrics including latency, energy consumption, and scalability. Results: The proposed system revealed the systems’ improvements in energy efficiency, fewer collisions, and strong scalability of drone management. Main conclusions possible to conclude during the experiment reveal the system’s generic aptitude to the different urban situations and its stability in changing traffic conditions. Conclusion: The article presents a scalable and efficient solution for extending drone applications to smart cities using IoT platforms. In this way, the results can serve as the further theoretical and experimental base for investigating the trends of management and the infrastructure of cities.
The Role of UAVs in Enhancing Network Resilience During Natural Disasters(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Failures of communication usually occur during natural disasters, therefore signaling the importance of flexible networks. Unmanned Aerial Vehicles (UAVs) are anticipated to solve this problem by acting as on-the-move networks in the disaster-stricken regions. However, barriers that include challenges in UAV control coordination, resources allocation as well as security of the data being drawn are still pushing the technology backward. Objective: The article seeks to design and analyze enhanced heuristics for employing UAVs in disaster communications to enhance performance, availability, and security. Methods: Both primary, semi-structured interview and survey and post-disaster reports as well as secondary, computational analysis based on MATLAB and NS - 3 simulations were used as the data collection technique. Five algorithms: Multi-UAV Coordination, Dynamic Resource Allocation with Security, Hybrid Communication Framework, AI-Driven Path Optimization, Privacy-Preserving Data Sharing were implemented and incorporated. Theoretical models built on the basis of multi- objective optimization and the theory of games confirmed work ability to scale. Results: The introduced algorithms increased coverage by 75%, decreased latency by 27 percent, and also introduced 30 percent energy efficiency. Average privacy compliance levels floated above 90%, and an advanced resource allocation model achieved equal distribution. All these enhancements were affirmed in urban, rural and mountainous regions further proving versatility and stability. Conclusion: The article proposes a framework for UAV-enabled disaster communication system that can incorporate advanced algorithm and theoretical models to overcome the coordination challenge while ensuring the efficiency and security of the system. The presented results can be considered to be the reliable base for the UAVs usage in disaster situations.
A Digital Twins in Smart Cities for Building Resilient Urban Infrastructures(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Digital twin (DT) technologies have become significant enablers of urban management, utilising real-time information, data analytics, and IoT connectivity to manage challenging urban issues. Nonetheless, existing studies reveal the capacity of the DTs, while their generalization, flexibility, and cross-disciplinary application for various urban environments are not thoroughly studied yet. Objective: This article aims to evaluate the effectiveness of DT technologies in improving traffic management, energy efficiency, infrastructure maintenance, and public safety across six case study cities: There are Singapore, Helsinki, Barcelona, Dubai, New York, and Tokyo. The study examines how DTs can be extended and implemented to target urban issues and how their use operational performance might be optimized. Methods: The study used quantitative data processing, on-line data analysis with factorization and machine learning, and assessment of the case studies. Quantitative measures which included traffic flow, energy loss, down time, and response to emergency situations were investigated pre and post DT application. The improvements mentioned were statistically confirmed, and the metrics of scalability and adaptability were evaluated in the course of the cities. Results: DT technologies increased traffic flow by up to 42.9%, reduced energy losses by 35%, minimum down time was 42%, emergency response was 44.9%. This was the case because the network had high IoT coverage and because DTs were applied to the context when it specifically needed them. Conclusion: The study proves that DTs can be implemented in different environments due to their flexibility to accommodate different urban conditions. AI and cross domain integration can add to the effectiveness of DT in general and both are inarguably now crucial for the management of contemporary urban environment.
Coordinated Communication Networks Using Drone Swarms for Advanced Telecommunication Systems(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background : The increasing demand for flexible, resilient, and high-performance telecommunication systems—especially in dynamic environments—has led to growing interest in the use of autonomous drones. Their mobility and adaptability make drone swarms a promising solution for enhancing communication networks, particularly in 6G and edge computing applications. Objective : This study explores the application of drone swarms to improve network formation, synchronization, and resilience in both urban and rural telecommunication scenarios, with an emphasis on their feasibility, robustness, and adaptability. Method : A series of simulations were conducted using multi-agent coordination algorithms and network optimization models under varying conditions. Key performance indicators including Packet Delivery Ratio (PDR), latency, energy efficiency, and system reliability were evaluated across different deployment scenarios. Results : The findings indicate that drone swarms achieved a 92% PDR, a significant improvement over the 75% observed in static wireless network (WN) bases. Additionally, average latency decreased by 35%, while energy efficiency increased by 28%. The swarm-based system maintained robust performance even with up to 20% node loss, demonstrating strong fault tolerance and adaptability. Conclusion : The study confirms the potential of drone swarms as a scalable and resilient solution to address critical telecommunication challenges such as disaster response, rural connectivity, and real-time data transmission. Future work should focus on addressing remaining deployment barriers, including regulatory concerns and seamless integration with existing telecommunications infrastructure.
Digital Transformation in Telecommunications from Legacy Systems to Modern Architectures(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Telecommunications has been rapidly moving from legacy systems to highly flexible modern architectures to accommodate the expanding demand on its services. This evolution is critical in providing the capacity needed for new technologies like 5G, IoT, and applications powered by AI. Objective: The study aims at establishing a literature review on the evolution from the more or less obsolete telecommunication structures to new generation digital structures, opportunity factors, technologies that facilitate this change as well as the value addition by this evolution. Methods: The literature review was followed by an examination of industry case studies of 50 telecommunications firms across the globe. The study looked at best practices including network resource utilization, operational price, and service delivery effectiveness, pre and post implementation of technologies like software-defined networking (SDN), network function virtualization (NFV), and cloud-native architectural strategies. Results: The analyses brought out the fact that with the new architectures, network scale up capabilities were enhanced by 70%, operation costs were brought down by up to 30% and service delivery rates were boosted by 40%. Nonetheless, 85% of the firms that implemented the software upgrade faced issues with system integration, which took fifteen months on average before the new system was fully incorporated, and the firms incurred an additional 20% in implementation costs in accommodating integration issues. Conclusion: Extension of telecommunication architectures towards digital landscape improves performance, capacity, and affordability thereby allowing the providers to address next generation applications. However, while making this transition, there are a number of risks that organizations have to face and it is very important to manage them in order to have maximum benefits from using new digital technologies.
Edge AI for Transforming Autonomous Systems and Telecommunications for Enhanced Efficiency and Responsiveness(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Enabling Edge Artificial Intelligence (Edge AI) to be implemented in autonomous systems and telecommunications can offer for improved real-time data, non-recurring latency, enhanced operational proficiency. Some empirical research suggests that Edge AI minimizes latency by 70%, enhances computing speed by 50%, and cuts bandwidth consumption by 30% in the most demanding cases. Objective: The purpose of this article is to investigate how Edge AI can serve as an enabling technology for the future of self-sustaining environments such as autonomous mobility and telecommunications in terms of measured utility and differentiation. Methods: Screening 120 refereed articles and 25 case studies connected to Edge AI application in telecoms and self-governing systems, this systematic looked-for patterns in the proximal research and promising agendas. The review encompassed research works concerned with latency minimization, bandwidth enhancement and enhancement in the processing capacity. Focus was made on application areas like self-driving cars, industrial IoT, and smart city platforms and performance analysis was made in these areas. Results: The current study prove that when employed in autonomous systems, Edge AI enhances decision making reaction time by 40-60%, while enhancing data traffic throughput within telecommunications networks by 35%. Further, Edge AI makes the overall energy consumption lower in IoT-based applications by cutting down the average usage by a quarter thus creating a sustainable network. Conclusion: Edge AI becomes a central tool in the development of self-driving cars and telecommunications, increased performance and ability to handle mass amount of data at a low latency. These developments place Edge AI at the base of the evolution of future intelligent systems as the basis for smarter and more responsive technological landscapes.
Emerging Trends in IT Governance to Addressing the Complexities and Challenges of 2025(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background : As digital transformation accelerates globally, effective IT governance has become critical for organizational success. With global spending on IT governance and risk management projected to reach $16 billion by 2025, emerging technologies such as artificial intelligence (AI), blockchain, and cloud computing are introducing new governance complexities that demand adaptive strategies. Objective : The article explores the key factors and anticipated trends in IT governance that are expected to shape organizational management by 2025. The aim is to understand how evolving technological landscapes influence governance models and risk management practices. Method : A qualitative methodology was adopted, involving a systematic review of 100 scholarly and industry articles focused on recent trends and future directions in IT governance. The analysis highlights issues related to risk management, regulatory compliance, cybersecurity, and technology integration. Results : The review revealed that 83% of organizations reported significant governance challenges due to technological disruption, while 68% indicated a transition toward decentralized governance models, particularly within blockchain-based systems. Additionally, AI-powered decision-making tools are projected to be adopted by over 70% of large enterprises for IT governance functions by 2025. Conclusion : The findings underscore the growing need for flexible and adaptive IT governance frameworks that align with both agile and traditional business objectives. By anticipating and addressing future risks and compliance demands, organizations can enhance their current governance strategies to remain resilient and competitive in the digital era.
Leveraging AI for Predictive Maintenance with Minimizing Downtime in Telecommunications Networks(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Telecommunications networks are exposed to numerous issues concerning equipment and that causes network outage, which proves very expensive. Basic maintenance methodologies like reactive or even scheduled preventive maintenance cannot cope up with the increasing trends in the facilities of telecom companies. Objective: The article examines how AI is applied to support predictive maintenance so that telecommunication networks can perform as intended with reduced downtime. Methods: The review of existing AI algorithms is presented, focusing on the ML models and deep learning methods. Network operations and maintenance logs are analyzed for data to assess the capabilities of the AI models in terms of prediction. It identifies and analyses such quantifiable parameters as the failure rate prediction accuracy and the response time cut. Results: Computerisation of the forecast maintenance revealed a corresponding decrease in equipment failure incidences and generally reduced time lost due to unscheduled stops. Through the improved network performance, the response to potential threats was quicker than before and services became more reliable and inexpensive to offer. Conclusion: To reduce network outages, reduce network vulnerability, and maximize the efficiency of telecommunications operations, the use of AI-based predictive maintenance can be viewed as a prospect. As technology advances, newer versions of AI algorithms will provide improved predictive strength and incorporation into the telecommunications system.
Optimizing Telecommunications Network Performance through Big Data Analytics: A Comprehensive Evaluation(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: The telecommunication industry is currently witnessing an unparalleled growth in traffic data with a concomitant growth in the complexity of networks. As operators seek to achieve high availability of the networks, it is almost compulsory to employ the BDA for improved quality of service and increased operational performance. Objective: The study aims to provide a systematic review of the deployment of BDA in enhancing the primary characteristic indicators of telecommunications networks, to include availability of upgraded latency and throughput levels and network dependability. Methods: The research method used was summed up by quantitative analyses of the key performance parameters of the networks, along with the qualitative results of case studies conducted with major telecommunications operators. Information was collected from multiple networks as well as analyzed with the use of machine learning to be able to predict possible performance issues. Results: The study demonstrates that there is the possibility for reducing latency utilizing BDA with enhancements of up to 40%. In addition, the throughput has been raised by an average of 30% and the predictable analytics lead to 25% reducing in network downtime to improve the reliability and satisfaction of the user experience. Conclusion: The information provided in this study highlights the importance of Big Data Analytics for the telecommunication industry, proving that the proper integration can bring tangible improvements to the existing networks. One future development that constitutes the need for innovative analytical technologies is the rise in data traffic and sophisticated network requirements.
A Pathway to Ultra-Fast Data Transmission for Next-Generation Networks through Terahertz Communication in 6G(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: As the demand for ultra-fast, low-latency communication continues to rise, Terahertz (THz) communication has emerged as a promising candidate for enabling next-generation 6G networks. However, environmental sensitivity and hardware challenges pose significant limitations. Objective: This study investigates the potential of THz communication to support ultra-high data transfer rates in 6G networks, with a focus on the impact of environmental conditions, hardware complexity, and modulation techniques. Method: Through simulation analysis under both optimal and adverse environmental conditions, the performance of THz communication was assessed. The study also explores emerging materials and adaptive technologies to mitigate performance degradation. Results: Under optimal conditions, THz communication demonstrated the ability to achieve data rates up to 8.5 Tbps with approximately 1 ms latency at 10 THz. However, in high humidity and non-line-of-sight (NLOS) scenarios, performance declined significantly, with the signal-to-noise ratio (SNR) dropping from 35 dB to 18 dB and the bit error rate (BER) increasing from 3×10⁻³ to 4×10⁻². Orthogonal Frequency Division Multiplexing (OFDM) outperformed Quadrature Amplitude Modulation (QAM) in BER under varying conditions. The integration of advanced materials such as graphene and photonic crystals, along with intelligent reflecting surfaces (IRS), showed promise in enhancing signal quality and thermal management. Conclusion: While THz communication exhibits strong potential for supporting the high-speed, low-latency demands of 6G, environmental vulnerabilities and hardware complexity remain key challenges. Future research should prioritize the development of cost-effective, scalable materials and adaptive technologies to improve performance and deployment feasibility in diverse conditions.
Advancements in Open RAN and the Decentralization of Telecom Networks(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: In the article, the author explores the possibilities of Open Radio Access Network (Open RAN) as a revolutionary idea to democratize telecom networks. Objective: The study aims to compare the efficiency, cost, flexibility, scalability, and performance of Open RAN against conventional RAN systems. Methods: The study used simulation, cost modeling and execution of real-world case studies with support from Rakuten Mobile, Vodafone, Telefónica, MTN, and DISH Network. The approach also employed prescriptive analytics to evaluate the deployment of relatively new paradigms like blockchain and AI into Open RAN environments. Results: The study shows that Open RAN leads to substantial CAPEX and OPEX cost saving with a further enhancement in the key network performance metric such as latency by 20% and throughputs by 25%. Additional improvements of 30% demonstrate that Open RAN is also an environmentally friendly solution. The validations also showed how it could expand to both heavily populated large cities and sparsely populated rural areas to improve both coverage and mobility. Conclusion: However, some of the disadvantages that surfaced include; the problem of compatibility, high costs of implementation in the initial stages, and compliance with set regulatory standards. These underscore the need for standardized and coherent protocols and frameworks to enable widespread implementation. Open RAN is highly transformative in modern telecommunications due to the fact that it is affordable, expandable and eco-friendly. Due to its Flexible/Modular design in combination with advanced technologies, it acts as key enabler for future networks such as 5G, 6G and more and tackles Global connectivity and efficiency problems.
Advancing Global Connectivity Through Low Earth Orbit Satellite Systems(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Satellite systems which orbit in Low Earth Orbit (LEO) are one of the innovative solutions to enable the connectivity in segregated geophysical parts of the world which are left out of the digital society, as they provide almost instantaneous, high-speed connections. Despite progress made there are barriers to deployment including lack of scalability, high costs, traffic management, and environmental vulnerability. Objective: The aim of the present study is enhancing the overall throughput and concurrently – steaking the LEO satellite networks reliable operation in various applications with usage of the effective traffic control and adaptive routing techniques meanwhile taking into account the costs and other factors. Methods: Quantitative and qualitative research was used in this study in which both theoretical and simulations of LEO satellite networks were used. The traffic engineering was done using Software-defined networking (SDN), whereas bio-inspired routing, including bee colony optimization algorithms, were evaluated for adaptive routing. Parameters like latency, throughout, packet loss, and costs that are exhibited to change with conditions like atmospheric interferences were also considered. Results: The results proved that the current latency can be cut by up to 60%, packet lost by up to 90%, with operating expenses slashed beyond 85% and resource utilization beyond 85%. Improved routing techniques improved transmission reliability over dynamic network loads; simulations have validated the environmental suitability of LEO networks. Conclusion: The article offers a coherent framework for the appropriate design of LEO satellite networks and discusses their ability to address the digital divide and guarantee economic, effective, and highly accessible computing access globally.
AI-Driven Drones for Real-Time Network Performance Monitoring(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: The growing complexity of telecommunications networks, fueled by advancements like the Internet of Things (IoT) and 5G, necessitates dynamic and real-time network performance monitoring. Traditional static systems often fail to address challenges related to scalability, adaptability, and response speed in high-demand environments. Integrating artificial intelligence (AI) with unmanned aerial vehicles (UAVs) presents a transformative approach to overcoming these limitations. Objective: This study aims to evaluate the effectiveness of AI-driven drones for real-time network performance monitoring, focusing on key metrics such as latency, signal strength, throughput, and anomaly detection. Methods: A comprehensive framework was developed, employing reinforcement learning (RL) for path planning and a hybrid temporal-spectral anomaly detection (HTS-AD) algorithm. Experimental validation was conducted using 10 UAVs across simulated and real-world environments, collecting over 3.2 million data points. Statistical analyses, including MANOVA and Bayesian regression, were used to evaluate performance. Results: The proposed system demonstrated significant improvements over traditional methods, including a 24.6% increase in anomaly detection accuracy, a 30% reduction in energy consumption, and 99.9% network coverage in high-density UAV deployments. Conclusion: AI-driven drones offer a scalable, efficient, and reliable solution for network monitoring. By addressing limitations of traditional systems, this study establishes a foundation for next-generation telecommunications infrastructure. Future research should focus on real-world deployment and hybrid security models.
Drone-Assisted Network Maintenance as a Revolutionizing Telecom Infrastructure(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Telecommunication infrastructure requires regular maintenance and upkeep for its networks’ matrices, but existing approaches have been associated with issues such as time consumption and concern costs, as well as safety hazards. Newer developments in drone technology present progressive opportunity through the improvement of current maintenance processes by means of automation, predictability, and real time computation. Objective: The article seeks to assess whether the use of drone in telecommunication maintenance enhances the operational productivity through increasing the efficiency, reducing cost, safety, environmental and scalability and in different terrains. Methods: The methods followed included the conduct of experimental surveys with drone operations in five different telecommunication settings. These areas of interest were inspection efficiency, the accuracy of condition-based maintenance, signal received signal power, delay reduction through edge computing, and energy consumption. Sophisticated numerical computations, like Kalman filters and various frameworks of edge computing, were used in this context to draw analytical insights on the collected data. Results: The methods that used drones lowered the time needed for inspections by ¾ and cut the expenses by 49.3% and increased safety and quality of the coverage. Predictive maintenance was found to have achieved 89.7% accuracy with the system response time being 246ms at different site. The results of energy consumption model depicted the errors under 2% confirming this approach’s suitability for operational planning. Conclusion: By evaluating the applicability of drones in telecoms maintenance, the paper shows that the notion of drones in this context is promising both now and in the future. These results signal existing and potential applications of drones is to incorporate drone technology into infrastructural management solutions to address emerging needs in the industry.
Smart Contracts and Blockchain: Transforming Telecommunications Contracts(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Smart contract is defined as a self-executing contract that runs on the distributed ledger technology, called block chain and has attracted much attention as a promising application for improving efficiency, accountability and reliability in telecommunications and related sectors. But problems like scalability issues, recurrent resource inefficiencies, and threats posed by new quantum computing technologies hinder their broad usage and effectiveness. Solving these problems is crucially important to further development of blockchain systems and to provide for them ongoing stability in complex contexts. Objective: Towards this goal, the current study proposes a comprehensive blockchain framework that incorporates these computational intelligence techniques and quantum-safe cryptography in an effort to address scalability, security, and efficiency issues. This research aims at solving practical problems and identifying the potential applications for blockchain in telecommunication and other fields. Methods: An evidence-based approach including detailed literature reviews, qualitative expert interviews, and simulation studies was adopted. Experimental conditions involved latency, throughput, energy, and scalability factors in order to assess single-photon detection. Telecommunications providers engaged in pilot tests to determine the practical usability of the system. Results: The improvement in the aspects of the system that was proposed were high improvements that were achieved as follows: 75% improvement in scalability, 25% improvement in latency, and the preferred quantum-resistant cryptography. Substantial gain in energy efficiency was estimated to be 40%, while field implementations ensured versatility of the system in the areas that differ from a city or even desert. Conclusion: These findings provide support to the proposition that blockchain systems hold the key to revolutionizing telecommunications. With that, the solution of the critical limitations of this research makes it the basis for further development to maintain blockchain technology secure, scalable, and sustainable in the quantum period.
The Future of Optical Fiber Networks for Speeding Up the Internet of Tomorrow(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: The availability of advanced digital technology and evolving need for high speed and low latency connections have put pressures on the existing optical fiber networks. New technologies like the Wavelength Division Multiplexing (WDM), Photonic Integrated Circuits (PICs), Mode Division Multiplexing (MDM) and Quantum Communication will be valuable towards the achievement of these demands. Objective: The study examines the capability, expansiveness, and cost-effectiveness of current and emerging optical fiber systems for the development of future Internet technology. The research also seeks to assess these formations to improve data transmission rates, network response time, secure and efficient networks’ solutions. Methods: This is a mixed methods study where both experimental and computational data were collected and analyzed accompanied by theoretical insight. The results that were compared included transmission rate, spectral efficiency, signal integrity and lifecycle costs. Specific work was done on multi-band WDM, PIC-based systems, optical QKD along with simulation studies on large scalable multi-core and mode-division architectures. Results: The article samples acknowledge improved network capabilities with increased transits per watt by 300% in multi-band WDM and reduction of latency levels by employing edge computing. The tested PIC-based systems were shown to be more efficient than the comparable existing systems and quantum communication proved to be reliable method for transmitting data over short to medium distances. Conclusion: Today, it can be stated that the advanced optical fiber technologies are of great value for the construction of high speed, large bandwidth and secure Internet connection. Their integration can reportedly conquer future connectivity issues but new development is required to come over the barriers of deployment and sustainability.
Trends and Challenges of Autonomous Drones in Enabling Resilient Telecommunication Networks(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: The advances in use of resilient telecommunication networks have shown the possible use of autonomous drones to support connectivity in unpredictable and complex terrains. Current network infrastructures have limitations in delivering optimized service in areas like traffic congestion, area of sparseness, disasters etc., which requires some form of innovation. Objective: The article is meant to propose a framework for using autonomous drones in practical telecommunication systems, with emphasis on the energy consumption, scalability, dependability, and flexibility of the solution for various situations. Methods: The study also uses other state-of-the-art approaches such as trajectory optimization, swarm coordination, dynamic spectrum management, and machine learning based resource allocation. Various slips were used on urban, rural, and disaster-sensitive scenarios to assess performance indices including energy input, network connectivity, signal strength, and lag time. The simulation results were supported by field experiments providing insights into various circumstances. Results: The simulation results of the actually proposed framework show network scalability enhancements, where coverage area involves up to 50 km² and power saving higher than 15%. The performance improvement included near perfect trajectory anticipation at a rate of 98%, while the utilization of resources was also optimized. Dynamic spectrum management was useful in reducing interference and increasing efficiency especially in areas of high density. Conclusion: The article promotes the use of UAV based telecommunication networks where challenging questions on scalability and reliability are raised and solved. Through the work presented, strong theoretical and empirical assumptions are made to foster concepts that will solidify next generation communication network.
5G Deployment in Rural Areas: Advancing Connectivity and Bridging the Digital Divide(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Digital literacy, education, and human right of internet are also hampered in rural areas keeping the rural people more backward in the technological advancement and not providing them better chance of socioeconomic development. The opportunity of implementing the new 5G technologies can give the new perspective in the elimination of such disparities manifested through the improvement of the connection, the decrease of latency, and the increase of data transfer rates. This research study established that the adoption of 5G for coverage in rural areas is challenged by technical, economic and policy factors. Objective: The article aims at analyzing the technical, economic, and social aspects of 5G technology in the rural context in order to identify such problem the development of appropriate solutions for infrastructure costs, spectrum availability, and consumers. In pursuing this goal, the research seeks to develop practical recommendations on enhancing deployment strategies to increase Internet access where it is scarce. Methods: The study therefore used an exploratory, theoretical-evaluative, and finally empirical approach that involved quantitative modeling, case studies, and key informant interviews. Latency, data throughput, and coverage stand as the most significant factors which were tested using network simulation. Level of cost was used to determine economic feasibility while data for qualitative analysis were obtained from a survey of policymakers and telecom operators, and focus group discussion with rural community leaders. Results: In the case of 5G, the implementation of the system led to a 75% reduction in latency, a 600% improvement in the data throughput, and a 300% increase in coverage area. This finding revealed that the partnerships as a deployment model were the most effective as they resulted to a 58%ROI and lowered infrastructure costs. Conclusion: The study reaffirms the innovation of 5G to rural areas and develops agriculture, healthcare services, and educational systems. Policies, investments in the right areas, optimal management of the available spectrum and engaging with the communities as required should be a key focus for deployment to be just.
Artificial Intelligence in Network Security with Autonomous Threat Response Systems(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: With the continued advance in cyber threats, traditional network security systems offer little returns to organizations. AI has turned out to be a useful technology in improving network security because it proactively identifies and responds to threats in a short time. Objective: This article seeks to discuss the role played by AI self-defending mechanisms in autonomous network security given their effectiveness in threat detection, response time, and the overall harm that can be caused to networks by cyber criminals. Methods: Three separate studies were made, including conventional security systems, and analytically compared them with the AI-driven system across 100 different network environments. Machine learning (ML), deep learning (DL), and other forms of AI were applied to identify and counteract distinct threats like viruses, phishing, and even DDoS attacks. Detecting accuracy, response time and ability to mitigate attacks where among some of the other factors that were examined. Results: Automated threat intelligence systems have a 92% accuracy while legacy systems only have 78%. Mean response time was also decreasing by 65% from 45 seconds to 15 seconds. A significant increase to attack mitigation rates was noted with fifty percent effectiveness of the AI programs averting 85 percent of the threats in the first 30 seconds of identification. Conclusion: Autonomous threat response systems substantiate AI, which function as a radically superior replacement to conventional network security structures, minimizing threat response time and boosting the overall threat neutralization outcome. Incorporation of these types of secure mechanisms into contemporary security landscapes is important as a means of counteraction against new forms of cyber threats.
Cybersecurity in the Age of Quantum Computing New Challenges and Solutions(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Mobile networks today specifically 5G require appreciable secure networks because of the emerging risks due to the growth in the deployment of network structures. Discovered weaknesses of cryptographic conventional methods to quantum computing breakthroughs make it necessary to develop quantum-resistant solutions. Objective: The article analysing Quantum Key Distribution (QKD) protocols in improving cryptographic performance in 5G networking environment, with emphasis on incorporating QKD into 5G network designs. Methods: The study performed both a systematic literature review and an evaluation of current QKD deployments, as well as a qualitative assessment of data derived from 20 key informant interviews on QKD in telecommunications and 15 technical reports. Latency and key generation rate experiments were both conducted with relay mechanisms including both trusted and untrusted optical fiber and wireless relay links, in addition to integration issues were explored using simulations over fiber and wireless emulated networks. Results: The outcomes emphasise that QKD brings radically enhanced key security in conjunction with low delay and high rate within integrated 5G architectures. Hybrid relay-based QKD augmented key generation rates by 23 % in comparison with previous techniques. There are also concerns associated with the implementation of internationally agreed on standards which include issues pertaining to non-compliance of the standards used in different countries and high costs involved when trying to implement these standards. Conclusion: QKD implementation also increases cryptographic protection of the 5G networks and makes infrastructures quantum-immune to threats originating from the quantum-age. To make it more widespread, additional standardization and a reduction in cost are required.
Adaptive AI-Driven Network Slicing in 6G for Smart Cities: Enhancing Resource Management and Efficiency(مقاله علمی وزارت علوم)
حوزههای تخصصی:
Background: Smart city evolution is fast-paced, and imposes severe demands on telecom infrastructures: it must be highly flexible and scalable for coping with bursty traffic loads and heterogeneous service needs. Legacy network systems are not well suited to handle the changing requirements of smart city environments with autonomous cars, IoT, and public safety systems. Objective : The study to offer an AI-native network slicing framework for 6G smart city networks in order to improve dynamic resource control and management. The framework aims to enhance the delay, energy, and resource performance metrics which are significant for smart city services. Method: To facilitate the real-time network resource orchestration depending on the changing traffic requirements and user preferences, the authors consider moving target defense adapted artificial intelligence with a Deep Reinforcement Learning (DRL) model. Simulations were carried out to compare the AI-native model to conventional and AI-supported slicing methods. Results : Simulation results validate that the AI-native network slicing framework outperforms current 5G solutions with 25% reduction in latency and 20% increase in energy efficiency. Furthermore, the model's online resource allocation scheme can enhance the utilization efficiency of the bandwidth and the energy by 15% compared with the traditional approaches. Such improvements especially in critical applications like traffic management, emergency response, and health care would be important. Conclusion: The presented results demonstrate that AI-native network slicing is a viable, flexible, and scalable solution for 6G smart city networks. The framework is designed to support the future sustainable and high-performance requirements of urban infrastructures, providing both energy-efficient real-time adaptability. This study provides an overarching front-to-end outlook to address the management issues of sophisticated resource systems, and puts AI-native network slicing at the base level of the emerging smart cities.