Mohammed Isam Al-Hiyali

Mohammed Isam Al-Hiyali

مطالب
ترتیب بر اساس: جدیدترینپربازدیدترین

فیلترهای جستجو: فیلتری انتخاب نشده است.
نمایش ۱ تا ۲ مورد از کل ۲ مورد.
۱.

Leveraging AI for Predictive Maintenance with Minimizing Downtime in Telecommunications Networks(مقاله علمی وزارت علوم)

کلیدواژه‌ها: Predictive maintenance artificial intelligence (AI) Machine Learning Telecom Networks Downtime Reduction Network Reliability deep learning Failure Prediction Operational Efficiency Network Optimization

حوزه‌های تخصصی:
تعداد بازدید : ۲۹ تعداد دانلود : ۴۶
Background: Telecommunications networks are exposed to numerous issues concerning equipment and that causes network outage, which proves very expensive. Basic maintenance methodologies like reactive or even scheduled preventive maintenance cannot cope up with the increasing trends in the facilities of telecom companies. Objective: The article examines how AI is applied to support predictive maintenance so that telecommunication networks can perform as intended with reduced downtime. Methods: The review of existing AI algorithms is presented, focusing on the ML models and deep learning methods. Network operations and maintenance logs are analyzed for data to assess the capabilities of the AI models in terms of prediction. It identifies and analyses such quantifiable parameters as the failure rate prediction accuracy and the response time cut. Results: Computerisation of the forecast maintenance revealed a corresponding decrease in equipment failure incidences and generally reduced time lost due to unscheduled stops. Through the improved network performance, the response to potential threats was quicker than before and services became more reliable and inexpensive to offer. Conclusion: To reduce network outages, reduce network vulnerability, and maximize the efficiency of telecommunications operations, the use of AI-based predictive maintenance can be viewed as a prospect. As technology advances, newer versions of AI algorithms will provide improved predictive strength and incorporation into the telecommunications system.
۲.

Neuromorphic Computing with a Paradigm Shift in Energy-Efficient and Scalable AI Hardware for Real-Time Applications(مقاله علمی وزارت علوم)

کلیدواژه‌ها: Neuromorphic computing AI hardware spiking neural networks (SNNs) brain-inspired architecture Loihi TrueNorth Energy Efficiency real-time processing edge computing scalable AI systems

حوزه‌های تخصصی:
تعداد بازدید : ۳۲ تعداد دانلود : ۳۳
Background: Neuromorphic computing is a newly developed technology that is based on data-flow architectures similar to the brain, which has the potential to power energy-constrained, latency-sensitive, and large-scale applications. The lack of flexibility in energy consumption and response time of traditional systems is a problem where neuromorphic platforms shine in real-time applications like robotics, IoT and autonomous systems. Objective: The article aims to assess the capabilities of neuromorphic computing platforms with respect to conventional schemes, both quantitatively and qualitatively, in terms of energy consumption, response time, modularity, and application-dependent adaptability, and to determine the drawbacks and application prospects for its further development. Methods: The study uses a comparative analysis approach to compare the identified factors and make statistical comparisons of the performance measures. The performance of the neuromorphic platforms as compared to non-neuromorphic platforms like Intel Loihi, IBM TrueNorth, NVIDIA Tesla V100, and Google TPU is compared based on its applications in robotics, IoT, and especially in healthcare. Data is derived from the experimental assessments of knowledge and theoretical paradigms encountered in prior research studies. Results: Neuromorphic systems showed better energy consumption, system size, and delay characteristics. Nevertheless, that the algorithm so excellently solves particular tasks does not mean that it can successfully be used regardless of its purpose, or can be adapted freely to new, further-reaching trends, such as quantum computing. Regression results demonstrate a high degree of dependency between these measures as well as their potential for real time data processing. Conclusion: Neuromorphic computing can be regarded as a new paradigm of energy-efficient and scalable AI and is especially promising for latency-sensitive deployment. Their shortcomings have been discussed earlier, yet it is worth stating that extension of these approaches by hybrid systems and more sophisticated integration frameworks might open new opportunities and eventually promote them as a foundation for new-generation computation models.

پالایش نتایج جستجو

تعداد نتایج در یک صفحه:

درجه علمی

مجله

سال

حوزه تخصصی

زبان