top of page

Conference paper presentations

​

Paper title: Deep Learning-Based Channel Adaptive Resource Allocation in LoRaWAN.

Conference Details:  The 21st IEEE International Conference on Electronics, Information, and Communication (ICEIC 2022), 6-9 Feb, Jeju,  Korea.

Paper title: Feasibility Study of the LoRaWAN Blind Adaptive Data Rate.

Conference Details:  Twelfth IEEE International Conference on Ubiquitous and Future Networks (ICUFN), pp: 1-4, 17-20 August 2021, Jeju Island, Korea.

Paper title: An Improved Adaptive Data Rate for LoRaWAN Networks.

Conference Details:  IEEE International Conference on Consumer Electronics - Asia (ICCE-Asia), pp: 1-4, 1-3 November 2020, Seoul, Korea.

Paper title: Scalability of LoRaWAN in an Urban Environment: A Simulation Study.

Conference Details:  Eleventh IEEE International Conference on Ubiquitous and Future Networks (ICUFN), pp: 677-681, 2-5 July 2019, Zagreb, Croatia.

Paper title: Interference-Aware Spreading Factor Assignment Scheme for the Massive LoRaWAN Network.

Conference Details:  IEEE International Conference on Electronics, Information, and Communication (ICEIC), pp: 1-2, 22-25 Jan. 2019, Auckland, New Zealand.

Poster Presentations

​

Paper title: An Improved Adaptive Data Rate for LoRaWAN Networks

Conference Details:  2020 IEEE International Conference on Consumer Electronics - Asia (ICCE-Asia),  1-3 Nov. 2020, Seoul, Korea (South).

Ph.D. Thesis 

​

Thesis title: Reactive and Proactive Resource Allocation for LoRa-Enabled IoT Applications.

​

Summary:  A long-range wide area network (LoRaWAN) is one of the leading communication technologies adopted for the Internet of Things (IoT) applications. To fulfill the IoT-enabled application requirements, LoRaWAN employs an adaptive data rate (ADR) for the efficient communication resource [spreading factor (SF) and transmit power (TP)] to the devices, recommended for static IoT applications, such as smart grid and metering. However, ADR cannot consider any appropriate measures to predict and provide evasive measures to alleviate the massive packet loss that is caused due to the inefficient SF and TP utilized by static and mobile IoT applications. Therefore, this dissertation resolves the packet loss issue using reactive, proactive, and hybrid resource allocation paradigms.

 

A proactive solution called the "AI-empowered resource allocation (AI-ERA) framework" was the most prominent among these approaches. First, the AI-ERA framework trained the AI model with an offline dataset (containing channel conditions) generated with the ns-3 simulator. Second, the pre-trained AI model was employed online for efficient resource allocation to mobile and static IoT applications (e.g., asset tracking and smart metering). Compared to state-of-the-art approaches, simulation results of the proposed paradigms showed an improved packet success ratio, energy consumption, and convergence period. Therefore, these proposed methods are highly suitable for IoT applications requiring reliability, low convergence period, high packet success ratio, and ultra-low energy consumption.

bottom of page