"Deploying On-Device AIGC Inference Services in 6G via Optimal MEC-Devi" by Changshi Zhou, Weiqi Liu et al.
 

Deploying On-Device AIGC Inference Services in 6G via Optimal MEC-Device Offloading

Document Type

Article

Publication Date

1-1-2024

Abstract

From AI-assisted art creation to large language model (LLM)-powered ChatGPT, AI-generated contents and services are becoming a transforming force. It calls for the telecom industry to embrace the prospects of AIGC services and face the unique challenges posed by incorporating generative model services into the AI-native 6G wireless network paradigm. We propose enabling AIGC inference services on mobile devices by optimizing MEC-device computing offloading, through which AIGC task latency is minimized by reinforcement learning based policy agent in a computing resource constrained and bandwidth limited wireless environment. Simulation results are presented to demonstrate the performance advantage.

Identifier

85209464414 (Scopus)

Publication Title

IEEE Networking Letters

External Full Text Location

https://doi.org/10.1109/LNET.2024.3490954

e-ISSN

25763156

First Page

232

Last Page

236

Issue

4

Volume

6

This document is currently not available here.

Plum Print visual indicator of research metrics
PlumX Metrics
  • Citations
    • Citation Indexes: 1
  • Usage
    • Abstract Views: 1
  • Captures
    • Readers: 6
see details

Share

COinS