🎉 New Year Sale: Get 20% OFF on all plans — Use code NEWYEAR2026.

Upgrade now
Authored By: Shuai Ma, Huayan Qi, Hang Li, Guangming Shi, Yong Liang, Naofal Al-Dhahir

A Theory for Semantic Channel Coding With Many-to-one Source

Jan 26, 2024

I. INTRODUCTION

According to the classic information theory established by Claude Shannon in 1948, communication systems advances are made by exploring new spectrum utilization methods and new coding schemes. However, due to the explosive growth of intelligent services, such as augmented reality/virtual reality, holographic communication, and autonomous driving, the fifth generation communications (5G) system is facing many bottlenecks: channel capacity is approaching Shannon limit, source coding efficiency is close to the Shannon information entropy/rate distortion function limit, the energy consumption is huge, and high-quality spectrum resources are scarce. To meet the development needs of future sixth generation communications (6G), there is an urgent need for new information representation space and degrees of freedom to improve communication efficiency and transmission capacity.

Semantic communications extract semantic features from raw data, then encode and transmit semantic information, which are expected to alleviate the bottlenecks faced by the current communication networks. Back in 1949, Weaver proposed a three-level communication theory as follows,

  • Technical level: How accurately can the symbols of communication be transmitted?
  • Semantic level: How precisely do the transmitted symbols convey the desired meaning?
  • Effectiveness level: How effectively does the received meaning affect conduct in the desired way?

Essentially, semantic communications reduce communication resource overhead, i.e., bandwidth, power consumption or delay, by exploiting computing power, which focuses on the accurate transmission of semantic information. Furthermore, semantic communications transform traditional grammatical communications into content-oriented communications, which will be one of the key technologies of 6G communications.

A. Related works

Recently, with the great progress of artificial intelligence (AI), neural networks can extract semantic information, such as images, text, and speech, which makes semantic communications feasible. However, there are still many open problems in semantic communications, especially lacking information metrics and theoretic guidelines to implement and analyze semantic communications.

To measure the quantity of semantic information for a source, many works proposed semantic entropy based on logical probability, fuzzy mathematics theory, language understanding model or the complexity of query tasks. Specifically, in 1952, Carnap and Bar-Hillel used propositional logic probability instead of statistical probability to measure the semantic information contained in a sentence, that is, the higher the probability of a sentence being logically true, the less semantic information it contains. The amount of semantic information was represented based on the distance from the real event, which needs the “true” semantic event as a reference...

Besides, in contrast to traditional communications, semantic communications can achieve accurate transmission of semantic information with a non-zero bit error rate. However, how to establish semantic channel capacity is still a challenge issue, and some works tried to prove the achievability of the semantic channel capacity. The “semantic channel capacity” was derived based on the semantic ambiguity and the logical semantic entropy of the received signal...

B. Contributions

With the above motivations, we propose a more general definition of semantic entropy with the help of Shannon information entropy and prove the semantic channel coding theorem for a typical semantic communications...

II. GENERAL DEFINITION OF SEMANTIC ENTROPY

Shannon entropy is a functional of the distribution of a random variable, which does not depend on the actual states taken by the random variable, but only on their probabilities. As a result, Shannon entropy can not be directly applied to measure the semantic information. Different from Shannon information theory, semantic information has the following two characteristics...