In a new review paper, scientists from the Quantum Information Science Section at Oak Ridge National Laboratory and the University of Michigan presented a survey of data compression algorithms with a focus on edge computing, which processes data at or near sensors.
Compressing data saves storage space and network bandwidth.
Classical computing stores information in bits equal to 0 or 1.
Quantum computing stores information in qubits, which can exist in more than one state simultaneously and can carry more information than classical bits.
Classical data compression is pretty well defined, but not quantum compression.
University of Michigan researcher Maryam Bagherian and her colleagues wanted to identify where quantum compression stands as a new enabling tool for edge applications so we can start more conversations on a definition and standards.
“Edge computing aims to address the challenges associated with communicating and transferring large amounts of data generated remotely to a data center in a timely and efficient manner,” they said.
“A central pillar of edge computing is local (i.e., at- or near-source) data processing capability so that data transfer to a data center for processing can be minimized.”
“Data compression at the edge is therefore a natural component of edge workflows.”
“Not all compression algorithms can accommodate the data type heterogeneity, tight processing and communication time constraints, or energy efficiency requirement characteristics of edge computing,” they added.
“We discussed specific examples of compression algorithms that are being explored in the context of edge computing.”
The authors surveyed techniques for compressing data generated by sensors in edge computing and compared classical techniques with quantum approaches, which are mostly in development.
“We provided a brief survey of emerging quantum compression techniques that are of importance in quantum information processing, including the…
Read the full article here