Statistical models for term compression

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Summary form only given. Computing systems frequently deal with symbolic tree data structures, which are also known as terms in universal algebra and logic. Our goal is to develop universal, effective and efficient term compression techniques superior to specialized or universal compression techniques currently available. Our approach is to use knowledge of term structure to build accurate universal statistical models of terms. These models can compress terms faster or more effectively than comparable sequential methods. We present two statistical term models that are related to Markov random fields over trees. These models gather statistical information about parent-child symbol relationships in terms. Huffman or arithmetic codes generated from these probability estimates are used to encode the terms. In the first model, a symbol's value is predicted by the value of its parent symbol alone. Thus, in compressing a subterm of the form t+u, the +operator would be used to select a specialized code for the root symbols of subterms t and u. The second model also uses the symbol's argument position as a predictor. For example, in compressing t+u, we would make different probability estimates for the first and second arguments, and use one code to encode the first children of + and another code for the second children. It might be the case that t+1 (but not 1+t) occurs frequently for many different terms t, in which case we could give 1 a shorter code as the second child of +. We have not achieved our goal of improved term compression, but we believe that more sophisticated and more effective techniques remain to be investigated. For example one improvement would be to make a term version of PPM in which contexts are ancestor symbols in the term rather than predecessors in a sequence. Our implementation could be viewed as a first step in that direction
Original languageEnglish
Title of host publicationData Compression Conference, 2000. Proceedings. DCC 2000
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages550-
ISBN (Print)0-7695-0592-9
DOIs
Publication statusPublished - 2000

Keywords

  • Huffman codes
  • Markov processes
  • arithmetic codes
  • data compression
  • estimation theory
  • mathematical operators
  • prediction theory
  • probability
  • statistical analysis
  • tree data structures
  • Markov random fields
  • operator
  • parent-child symbol relationships
  • probability estimates
  • statistical models
  • symbol argument position
  • symbol prediction
  • symbolic tree data structures
  • term compression
  • trees
  • Algebra
  • Arithmetic
  • HTML
  • Logic functions
  • Markup languages
  • Predictive models
  • Probability
  • SGML
  • Tree data structures

Fingerprint

Dive into the research topics of 'Statistical models for term compression'. Together they form a unique fingerprint.

Cite this