| Andrew Cooke | Contents | Latest | RSS | Twitter | Previous | Next

C[omp]ute

Welcome to my blog, which was once a mailing list of the same name and is still generated by mail. Please reply via the "comment" links.

Always interested in offers/projects/new ideas. Eclectic experience in fields like: numerical computing; Python web; Java enterprise; functional languages; GPGPU; SQL databases; etc. Based in Santiago, Chile; telecommute worldwide. CV; email.

Personal Projects

Lepl parser for Python.

Colorless Green.

Photography around Santiago.

SVG experiment.

Professional Portfolio

Calibration of seismometers.

Data access via web services.

Cache rewrite.

Extending OpenSSH.

C-ORM: docs, API.

Last 100 entries

Lepton Decay Irregularity; An Easier Way; Julia's BinDeps (aka How To Install Cairo); Good Example Of Good Police Work (And Anonymity Being Hard); Best Santiago Burgers; Also; Michael Emmerich (Vibrator Translator) Interview (Japanese Books); Clarice Lispector (Brazillian Writer); Books On Evolution; Looks like Ara (Modular Phone) is dead; Index - Translations From Chile; More Emotion in Chilean Wines; Week 7; Aeon Magazine (Science-ish); QM, Deutsch, Constructor Theory; Interesting Talk Transcripts; Interesting Suggestion Of Election Fraud; "Hard" Books; Articles or Papers on depolarizing the US; Textbook for "QM as complex probabilities"; SFO Get Libor Trader (14 years); Why Are There Still So Many Jobs?; Navier Stokes Incomplete; More on Benford; FBI Claimed Vandalism; Architectural Tessellation; Also: Go, Blake's 7; Delusions of Gender (book); Crypto AG DID work with NSA / GCHQ; UNUMS (Universal Number Format); MOOCs (Massive Open Online Courses); Interesting Looking Game; Euler's Theorem for Polynomials; Weeks 3-6; Reddit Comment; Differential Cryptanalysis For Dummies; Japanese Graphic Design; Books To Be Re-Read; And Today I Learned Bugs Need Clear Examples; Factoring a 67 bit prime in your head; Islamic Geometric Art; Useful Julia Backtraces from Tasks; Nothing, however, is lost with less discomfort than that which, when lost, cannot be missed; Article on Didion; Cost of Living by City; British Slavery; Derrida on Metaphor; African SciFi; Traits in Julia; Alternative Japanese Lit; Pulic Key as Address (Snow); Why Information Grows; The Blindness Of The Chilean Elite; Some Victoriagate Links; This Is Why I Left StackOverflow; New TLS Implementation; Maths for Physicists; How I Am 8; 1000 Word Philosophy; Cyberpunk Reading List; Detailed Discussion of Message Dispatch in ParserCombinator Library for Julia; FizzBuzz in Julia w Dependent Types; kokko - Design Shop in Osaka; Summary of Greece, Currently; LLVM and GPUs; See Also; Schoolgirl Groyps (Maths); Japanese Lit; Another Example - Modular Arithmetic; Music from United; Python 2 and 3 compatible alternative.; Read Agatha Christie for the Plot; A Constructive Look at TempleOS; Music Thread w Many Recommendations; Fixed Version; A Useful Julia Macro To Define Equality And Hash; k3b cdrom access, OpenSuse 13.1; Week 2; From outside, the UK looks less than stellar; Huge Fonts in VirtualBox; Keen - Complex Emergencies; The Fallen of World War II; Some Spanish Fiction; Calling C From Fortran 95; Bjork DJ Set; Z3 Example With Python; Week 1; Useful Guide To Starting With IJulia; UK Election + Media; Review: Reinventing Organizations; Inline Assembly With Julia / LLVM; Against the definition of types; Dumb Crypto Paper; The Search For Quasi-Periodicity...; Is There An Alternative To Processing?; CARDIAC (CARDboard Illustrative Aid to Computation); The Bolivian Case Against Chile At The Hague; Clear, Cogent Economic Arguments For Immigration; A Program To Say If I Am Working; Decent Cards For Ill People; New Photo

© 2006-2015 Andrew Cooke (site) / post authors (content).

Burrows-Wheeler Transform

From: "andrew cooke" <andrew@...>

Date: Fri, 17 Feb 2006 09:26:10 -0300 (CLST)

This is very neat.  A reversible transform that makes data trivial to
compress.

The Transform:

Take the text, of length n, and generate n different texts from the
rotations available.  Note that the last character of each line is the
prefix to the first.  Then sort and note the index of the "correct"
(unrotated) line.

The transform is the index and the final characters of the sorted list.

The Inverse:

The transform characters have all the charcters in the text.  So the
leading character column can be constructed by sorting.  This gives the
first and last character of each rotated string.  Which is a sequence of
(character, next character pairs).  Which can be chained together to give
the original text (except for an arbitrary rotation, which is corrected by
the index).

The compression:

The first characters of each line are sorted.  The transform characters
are their prefixes.  So the transform will be strongly ordered if
character pairs are strongly correlated.


Wikipedia explanation -
http://en.wikipedia.org/wiki/Burrows-Wheeler_transform

Article - http://dogma.net/markn/articles/bwt/bwt.htm

Paper - http://citeseer.ist.psu.edu/76182.html

(This is the bzip/bzip2 compression method; note that it requires the
whole input - or at least a large block size - and that decompression is
faster than compression).

From http://pmd.sourceforge.net/cpd.html
From someone on pragprog (sorry, lost reference)

Andrew

Second Order Correlations?

From: "andrew cooke" <andrew@...>

Date: Fri, 17 Feb 2006 13:20:36 -0300 (CLST)

Seems to me that the sort on the input is arbitrary.  In the different
descriptions I've read it's lexical.  But why?  It seems to me that you
could fine-tune the algorithm by using a different ordering - one, for
example, that keeps letters that occur together as close as possible.

This would increase compression because it reduces "churn" as the lead
character changes (and, to a lesser extent, for subsequent characters).

Since the sorting is necessary for the inverse there's an additional
problem - the sorting order has to be one of:
- universal
- encoded in the file (extra length)
- inferred from the transform itself

The third option is attractive because the ordering seems to reflect
similar information to the (character, prefix) pair information in the
transform.  But off the top of my head I don't see how it helps.

On the other hand, how much would this help?  As a first approximation
perhaps it is only significant when the first character changes.  That
occurs, proportionally, less and less often in large texts (since the
alphabet is fixed).

Still, it's tempting to play around...

Comment on this post