Phillip Pearson - web + electronics notes

tech notes and web hackery from a new zealander who was vaguely useful on the web back in 2002 (see: python community server, the blogging ecosystem, the new zealand coffee review, the internet topic exchange).


Repairing MetaKit databases

WARNING: this wasn't the magic fix I hoped for, and may have contributed to a real crash a day or so later. Use at your own risk

The Python Community Server has been a little unstable over the last few days, and it's reminding me of what happened a year or so ago when the database file got corrupted after a system crash. Basically, the server process suddenly starts using huge quantities of CPU time for no apparent reason. This time, I can just restart it and it goes back to normal, but last time it was hanging completely.

The solution was to take the MetaKit database and dump out all the data, then create another database with the same structure and fill it up with the data. Radio/Frontier calls this 'compacting', and Zope calls it 'packing', I think -- it's not exactly an uncommon thing to do. However, I don't know of any utilities that will pack MetaKit databases.

Does anybody know of such a tool?

Update: Scratch that request, I've done it myself. Wasn't hard - MetaKit lets you just pick up a view and drop it into another database. Great!

import metakit, re

# path to your (possibly broken) settings.dat file:
FN = 'settings.dat'
# where to save the compacted version:
NEW_FN = 'settings_2.dat'

# open the databases
s =, 1)
new_s =, 1)

# we'll build up a copy of the description string here, just to make
# sure we didn't do anything dumb.

total_desc = []
def process_table(desc):
    "copy a single table between the databases"
    print "copying table:", desc

    global total_desc

    # get the table from the source, and create it on the dest
    src = s.getas(desc)
    dest = new_s.getas(desc)

    # copy all rows
    for row in src:


def main():
    # grab full database description
    desc = s.description()

    # split it up into tables and process them one at a time
    sofar = []
    add = sofar.append
    depth = 0
    for c in desc:
        if c == '[':
            depth += 1
        elif c == ']':
            depth -= 1
        if depth == 0 and c == ',':
            sofar[:] = []

    # all done: make sure we really did process them all
    assert ",".join(total_desc) == desc, "didn't get everything!"

    # and save ...

if __name__ == '__main__':
    print "done."

... more like this: [, , ]