Beware of ECS A740GM-M and 4+ GB of ram

My ZFS storage server needed more ram so that I could get the VM’s running on it a bit more elbow room. I got a pair of 4GB 240 Ram chips and they were a no go on the ECS A740GM-M motherboard I have. Seems that the BIOS screen needs to have the option to remap memory, which it does not. Too bad the manufactures page for it says it supports up to 16GB

Updating GCD Data

So you have loaded the Grand Comicbook Database into a local postgresql instance and wrote some code that makes use of the data… They just did a new data dump… Now how do you update your copy of the data?

Prep the data

Do the steps in “create mysql clean up script” and “dump data to tab separated value files” steps.

Now copy this python script:

#!/usr/bin/env python

"""
update gcd data that's prep'ed in /tmp/gcd_dump
"""

import os, glob
from pprint import pprint
import psycopg2, psycopg2.extras

table_names = [os.path.splitext(os.path.basename(fp))[0] for fp in glob.glob('/tmp/gcd_dump/*.txt')]

conn = psycopg2.connect("dbname='gcd' user='postgres'")
cur = conn.cursor(cursor_factory=psycopg2.extras.DictCursor)

def sql_logger(sql):
    print sql
    cur.execute(sql)

constraints = []
for ii in table_names:
    sql_logger("""
select t.constraint_name, t.table_name, t.constraint_type,
  c.table_name as c_table_name, c.column_name as c_column_name, k.column_name as k_column_name
from information_schema.table_constraints t,
  information_schema.constraint_column_usage c,
  information_schema.key_column_usage k

  where t.constraint_name = c.constraint_name
    and t.constraint_name = k.constraint_name
    and t.constraint_type = 'FOREIGN KEY'
    and c.table_name = '%s'
  """ % ii)
    for row in cur:
        constraints.append(dict(row))

sql_logger('begin')
for ii in constraints:
    sql_logger('alter table %s drop constraint %s;' % (ii['table_name'], ii['constraint_name'],))

for table_name in table_names:
    sql_logger("DELETE FROM %(table_name)s" % locals())
    sql_logger("COPY %(table_name)s FROM '/tmp/gcd_dump/%(table_name)s.txt'" % locals())

for ii in constraints:
    sql_logger("""
ALTER TABLE ONLY %(table_name)s
  ADD CONSTRAINT %(constraint_name)s
    FOREIGN KEY (%(k_column_name)s) REFERENCES %(c_table_name)s(%(c_column_name)s) DEFERRABLE INITIALLY DEFERRED;
""" % ii)

sql_logger('commit')

You’ll have to run this as the postgres user just as before. It records the FOREIGN KEY CONSTRAINT, drops them, deletes the old data, copies in the new, and recreates the constraints, all in one transaction! Eat that MySQL.

#django #python

django_loader.py

I got tired of putting

import os, sys
sys.path.append(<django project parent dir>)
sys.path.append(<django project dir>)
os.environ['DJANGO_SETTINGS_MODULE']='<django project name>.settings'

at the top of all my scripts that do command line things with my django models. So I share with you ‘django_loader.py’. Note the use of traceback to figure out what file is importing ‘django_loader.py’.

"""
Put this in your python path.  At the top of your script put 'import
django_loader'.  This will start with the directory your file is in and
search through it and it's parent directories until it finds a file named
'settings.py'.  It will then add that directory and it's parent to your
sys.path, and set DJANGO_SETTINGS env var.
"""

import os, sys, traceback

class CouldNotFindSettings(StandardError):
    pass
def find_settings(current_dir):
    if current_dir == '/':
        raise CouldNotFindSettings
    if 'settings.py' in os.listdir(current_dir):
        return current_dir
    return find_settings(os.path.dirname(current_dir))
def load(filepath):
    django_project_dir = find_settings(os.path.dirname(filepath))
    django_project_name = os.path.basename(django_project_dir)

    sys.path.append(os.path.dirname(django_project_dir))
    sys.path.append(django_project_dir)
    os.environ['DJANGO_SETTINGS_MODULE']='%s.settings' % (django_project_name,)

current_filepath = os.path.normpath(os.path.join(os.getcwd(), traceback.extract_stack(limit=2)[0][0]))
load(current_filepath)
#django #python

Now playing in 2010

XBox 360

  • Dead Rising - I’ve been playing this off and on since 2008. This last weekend I made a huge amount of progress, but I might have to start over due to saving when I was almost out of time for a mission. Punishing difficulty, but it’s more fun that way.
  • Dragon Age - There’s no way I’ll finish before Mass Effect 2 comes out this month, and I like the Mass Effect story more. I suspect this one will be around for a while.
  • Left 4 Dead 2 - It’s intense, but difficult to get the right people for a good game at expert. Clearly I need a second TV and xbox to put in the family room.

D&D

Just finished a 3 year once a week campaign. My character destroyed the universe. Sorry about that guys.

Started a new campaign as a Binder/Bard based way too closely on Eddie Riggs from Brütal Legend. I know it’s cheep to copy, but it’s fun. Since the world he lives in looks more like the a heavy metal cover from this world his album covers will have folks in offices, staring at computers.

#video-games

Why Django? Why Postgres?

I use Django and Postgres at home, because I use Rails and MySQL all day at work. Working in totally different solutions to the same problem keeps you fresh.