I am using sqlite3 with python2.5 and the pysqlite wrapper. I am trying
to copy tables from one database (in memory) to another database (file)
using ATTACH. I looked on the internet and found a couple of sites that show
how to do this but the table schema is not copied.
# Attach external db file - give it handle filename
# Attaching automatically creates a database by default
self.__curs.execute("ATTACH %s AS %s" % (filename,filename))
table_names=self.__curs.execute("SELECT name FROM main.sqlite_master
for table_name, in table_names:
#copy in mem db to persistent db
self.__curs.execute("CREATE TABLE %s.%s AS SELECT * FROM
main.%s" % (filename,table_name,table_name))
self.__curs.execute("DETACH %s" % filename) # Detach external db"""
Is there a way to copy the schema from the sqlite_master table. I know we
can used iter.dump from within python to dump to text based SQL but I would
prefer not to do this. Eventually my goal is to dynamically load and unload
sections of a file based database (could be tables or rows) in and out
of memory for effeciency purposes. Any help is appreciated. Thanks in
On 08/30/2010 10:25 PM, Denis Gomes wrote:
> Eventually my goal is to dynamically load and unload
> sections of a file based database (could be tables or rows) in and out
> of memory for effeciency purposes.
What makes you believe that SQLite is inefficient on your operating system?
In any event you can meet that same goal using two builtin pieces of
SQLite functionality. VFS lets you do it at the file level while
virtual tables let you do it at the table level.
RAMdisks/memory mapping or your operating system's builtin cache manager
is designed for "efficiency".