IWETHEY v. 0.3.0 | TODO
1,095 registered users | 0 active users | 0 LpH | Statistics
Login | Create New User
IWETHEY Banner

Welcome to IWETHEY!

New ddl is easy
Schema is generated from meta model and various utils exist to drop/create tables/constraints/sequences.

OTOH, I have a metric assload of data that took a couple weeks to amass - this I've been keeping in a pg_dump generated file.

What about deployment - is this still the way to go for production? It kind of looks like pg_dump to archive file with pg_restore gives quite a bit more flexibility - but I am suspicious of binary formats.




[link|http://www.blackbagops.net|Black Bag Operations Log]

[link|http://www.objectiveclips.com|Artificial Intelligence]

[link|http://www.badpage.info/seaside/html|Scrutinizer]
New Re: ddl is easy
OTOH, I have a metric assload of data that took a couple weeks to amass - this I've been keeping in a pg_dump generated file.

What about deployment - is this still the way to go for production? It kind of looks like pg_dump to archive file with pg_restore gives quite a bit more flexibility - but I am suspicious of binary formats.
As noted above, I use pg_dump to dump my output to a text file. The --inserts flag tells pg_dump to create INSERT INTO ... statements for each row instead of a tab-delimited format, or other format. I prefer to have SQL statements because I can read them alot easier than I can read tab-delimited files. Maybe I'm just set in my ways and I need to upgrade...But I'll take the performance degradation that goes along with INSERT statements in favor of something that I *could* read more easily if I *had* to.

Importing the output of my pg_dump command into a new copy of my database is a simple \\i filename.sql at the Postgres CLI.
-YendorMike

"They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety."
- Benjamin Franklin, 1759 Historical Review of Pennsylvania
New If the issue is
accumulating a large qty of data that need to be put back, then you could have a 2 stage build.

Can you have your data external to the database to start off with, ie: you are going through a development phase, accumulating data in the database, but then have the data exported into flat file to be available for a rebuild?

If so, then create your database without any data, and then use the copy command to quickly bring it back in.

[link|http://www.postgresql.org/docs/8.1/interactive/sql-copy.html|http://www.postgresq...ive/sql-copy.html]

You can have the data saved in non-binary or binary, depending on your speed requirement. You get much better control of what tables are done when, and you can place the data physically on the server before import (if you are doing client server, and the round trip hurts you).
     PostgreSQL backup and restore strategies - (tuberculosis) - (5)
         Re: PostgreSQL backup and restore strategies - (Yendor)
         Ditto on Mike - (admin) - (3)
             ddl is easy - (tuberculosis) - (2)
                 Re: ddl is easy - (Yendor)
                 If the issue is - (crazy)

Supercalifragilisticexshpialadocious.
37 ms