Data Migration from Sybase to PostgreSQL? 15
hp9000 asks: "I've been asked to work on a project that will involve data migration from Sybase to PostgreSQL. I'd like to know if there's any tool, similar to Oracle's Migration Workbench, or even a shell script, to move all data from Sybase tables to PostgreSQL tables, creating the tables/tablespaces/etc in the process or at least generating a script to do so. Any kind of tool, so to speak, that would allow me to perform the migration will be great. I'm not interested in the sql code migration (that will come later if necessary), only tables, indexes, etc. at this point."
Just gonna throw this out there... (Score:3, Informative)
Once the schemas are setup, you can use any number of raw copy tools out there. Borland Delphi comes with one (DataPamp) there are several java-based ones out there. Or you could just write a series of "INSERT INTO" SQL statements with ODBC. This assumes that you are a competent DBA and not like all those Oracle "experts" out there that can't write SQL.
Should be relatively easy (Score:2, Interesting)
It's easy (Score:1)
Aaarrgh! I just woke from a bizarre dream! Seriously, the Perl::DBD stuff should take care of it, without the need for an intermediary "file based" stage. And I echo another posters sentiment - exporting the schema, then manually reading it, has some significant benefits, including the ability to "tweak" it and do some "what ifs". I did this during the design of a postgresql db once and using a simple shell script was able to build and destroy different schemas for testing very quickly.
Checkout techdocs.postgresql.org (Score:1)
Scripting approach (Score:2)
select 'insert into emp values ( "' + empname + '", "' + address1 '", "' + city + '" )'
from emp
This generates the insert statements, which can then be run against PostgreSQL.
If you have a large number of tables and columns, you could generate these queries by doing a small application (e.g. using Perl or whatever your favourite language is) that is driven by the Sybase catalogues (data dictionary).
Another approach is to just extract the data in flat file format, e.g. CSV, and generate the 'insert' statements using a small app. However, the first approach outlined avoids writing any apps and is fine for a not-too-complex database.
Whatever you do, be sure to do 'select count(*)' before and after, and consider using the flat file approach with detailed error reporting when a row can't be transferred. Things will go wrong, but typically only on a subset of the data, so make it easy to retry on that subset.
Moving the Data Is Only Half The Battle (Score:3, Informative)
However, we have hundreds and hundreds (my guess is nearly 3000) triggers and stored procedures. Migrating the data is completely useless for me unless we can somehow migrate the T-SQL too....and I strongly suspect thats not gonna happen.
So, does anyone have any thoughts on this?
Re:Moving the Data Is Only Half The Battle (Score:1)
Re:Moving the Data Is Only Half The Battle (Score:1)
Re:Moving the Data Is Only Half The Battle (Score:2)
Re:Moving the Data Is Only Half The Battle (Score:1)