sql - Running select command on postgres relational Table containing data in tera bytes -


i have relational table in postgres of 3 tb. want dump content csv file. doing following tutorial: http://www.mkyong.com/database/how-to-export-table-data-to-file-csv-postgresql/

my problem after specifying file export has done , select statement. postgres shows "killed". because of relational table being of 3tb. if yes, how should export data postgres file (txt or csv, etc). if not, how should figure out possible cause of select command getting killed.

killed suggests you're running on system out-of-memory killer (oom killer) enabled memory over-commit settings. this isn't recommended manual.

if disable overcommit you'll neater 'out of memory' error client instead of sigkill , server re-start.

as copy ... running copy (select ...) ? or copy tablename .... ? try direct copy without query, see if helps.

when diagnosing faults should looking @ postgresql error logs (which tell more problem) , system logs kernel logs or dmesg output.

when asking questions postgresql on stack overflow always include exact server version select version(), exact command text/code run, exact unedited text of error messages, etc.


Comments

Popular posts from this blog

SPSS keyboard combination alters encoding -

Add new record to the table by click on the button in Microsoft Access -

javascript - jQuery .height() return 0 when visible but non-0 when hidden -