Resize a column in a PostgreSQL table without changing data: Repost

This morning I realized that one of the columns in the table I have recently created is running out of space. It is varchar(10) and I want to make it varchar(20) and of course wanted to do without losing any data which got filled in the last couple of days. Though there is a standard way to changing the size and type of the col using ALTER commands which can become tricky sometimes with the data. So I was looking for an alternate way and I found an awesome post on how to do this in an easy way without disturbing data, this is an almost 10 yrs old post so reposting the commands in case they get lost. For detailed reading please visit the blog link given in references.

Command to check the current size of a given column:

SELECT atttypmod FROM pg_attribute WHERE attrelid = 'TABLE1'::regclass AND attname = 'COL1';

In my case, the present size is 10 and I want to increase it to 25, so the command to update column will be as below:

UPDATE pg_attribute SET atttypmod = 25 WHERE attrelid = 'TABLE1'::regclass AND attname = 'COL1';

I hope this info and re-post helps.

Original blog link:

Import RDBMS table to HDFS with sqoop from postgreSQL


1. Download JDBC driver


[code lang=”bash”]$wget[/code]


 2. Copy: 

[code lang=”bash”]cp /home/cloudera/Desktop/postgresql-9.3-1102.jdbc4.jar /usr/lib/sqoop/lib/ [/code]


3. Configure: 

[code lang=”bash”]/var/lib/pgsql/data/pg_hba.conf[/code]

file. You need to allow the IP/host of machine running Hadoop.

Restart postgreSQL using 

[code lang=”bash”]$pg_ctl restart[/code]


4. Run sqoop: Open the terminal on machine running hadoop and type the below command.


[code lang=”bash”] cloudera@cloudera-vm:/usr/lib/sqoop bin/sqoop import –connect jdbc:postgresql://–table employee –username postgres -P –target-dir /sqoopOut1 -m 1 [/code]


Enter password:



  • Cloudera hadoop VM distribution or any other machine running hadoop.
  • postgreSQL installation.
  • database Testdb and employee table on a running instance of postgreSQL (e.g.; in point 4).


All set! Your pgsql table data is now available on HDFS of  VM hadoop cluster.


Enjoy hadoop learning!

PostgreSQL useful tips

I am posting some of the PostgreSQL commands which I use frequently.

Create a new database name testdb

# createdb <dbname>
[code lang=”sql”]
# createdb testdb
Remove a PostgreSQL database
# dropdb <dbname>
[code lang=”sql”]
# dropdb testdb

Backing up a PostgreSQL database
[code lang=”sql”]
# su – postgres

[code lang=”sql”]
# pg_dump –blob -Fc testdb -f /var/lib/pgsql/backups/testdb_backup

Restoring PostgreSQL database from back up dump
[code lang=”sql”]
# pg_restore –dbname=testdb /var/lib/pgsql/backups/testdb_backup

Writing query output to a CSV file:
[code lang=”sql”]
# \o ‘tmp/logs/query_out_dump.csv’

After this operation all the query results will be stored in a CSV file.
Using console again for query output:
[code lang=”sql”]
# \o
For more on pg_dump and pg_restore pl. check the documentation