postgresql - Inserting an Array into Postgres Database from a Ruby File -


i'm trying transfer information ruby file postgres database. able transfer information when not have array column, assuming error message getting because of array column trying add. error message getting is:

in `exec_prepared': error:  missing dimension value (pg::invalidtextrepresentation) 

here code used connect ruby file postgres database:

require 'pg'  class postgres   # create connection instance. scraping name of database adding information   def connect     @conn = pg.connect(:dbname => 'scraping')   end    # create our venue table   def createvenuetable     @conn.exec("create table venues (venue_number varchar(15) unique,...,img_array varchar[]);")   end    ...    def prepareinsertvenuestatement     @conn.prepare("insert_venue", "insert venues(venue_number,...,img_array) values ($1,...,$24)")   end    # add venue prepared statement.   def addvenue(venue_number,...,img_array)     @conn.exec_prepared("insert_venue", [venue_number,...,img_array])   end end 

when check postgres database, img_array column made, however, unable populate it. please help! thank you.

i suggest using serialization handle writing string rather actual array.

require 'pg' require 'yaml'  class postgres   # create connection instance. scraping name of database adding information   def connect     @conn = pg.connect(:dbname => 'scraping')   end    # create our venue table   def createvenuetable     #changed img_array varchar(8000) storing serialized array     @conn.exec("create table venues (venue_number varchar(15) unique,...,img_array varchar(8000));")   end    ...    def prepareinsertvenuestatement     @conn.prepare("insert_venue", "insert venues(venue_number,...,img_array) values ($1,...,$24)")   end    # add venue prepared statement.   def addvenue(venue_number,...,img_array)     @conn.exec_prepared("insert_venue", [venue_number,...,serialized(img_array)])   end   #serialize object   def serialized(obj)      yaml.dump(obj)   end   #deserialize object   def deserialized(obj)      yaml.load(obj)   end end 

abstracted usage example show serialization

a = [1,2,4,5] serialized = yaml.dump(a) #=> "---\n- 1\n- 2\n- 3\n- 4\n- 5\n" yaml.load(serialized) #=> [1,2,3,4,5] #also works on hash objects h = {name: "image", type: "jpeg", data:[1,2,3,4,5]} serial = yaml.dump(h) #=> "---\n:name: image\n:type: jpeg\n:data:\n- 1\n- 2\n- 3\n- 4\n- 5\n" yaml.load(serial) #=> {:name=>"image", :type=>"jpeg", :data=>[1, 2, 3, 4, 5]} 

hope helps out handling issue.

if need store more 8000 characters can switch varchar(max) or text column definitions. recommend varchar(max) because data stored standard varchar until exceeds 8000 character @ point db converts text column under hood.


Comments

Popular posts from this blog

How has firefox/gecko HTML+CSS rendering changed in version 38? -

javascript - Complex json ng-repeat -

jquery - Cloning of rows and columns from the old table into the new with colSpan and rowSpan -