how can fetch data from database chunlby chunk by chunk using Python PETL or pygramETL or pandas -


is way fetch data db using chunkwise

i have around 30 million data in db cause big memory usage without using tried pandas version 0.17.1

for sd in psql.read_sql(sql,myconn,chunksize=100):     print sd 

but throwing

/usr/bin/python2.7 /home/subin/pythonide/workspace/python/pygram.py traceback (most recent call last):   file "/home/subin/pythonide/workspace/python/pygram.py", line 20, in <module>     sd in psql.read_sql(sql,myconn,chunksize=100):   file "/usr/lib/python2.7/dist-packages/pandas/io/sql.py", line 1565, in _query_iterator     parse_dates=parse_dates)   file "/usr/lib/python2.7/dist-packages/pandas/io/sql.py", line 137, in _wrap_result     coerce_float=coerce_float)   file "/usr/lib/python2.7/dist-packages/pandas/core/frame.py", line 969, in from_records     coerce_float=coerce_float)   file "/usr/lib/python2.7/dist-packages/pandas/core/frame.py", line 5279, in _to_arrays     dtype=dtype)   file "/usr/lib/python2.7/dist-packages/pandas/core/frame.py", line 5357, in _list_to_arrays     content = list(lib.to_object_array_tuples(data).t) typeerror: argument 'rows' has incorrect type (expected list, got tuple) 

please me


Comments

Popular posts from this blog

mysql - Dreamhost PyCharm Django Python 3 Launching a Site -

java - Sending SMS with SMSLib and Web Services -

java - How to resolve The method toString() in the type Object is not applicable for the arguments (InputStream) -