How to insert bulk data with libpq in C?

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
5 messages Options
a-2
Reply | Threaded
Open this post in threaded view
|

How to insert bulk data with libpq in C?

a-2
May I ask the proper (fast) way of inserting bulk data and if possible, an example would be very appreciated.

Please note that the data I would like to insert contains arrays (float8[]).

By the way, my environment is Windows 10, PostgreSQL 11 and compiling with Visual Studio 2017.
Reply | Threaded
Open this post in threaded view
|

Re: How to insert bulk data with libpq in C?

Laurenz Albe
a wrote:
> May I ask the proper (fast) way of inserting bulk data and if possible, an example would be very appreciated.
>
> Please note that the data I would like to insert contains arrays (float8[]).
>
> By the way, my environment is Windows 10, PostgreSQL 11 and compiling with Visual Studio 2017.

That would be COPY:
https://www.postgresql.org/docs/current/sql-copy.html

Yours,
Laurenz Albe
--
Cybertec | https://www.cybertec-postgresql.com


Reply | Threaded
Open this post in threaded view
|

Re: How to insert bulk data with libpq in C?

Christian Mair
>> May I ask the proper (fast) way of inserting bulk data and if possible, an example would be very appreciated.
>>
>> Please note that the data I would like to insert contains arrays (float8[]).
>>
>> By the way, my environment is Windows 10, PostgreSQL 11 and compiling with Visual Studio 2017.
>
> That would be COPY:
> https://www.postgresql.org/docs/current/sql-copy.html

In particular,

copy table (col1, col2, ...) from stdin with binary

see also

https://www.postgresql.org/docs/10/static/protocol-flow.html#PROTOCOL-COPY

https://www.postgresql.org/docs/10/static/sql-copy.html#id-1.9.3.52.9.4

Also, I suggesttransaction bundling (one commit every 1000 records or so),
depending on context.

Bye,
Chris.


Reply | Threaded
Open this post in threaded view
|

Re: How to insert bulk data with libpq in C?

Laurenz Albe
Chris Mair wrote:

> > > May I ask the proper (fast) way of inserting bulk data and if possible, an example would be very appreciated.
> > >
> > > Please note that the data I would like to insert contains arrays (float8[]).
> > >
> > > By the way, my environment is Windows 10, PostgreSQL 11 and compiling with Visual Studio 2017.
> >
> > That would be COPY:
> > https://www.postgresql.org/docs/current/sql-copy.html
>
> In particular,
>
> copy table (col1, col2, ...) from stdin with binary

Hmm, binary requires that you know the binary PostgreSQL representation.
PostgreSQL might be faster converting text to its internal representation
than your application program...

The big exception here is of course bytea, where the binary representation
is definitely smaller and easy to create.

> see also
>
> https://www.postgresql.org/docs/10/static/protocol-flow.html#PROTOCOL-COPY
>
> https://www.postgresql.org/docs/10/static/sql-copy.html#id-1.9.3.52.9.4
>
> Also, I suggesttransaction bundling (one commit every 1000 records or so),
> depending on context.

Why that?
It might be useful if you expect the load to fail in the middle and want
to be able to restart from there.  But normally, doing it all in one
transaction is simpler.

Yours,
Laurenz Albe


Reply | Threaded
Open this post in threaded view
|

Re: How to insert bulk data with libpq in C?

Christian Mair

>>>> May I ask the proper (fast) way of inserting bulk data and if possible, an example would be very appreciated.
>>>>
>>>> Please note that the data I would like to insert contains arrays (float8[]).
>>>>
>>>> By the way, my environment is Windows 10, PostgreSQL 11 and compiling with Visual Studio 2017.
>>>
>>> That would be COPY:
>>> https://www.postgresql.org/docs/current/sql-copy.html
>>
>> In particular,
>>
>> copy table (col1, col2, ...) from stdin with binary
>
> Hmm, binary requires that you know the binary PostgreSQL representation.
> PostgreSQL might be faster converting text to its internal representation
> than your application program...
>
> The big exception here is of course bytea, where the binary representation
> is definitely smaller and easy to create.

Hi,

yes.
I understood the data is already binary and OP wants to store it binary, but
of course this might not be the case.

>> Also, I suggest transaction bundling (one commit every 1000 records or so),
>> depending on context.
>
> Why that?
> It might be useful if you expect the load to fail in the middle and want
> to be able to restart from there.  But normally, doing it all in one
> transaction is simpler.

Yes. That would be the special case with one commit every N records, where
N is the total count :)

I just wanted to make shure OP will not commit after each COPY.

Bye.
Chris.