Advice on logging strategy

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Advice on logging strategy

Mike Martin
I have a question on logging strategy

I have loggin set to
log_statement = 'all' on a network database with logging set to csv so I can import it to a logging table

However the database is populated via a nightly routine downloading data via REST APIusing prepared statements

This results in enormous log files which take ages to import using copy becuase each execute statement is logged with the parameters chosen

Is there any way around this?

I cant find any way to filter dml statements

thanks

lup
Reply | Threaded
Open this post in threaded view
|

Re: Advice on logging strategy

lup


> On Oct 11, 2018, at 4:26 AM, Mike Martin <[hidden email]> wrote:
>
> I have a question on logging strategy
>
> I have loggin set to
> log_statement = 'all' on a network database with logging set to csv so I can import it to a logging table
>
> However the database is populated via a nightly routine downloading data via REST APIusing prepared statements
>
> This results in enormous log files which take ages to import using copy becuase each execute statement is logged with the parameters chosen
>
> Is there any way around this?
>
> I cant find any way to filter dml statements
>
> thanks
>
Do you want all the log lines in you logging table?
There was a thread yesterday (10.Oct.2018) on COPY which mention the possibility of multiple processes COPYing to same table.
Reply | Threaded
Open this post in threaded view
|

Re: Advice on logging strategy

Mike Martin
I suppose the ideal would be to log the prepared statement once and detail only if error rather than one per execution

On Thu, 11 Oct 2018 at 11:33, Rob Sargent <[hidden email]> wrote:


> On Oct 11, 2018, at 4:26 AM, Mike Martin <[hidden email]> wrote:
>
> I have a question on logging strategy
>
> I have loggin set to
> log_statement = 'all' on a network database with logging set to csv so I can import it to a logging table
>
> However the database is populated via a nightly routine downloading data via REST APIusing prepared statements
>
> This results in enormous log files which take ages to import using copy becuase each execute statement is logged with the parameters chosen
>
> Is there any way around this?
>
> I cant find any way to filter dml statements
>
> thanks       
>
Do you want all the log lines in you logging table?
There was a thread yesterday (10.Oct.2018) on COPY which mention the possibility of multiple processes COPYing to same table.
Reply | Threaded
Open this post in threaded view
|

Re: Advice on logging strategy

Jeff Janes
In reply to this post by Mike Martin
On Thu, Oct 11, 2018 at 6:27 AM Mike Martin <[hidden email]> wrote:
I have a question on logging strategy

I have loggin set to
log_statement = 'all' on a network database with logging set to csv so I can import it to a logging table

However the database is populated via a nightly routine downloading data via REST APIusing prepared statements

This results in enormous log files which take ages to import using copy becuase each execute statement is logged with the parameters chosen

Is there any way around this?

One option is to convert to using COPY...FROM STDIN rather than prepared INSERTs.

Another is to create a user specifically for bulk population, and do a 'ALTER USER bulk_load SET log_statement=none` to override the global log_statement setting.

Cheers,

Jeff
Reply | Threaded
Open this post in threaded view
|

Re: Advice on logging strategy

David Steele
In reply to this post by Mike Martin
On 10/11/18 11:26 AM, Mike Martin wrote:
>
> This results in enormous log files which take ages to import using copy
> becuase each execute statement is logged with the parameters chosen
>
> Is there any way around this?
>
> I cant find any way to filter dml statements

pgAudit (https://github.com/pgaudit/pgaudit) gives you fine-grain
control over what is logged by command type, table, or user as well as a
lot more detail.

--
-David
[hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Advice on logging strategy

Mike Martin
Thanks!

On Fri, 12 Oct 2018 at 14:33, David Steele <[hidden email]> wrote:
On 10/11/18 11:26 AM, Mike Martin wrote:
>
> This results in enormous log files which take ages to import using copy
> becuase each execute statement is logged with the parameters chosen
>
> Is there any way around this?
>
> I cant find any way to filter dml statements

pgAudit (https://github.com/pgaudit/pgaudit) gives you fine-grain
control over what is logged by command type, table, or user as well as a
lot more detail.

--
-David
[hidden email]