time taking deletion on large tables

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

time taking deletion on large tables

Atul Kumar-2
Hi,

The feed_posts table has over 50 Million rows.

When I m deleting all rows of a certain type that are over 60 days old.

When I try to do a delete like this:  it hangs for an entire day, so I
need to kill it with pg_terminate_backend(pid).

DELETE FROM feed_posts
WHERE feed_definition_id = 'bf33573d-936e-4e55-8607-72b685d2cbae'
AND created_at > '2020-05-11 00:00:00'
AND created_at < '2020-05-12 00:00:00';

So– I need help in figuring out how to do large deletes on a
production database during normal hours.

explain plan is given below



"Delete on feed_posts (cost=1156.57..195748.88 rows=15534 width=6)"
" -> Bitmap Heap Scan on feed_posts (cost=1156.57..195748.88
rows=15534 width=6)"
" Recheck Cond: ((created_at >= '2020-05-11 00:00:00'::timestamp
without time zone) AND (created_at <= '2020-05-12 00:00:00'::timestamp
without time zone))"
" Filter: (feed_definition_id = 'bf33573d-936e-4e55-8607-72b685d2cbae'::uuid)"
" -> Bitmap Index Scan on feed_posts_created_at (cost=0.00..1152.68
rows=54812 width=0)"
" Index Cond: ((created_at >= '2020-05-11 00:00:00'::timestamp without
time zone) AND (created_at <= '2020-05-12 00:00:00'::timestamp without
time zone))"


please help me on deleting the rows, Do I need to anything in postgres
configuration ?
or in table structure ?





Regards,
Atul