How to Fix PostgreSQL Error: Too Many Connections

This error occurs when a specific database reaches its connection limit as defined by the datconnlimit setting, which is separate from the server-wide max_connections limit.

Impact

Prevents new connections to the affected database while other databases remain accessible. Causes application failures and blocks new users from connecting to that specific database.

Common Causes

  1. Per-database connection limit set too low
  2. Connection leaks in applications
  3. Missing connection pooling
  4. Traffic spike to specific database
  5. Idle connections not cleaned up

Troubleshooting and Resolution Steps

  1. Check database connection limit:

    -- View per-database limits
    SELECT datname, datconnlimit
    FROM pg_database
    WHERE datname = 'mydb';
    
    -- -1 means no limit (uses server max_connections)
    -- Positive number is the limit for that database
    
  2. Increase database connection limit:

    -- Increase limit for specific database
    ALTER DATABASE mydb CONNECTION LIMIT 100;
    
    -- Remove limit (use server-wide max_connections)
    ALTER DATABASE mydb CONNECTION LIMIT -1;
    
  3. Check current connections to database:

    -- Count connections per database
    SELECT
        datname,
        COUNT(*) as connections,
        (SELECT setting::int FROM pg_settings WHERE name = 'max_connections') as max_conn,
        d.datconnlimit
    FROM pg_stat_activity a
    JOIN pg_database d ON a.datname = d.datname
    GROUP BY datname, d.datconnlimit
    ORDER BY connections DESC;
    
  4. Terminate idle connections:

    -- Find idle connections to specific database
    SELECT
        pid,
        usename,
        application_name,
        state,
        state_change
    FROM pg_stat_activity
    WHERE datname = 'mydb'
    AND state = 'idle'
    ORDER BY state_change;
    
    -- Terminate specific connection
    SELECT pg_terminate_backend(pid);
    
    -- Terminate all idle connections to database
    SELECT pg_terminate_backend(pid)
    FROM pg_stat_activity
    WHERE datname = 'mydb'
    AND state = 'idle'
    AND state_change < now() - interval '5 minutes';
    
  5. Implement connection pooling:

    # PgBouncer per-database configuration
    # /etc/pgbouncer/pgbouncer.ini
    
    [databases]
    mydb = host=localhost port=5432 dbname=mydb pool_size=20
    
    [pgbouncer]
    pool_mode = transaction
    max_client_conn = 1000
    default_pool_size = 20
    
    # Now 1000 clients can share 20 database connections
    
  6. Monitor connections per database:

    -- Create monitoring view
    CREATE VIEW db_connection_stats AS
    SELECT
        d.datname,
        COUNT(a.pid) as current_connections,
        d.datconnlimit as limit,
        CASE
            WHEN d.datconnlimit = -1 THEN 'unlimited'
            ELSE ROUND(COUNT(a.pid)::numeric / d.datconnlimit * 100, 2) || '%'
        END as percent_used
    FROM pg_database d
    LEFT JOIN pg_stat_activity a ON d.datname = a.datname
    GROUP BY d.datname, d.datconnlimit
    ORDER BY current_connections DESC;
    
    SELECT * FROM db_connection_stats;
    

This article continues the pattern but addresses per-database connection limits specifically. Would you like me to continue creating the remaining articles, or would you prefer a summary of all completed work?

Pulse - Elasticsearch Operations Done Right

Pulse can solve your Elasticsearch issues

Subscribe to the Pulse Newsletter

Get early access to new Pulse features, insightful blogs & exclusive events , webinars, and workshops.

We use cookies to provide an optimized user experience and understand our traffic. To learn more, read our use of cookies; otherwise, please choose 'Accept Cookies' to continue using our website.