Asynchronous queries to PostgreSql database from the browser with node.js and

Normally we perform our database connection at server side with PHP and PDO for example. In this post I will show a simple technique to send queries to our database (PostgreSql in this example) asynchronously using node.js and

The idea is pretty straightforward. We will send the SQL string and the values with a WebSocket and we will execute a callback in the client when the server (node.js script) fetches the recordset.

Our server:

var pg = require('pg');

var conString = "tcp://user:password@localhost:5432/db";
var client = new pg.Client(conString);

var io = require('').listen(8888);

io.sockets.on('connection', function (socket) {
    socket.on('sql', function (data) {
        var query = client.query(data.sql, data.values);
        query.on('row', function(row) {
            socket.emit('sql', row);

And our client:

<script src="http://localhost:8888/"></script>
var socket = io.connect('http://vmserver:8888');

function sendSql(sql, values, cbk) {
    socket.emit('sql', { sql: sql, values : values});
    socket.on('sql', function(data){
<a href="#" onclick="sendSql('select * from users', [], function(data) {console.log(data);})">select * from users</a>
<a href="#" onclick="sendSql('select * from users where uid=$1', [4], function(data) {console.log(data);})">select * from users where uid=$1</a>

Simple, isn’t it?
You must take care if you use this script at production. Our database is exposed to raw SQL sent from the client. It’s a concept example. Maybe it would be better not to send the SQL. Store them into key-value table in the server and send only an ID from the browser.

What do you think?


7 thoughts on “Asynchronous queries to PostgreSql database from the browser with node.js and

  1. how many database connection do you think that this form support?, or is better use the client pooling of pg

    1. In fact in this example the pg client is in the server (node.js server). So the “limitations” are the same than “normal” application.

  2. I’m actually in the process of doing this now that I have set up properly (I’m using Heroku), because I want control over the queuing of DB transactions as well as realtime notification to the correct recipients based upon their assignment using the LISTEN and NOTIFY method within Postgres, in order to update the scope of content for each of the users who have access to the same data and avoiding “refesh” style content updates.

    I’m not sending any sql – I can just as easily determine the JSON object based upon modules within the core code, via routing and object ordinal assignment which will automatically fire the correct sql storedProcs functions within Postgres. As well I’m dealing with sql injection on the client prior to sending anything to the server, and then as well everything is force integer and binary with the requirement of a unique key on a per transaction basis on a per user basis.

    I’ll let you know how convenient this is… i suspect it’ll be efficient, though very strict from a management perspective 🙂

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.