Normally we perform our database connection at server side with PHP and PDO for example. In this post I will show a simple technique to send queries to our database (PostgreSql in this example) asynchronously using node.js and socket.io.
The idea is pretty straightforward. We will send the SQL string and the values with a WebSocket and we will execute a callback in the client when the server (node.js script) fetches the recordset.
Our server:
var pg = require('pg'); var conString = "tcp://user:password@localhost:5432/db"; var client = new pg.Client(conString); client.connect(); var io = require('socket.io').listen(8888); io.sockets.on('connection', function (socket) { socket.on('sql', function (data) { var query = client.query(data.sql, data.values); query.on('row', function(row) { socket.emit('sql', row); }); }); });
And our client:
<script src="http://localhost:8888/socket.io/socket.io.js"></script> <script> var socket = io.connect('http://vmserver:8888'); function sendSql(sql, values, cbk) { socket.emit('sql', { sql: sql, values : values}); socket.on('sql', function(data){ console.log(data); }); } </script> <p> <a href="#" onclick="sendSql('select * from users', [], function(data) {console.log(data);})">select * from users</a> </p> <p> <a href="#" onclick="sendSql('select * from users where uid=$1', [4], function(data) {console.log(data);})">select * from users where uid=$1</a> </p>
Simple, isn’t it?
You must take care if you use this script at production. Our database is exposed to raw SQL sent from the client. It’s a concept example. Maybe it would be better not to send the SQL. Store them into key-value table in the server and send only an ID from the browser.
What do you think?
Nice article. Accessing a database from the browser is gaining traction with the ‘security-concern’ rider. Meteor is one framework that comes to mind. Another interesting implementation http://podefr.tumblr.com/post/22553968711/an-innovative-way-to-replace-ajax-and-jsonp-using
It’s pretty nice. I love how Node.js and other technologies allow you to do these kind of (experimental) things nice and cheaply.
how many database connection do you think that this form support?, or is better use the client pooling of pg
In fact in this example the pg client is in the server (node.js server). So the “limitations” are the same than “normal” application.
Nice…work it very well
I’m actually in the process of doing this now that I have socket.io set up properly (I’m using Heroku), because I want control over the queuing of DB transactions as well as realtime notification to the correct recipients based upon their assignment using the LISTEN and NOTIFY method within Postgres, in order to update the scope of content for each of the users who have access to the same data and avoiding “refesh” style content updates.
I’m not sending any sql – I can just as easily determine the JSON object based upon modules within the core code, via routing and object ordinal assignment which will automatically fire the correct sql storedProcs functions within Postgres. As well I’m dealing with sql injection on the client prior to sending anything to the server, and then as well everything is force integer and binary with the requirement of a unique key on a per transaction basis on a per user basis.
I’ll let you know how convenient this is… i suspect it’ll be efficient, though very strict from a management perspective 🙂
I don’t understand the problem but Complex solutions normally are bad idea.