Blog Archives

Calling Silex backend from command line. Creating SAAS command line tools

Sometimes we need to create command line tools. We can build those tools using different technologies. In Symfony world there’s Symfony Console. I feel very confortable using it. But if we want to distribute our tool we will need to face with one “problem”. User’ll need to have PHP installed. It sounds trivial but it isn’t installed in every computer. We can use nodeJs to build our tool. Nowadays nodeJs is a de-facto standard but we still have the problem. Another “problem” is how to distribute new version of our tool. Problems everywhere.

Software as a service tools are great. We can build a service (a web based service for example) and we can even monetize our service with one kind of paid-plan or another. With our SAAS we don’t need to worry about redistribute our software within each release. But, what happens when our service is a command line one?

Imagine for example that we’re going to build one service to convert text to uppercase (I thing this idea will become me rich, indeed :)

We can create one simple Silex example to convert to upper case strings:

<?php
include __DIR__ . "/../vendor/autoload.php";

use Silex\Application;
use Symfony\Component\HttpFoundation\Request;

$app = new Application();
$app->post("/", function (Request $request) {
    return strtoupper($request->getContent());
});
$app->run();

And now we only to call this service from the command line. We can use curl for example and convert one file content to upper case:

cat myfile.txt | curl -d @- localhost:8080 > MYFILE.txt

You can see the example in my github account here

Working with Ionic and PHP Backends. Remote debugging with PHP7 and Xdebug working with real devices

Sometimes I speak with PHP developers and they don’t use remote debugging in their development environments. Some people don’t like to use remote debugging. They prefer to use TDD and rely on the unit tests. That’s a good point of view, but sometimes they don’t use remote debugging only because they don’t know how to do it, and that’s inadmissible. Remote debugger is a powerful tool especially to handle with legacy applications. I’ve using xdebug for years with my linux workstation for years. This days I’m using Mac and it’s also very simple to set up xdebug here.

First we need to install PHP:

brew install php70

Then Xdebug

brew install php70-xdebug

(in a Ubuntu box we only need to use apt-get instead of brew)

Now we need to setup xdebug to enable remote debugging:
In a standard installation xdebug configuration is located at: /usr/local/etc/php/7.0/conf.d/ext-xdebug.ini

[xdebug]
zend_extension="/usr/local/opt/php70-xdebug/xdebug.so"

xdebug.remote_enable=1
xdebug.remote_port=9000
xdebug.profiler_enable=0
xdebug.profiler_output_dir="/tmp"
xdebug.idekey= "PHPSTORM"
xdebug.remote_connect_back = 1
xdebug.max_nesting_level = 250

And basically that’s all. To set/unset the cookie you can use one bookmarklet in your browser (you can generate your bookmarklets here). Or use a Chrome extension to enable xdebug.

Now se only need to start the built-in server with

php -S 0.0.0.0:8080

And remote debugging will be available
Remote debugger works this way:

  • We open on port within our IDE. In my case PHPStorm (it happens when we click on “Start listening for PHP debug connections”)
  • We set one cookie in our browser (it happens when click on Chrome extension)
  • When our server receives one request with the cookie, it connects to the port that our IDE opens (usually port 9000). If you use a personal firewall in your workstation, ensure that you allow incoming connections to this port.

Nowadays I’m involved with several projects building hybrid applications with Apache Cordova. In the Frontend I’m using ionic and Silex in the Backend. When I’m working with hybrid applications normally I go through two phases.

In the first one I build a working prototype. To to this I run a local server and I use my browser to develop the application. This phase is very similar than a traditional Web development process. If we also set up properly LiveReload, our application will be reloaded each time we change one javaScript file. Ionic framework integrates LiveReload and we only need to run:

ionic serve -l

to start our application. We also need to start our backend server. For example

php -S 0.0.0.0:8080 -t api/www

Now we can debug our Backend with remote debugger and Frontend with Chrome’s developer’s tools. Chrome also allows us to edit Frontend files and save them within the filesystem using workspaces. This phase is the easy one. But sooner or later we’ll need start working with a real device. We need a real device basically if we use plugins such as Camera plugin, Geolocation plugin, or things like that. OK there are emulators, but usually emulators don’t allow to use all plugins in the same way than we use then with a real device. Chrome also allow us to see the console logs of the device from our workstation. OK we can see all logs of our plugged Android device using “adb logcat” but follow the flow of our logs with logcat is similar than understand Matrix code. It’s a mess.

If we plug our android device to our computer and we open with Chrome:

chrome://inspect/#devices

We can see our device’s console, use breakpoints and things like that. Cool, isn’t it? Of course it only works if we compile our application without “–release” option. We can do something similar with Safary and iOS devices.

With ionic if we want to use LiveReload from the real device and not to recompile and re-install again and again our application each time we change our javaScript files, we can run the application using

ionic run android --device -l

When we’re developing our application and we’re in this phase we also need to handle with CORS. CORS isn’t a problem when we run our hybrid application in production. When we run the hybrid application with our device our “origin” is the local filesystem. That’s means CORS don’t apply, but when we run our application in the device, but served from our computer (when we use “-l” option), our origin isn’t local filesystem. So if our Backend is served from another origin we need to enable CORS.

We can enable CORS in the backend. I’ve written about it here, but ionic people allows us a easier way. We can set up a local proxy to serve our backend through the same origin than the application does and forget about CORS. Here we can read a good article about it.

Anyway if we want to start the remote debugger we need to create one cookie called XDEBUG_SESSION. In the browser we can use chrome extension, but when we inspect the plugged device isn’t so simple. It would be cool that ionic people allows us to inject cookies to our proxy server. I’ve try to see how to do it with ionic-cli. Maybe is possible but I didn’t realize how to do it. Because of that I’ve created a simple AngularJS service to inject this cookie. Then, if I start listening debug connections in my IDE I’ll be able to use remote debugger as well as I do when I work with the browser.

First we need to install service via Bower:

bower install ng-xdebugger --save

Now we need to include javaScript files

<script src="lib/angular-cookies/angular-cookies.min.js"></script>
<script src="lib/ng-xdebugger/dist/gonzalo123.xdebugger.min.js"></script>

then we add our service to the project.

angular.module("starter", ["ionic", "gonzalo123.xdebugger"])

Now we only need to configure our application and set de debugger key (it must be the same key than we use within the server-side configuration of xdebug)

.config(function (xdebuggerProvider) {
        xdebuggerProvider.setKey('PHPSTORM');
    })
})

And that’s all. The service is very simple. It only uses one http interceptor to inject the cookie in our http requests:

(function () {
    "use strict";

    angular.module("gonzalo123.xdebugger", ["ngCookies"])
        .provider("xdebugger", ['$httpProvider', function ($httpProvider) {
            var debugKey;

            this.$get = function () {
                return {
                    getDebugKey: function () {
                        return debugKey;
                    }
                };
            };

            this.setKey = function (string) {
                if (string) {
                    debugKey = string;
                    $httpProvider.interceptors.push("xdebuggerCookieInterceptor");
                }
            };
        }])

        .factory("xdebuggerCookieInterceptor", ['$cookieStore', 'xdebugger', function ($cookieStore, xdebugger) {
            return {
                response: function (response) {
                    $cookieStore.put("XDEBUG_SESSION", xdebugger.getDebugKey());

                    return response;
                }
            };
        }])
    ;
})();

And of course you can see the whole project in my github account.

POST Request logger using websockets

Last days I’ve been working with background geolocation with an ionic application. There’s a cool plugin to do that. The free version of the plugin works fine. But there’s a also a premium version with improvements, especially in battery consumption with Android devices.

Basically this plugin performs a POST request to the server with the GPS data. When I was developing my application I needed a simple HTTP server to see the POST requests. Later I’ll code the backend to handle those requests. I can develop a simple Silex application with a POST route and log the request in a file or flush those request to the console. This’d have been easy but as far as I’m a big fan of WebSockets (yes I must admit that I want to use WebSockets everywere :) I had one idea in my mind. The idea was create a simple HTTP server to handle my GPS POST requests but instead of logging the request I will emit a WebSocket. Then I can create one site that connects to the WebSocket server and register on screen the POST request. Ok today I’m a bit lazy to fight with the Frontend so my log will be on the browser’s console.

To build the application I’ll reuse one of my projects in github: The PHP dumper. The idea is almost the same. I’ll create a simple HTTP server with Silex with two routes. One to handle POST requests (the GPS ones) and another GET to allow me to connect to the WebSocket

That’s the server. Silex, a bit of Twig, another bit of Guzzle and that’s all

use GuzzleHttp\Client;
use Silex\Application;
use Silex\Provider\TwigServiceProvider;
use Symfony\Component\HttpFoundation\Request;

$app = new Application([
    'debug'       => true,
    'ioServer'    => '//localhost:8888',
    'wsConnector' => 'http://127.0.0.1:26300'
]);

$app->register(new TwigServiceProvider(), [
    'twig.path' => __DIR__ . '/../views',
]);

$app['http.client'] = new Client();

$app->get("/{channel}", function (Application $app, $channel) {
    return $app['twig']->render('index.twig', [
        'channel'  => $channel,
        'ioServer' => $app['ioServer']
    ]);
});

$app->post("/{channel}", function (Application $app, $channel, Request $request) {
    $app['http.client']->get($app['wsConnector'] . "/info/{$channel}/" . json_encode($request->getContent()));

    return $app->json('OK');
});

$app->run();

That’s the Twig template. Nothing especial: A bit of Bootstrap and one socket.io client. Each time user access to one “channel”‘s url (GET /mychannel). It connects to websocket server

var CONF = {
        IO: {HOST: '0.0.0.0', PORT: 8888},
        EXPRESS: {HOST: '0.0.0.0', PORT: 26300}
    },
    express = require('express'),
    expressApp = express(),
    server = require('http').Server(expressApp),
    io = require('socket.io')(server, {origins: 'localhost:*'})
    ;

expressApp.get('/:type/:session/:message', function (req, res) {
    console.log(req.params);
    var session = req.params.session,
        type = req.params.type,
        message = req.params.message;

    io.sockets.emit('dumper.' + session, {title: type, data: JSON.parse(message)});
    res.json('OK');
});

io.sockets.on('connection', function (socket) {
    console.log("Socket connected!");
});

expressApp.listen(CONF.EXPRESS.PORT, CONF.EXPRESS.HOST, function () {
    console.log('Express started');
});

server.listen(CONF.IO.PORT, CONF.IO.HOST, function () {
    console.log('IO started');
});

And each time background geolocation plugin POSTs GPS data Silex POST route will emit a WebSocket to the desired channel. Our WebSocket client just logs the GPS data using console.log. Is hard to explain but very simple process.

We also can emulate POST requests with this simple node script:

var request = require('request');

request.post('http://localhost:8080/Hello', {form: {key: 'value'}}, function (error, response, body) {
    if (!error && response.statusCode == 200) {
        console.log(body)
    }
});

And that’s all. You can see the whole code within my github account.

Alternative way to inject providers in a Silex application

I normally use Silex when I need to build one Backend. It’s simple and straightforward to build one API endpoint using this micro framework. But there’s something that I don’t like it: The “array access” way to access to the dependency injection container. I need to remember what kind of object provides my service provider and also my IDE doesn’t help me with autocompletion. OK I can use PHPDoc comments or even create one class that inherits from Silex\Application and use Traits. Normally I’m lazy to do it. Because of that I’ve create this simple service provider to help me to do what I’m looking for. Let me explain it a little bit.

Imagine that I’ve got this class

namespace Foo

class Math
{
    public function sum($i, $j)
    {
        return $i+$j;
    }
}

And I want to add this service to my DIC

$app['math'] = $app->share(function () {
    return new Math();
});

Now I can use my service within my Silex application

$app->get("/", function () use ($app) {
    return $app['math']->sum(1, 2);
});

But I want to use my service in the same way that I’m using my services within my AngularJS applications. I what to do something like that:

use Foo\Math;
...
$app->get("/", function (Math $math) {
    return $math->sum(1, 2);
});

And that’s exactly what my service provider does. I only need to append my provider to my Application and tell to the provider what’s the relationship between Pimple’s services keys and its provided Instance

$app->register(new InjectorServiceProvider([
    'Foo\Math' => 'math',
]));

This is one example

composer require gonzalo123/injector
include __DIR__ . "/../vendor/autoload.php";

use Silex\Application;
use Injector\InjectorServiceProvider;
use Foo\Math;

$app            = new Application(['debug' => true]);

$app->register(new InjectorServiceProvider([
    'Foo\Math' => 'math',
]));

$app['math'] = function () {
    return new Math();
};

$app->get("/", function (Math $math) {
    return $math->sum(1, 2);
});

$app->run();

And this is the Service Provider

namespace Injector;
use Silex\Application;
use Silex\ServiceProviderInterface;
use Symfony\Component\HttpKernel\Event\FilterControllerEvent;
use Symfony\Component\HttpKernel\KernelEvents;
class InjectorServiceProvider implements ServiceProviderInterface
{
    private $injectables;
    public function __construct($injectables = [])
    {
        $this->injectables = $injectables;
    }
    public function appendInjectables($providedClass, $key)
    {
        $this->injectables[$providedClass] = $key;
    }
    public function register(Application $app)
    {
        $app->on(KernelEvents::CONTROLLER, function (FilterControllerEvent $event) use ($app) {
            $reflectionFunction = new \ReflectionFunction($event->getController());
            $parameters         = $reflectionFunction->getParameters();
            foreach ($parameters as $param) {
                $class = $param->getClass();
                if ($class && array_key_exists($class->name, $this->injectables)) {
                    $event->getRequest()->attributes->set($param->name, $app[$this->injectables[$class->name]]);
                }
            }
        });
    }
    public function boot(Application $app)
    {
    }
}

As we can see I’m listening to CONTROLLER event from event dispatcher and I inject the dependency form container to requests attributes.

Full code in my github account

Building one HTTP client in PostgreSQL with PL/Python

Don’t ask me way, but I need to call to a HTTP server (one Silex application) from a PostgreSQL database.

I want to do something like this:

select get('http://localhost:8080?name=Gonzalo')->'hello';

PostgreSQL has a datatype for json. It’s really cool and it allows us to connect our HTTP server and our SQL database using same datatype.

PostgreSQL also allows us to create stored procedures using different languages. The default language is PL/pgSQL. PL/pgSQL is a simple language where we can embed SQL. But we also can use Python. With Python we can easily create HTTP clients, for example with urllib2. That means that develop our a HTTP client for a PostgreSQL database is pretty straightforward.

CREATE OR REPLACE FUNCTION get(uri character varying)
  RETURNS json AS
$BODY$
import urllib2

data = urllib2.urlopen(uri)

return data.read()

$BODY$
  LANGUAGE plpython2u VOLATILE
  COST 100;
ALTER FUNCTION get(character varying)
  OWNER TO gonzalo;

Ok that’s a GET client, but we also want a POST client to do something like this:

select post('http://localhost:8080', '{"name": "Gonzalo"}'::json)->'hello';

As you can see I want to use application/json instead of application/x-www-form-urlencoded to send request parameters. I wrote about it here time ago. So I will create one endpoint within my Silex server to handle my POST requests to:

<?php
include __DIR__ . '/../vendor/autoload.php';

use Silex\Application;
use Symfony\Component\HttpFoundation\Request;
use G\AngularPostRequestServiceProvider;

$app = new Application(['debug' => true]);

$app->register(new AngularPostRequestServiceProvider());

$app->post('/', function (Application $app, Request $request) {
    return $app->json(['hello' => $request->get('name')]);
});

$app->get('/', function (Application $app, Request $request) {
    return $app->json(['hello' => $request->get('name')]);
});

$app->run();

And now we only need to create one stored procedure to send POST requests

CREATE OR REPLACE FUNCTION post(
    uri character varying,
    paramenters json)
  RETURNS json AS
$BODY$
import urllib2

clen = len(paramenters)
req = urllib2.Request(uri, paramenters, {'Content-Type': 'application/json', 'Content-Length': clen})
f = urllib2.urlopen(req)
return f.read()

$BODY$
  LANGUAGE plpython2u VOLATILE
  COST 100;
ALTER FUNCTION post(character varying, json)
  OWNER TO gonzalo;

And that’s all. At least this simple script is exactly what I need.

Generating push notifications with Pushbullet and Silex

Sometimes I need to send push notifications to mobile apps (Android or IOS). It’s not difficult. Maybe it’s a bit nightmare the first times, but when you understand the process, it’s straightforward. Last days I discover a cool service called PushBullet. It allows us to install one client in our Android/IOS or even desktop computer, and send push notifications between them.

Pushbullet also has a good API, and it allows us to automate our push notifications. I’ve play a little bit with the API and my Raspberry Pi – home server. It’s really simple to integrate the API with our Silex backend and send push notifications to our registered devices.

I’ve created one small service provider to enclose the API. The idea is to use one Silex application like this

use Silex\Application;
use PushSilex\Silex\Provider\PushbulletServiceProvider;

$app = new Application(['debug' => true]);

$myToken = include(__DIR__ . '/../conf/token.php');

$app->register(new PushbulletServiceProvider($myToken));

$app->get("/", function () {
    return "Usage: GET /note/{title}/{body}";
});

$app->get("/note/{title}/{body}", function (Application $app, $title, $body) {
    return $app->json($app['pushbullet.note']($title, $body));
});

$app->run();

As we can see we’re using one service providers called PushbulletServiceProvider. This service provides us ‘pushbullet.note’ and allows to send push notifications. We only need to configure our Service Provider with our Pushbulled’s token and that’s all.

<?php
namespace PushSilex\Silex\Provider;
use Silex\ServiceProviderInterface;
use Silex\Application;
class PushbulletServiceProvider implements ServiceProviderInterface
{
    private $accessToken;
    const URI = 'https://api.pushbullet.com/v2/pushes';
    const NOTE = 'note';
    public function __construct($accessToken)
    {
        $this->accessToken = $accessToken;
    }
    public function register(Application $app)
    {
        $app['pushbullet.note'] = $app->protect(function ($title, $body) {
            return $this->push(self::NOTE, $title, $body);
        });
    }
    private function push($type, $title, $body)
    {
        $data = [
            'type'  => $type,
            'title' => $title,
            'body'  => $body,
        ];
        $ch = curl_init();
        curl_setopt_array($ch, [
            CURLOPT_URL            => self::URI,
            CURLOPT_HTTPHEADER     => ['Content-Type' => 'application/json'],
            CURLOPT_CUSTOMREQUEST  => 'POST',
            CURLOPT_POSTFIELDS     => $data,
            CURLOPT_HTTPAUTH       => CURLAUTH_BASIC,
            CURLOPT_USERPWD        => $this->accessToken . ':',
            CURLOPT_RETURNTRANSFER => true
        ]);
        $out = curl_exec($ch);
        curl_close($ch);

        return json_decode($out);
    }
    public function boot(Application $app)
    {
    }
}

Normally I use Guzzle to handle HTTP clients, but in this example I’ve created a raw curl connection.

You can see the project in my github account

Microservice container with Guzzle

This days I’m reading about Microservices. The idea is great. Instead of building a monolithic script using one language/framowork. We create isolated services and we build our application using those services (speaking HTTP between services and application).

That’s means we’ll have several microservices and we need to use them, and maybe sometimes change one service with another one. In this post I want to build one small container to handle those microservices. Similar idea than Dependency Injection Containers.

As we’re going to speak HTTP, we need a HTTP client. We can build one using curl, but in PHP world we have Guzzle, a great HTTP client library. In fact Guzzle has something similar than the idea of this post: Guzzle services, but I want something more siple.

Imagine we have different services:
One Silex service (PHP + Silex)

use Silex\Application;

$app = new Application();

$app->get('/hello/{username}', function($username) {
    return "Hello {$username} from silex service";
});

$app->run();

Another PHP service. This one using Slim framework

use Slim\Slim;

$app = new Slim();

$app->get('/hello/:username', function ($username) {
    echo "Hello {$username} from slim service";
});

$app->run();

And finally one Python service using Flask framework

from flask import Flask, jsonify
app = Flask(__name__)

@app.route('/hello/<username>')
def show_user_profile(username):
    return "Hello %s from flask service" % username

if __name__ == "__main__":
    app.run(debug=True, host='0.0.0.0', port=5000)

Now, with our simple container we can use one service or another

use Symfony\Component\Config\FileLocator;
use MSIC\Loader\YamlFileLoader;
use MSIC\Container;

$container = new Container();

$ymlLoader = new YamlFileLoader($container, new FileLocator(__DIR__));
$ymlLoader->load('container.yml');

echo $container->getService('flaskServer')->get('/hello/Gonzalo')->getBody() . "\n";
echo $container->getService('silexServer')->get('/hello/Gonzalo')->getBody() . "\n";
echo $container->getService('slimServer')->get('/hello/Gonzalo')->getBody() . "\n";

And that’s all. You can see the project in my github account.

PHP Dumper using Websockets

Another crazy idea. I want to dump my backend output in the browser’s console. There’re several PHP dumpers. For example Raul Fraile’s LadyBug. There’re also libraries to do exactly what I want to do, such as Chrome Logger. But I wanted to use Websockets and dump values in real time, without waiting to the end of backend script. Why? The answer is simple: Because I wanted to it :)

I’ve written several post about Websockets, Silex, PHP. In this case I’ll use a similar approach than the previous posts. First I’ve created a simple Webscocket server with socket.io. This server also starts a Express server to handle internal messages from the Silex Backend

var CONF = {
        IO: {HOST: '0.0.0.0', PORT: 8888},
        EXPRESS: {HOST: '0.0.0.0', PORT: 26300}
    },
    express = require('express'),
    expressApp = express(),
    server = require('http').Server(expressApp),
    io = require('socket.io')(server, {origins: 'localhost:*'})
    ;

expressApp.get('/:type/:session/:message', function (req, res) {
    console.log(req.params);
    var session = req.params.session,
        type = req.params.type,
        message = req.params.message;

    io.sockets.emit('dumper.' + session, {title: type, data: JSON.parse(message)});
    res.json('OK');
});

io.sockets.on('connection', function (socket) {
    console.log("Socket connected!");
});

expressApp.listen(CONF.EXPRESS.PORT, CONF.EXPRESS.HOST, function () {
    console.log('Express started');
});

server.listen(CONF.IO.PORT, CONF.IO.HOST, function () {
    console.log('IO started');
});

Now we create a simple Service provider to connect our Silex Backend to our Express server (and send the dumper’s messages using the websocket connection)

<?php

namespace Dumper\Silex\Provider;

use Silex\Application;
use Silex\ServiceProviderInterface;
use Dumper\Dumper;
use Silex\Provider\SessionServiceProvider;
use GuzzleHttp\Client;

class DumperServiceProvider implements ServiceProviderInterface
{
    private $wsConnector;
    private $client;

    public function __construct(Client $client, $wsConnector)
    {
        $this->client = $client;
        $this->wsConnector = $wsConnector;
    }

    public function register(Application $app)
    {
        $app->register(new SessionServiceProvider());

        $app['dumper'] = function () use ($app) {
            return new Dumper($this->client, $this->wsConnector, $app['session']->get('uid'));
        };

        $app['dumper.init'] = $app->protect(function ($uid) use ($app) {
            $app['session']->set('uid', $uid);
        });

        $app['dumper.uid'] = function () use ($app) {
            return $app['session']->get('uid');
        };
    }

    public function boot(Application $app)
    {
    }
}

Finally our Silex Application looks like that:

include __DIR__ . '/../vendor/autoload.php';

use Silex\Application;
use Silex\Provider\TwigServiceProvider;
use Dumper\Silex\Provider\DumperServiceProvider;
use GuzzleHttp\Client;

$app = new Application([
    'debug' => true
]);

$app->register(new DumperServiceProvider(new Client(), 'http://192.168.1.104:26300'));

$app->register(new TwigServiceProvider(), [
    'twig.path' => __DIR__ . '/../views',
]);

$app->get("/", function (Application $app) {
    $uid = uniqid();

    $app['dumper.init']($uid);

    return $app['twig']->render('index.twig', [
        'uid' => $uid
    ]);
});

$app->get('/api/hello', function (Application $app) {
    $app['dumper']->error("Hello world1");
    $app['dumper']->info([1,2,3]);

    return $app->json('OK');
});


$app->run();

In the client side we have one index.html. I’ve created Twig template to pass uid to the dumper object (the websocket channel to listen to), but we also can fetch this uid from the backend with one ajax call.

<!DOCTYPE html>
<html>
<head lang="en">
    <meta charset="UTF-8">
    <title>Dumper example</title>
</head>
<body>

<a href="#" onclick="api('hello')">hello</a>

<!-- We use jQuery just for the demo. Library doesn't need jQuery -->
<script src="assets/jquery/dist/jquery.min.js"></script>
<!-- We load the library -->
<script src="js/dumper.js"></script>

<script>
    dumper.startSocketIo('{{ uid }}', '//localhost:8888');
    function api(name) {
        // we perform a remote api ajax call that triggers websockets
        $.getJSON('/api/' + name, function (data) {
            // Doing nothing. We only call the api to test php dumper
        });
    }
</script>
</body>
</html>

I use jQuery to handle ajax request and to connect to the websocket dumper object (it doesn’t deppend on jQuery, only depend on socket.io)

var dumper = (function () {
    var socket, sessionUid, socketUri, init;

    init = function () {
        if (typeof(io) === 'undefined') {
            setTimeout(init, 100);
        } else {
            socket = io(socketUri);

            socket.on('dumper.' + sessionUid, function (data) {
                console.group('Dumper:', data.title);
                switch (data.title) {
                    case 'emergency':
                    case 'alert':
                    case 'critical':
                    case 'error':
                        console.error(data.data);
                        break;
                    case 'warning':
                        console.warn(data.data);
                        break;
                    case 'notice':
                    case 'info':
                    //case 'debug':
                        console.info(data.data);
                        break;
                    default:
                        console.log(data.data);
                }
                console.groupEnd();
            });
        }
    };

    return {
        startSocketIo: function (uid, uri) {
            var script = document.createElement('script');
            var node = document.getElementsByTagName('script')[0];

            sessionUid = uid;
            socketUri = uri;
            script.src = socketUri + '/socket.io/socket.io.js';
            node.parentNode.insertBefore(script, node);

            init();
        }
    };
})();

Source code is available in my github account

Building TCP server daemond with PHP and Ratchet

In my daily work I normally play a lot with TCP servers, clients and things like that. I like to use Linux’s xinet.d daemon to handle the TCP ports.

I’ve also written something about it. This approach works fine. You don’t need to open any port. Xinet.d opens the ports and invoke the PHP scripts. The problem appears when we call intensively our xinet.d server. It creates one PHP instance per request. It isn’t a problem with one request in, for example, 3 seconds, but if we need to handle 10 requests per second our server load will grow. The solution: a dedicated server.

With PHP we can create dedicated servers using, for example, Ratchet. I want to create a library using Ratchet to open TCP ports and register callbacks to those ports to handle the requests (Reactor pattern). Do you know Silex? Of course you know. This library borrows the idea of Silex (register callbacks to routes) to the TCP world.

Let me show examples:

Example 1:

use React\EventLoop\Factory as LoopFactory;
use G\Pxi\Pxinetd;

$loop = LoopFactory::create();
$service = new Pxinetd('0.0.0.0');

$service->on(8080, function ($data) {
    echo $data;
});

$service->register($loop);
$loop->run();

That’s the simplest example. A TCP echo server. We open 8080 port to all interfaces (0.0.0.0) and we return a simple input echo.

We can start different ports also:

use G\Pxi\Pxinetd;
use React\EventLoop\Factory as LoopFactory;

$loop = LoopFactory::create();
$service = new Pxinetd('0.0.0.0');

$service->on(8080, function ($data) {
    echo $data;
});

$service->on(8888, function ($data) {
    echo $data;
});

$service->register($loop);
$loop->run();

Example 2:
We can also work with the connection

use G\Pxi\Pxinetd;
use G\Pxi\Connection;
use React\EventLoop\Factory as LoopFactory;

$loop = LoopFactory::create();
$service = new Pxinetd('0.0.0.0');

$service->on(8080, function ($data) {
    echo $data;
});

$service->on(8088, function ($data, Connection $conn) {
    var_dump($conn->getRemoteAddress());
    echo $data;
    $conn->send("....");
    $conn->close();
});

$service->register($loop);
$loop->run();

Example 3:
I’m a big fan of YAML configurations, so we can load configurations from a YAML file, of course:

conf3.yml:

ports:
  9999:
    class: Services\Reader1
// Services/Reader1.php
use G\Pxi\Connection;
use G\Pxi\MessageIface;

class Reader1 implements MessageIface
{
    public function onMessage($data, Connection $conn)
    {
        echo $data . $conn->getRemoteAddress();
    }
}
use G\Pxi\Pxinetd;
use G\Pxi\YamlFileLoader;
use Symfony\Component\Config\FileLocator;
use React\EventLoop\Factory as LoopFactory;

$loop = LoopFactory::create();
$service = new Pxinetd('0.0.0.0');

$loader = new YamlFileLoader($service, new FileLocator(__DIR__ ));
$loader->load('conf3.yml');

$service->on(8080, function ($data) {
    echo "$data";
});

$service->register($loop);
$loop->run();

Example 4:
We’re using symfony/config and symfony/yaml components, so we can use hierarchy within our yaml files:

use G\Pxi\Pxinetd;
use G\Pxi\YamlFileLoader;
use Symfony\Component\Config\FileLocator;
use React\EventLoop\Factory as LoopFactory;

$loop = LoopFactory::create();
$service = new Pxinetd('0.0.0.0');

$loader = new YamlFileLoader($service, new FileLocator(__DIR__));
$loader->load('conf4.yml');

$service->on(8080, function ($data) {
    echo "$data";
});

$service->register($loop);
$loop->run();

config4.yml:

imports:
  - { resource: conf4_2.yml }
ports:
  9999:
    class: Services\Reader1

config4_2.yml

ports:
  7777:
    class: Services\Reader1

Example 5:
And finally one bonus. This script is single thread. That means if one process takes too much time it will block to the rest of the processes. We can implemente threads, but I try to avoid them like the plague. I prefer to create a Silex app (behind a http server) and perform http requests to “emulate” threads in a simply way.

use G\Pxi\Pxinetd;
use G\Pxi\YamlFileLoader;
use Symfony\Component\Config\FileLocator;
use React\EventLoop\Factory as LoopFactory;

$loop = LoopFactory::create();
$service = new Pxinetd('0.0.0.0');

$loader = new YamlFileLoader($service, new FileLocator(__DIR__ ));
$loader->load('conf5.yml');

$service->on(8080, function ($data) {
    echo "$data";
});

$service->register($loop);
$loop->run();

conf5.yml

ports:
  9999:
    class: Services\Reader1
  9991:
    url: http://localhost:8899/onMessage/{data}
  9992:
    url: http://localhost:8899/simulateError/{data}

And now our Silex server running at 8899 port:

include __DIR__ . "/../../vendor/autoload.php";

use Silex\Application;
use Symfony\Component\HttpKernel\Exception\NotFoundHttpException;

$app = new Application();

$app->get('/onMessage/{data}', function ($data) {
    return "OK" . "'{$data}'";
});

$app->get('/simulateError/{data}', function ($data) {
    throw new NotFoundHttpException();
});

$app->run();

And that’s all. What do you think? You can see the whole library in my github account.

Handling AngularJs POST requests with a Silex Backend

This days I working a lot with AngularJs applications (who doesn’t?). Normally my backend is a Silex application. It’s pretty straightforward to build a REST api with Silex. But when we play with an AngularJs client we need to face with a a problem. POST requests “doesn’t” work. That’s not 100% true. They work, indeed, but they speak different languages.

Silex assumes our POST requests are encoded as application/x-www-form-urlencoded, but angular encodes POST requests as application/json. That’s not a problem. It isn’t mandatory to use one encoder or another.

For example

name: Gonzalo
surname: Ayuso

If we use application/x-www-form-urlencoded, it’s encoded to:
name=Gonzalo&surname=Ayuso

And if we use application/json, it’s encoded to:
{ "name" : "Gonzalo", "surname" : "Ayuso" }

It’s the same but it isn’t.

Imagine this Silex example.

use Silex\Application;
use Symfony\Component\HttpFoundation\Request;

$app = new Application();

$app->post("/post", function (Application $app, Request $request) {
    return $app->json([
        'status' => true,
        'name'   => $request->get('name')
    ]);
});

This example works with application/x-www-form-urlencoded but it doesn’t work with application/json. We cannot use Symfony\Component\HttpFoundation\Request parameter’s bag as usual. We can get the raw request body with:

$request->getContent();

Our content in a application/json encoded Request is a JSON, so we can use json_decode to access to those parameters.

If we read the Silex documentation we can see how to handle those requests

http://silex.sensiolabs.org/doc/cookbook/json_request_body.html

In this post we’re going to enclose this code within a service provider. OK, that’s not really a service provider (it doesn’t provide any service). It just change the request (when we get application/json) without copy and paste the same code within each project.

use Silex\Application;
use G\AngularPostRequestServiceProvider;
use Symfony\Component\HttpFoundation\Request;

$app = new Application();
$app->register(new AngularPostRequestServiceProvider());

$app->post("/post", function (Application $app, Request $request) {
    return $app->json([
        'status' => true,
        'name'   => $request->get('name')
    ]);
});

The service provider is very simple

namespace G;

use Silex\Application;
use Symfony\Component\HttpFoundation\Request;
use Pimple\ServiceProviderInterface;
use Pimple\Container;

class AngularPostRequestServiceProvider implements ServiceProviderInterface
{
    public function register(Container $app)
    {
        $app->before(function (Request $request) {
            if ($this->isRequestTransformable($request)) {
                $transformedRequest = $this->transformContent($request->getContent());
                $request->request->replace($transformedRequest);
            }
        });
    }

    public function boot(Application $app)
    {
    }

    private function transformContent($content)
    {
        return json_decode($content, true);
    }

    private function isRequestTransformable(Request $request)
    {
        return 0 === strpos($request->headers->get('Content-Type'), 'application/json');
    }
}

You can see the whole code in my github account and also in packagist

Follow

Get every new post delivered to your Inbox.

Join 1,115 other followers