Happy logins. Only the happy user will pass

Login forms are bored. In this example we’re going to create an especial login form. Only for happy users. Happiness is something complicated, but at least, one smile is more easy to obtain, and all is better with one smile :). Our login form will only appear if the user smiles. Let’s start.

I must admit that this project is just an excuse to play with different technologies that I wanted to play. Weeks ago I discovered one library called face_classification. With this library I can perform emotion classification from a picture. The idea is simple. We create RabbitMQ RPC server script that answers with the emotion of the face within a picture. Then we obtain on frame from the video stream of the webcam (with HTML5) and we send this frame using websocket to a socket.io server. This websocket server (node) ask to the RabbitMQ RPC the emotion and it sends back to the browser the emotion and a the original picture with a rectangle over the face.

Frontend

As well as we’re going to use socket.io for websockets we will use the same script to serve the frontend (the login and the HTML5 video capture)

<!doctype html>
<html>
<head>
    <title>Happy login</title>
    <link rel="stylesheet" href="css/app.css">
</head>
<body>

<div id="login-page" class="login-page">
    <div class="form">
        <h1 id="nonHappy" style="display: block;">Only the happy user will pass</h1>
        <form id="happyForm" class="login-form" style="display: none" onsubmit="return false;">
            <input id="user" type="text" placeholder="username"/>
            <input id="pass" type="password" placeholder="password"/>
            <button id="login">login</button>
            <p></p>
            <img id="smile" width="426" height="320" src=""/>
        </form>
        <div id="video">
            <video style="display:none;"></video>
            <canvas id="canvas" style="display:none"></canvas>
            <canvas id="canvas-face" width="426" height="320"></canvas>
        </div>
    </div>
</div>

<div id="private" style="display: none;">
    <h1>Private page</h1>
</div>

<script src="https://code.jquery.com/jquery-3.2.1.min.js" integrity="sha256-hwg4gsxgFZhOsEEamdOYGBf13FyQuiTwlAQgxVSNgt4=" crossorigin="anonymous"></script>
<script src="https://unpkg.com/sweetalert/dist/sweetalert.min.js"></script>
<script type="text/javascript" src="/socket.io/socket.io.js"></script>
<script type="text/javascript" src="/js/app.js"></script>
</body>
</html>

Here we’ll connect to the websocket and we’ll emit the webcam frame to the server. We´ll also be listening to one event called ‘response’ where server will notify us when one emotion has been detected.

let socket = io.connect(location.origin),
    img = new Image(),
    canvasFace = document.getElementById('canvas-face'),
    context = canvasFace.getContext('2d'),
    canvas = document.getElementById('canvas'),
    width = 640,
    height = 480,
    delay = 1000,
    jpgQuality = 0.6,
    isHappy = false;

socket.on('response', function (r) {
    let data = JSON.parse(r);
    if (data.length > 0 && data[0].hasOwnProperty('emotion')) {
        if (isHappy === false && data[0]['emotion'] === 'happy') {
            isHappy = true;
            swal({
                title: "Good!",
                text: "All is better with one smile!",
                icon: "success",
                buttons: false,
                timer: 2000,
            });

            $('#nonHappy').hide();
            $('#video').hide();
            $('#happyForm').show();
            $('#smile')[0].src = 'data:image/png;base64,' + data[0].image;
        }

        img.onload = function () {
            context.drawImage(this, 0, 0, canvasFace.width, canvasFace.height);
        };

        img.src = 'data:image/png;base64,' + data[0].image;
    }
});

navigator.getMedia = (navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia);

navigator.getMedia({video: true, audio: false}, (mediaStream) => {
    let video = document.getElementsByTagName('video')[0];
    video.src = window.URL.createObjectURL(mediaStream);
    video.play();
    setInterval(((video) => {
        return function () {
            let context = canvas.getContext('2d');
            canvas.width = width;
            canvas.height = height;
            context.drawImage(video, 0, 0, width, height);
            socket.emit('img', canvas.toDataURL('image/jpeg', jpgQuality));
        }
    })(video), delay)
}, error => console.log(error));

$(() => {
    $('#login').click(() => {
        $('#login-page').hide();
        $('#private').show();
    })
});

Backend
Finally we’ll work in the backend. Basically I’ve check the examples that we can see in face_classification project and tune it a bit according to my needs.

from rabbit import builder
import logging
import numpy as np
from keras.models import load_model
from utils.datasets import get_labels
from utils.inference import detect_faces
from utils.inference import draw_text
from utils.inference import draw_bounding_box
from utils.inference import apply_offsets
from utils.inference import load_detection_model
from utils.inference import load_image
from utils.preprocessor import preprocess_input
import cv2
import json
import base64

detection_model_path = 'trained_models/detection_models/haarcascade_frontalface_default.xml'
emotion_model_path = 'trained_models/emotion_models/fer2013_mini_XCEPTION.102-0.66.hdf5'
emotion_labels = get_labels('fer2013')
font = cv2.FONT_HERSHEY_SIMPLEX

# hyper-parameters for bounding boxes shape
emotion_offsets = (20, 40)

# loading models
face_detection = load_detection_model(detection_model_path)
emotion_classifier = load_model(emotion_model_path, compile=False)

# getting input model shapes for inference
emotion_target_size = emotion_classifier.input_shape[1:3]


def format_response(response):
    decoded_json = json.loads(response)
    return "Hello {}".format(decoded_json['name'])


def on_data(data):
    f = open('current.jpg', 'wb')
    f.write(base64.decodebytes(data))
    f.close()
    image_path = "current.jpg"

    out = []
    # loading images
    rgb_image = load_image(image_path, grayscale=False)
    gray_image = load_image(image_path, grayscale=True)
    gray_image = np.squeeze(gray_image)
    gray_image = gray_image.astype('uint8')

    faces = detect_faces(face_detection, gray_image)
    for face_coordinates in faces:
        x1, x2, y1, y2 = apply_offsets(face_coordinates, emotion_offsets)
        gray_face = gray_image[y1:y2, x1:x2]

        try:
            gray_face = cv2.resize(gray_face, (emotion_target_size))
        except:
            continue

        gray_face = preprocess_input(gray_face, True)
        gray_face = np.expand_dims(gray_face, 0)
        gray_face = np.expand_dims(gray_face, -1)
        emotion_label_arg = np.argmax(emotion_classifier.predict(gray_face))
        emotion_text = emotion_labels[emotion_label_arg]
        color = (0, 0, 255)

        draw_bounding_box(face_coordinates, rgb_image, color)
        draw_text(face_coordinates, rgb_image, emotion_text, color, 0, -50, 1, 2)
        bgr_image = cv2.cvtColor(rgb_image, cv2.COLOR_RGB2BGR)

        cv2.imwrite('predicted.png', bgr_image)
        data = open('predicted.png', 'rb').read()
        encoded = base64.encodebytes(data).decode('utf-8')
        out.append({
            'image': encoded,
            'emotion': emotion_text,
        })

    return out

logging.basicConfig(level=logging.WARN)
rpc = builder.rpc("image.check", {'host': 'localhost', 'port': 5672})
rpc.server(on_data)

Here you can see in action the working prototype

Maybe we can do the same with another tools and even more simple but as I said before this example is just an excuse to play with those technologies:

  • Send webcam frames via websockets
  • Connect one web application to a Pyhon application via RabbitMQ RPC
  • Play with face classification script

Please don’t use this script in production. It’s just a proof of concepts. With smiles but a proof of concepts 🙂

You can see the project in my github account

Advertisement

Real Time IoT in the cloud with SAP’s SCP, Cloud Foundry and WebSockets

Nowadays I’m involved with a cloud project based on SAP Cloud Platform (SCP). Side projects are the best way to mastering new technologies (at least for me) so I want to build something with SCP and my Arduino stuff. SCP comes whit one IoT module. In fact every cloud platforms have, in one way or another, one IoT module (Amazon, Azure, …). With SCP the IoT module it’s just a Hana Database where we can push our IoT values and we’re able to retrieve information via oData (the common way in SAP world).

It’s pretty straightforward to configure the IoT module with the SAP Cloud Platform Cockpit (Every thing can be done with a hana trial account).

NodeMcu

First I’m going to use a simple circuit with my NodeMcu connected to my wifi network. The prototype is a potentiometer connected to the analog input. I normally use this this circuit because I can change the value just changing the potentiometer wheel. I know it’s not very usefull, but we can easily change it and use a sensor (temperature, humidity, light, …)

It will send the percentage (from 0 to 100) of the position of the potentiometer directly to the cloud.

#include <ESP8266WiFi.h>

const int potentiometerPin = 0;

// Wifi configuration
const char* ssid = "my-wifi-ssid";
const char* password = "my-wifi-password";

// SAP SCP specific configuration
const char* host = "mytenant.hanatrial.ondemand.com";
String device_id = "my-device-ide";
String message_type_id = "my-device-type-id";
String oauth_token = "my-oauth-token";

String url = "https://[mytenant].hanatrial.ondemand.com/com.sap.iotservices.mms/v1/api/http/data/" + device_id;

const int httpsPort = 443;

WiFiClientSecure clientTLS;

void wifiConnect() {
  Serial.println();
  Serial.print("Connecting to ");
  Serial.println(ssid);

  WiFi.begin(ssid, password);

  while (WiFi.status() != WL_CONNECTED) {
    delay(500);
    Serial.print(".");
  }
  Serial.println("");
  Serial.print("WiFi connected.");
  Serial.print("IP address: ");
  Serial.println(WiFi.localIP());
}

void sendMessage(int value) {
  String payload = "{\"mode\":\"async\", \"messageType\":\"" + message_type_id + "\", \"messages\":[{\"value\": " + (String) value + "}]}";
  Serial.print("connecting to ");
  Serial.println(host);
  if (!clientTLS.connect(host, httpsPort)) {
    Serial.println("connection failed");
    return;
  }

  Serial.print("requesting payload: ");
  Serial.println(url);

  clientTLS.print(String("POST ") + url + " HTTP/1.0\r\n" +
               "Host: " + host + "\r\n" +
               "Content-Type: application/json;charset=utf-8\r\n" +
               "Authorization: Bearer " + oauth_token + "\r\n" +
               "Content-Length: " + payload.length() + "\r\n\r\n" +
               payload + "\r\n\r\n");

  Serial.println("request sent");

  Serial.println("reply was:");
  while (clientTLS.connected()) {
    String line = clientTLS.readStringUntil('\n');
    Serial.println(line);
  }
}

void setup() {
  Serial.begin(9600);
  wifiConnect();

  delay(10);
}

int mem;
void loop() {

  int value = ((analogRead(potentiometerPin) * 100) / 1010);
  if (value < (mem - 1) or value > (mem + 1)) {
    sendMessage(value);
    Serial.println(value);
    mem = value;
  }

  delay(200);
}

SCP

SAP Cloud Platform allows us to create web applications using SAPUI5 framework easily. It also allows us to create a destination (the way that SAP’s cloud uses to connect different modules) to our IoT module. Also every Hana table can be accessed via oData so and we can retrieve the information easily within SAPIUI5.

onAfterRendering: function () {
    var model = this.model;

    this.getView().getModel().read("/my-hana-table-odata-uri", {
        urlParameters: {
            $top: 1,
            $orderby: "G_CREATED desc"
        },
        success: function (oData) {
            model.setProperty("/value", oData.results[0].C_VALUE);
        }
    });
}

and display in a view

<mvc:View controllerName="gonzalo123.iot.controller.Main" xmlns:html="http://www.w3.org/1999/xhtml" xmlns:mvc="sap.ui.core.mvc"
          displayBlock="true" xmlns="sap.m">
    <App>
        <pages>
            <Page title="{i18n>title}">
                <content>
                    <GenericTile class="sapUiTinyMarginBegin sapUiTinyMarginTop tileLayout" header="nodemcu" frameType="OneByOne">
                        <tileContent>
                            <TileContent unit="%">
                                <content>
                                    <NumericContent value="{view>/value}" icon="sap-icon://line-charts"/>
                                </content>
                            </TileContent>
                        </tileContent>
                    </GenericTile>
                </content>
            </Page>
        </pages>
    </App>
</mvc:View>

Cloud Foundry

The web application (with SCP and SAPUI5) can access to IoT values via oData. We can fetch data again and again, but that’s not cool. We want real time updates in the web application. So we need WebSockets. SCP IoT module allows us to use WebSockets to put information, but not get updates (afaik. Let me know if I’m wrong). We also can connect our IoT to an existing MQTT server, but in this prototype I only want to use websockets. So we’re going to create a simple WebSocket server with node and socket.io. This server will be listening to devices updates (again and again with a setInterval function via oData) and when it detects a change it will emit a broadcast to the WebSocket.

SAP’s SCP also allows us to create services with Cloud Foundry. So we’ll create our nodejs server there.

var http = require('http'),
    io = require('socket.io'),
    request = require('request'),
    auth = "Basic " + new Buffer(process.env.USER + ":" + process.env.PASS).toString("base64"),
    url = process.env.IOT_ODATA,
    INTERVAL = process.env.INTERVAL,
    socket,
    value;

server = http.createServer();
server.listen(process.env.PORT || 3000);

socket = io.listen(server);

setInterval(function () {
    request.get({
        url: url,
        headers: {
            "Authorization": auth,
            "Accept": "application/json"
        }
    }, function (error, response, body) {
        var newValue = JSON.parse(body).d.results[0].C_VALUE;
        if (value !== newValue) {
            value = newValue;
            socket.sockets.emit('value', value);
        }
    });
}, INTERVAL);

And that’s all. My NodeMcu device connected to the cloud.

Full project available in my github

Playing with Docker, Silex, Python, Node and WebSockets

I’m learning Docker. In this post I want to share a little experiment that I have done. I know the code looks like over-engineering but it’s just an excuse to build something with docker and containers. Let me explain it a little bit.

The idea is build a Time clock in the browser. Something like this:

Clock

Yes I know. We can do it only with js, css and html but we want to hack a little bit more. The idea is to create:

  • A Silex/PHP frontend
  • A WebSocket server with socket.io/node
  • A Python script to obtain the current time

WebSocket server will open 2 ports: One port to serve webSockets (socket.io) and another one as a http server (express). Python script will get the current time and it’ll send it to the webSocket server. Finally one frontend(silex) will be listening to WebSocket’s event and it will render the current time.

That’s the WebSocket server (with socket.io and express)

var
    express = require('express'),
    expressApp = express(),
    server = require('http').Server(expressApp),
    io = require('socket.io')(server, {origins: 'localhost:*'})
    ;

expressApp.get('/tic', function (req, res) {
    io.sockets.emit('time', req.query.time);
    res.json('OK');
});

expressApp.listen(6400, '0.0.0.0');

server.listen(8080);

That’s our Python script

from time import gmtime, strftime, sleep
import httplib2

h = httplib2.Http()
while True:
    (resp, content) = h.request("http://node:6400/tic?time=" + strftime("%H:%M:%S", gmtime()))
    sleep(1)

And our Silex frontend

use Silex\Application;
use Silex\Provider\TwigServiceProvider;

$app = new Application(['debug' => true]);
$app->register(new TwigServiceProvider(), [
    'twig.path' => __DIR__ . '/../views',
]);

$app->get("/", function (Application $app) {
    return $app['twig']->render('index.twig', []);
});

$app->run();

using this twig template

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="utf-8">
    <meta http-equiv="X-UA-Compatible" content="IE=edge">
    <meta name="viewport" content="width=device-width, initial-scale=1">
    <title>Docker example</title>
    <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css" integrity="sha384-BVYiiSIFeK1dGmJRAkycuHAHRg32OmUcww7on3RYdg4Va+PmSTsz/K68vbdEjh4u" crossorigin="anonymous">
    <link href="css/app.css" rel="stylesheet">
    <script src="https://oss.maxcdn.com/html5shiv/3.7.3/html5shiv.min.js"></script>
    <script src="https://oss.maxcdn.com/respond/1.4.2/respond.min.js"></script>
</head>
<body>
<div class="site-wrapper">
    <div class="site-wrapper-inner">
        <div class="cover-container">
            <div class="inner cover">
                <h1 class="cover-heading">
                    <div id="display">
                        display
                    </div>
                </h1>
            </div>
        </div>
    </div>
</div>
<script src="//localhost:8080/socket.io/socket.io.js"></script>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.12.4/jquery.min.js"></script>
<script>
var socket = io.connect('//localhost:8080');

$(function () {
    socket.on('time', function (data) {
        $('#display').html(data);
    });
});
</script>
</body>
</html>

The idea is to use one Docker container for each process. I like to have all the code in one place so all containers will share the same volume with source code.

First the node container (WebSocket server)

FROM node:argon

RUN mkdir -p /mnt/src
WORKDIR /mnt/src/node

EXPOSE 8080 6400

Now the python container

FROM python:2

RUN pip install httplib2

RUN mkdir -p /mnt/src
WORKDIR /mnt/src/python

And finally Frontend contailer (apache2 with Ubuntu 16.04)

FROM ubuntu:16.04

RUN locale-gen es_ES.UTF-8
RUN update-locale LANG=es_ES.UTF-8
ENV DEBIAN_FRONTEND=noninteractive

RUN apt-get update -y
RUN apt-get install --no-install-recommends -y apache2 php libapache2-mod-php
RUN apt-get clean -y

COPY ./apache2/sites-available/000-default.conf /etc/apache2/sites-available/000-default.conf

RUN mkdir -p /mnt/src

RUN a2enmod rewrite
RUN a2enmod proxy
RUN a2enmod mpm_prefork

RUN chown -R www-data:www-data /mnt/src
ENV APACHE_RUN_USER www-data
ENV APACHE_RUN_GROUP www-data
ENV APACHE_LOG_DIR /var/log/apache2
ENV APACHE_LOCK_DIR /var/lock/apache2
ENV APACHE_PID_FILE /var/run/apache2/apache2.pid
ENV APACHE_SERVERADMIN admin@localhost
ENV APACHE_SERVERNAME localhost

EXPOSE 80

Now we’ve got the three containers but we want to use all together. We’ll use a docker-compose.yml file. The web container will expose port 80 and node container 8080. Node container also opens 6400 but this port is an internal port. We don’t need to access to this port outside. Only Python container needs to access to this port. Because of that 6400 is not mapped to any port in docker-compose

version: '2'

services:
  web:
    image: gonzalo123/example_web
    container_name: example_web
    ports:
     - "80:80"
    restart: always
    depends_on:
      - node
    build:
      context: ./images/php
      dockerfile: Dockerfile
    entrypoint:
      - /usr/sbin/apache2
      - -D
      - FOREGROUND
    volumes:
     - ./src:/mnt/src

  node:
    image: gonzalo123/example_node
    container_name: example_node
    ports:
     - "8080:8080"
    restart: always
    build:
      context: ./images/node
      dockerfile: Dockerfile
    entrypoint:
      - npm
      - start
    volumes:
     - ./src:/mnt/src

  python:
      image: gonzalo123/example_python
      container_name: example_python
      restart: always
      depends_on:
        - node
      build:
        context: ./images/python
        dockerfile: Dockerfile
      entrypoint:
        - python
        - tic.py
      volumes:
       - ./src:/mnt/src

And that’s all. We only need to start our containers

docker-compose up --build -d

and open our browser at: http://localhost to see our Time clock

Full source code available within my github account

Encrypt Websocket (socket.io) communications

I’m a big fan of WebSockets and socket.io. I’ve written a lot of about it. In last posts I’ve written about socket.io and authentication. Today we’re going to speak about communications.

Imagine we’ve got a websocket server and we connect our application to this server (even using https/wss). If we open our browser’s console we can inspect our WebSocket communications. We also can enable debugging. This works in a similar way than when we start the promiscuous mode within our network interface. We will see every packets. Not only the packets that server is sending to us.

If we send send sensitive information over websockets, that means than one logged user can see another ones information. We can separate namespaces in our socket.io server. We also can do another thing: Encrypt communications using crypto-js.

I’ve created one small wrapper to use it with socket.io.
We can install our server dependency

npm g-crypt

And install our client dependency with bower

bower install g-crypt

And use it in our server

var io = require('socket.io')(3000),
    Crypt = require("g-crypt"),
    passphrase = 'super-secret-passphrase',
    crypter = Crypt(passphrase);

io.on('connection', function (socket) {
    socket.on('counter', function (data) {
        var decriptedData = crypter.decrypt(data);
        setTimeout(function () {
            console.log("counter status: " + decriptedData.id);
            decriptedData.id++;
            socket.emit('counter', crypter.encrypt(decriptedData));
        }, 1000);
    });
});

And now a simple HTTP application

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <title>Title</title>
</head>
<body>
Open console to see the messages

<script src="http://localhost:3000/socket.io/socket.io.js"></script>
<script src="assets/cryptojslib/rollups/aes.js"></script>
<script src="assets/g-crypt/src/Crypt.js"></script>
<script>
    var socket = io('http://localhost:3000/'),
        passphrase = 'super-secret-passphrase',
        crypter = Crypt(passphrase),
        id = 0;

    socket.on('connect', function () {
        console.log("connected! Let's start the counter with: " + id);
        socket.emit('counter', crypter.encrypt({id: id}));
    });

    socket.on('counter', function (data) {
        var decriptedData = crypter.decrypt(data);
        console.log("counter status: " + decriptedData.id);
        socket.emit('counter', crypter.encrypt({id: decriptedData.id}));
    });
</script>

</body>
</html>

Now our communications are encrypted and logged user cannot read another ones data.

Library is a simple wrapper

Crypt = function (passphrase) {
    "use strict";
    var pass = passphrase;
    var CryptoJSAesJson = {
        parse: function (jsonStr) {
            var j = JSON.parse(jsonStr);
            var cipherParams = CryptoJS.lib.CipherParams.create({ciphertext: CryptoJS.enc.Base64.parse(j.ct)});
            if (j.iv) cipherParams.iv = CryptoJS.enc.Hex.parse(j.iv);
            if (j.s) cipherParams.salt = CryptoJS.enc.Hex.parse(j.s);
            return cipherParams;
        },
        stringify: function (cipherParams) {
            var j = {ct: cipherParams.ciphertext.toString(CryptoJS.enc.Base64)};
            if (cipherParams.iv) j.iv = cipherParams.iv.toString();
            if (cipherParams.salt) j.s = cipherParams.salt.toString();
            return JSON.stringify(j);
        }
    };

    return {
        decrypt: function (data) {
            return JSON.parse(CryptoJS.AES.decrypt(data, pass, {format: CryptoJSAesJson}).toString(CryptoJS.enc.Utf8));
        },
        encrypt: function (data) {
            return CryptoJS.AES.encrypt(JSON.stringify(data), pass, {format: CryptoJSAesJson}).toString();
        }
    };
};

if (typeof module !== 'undefined' && typeof module.exports !== 'undefined') {
    CryptoJS = require("crypto-js");
    module.exports = Crypt;
} else {
    window.Crypt = Crypt;
}

Library available in my github and also we can use it using npm and bower.

Sharing authentication between socket.io and a PHP frontend (using JSON Web Tokens)

I’ve written a previous post about Sharing authentication between socket.io and a PHP frontend but after publish the post a colleague (hi @mariotux) told me that I can use JSON Web Tokens (jwt) to do this. I had never used jwt before so I decided to study a little bit.

JWT are pretty straightforward. You only need to create the token and send it to the client. You don’t need to store this token within a database. Client can decode and validate it on its own. You also can use any programming language to encode and decode tokens (jwt is available in the most common ones)

We’re going to create the same example than the previous post. Today, with jwt, we don’t need to pass the PHP session and perform a http request to validate it. We’ll only pass the token. Our nodejs server will validate by its own.

var io = require('socket.io')(3000),
    jwt = require('jsonwebtoken'),
    secret = "my_super_secret_key";

// middleware to perform authorization
io.use(function (socket, next) {
    var token = socket.handshake.query.token,
        decodedToken;
    try {
        decodedToken = jwt.verify(token, secret);
        console.log("token valid for user", decodedToken.user);
        socket.connectedUser = decodedToken.user;
        next();
    } catch (err) {
        console.log(err);
        next(new Error("not valid token"));
        //socket.disconnect();
    }
});

io.on('connection', function (socket) {
    console.log('Connected! User: ', socket.connectedUser);
});

That’s the client:

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <title>Title</title>
</head>
<body>
Welcome {{ user }}!

<script src="http://localhost:3000/socket.io/socket.io.js"></script>
<script src="/assets/jquery/dist/jquery.js"></script>

<script>
    var socket;
    $(function () {
        $.getJSON("/getIoConnectionToken", function (jwt) {
            socket = io('http://localhost:3000', {
                query: 'token=' + jwt
            });

            socket.on('connect', function () {
                console.log("connected!");
            });

            socket.on('error', function (err) {
                console.log(err);
            });
        });
    });
</script>

</body>
</html>

And here the backend. A simple Silex server very similar than the previous post one. JWT has also several reserved claims. For example “exp” to set up an expiration timestamp. It’s very useful. We only set one value and validator will reject tokens with incorrect timestamp. In this example I’m not using expiration date. That’s means that my token will never expires. And never means never. In my first prototype I set up an small expiration date (10 seconds). That means my token is only available during 10 seconds. Sounds great. My backend generate tokens that are going to be used immediately. That’s the normal situation but, what happens if I restart the socket.io server? The client will try to reconnect again using the token but it’s expired. We’ll need to create a new jwt before reconnecting. Because of that I’ve removed expiration date in this example but remember: Without expiration date your generated tokens will be always valid (al always is a very big period of time)

<?php
include __DIR__ . "/../vendor/autoload.php";

use Firebase\JWT\JWT;
use Silex\Application;
use Silex\Provider\SessionServiceProvider;
use Silex\Provider\TwigServiceProvider;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpKernel\Exception\AccessDeniedHttpException;

$app = new Application([
    'secret' => "my_super_secret_key",
    'debug' => true
]);
$app->register(new SessionServiceProvider());
$app->register(new TwigServiceProvider(), [
    'twig.path' => __DIR__ . '/../views',
]);

$app->get('/', function (Application $app) {
    return $app['twig']->render('home.twig');
});
$app->get('/login', function (Application $app) {
    $username = $app['request']->server->get('PHP_AUTH_USER', false);
    $password = $app['request']->server->get('PHP_AUTH_PW');
    if ('gonzalo' === $username && 'password' === $password) {
        $app['session']->set('user', ['username' => $username]);

        return $app->redirect('/private');
    }
    $response = new Response();
    $response->headers->set('WWW-Authenticate', sprintf('Basic realm="%s"', 'site_login'));
    $response->setStatusCode(401, 'Please sign in.');

    return $response;
});

$app->get('/getIoConnectionToken', function (Application $app) {
    $user = $app['session']->get('user');
    if (null === $user) {
        throw new AccessDeniedHttpException('Access Denied');
    }

    $jwt = JWT::encode([
        // I can use "exp" reserved claim. It's cool. My connection token is only available
        // during a period of time. The problem is if I restart the io server. Client will
        // try to re-connect using this token and it's expired.
        //"exp"  => (new \DateTimeImmutable())->modify('+10 second')->getTimestamp(),
        "user" => $user
    ], $app['secret']);

    return $app->json($jwt);
});

$app->get('/private', function (Application $app) {
    $user = $app['session']->get('user');

    if (null === $user) {
        throw new AccessDeniedHttpException('Access Denied');
    }

    $userName = $user['username'];

    return $app['twig']->render('private.twig', [
        'user'  => $userName
    ]);
});
$app->run();

Full project in my github.

Book review: Socket.IO Cookbook

Last summer I collaborated as a technical reviewer in the book “Socket.IO Cookbook” written by Tyson Cadenhead and finally I’ve got the book in my hands

I’m a big fan of real time technologies and I’m normally Socket.io user. Because of that, when people of Packt Publishing contacted me to join to the project as technical reviewer my answer was yes. I’ve got serious problems nowadays to find time to pet projects and extra activities, but if there’re WebSockets inside I cannot resists.

The book is correct and it’s a good starting point to event-based communication with JavaScript. I normally don’t like beginners books (even if I’m a beginner in the technology). I don’t like the books where author explains how to do one thing that I can see how to do it within the website of the. OK. This book isn’t one of those of books. The writer don’t assume reader is a totally newbie. Because of that newbies sometimes can be lost in some chapters, but this exactly the way we all learn new technologies. I like the way Tyson introduces concepts about socket.io.

The book is focused in JavaScript and also uses JavaScript to the backend (with node). Maybe I miss the integration with non-JavaScript environments, but as socket.io is a javascript library I understand that the usage of JavaScript in all application lifecycle is a good approach.

IMG_20151106_204902_jpg

Also those days I was reading and playing a little bit with WebRTC and the book has one chapter about it! #cool

Enclosing socket.io Websocket connection inside a HTML5 SharedWorker

I really like WebSockets. I’ve written several posts about them. Today we’re going to speak about something related to WebSockets. Let me explain it a little bit.

Imagine that we build a Web Application with WebSockets. That’s means that when we start the application, we need to connect to the WebSockets server. If our application is a Single-page application, we’ll create one socket per application, but: What happens if we open three tabs with the application within the browser? The answer is simple, we’ll create three sockets. Also, if we reload one tab (a full refresh) we’ll disconnect our socket and reconnect again. Maybe we can handle this situation, but we can easily bypass this disconnect-connect situation with a HTML5 feature called SharedWorkers.

Web Workers allows us to run JavaScript process in background. We also can create Shared Workers. SharedWorkers can be shared within our browser session. That’s means that we can enclose our WebSocket server inside s SharedWorker, and if we open various tabs with our browser we only need one Socket (one socket per session instead one socket per tab).

I’ve written a simple library called gio to perform this operation. gio uses socket.io to create WebSockets. WebWorker is a new HTML5 feature and it needs a modern browser. Socket.io works also with old browsers. It checks if WebWorkers are available and if they isn’t, then gio creates a WebSocket connection instead of using a WebWorker to enclose the WebSockets.

We can see one simple video to see how it works. In the video we can see how sockets are created. Only one socket is created even if we open more than one tab in our browser. But if we open a new session (one incognito session for example), a new socket is created

Here we can see the SharedWorker code:

"use strict";

importScripts('socket.io.js');

var socket = io(self.name),
    ports = [];

addEventListener('connect', function (event) {
    var port = event.ports[0];
    ports.push(port);
    port.start();

    port.addEventListener("message", function (event) {
        for (var i = 0; i < event.data.events.length; ++i) {
            var eventName = event.data.events[i];

            socket.on(event.data.events[i], function (e) {
                port.postMessage({type: eventName, message: e});
            });
        }
    });
});

socket.on('connect', function () {
    for (var i = 0; i < ports.length; i++) {
        ports[i].postMessage({type: '_connect'});
    }
});

socket.on('disconnect', function () {
    for (var i = 0; i < ports.length; i++) {
        ports[i].postMessage({type: '_disconnect'});
    }
});

And here we can see the gio source code:

var gio = function (uri, onConnect, onDisConnect) {
    "use strict";
    var worker, onError, workerUri, events = {};

    function getKeys(obj) {
        var keys = [];

        for (var i in obj) {
            if (obj.hasOwnProperty(i)) {
                keys.push(i);
            }
        }

        return keys;
    }

    function onMessage(type, message) {
        switch (type) {
            case '_connect':
                if (onConnect) onConnect();
                break;
            case '_disconnect':
                if (onDisConnect) onDisConnect();
                break;
            default:
                if (events[type]) events[type](message);
        }
    }

    function startWorker() {
        worker = new SharedWorker(workerUri, uri);
        worker.port.addEventListener("message", function (event) {
            onMessage(event.data.type, event.data.message);

        }, false);

        worker.onerror = function (evt) {
            if (onError) onError(evt);
        };

        worker.port.start();
        worker.port.postMessage({events: getKeys(events)});
    }

    function startSocketIo() {
        var socket = io(uri);
        socket.on('connect', function () {
            if (onConnect) onConnect();
        });

        socket.on('disconnect', function () {
            if (onDisConnect) onDisConnect();
        });

        for (var eventName in events) {
            if (events.hasOwnProperty(eventName)) {
                socket.on(eventName, socketOnEventHandler(eventName));
            }
        }
    }

    function socketOnEventHandler(eventName) {
        return function (e) {
            onMessage(eventName, e);
        };
    }

    return {
        registerEvent: function (eventName, callback) {
            events[eventName] = callback;
        },

        start: function () {
            if (!SharedWorker) {
                startSocketIo();
            } else {
                startWorker();
            }
        },

        onError: function (cbk) {
            onError = cbk;
        },

        setWorker: function (uri) {
            workerUri = uri;
        }
    };
};

And here the application code:

(function (gio) {
    "use strict";

    var onConnect = function () {
        console.log("connected!");
    };

    var onDisConnect = function () {
        console.log("disconnect!");
    };

    var ws = gio("http://localhost:8080", onConnect, onDisConnect);
    ws.setWorker("sharedWorker.js");

    ws.registerEvent("message", function (data) {
        console.log("message", data);
    });

    ws.onError(function (data) {
        console.log("error", data);
    });

    ws.start();
}(gio));

I’ve also created a simple webSocket server with socket.io. In this small server there’s a setInterval function broadcasting one message to all clients per second to see the application working

var io, connectedSockets;

io = require('socket.io').listen(8080);
connectedSockets = 0;

io.sockets.on('connection', function (socket) {
    connectedSockets++;
    console.log("Socket connected! Conected sockets:", connectedSockets);

    socket.on('disconnect', function () {
        connectedSockets--;
        console.log("Socket disconnect! Conected sockets:", connectedSockets);
    });
});

setInterval(function() {
    io.emit("message", "Hola " + new Date().getTime());
}, 1000); 

Source code is available in my github account.

Yet Another example of WebSockets, socket.io and AngularJs working with a Silex backend

Remember my last post about WebSockets and AngularJs? Today we’re going to play with something similar. I want to create a key-value interface to play with websockets. Let me explain it a little bit.

First we’re going to see the backend. One Silex application with two routes: a get one and a post one:

<?php

include __DIR__ . '/../../vendor/autoload.php';
include __DIR__ . '/SqlLiteStorage.php';

use Silex\Application;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Silex\Provider\DoctrineServiceProvider;

$app = new Application([
    'debug'      => true,
    'ioServer'   => 'http://localhost:3000',
    'httpServer' => 'http://localhost:3001',
]);

$app->after(function (Request $request, Response $response) {
    $response->headers->set('Access-Control-Allow-Origin', '*');
});

$app->register(new G\Io\EmitterServiceProvider($app['httpServer']));
$app->register(new DoctrineServiceProvider(), [
    'db.options' => [
        'driver' => 'pdo_sqlite',
        'path'   => __DIR__ . '/../../db/app.db.sqlite',
    ],
]);
$app->register(new G\Io\Storage\Provider(new SqlLiteStorage($app['db'])));

$app->get('conf', function (Application $app, Request $request) {
    $chanel = $request->get('token');
    return $app->json([
        'ioServer' => $app['ioServer'],
        'chanel'   => $chanel
    ]);
});

$app->get('/{key}', function (Application $app, $key) {
    return $app->json($app['gdb.get']($key));
});

$app->post('/{key}', function (Application $app, Request $request, $key) {
    $content = json_decode($request->getContent(), true);

    $chanel = $content['token'];
    $app->json($app['gdb.post']($key, $content['value']));

    $app['io.emit']($chanel, [
        'key'   => $key,
        'value' => $content['value']
    ]);

    return $app->json(true);
});

$app->run();

As we can see we register one service provider:

$app->register(new G\Io\Storage\Provider(new SqlLiteStorage($app['db'])));

This provider needs an instance of StorageIface

namespace G\Io\Storage;

interface StorageIface
{
    public function get($key);

    public function post($key, $value);
}

Our implementation uses SqlLite, but it’s pretty straightforward to change to another Database Storage or even a NoSql Database.

use Doctrine\DBAL\Connection;
use G\Io\Storage\StorageIface;

class SqlLiteStorage implements StorageIface
{
    private $db;

    public function __construct(Connection $db)
    {
        $this->db = $db;
    }

    public function get($key)
    {
        $statement = $this->db->executeQuery('select value from storage where key = :KEY', ['KEY' => $key]);
        $data      = $statement->fetchAll();

        return isset($data[0]['value']) ? $data[0]['value'] : null;
    }

    public function post($key, $value)
    {
        $this->db->beginTransaction();

        $statement = $this->db->executeQuery('select value from storage where key = :KEY', ['KEY' =>; $key]);
        $data      = $statement->fetchAll();

        if (count($data) > 0) {
            $this->db->update('storage', ['value' => $value], ['key' => $key]);
        } else {
            $this->db->insert('storage', ['key' => $key, 'value' => $value]);
        }

        $this->db->commit();

        return $value;
    }
}

We also register another Service provider:

$app->register(new G\Io\EmitterServiceProvider($app['httpServer']));

This provider’s responsibility is to notify to the websocket’s server when anything changes within the storage:

namespace G\Io;

use Pimple\Container;
use Pimple\ServiceProviderInterface;

class EmitterServiceProvider implements ServiceProviderInterface
{
    private $server;

    public function __construct($url)
    {
        $this->server = $url;
    }

    public function register(Container $app)
    {
        $app['io.emit'] = $app->protect(function ($chanel, $params) use ($app) {
            $s = curl_init();
            curl_setopt($s, CURLOPT_URL, '{$this->server}/emit/?' . http_build_query($params) . '&_chanel=' . $chanel);
            curl_setopt($s, CURLOPT_RETURNTRANSFER, true);
            $content = curl_exec($s);
            $status  = curl_getinfo($s, CURLINFO_HTTP_CODE);
            curl_close($s);

            if ($status != 200) throw new \Exception();

            return $content;
        });
    }
}

The Websocket server is a simple socket.io server as well as a Express server to handle the backend’s triggers.

var
    express = require('express'),
    expressApp = express(),
    server = require('http').Server(expressApp),
    io = require('socket.io')(server, {origins: 'localhost:*'})
    ;

expressApp.get('/emit', function (req, res) {
    io.sockets.emit(req.query._chanel, req.query);
    res.json('OK');
});

expressApp.listen(3001);

server.listen(3000);

Our client application is an AngularJs application:

<!doctype html>
<html ng-app="app">
<head>
    <script src="//localhost:3000/socket.io/socket.io.js"></script>
    <script src="assets/angularjs/angular.js"></script>
    <script src="js/app.js"></script>
    <script src="js/gdb.js"></script>
</head>
<body>
 
<div ng-controller="MainController">
    <input type="text" ng-model="key">
    <button ng-click="change()">change</button>
</div>
 
</body>
</html>
angular.module('app', ['Gdb'])

    .run(function (Gdb) {
        Gdb.init({
            server: 'http://localhost:8080/gdb',
            token: '4b96716bcb3d42fc01ff421ea2cfd757'
        });
    })

    .controller('MainController', function ($scope, Gdb) {
        $scope.change = function () {
            Gdb.set('key', $scope.key).then(function() {
                console.log(&quot;Value set&quot;);
            });
        };

        Gdb.get('key').then(function (data) {
            $scope.key = data;
        });

        Gdb.watch('key', function (value) {
            console.log(&quot;Value updated&quot;);
            $scope.key = value;
        });
    })
;

As we can see the AngularJs application uses one small library called Gdb to handle the communications with the backend and WebSockets:

angular.module('Gdb', [])
    .factory('Gdb', function ($http, $q, $rootScope) {

        var socket,
            gdbServer,
            token,
            watches = {};

        var Gdb = {
            init: function (conf) {
                gdbServer = conf.server;
                token = conf.token;

                $http.get(gdbServer + '/conf', {params: {token: token}}).success(function (data) {
                    socket = io.connect(data.ioServer);
                    socket.on(data.chanel, function (data) {
                        watches.hasOwnProperty(data.key) ? watches[data.key](data.value) : null;
                        $rootScope.$apply();
                    });
                });
            },

            set: function (key, value) {
                var deferred = $q.defer();

                $http.post(gdbServer + '/' + key, {value: value, token: token}).success(function (data) {
                    deferred.resolve(data);
                });

                return deferred.promise;
            },

            get: function (key) {
                var deferred = $q.defer();

                $http.get(gdbServer + '/' + key, {params: {token: token}}).success(function (data) {
                    deferred.resolve(JSON.parse(data));
                });

                return deferred.promise;
            },

            watch: function (key, closure) {
                watches[key] = closure;
            }
        };

        return Gdb;
    });

And that’s all. You can see the whole project at github.

Playing with websockets, angularjs and socket.io

I’m a big fan of websockets. I’ve got various post about them (here, here). Last months I’m working with angularjs projects and because of that I wanna play a little bit with websockets (with socket.io) and angularjs.

I want to build one angular service.

angular.module('io.service', []).
    factory('io', function ($http) {
        var socket,
            apiServer,
            ioEvent,
            watches = {};

        return {
            init: function (conf) {
                apiServer = conf.apiServer;
                ioEvent = conf.ioEvent;

                socket = io.connect(conf.ioServer);
                socket.on(ioEvent, function (data) {
                    return watches.hasOwnProperty(data.item) ? watches[data.item](data) : null;
                });
            },

            emit: function (arguments) {
                return $http.get(apiServer + '/request', {params: arguments});
            },

            watch: function (item, closure) {
                watches[item] = closure;
            },

            unWatch: function (item) {
                delete watches[item];
            }
        };
    });

And now we can build the application

angular.module('myApp', ['io.service']).

    run(function (io) {
        io.init({
            ioServer: 'http://localhost:3000',
            apiServer: 'http://localhost:8080/api',
            ioEvent: 'io.response'
        });
    }).

    controller('MainController', function ($scope, io) {
        $scope.$watch('question', function (newValue, oldValue) {
            if (newValue != oldValue) {
                io.emit({item: 'question', newValue: newValue, oldValue: oldValue});
            }
        });

        io.watch('answer', function (data) {
            $scope.answer = data.value;
            $scope.$apply();
        });
    });

And this’s the html

<!doctype html>
<html>

<head>
    <title>ws experiment</title>
</head>

<body ng-app="myApp">

<div ng-controller="MainController">

    <input type="text" ng-model="question">
    <hr>
    <h1>Hello {{answer}}!</h1>
</div>

<script src="assets/angular/angular.min.js"></script>
<script src="//localhost:3000/socket.io/socket.io.js"></script>

<script src="js/io.js"></script>
<script src="js/app.js"></script>

</body>
</html>

The idea of the application is to watch one model’s variable (‘question’ in this example) and each time it changes we will send the value to the websocket server and we’ll so something (we will convert the string to upper case in our example)

As you can read in one of my previous post I don’t like to send messages from the web browser to the websocket server directly (due do to authentication issues commented here). I prefer to use one server (a Silex server in this example)

include __DIR__ . '/../../vendor/autoload.php';

use Silex\Application;
use Symfony\Component\HttpFoundation\Request;

$app = new Application(['debug' => true]);
$app->register(new G\Io\EmitterServiceProvider());

$app->get('/request', function (Application $app, Request $request) {

    $params = [
        'item'     => $request->get('item'),
        'newValue' => strtoupper($request->get('newValue')),
        'oldValue' => $request->get('oldValue'),
    ];

    try {
        $app['io.emit']($params);
        $params['status'] = true;
    } catch (\Exception $e) {
        $params['status'] = false;
    }

    return $app->json($params);
});

$app->run();

You can see the code within my github account.

How to send the output of Symfony’s process Component to a node.js server in Real Time with Socket.io

Today another crazy idea. Do you know Symfony Process Component? The Process Component is a simple component that executes commands in sub-processes. I like to use it when I need to execute commands in the operating system. The documentation is pretty straightforward. Normally when I want to collect the output of the script (imagine we run those scripts within a crontab) I save the output in a log file and I can check it even in real time with tail -f command.

This approach works, but I want to do it in a browser (call me crazy :)). I’ve written a couple of posts with something similar. What I want to do now? The idea is simple:

First I want to execute custom commands with process. Just follow the process documentation:

<?php
use Symfony\Component\Process\Process;

$process = new Process('ls -lsa');
$process->setTimeout(3600);
$process->run();
if (!$process->isSuccessful()) {
    throw new \RuntimeException($process->getErrorOutput());
}

print $process->getOutput();

Process has one cool feature, we can give feedback in real-time by passing an anonymous function to the run() method:

<?php
use Symfony\Component\Process\Process;

$process = new Process('ls -lsa');
$process->run(function ($type, $buffer) {
    if ('err' === $type) {
        echo 'ERR > '.$buffer;
    } else {
        echo 'OUT > '.$buffer;
    }
});

The idea now is to use this callback to send a TCP socket to one server with node.js

var LOCAL_PORT = 5600;
var server = require('net').createServer(function (socket) {
    socket.on('data', function (msg) {
        console.log(msg);
    });
}).listen(LOCAL_PORT);

server.on('listening', function () {
    console.log("TCP server accepting connection on port: " + LOCAL_PORT);
});

Now we change our php script to

<?php
include __DIR__ . "/../vendor/autoload.php";

use Symfony\Component\Process\Process;

function runProcess($command)
{
    $address = 'localhost';
    $port = 5600;
    $socket = socket_create(AF_INET, SOCK_STREAM, SOL_TCP);
    socket_connect($socket, $address, $port);

    $process = new Process($command);
    $process->setTimeout(3600);
    $process->run(
        function ($type, $buffer) use ($socket) {
            if ('err' === $type) {
                socket_write($socket, "ERROR\n", strlen("ERROR\n"));
                socket_write($socket, $buffer, strlen($buffer));
            } else {

                socket_write($socket, $buffer, strlen($buffer));
            }
        }
    );
    if (!$process->isSuccessful()) {
        throw new \RuntimeException($process->getErrorOutput());
    }
    socket_close($socket);
}

runProcess('ls -latr /');

Now with the node.js started, if we run the php script, we will see the output of the process command in the node’s terminal. But we want to show it in a browser. What can we do? Of course, socket.io. We change the node.js command to:

var LOCAL_PORT = 5600;
var SOKET_IO_PORT = 8000;
var ioClients = [];

var io = require('socket.io').listen(SOKET_IO_PORT);

var server = require('net').createServer(function (socket) {
    socket.on('data', function (msg) {
        ioClients.forEach(function (ioClient) {
            ioClient.emit('log', msg.toString().trim());
        });
    });
}).listen(LOCAL_PORT);

io.sockets.on('connection', function (socketIo) {
    ioClients.push(socketIo);
});

server.on('listening', function () {
    console.log("TCP server accepting connection on port: " + LOCAL_PORT);
});

and finally we create a simple web client:

<script src="http://localhost:8000/socket.io/socket.io.js"></script>
<script>
    var socket = io.connect('http://localhost:8000');
    socket.on('log', function (data) {
        console.log(data);
    });
</script>

Now if we start one browser we will see the output of our command line process within the console tab of the browser.

If you want something like webConsole, I also have created the example, With a Web UI enabling to send custom commands. You can see it in github.

Obviously that’s only one experiment with a lot of security issues that we need to take into account if we want to use it in production. What do you think?