Nosy Monster

Robin & I have been working on a rover for the land since his toy RC car broke. I opened it up to see if I could fix it, and as with many things, I quickly came to the conclusion that “I’ll just throw a Pi in there and do it myself”.

Here’s the supposedly amphibian piece of shit that broke within 1 hour of use. I have to take this back, the company replaced it and it has been sturdy with all the mods.

Screen Shot 2016-10-16 at 6.04.40 PMThe engines still worked so I bought a Raspberry Pi Zero with a Pi cam, some super cheap Sunfounder Relays

From the ground up

Before anything else, we introduced the notion of a relay. In the past we used Lego motors and batteries to apply power directly to actuators and create little robots. I just snipped one of the wires and had Robin create contact manually so he could make the correlation between a closed circuit and the motor going.


With this “manual relay” in mind, we added a Pi controlled relay to make him realize that what the new gizmos do, is what he was doing by hand.IMG_7013


Ok we have a web controlled Lego motor going. Let’s see if we can replicate with the RC car’s motors.

IMG_7020IMG_7021First the manual relay


Then with the Pi controlled relaysIMG_7024nosy_monster_04Our first iteration looked like this and had a few issues. I separated the circuit powering the DC motors and each were powered by only 1 AA battery. I also had many adjustments to make in the logic.

IMG_7064Eventually, by adding a DROK voltage regulator, I was able to power everything from a single USB charger and prevent the motors from affecting the rest of the circuits.

IMG_7127But the extra hardware is hard to fit in the Nosy Monster so it’s unlikely that I will be able to fit the solar panel that would turn it into a completely autonomous robot. So I started googling for other potential frames and OH GOD I JUST STUMBLED INTO THE WORLD OF RC ROBOTICS. Oops…

In any case, I broke down the control into a step by step process. Instead of pressing “Go” and “Stop”, pressing “Go” will make it go for 1 second. There is 2 reasons for this. First, web based control introduces delays which make for a shitty live driving experience. Second, I would like this to behave like an actual rover on another planet. It reports back its sensors status and human decide on the next steps to follow. Heck I’m even thinking the next steps could be something that is voted on online. This would not be possible with “live” control.


Adding collaborative editing to the Ace web code editor with web sockets

Using Ace‘s excellent API, it is relatively easy to enhance it to allow for live collaborative editing.

The gist of what we’re doing here is to use Ace’s API for extracting and applying delta when changes occur in the editor. Then we simply transmit them over a websocket that all clients are connected to. This example is functional but in no way comprehensive to what a full code editing collaboration could be. It’s meant to be simple thus understandable. It’s a great starting point for whatever other pieces of functionality you want to send across web sockets.

Loading Ace in a webpage with some custom Javascript

This is what your web page looks like, load Ace as instructed and add Javascript to handle interaction with the websocket server.

<!DOCTYPE html>
<html lang="en">

        <title>Collaborative Ace Coding!</title>

        <style type="text/css" media="screen">
            #editor {
                position: absolute;
                top: 0;
                right: 0;
                bottom: 0;
                left: 0;

        <script src="https://<?=$_SERVER['HTTP_HOST']?>:1337/"></script>
        <script src="ace-builds/src/ace.js" type="text/javascript" charset="utf-8"></script>
        <script src="ace-builds/src/ext-language_tools.js"></script>
            var session_id = null ;
            var editor = null ;
            var collaborator = null ;
            var buffer_dumped = false ;
            var last_applied_change = null ;
            var just_cleared_buffer = null ;

            function Collaborator( session_id ) {
                this.collaboration_socket = io.connect( "", {query:'session_id=' + session_id} ) ;

                this.collaboration_socket.on( "change", function(delta) {
                    delta = JSON.parse( delta ) ;
                    last_applied_change = delta ;
                    editor.getSession().getDocument().applyDeltas( [delta] ) ;
                }.bind() ) ;

                this.collaboration_socket.on( "clear_buffer", function() {
                    just_cleared_buffer = true ;
                    console.log( "setting editor empty" ) ;
                    editor.setValue( "" ) ;
                }.bind() ) ;

            Collaborator.prototype.change = function( delta ) {
                this.collaboration_socket.emit( "change", delta ) ;

            Collaborator.prototype.clear_buffer = function() {
                this.collaboration_socket.emit( "clear_buffer" ) ;

            Collaborator.prototype.dump_buffer = function() {
                this.collaboration_socket.emit( "dump_buffer" ) ;

            function body_loaded() {

                session_id = "meow" ;

                editor = ace.edit( "editor" ) ;
                collaborator = new Collaborator( session_id ) ;

                // registering change callback
                editor.on( "change", function( e ) {
                    // TODO, we could make things more efficient and not likely to conflict by keeping track of change IDs
                    if( last_applied_change!=e && !just_cleared_buffer ) {
                        collaborator.change( JSON.stringify(e) ) ;
                    just_cleared_buffer = false ;
                }, false );

                editor.setTheme( "ace/theme/monokai") ;
                editor.$blockScrolling = Infinity ;

                collaborator.dump_buffer() ;

                document.getElementsByTagName('textarea')[0].focus() ;
                last_applied_change = null ;
                just_cleared_buffer = false ;

    <body onLoad="body_loaded()">
        <div id="editor"></div>

Parallel to this, run the following Node.js server script

Following is the Node.js websocket server which must be instantiated on the same server serving the web page above. It needs to be up for the page above to work.

  1. Make sure to have port 1337 open in the same capacity as ports 80 & 443, this is what this listens on.
  2. Make sure to update the paths to SSL certs, we use SSL on the websocket server. We do SSL here so browsers can run the websocket Javascript regardless of whether their original context it SSL or not.
  3. You need to have Socket.IO installed
// config variables
verbose = false ;
session_directory = "/tmp" ; // it has to exist

/* https specific */
var https = require('https'),
    fs =    require('fs');

var options = {
    key:    fs.readFileSync('/path/to/your/ssl.key'),
    cert:   fs.readFileSync('/path/to/your/ssl.crt'),
    ca:     fs.readFileSync('/path/to/your/CA.crt')
var app = https.createServer(options);
io = require('').listen(app);     // server listens to https connections
app.listen(1337, "");

// will use the following for file IO
var fs = require( "fs" ) ;

//io = require('').listen(2015) ;
if( verbose ) { console.log( "> server launched" ) ; }

collaborations = [] ;
socket_id_to_session_id = [] ;

io.sockets.on('connection', function(socket) {
    var session_id = socket.manager.handshaken[].query['session_id'] ;

    socket_id_to_session_id[] = session_id ;

    if( verbose ) { console.log( session_id + " connected on socket " + ) ; }

    if( !(session_id in collaborations) ) {
        // not in memory but is is on the filesystem?
        if( file_exists(session_directory + "/" + session_id) ) {
            if( verbose ) { console.log( "   session terminated previously, pulling back from filesystem" ) ; }
            var data = read_file( session_directory + "/" + session_id ) ;
            if( data!==false ) {
                collaborations[session_id] = {'cached_instructions':JSON.parse(data), 'participants':[]} ;
            } else {
                // something went wrong, we start from scratch
                collaborations[session_id] = {'cached_instructions':[], 'participants':[]} ;
        } else {
            if( verbose ) { console.log( "   creating new session" ) ; }
            collaborations[session_id] = {'cached_instructions':[], 'participants':[]} ;
    collaborations[session_id]['participants'].push( ) ;

    socket.on('change', function( delta ) {
        if( verbose ) { console.log( "change " + socket_id_to_session_id[] + " " + delta ) ; }
        if( socket_id_to_session_id[] in collaborations ) {
            collaborations[socket_id_to_session_id[]]['cached_instructions'].push( ["change", delta,] ) ;
            for( var i=0 ; i<collaborations[session_id]['participants'].length ; i++ ) {
                if(!=collaborations[session_id]['participants'][i] ) {
                    io.sockets.socket(collaborations[session_id]['participants'][i]).emit( "change", delta ) ;
        } else {
            if( verbose ) { console.log( "WARNING: could not tie socket_id to any collaboration" ) ; }

    socket.on('change_selection', function( selections ) {
        if( verbose ) { console.log( "change_selection " + socket_id_to_session_id[] + " " + selections ) ; }
        if( socket_id_to_session_id[] in collaborations ) {
            for( var i=0 ; i<collaborations[session_id]['participants'].length ; i++ ) {
                if(!=collaborations[session_id]['participants'][i] ) {
                    io.sockets.socket(collaborations[session_id]['participants'][i]).emit( "change_selection", selections ) ;
        } else {
            if( verbose ) { console.log( "WARNING: could not tie socket_id to any collaboration" ) ; }

    socket.on('clear_buffer', function() {
        if( verbose ) { console.log( "clear_buffer " + socket_id_to_session_id[] ) ; }
        if( socket_id_to_session_id[] in collaborations ) {
            collaborations[socket_id_to_session_id[]]['cached_instructions'] = [] ;
            for( var i=0 ; i<collaborations[session_id]['participants'].length ; i++ ) {
                if(!=collaborations[session_id]['participants'][i] ) {
                    io.sockets.socket(collaborations[session_id]['participants'][i]).emit( "clear_buffer" ) ;
        } else {
            if( verbose ) { console.log( "WARNING: could not tie socket_id to any collaboration" ) ; }

    socket.on('dump_buffer', function() {
        if( verbose ) { console.log( "dump_buffer " + socket_id_to_session_id[] ) ; }
        if( socket_id_to_session_id[] in collaborations ) {
            for( var i=0 ; i<collaborations[socket_id_to_session_id[]]['cached_instructions'].length ; i++ ) {
                socket.emit( collaborations[socket_id_to_session_id[]]['cached_instructions'][i][0], collaborations[socket_id_to_session_id[]]['cached_instructions'][i][1] ) ;
        } else {
            if( verbose ) { console.log( "WARNING: could not tie socket_id to any collaboration" ) ; }
        socket.emit( "buffer_dumped" ) ;

    socket.on('disconnect', function () {
        console.log( socket_id_to_session_id[] + " disconnected" ) ;
        var found_and_removed = false ;
        if( socket_id_to_session_id[] in collaborations ) {
            //var index = collaborations[socket_id_to_session_id[]].participants.indexOf( ) ;
            var index = collaborations[socket_id_to_session_id[]]['participants'].indexOf( ) ;
            if( index>-1 ) {
                //collaborations[socket_id_to_session_id[]].participants.splice( index, 1 ) ;
                collaborations[socket_id_to_session_id[]]['participants'].splice( index, 1 ) ;
                found_and_removed = true ;
                //if( collaborations[socket_id_to_session_id[]].participants.length==0 ) {
                if( collaborations[socket_id_to_session_id[]]['participants'].length==0 ) {
                    if( verbose ) { console.log( "last participant in collaboration, committing to disk & removing from memory" ) ; }
                    // no one is left in this session, we commit it to disk & remove it from memory
                    write_file( session_directory + "/" + socket_id_to_session_id[], JSON.stringify(collaborations[socket_id_to_session_id[]]['cached_instructions']) ) ;
                    delete collaborations[socket_id_to_session_id[]] ;
        if( !found_and_removed ) {
            console.log( "WARNING: could not tie socket_id to any collaboration" ) ;
        console.log( collaborations ) ;


function write_file( path, data ) {
    try {
        fs.writeFileSync( path, data ) ;
        return true ;
    } catch( e ) {
        return false ;

function read_file( path ) {
    try {
        var data = fs.readFileSync( path ) ;
        return data ;
    } catch( e ) {
        return false

function file_exists( path ) {
    try {
        stats = fs.lstatSync( path ) ;
        if (stats.isFile()) {
            return true ;
    } catch( e ) {
        return false ;
    // we should not reach that point
    return false ;

Using Google's APIs with Python scripts

I was never able to find centralized, succinct and example based documentation for doing domain delegated API calls with Google. Hopefully here is exactly this documentation from all the pieces I gathered along the way.

Service Account Creation

  1. Go to and create a new project.
    Screen Shot 2016-03-15 at 10.11.31 AM
  2. Call it whatever you want
    Screen Shot 2016-03-15 at 10.11.48 AM
  3. Enable the right APIs that this project will use We’ll do drive API for the purpose of this testing
    Screen Shot 2016-03-15 at 10.15.06 AMScreen Shot 2016-03-15 at 10.16.40 AMScreen Shot 2016-03-15 at 10.16.48 AM
  4. Go to the “Credentials” screen
    Screen Shot 2016-03-15 at 10.18.08 AM
  5. Create a “Service Account Key”
    Screen Shot 2016-03-15 at 10.18.33 AM
  6. Make it a “New service account” and give it a nameScreen Shot 2016-03-15 at 10.19.17 AMScreen Shot 2016-03-15 at 10.20.09 AMScreen Shot 2016-03-15 at 10.23.45 AM
  7. Download that JSON file that follows.
    Screen Shot 2016-03-15 at 10.23.53 AM
    This file contains the credentials for the account you just created, treat it with care, anyone getting their hands on it can authenticate with the account. This is especially critical since we are about to grant domain delegation to the account we created. Any one with access to this file is essentially able to run any API call masquerading as anyone in your Google Apps domain. This is for all intents and purposes a root account.

Domain Delegation

  1. Back on the “Credentials” screen, click “Manage service accounts”
    Screen Shot 2016-03-15 at 10.26.43 AM
  2. Edit the service account you just created
    Screen Shot 2016-03-15 at 10.28.23 AM
  3. Check the “Enable Google Apps Domain-wide Delegation” checkbox and click “Save”.
    Screen Shot 2016-03-15 at 10.30.28 AM
    Google at this points needs a product name for the consent screen, so be it.
  4. At this point, if everything went well, when you go back to the “Credentials” screen, you will notice that Google create an “OAuth 2.0 client ID” that is paired with the service account you created.

Domain delegation continued, configuring API client access

Granting domain delegation to the service account as we just did isn’t enough, we now need to specify the scopes for which the account can request delegated access.

  1. Go to your Google Apps domain’s Admin console.
  2. Select the Security tabScreen Shot 2016-03-09 at 11.15.40 AM
  3. Click “Show more” -> “Advanced Settings” Screen Shot 2016-03-09 at 11.15.52 AM
  4. Click “Manage API client access Screen Shot 2016-03-09 at 11.16.08 AM
  5. In the “Client Name” field, use the “client_id” field from the json file you downloaded earlier. You can get it via the following command:
    cat ~/Downloads/*.json | grep client_id | cut -d '"' -f4

    In the “One or More API Scopes” field use the following scope:

    Screen Shot 2016-03-15 at 11.00.36 AM
    If you want to allow more scopes], comma separate them. This interface is very finicky, only enter URLs and don’t copy/paste the description that show up for previous entries. There also might be a few minutes delay between you granting a scope and its taking effect.

  6. Click “Authorize”, you should get a new entry that looks like this:
    Screen Shot 2016-03-15 at 11.01.51 AM
    If you need to find the URL for a scope, this link is helpful.

Scripting & OAuth 2.0 authentication

Okay! The account is all set up on the Google side of things, let’s write a Python script to use it. Here’s your starting point:

This scripts contains all the functions to get you started with making API calls to Google with Python. It isn’t the simplest form it could be presented in but it solves a few issues right off the bat:

  • All Google interactions are in the “google_api” class, this allows for efficient use of tokens. When “subing as” a user in your domain, the class will keep track of access tokens for users and only re-generate them when they expire.
  • Exponential back-off is baked-in and generalized to anything unusual gotten back from Google (based on observation).
  • SIGINT will get handled properly

Before running the script, you may need to:

sudo apt-get update && sudo apt-get install python-pycurl

Running the script is done as such:

./ /path/to/json/file/you/downloaded/earlier.json

It will simply run the “get about” Drive API call and print the result. This should allow you to verify that the call was indeed executed as the account you specified in the arguments.

Once you’ve ran this script once, the sky is the limit, all the Drive API calls can be added to it based on the get_about function.

Important note on scopes: the same way that you granted domain delegation to certain comma separated scopes in the Google Apps Admin Console earlier; this script needs to reflect the scopes that are being accessed and the same space separated list of scopes need to be part of your jwt claim set (line 78 of the script). So if you need to make calls against more than just drive, make sure to update scopes in both locations or your calls won’t work.

More scopes & more functions

Taking it one step further with the Google Enforcer. This is the project that lead me down the path of writing my own class to handle Google API calls. While it is not quite ready for public use, I’m publishing the project here as it is an excellent reference to making all kinds of other Google API calls; some doing POSTs, PUTs, DELETEs, some implementing paging, et cetera.


The purpose of this project is to enforce on the fly permissions on a directory tree. There is a extravagant amount of gotchas to figure out to do this. If you are interested in implementing it with your organization, please leave a comment and I can either help or get it ready for public use depending on interest.

This project works towards the same end as AODocs, making Google Drive’s permission not completely insane as they are by default.

Here are the scopes I have enabled for domain delegation for this project.

Screen Shot 2016-03-15 at 4.55.25 PMProblems addressed by this project:

  • domain account “subbing as” other users AKA masquerading
  • a myriad of Google Drive API calls focused on file permissions
  • watching for changes
  • crawling through directory hierarchy
  • threading of processes to quickly set the right permissions
  • disable re-sharing of files
  • access token refreshing and handling
  • exponential back-off