Choose color scheme

Category Archives: I.T.

  • Multiaxis symmetrical drawing – A Mandala maker that doesn’t suck

    I’ve had a terrible time finding a good piece of software to draw mandalas with. To be honest, I don’t care what mandalas are but I’m obsessed with how cool it is to draw with replicated symmetry on multiple axis.

    Without further ado, here it is (drag your mouse to draw):

    I hope you find it addictive. Click to pop out.

    So wow… just wow, this blew up. This little tool ended up making the front page of Reddit in one amazing thread in which  many people shared their mandalas. It was an amazing day in many ways, first of all I’ve never seen so many positive comments in a single thread online. The amount of people who seem to have been positively touched by this program is humbling. Drawing mandalas is apparently great stress relief for many and I’ve received several personal notes on how much this program had done for them. I did not see that coming to say the least. Then the fact that this tools was picked up by real artists pushed it to build creations I didn’t even know it was capable of. Lastly, my solar powered raspberry Pi handled hundred of thousands of connections in a single day which turned out to be a technical challenge on top of the overwhelming response. When I set out to create this program, I did not have the slightest idea that it would hit such a sweet spot. I mainly wanted to scratch an itch and couldn’t find any good apps out there. It is a true privilege to have had the chance so see so many people use a tool I made, and have them report they were positively touched by it.

    Here is a few of the most amazing mandalas that were posted on the Reddit thread, this is what it looks like when real artists take over your tool :)


  • At the junction of I.T. & homesteading – continued


    Figuring out a good repeatable & maintainable way to deploy Pi Zeros.IMG_7684

    My favorite project screws in action.IMG_7693

    The boxes I picked a very tight and leave no room for any other hardware.IMG_7694

    I made a hole for a cable gland which is very helpful for cable strain relief, removing friction on sharp edges and making a right cable entryway.IMG_7695

    This little guy is only monitoring temperature, I’ll need a bigger box for the greenhouse device as it needs a bit more hardware.IMG_7746

  • At the junction of I.T. & homesteading

    I started acquiring multiple Raspberry Pi Zeros for the purpose of starting to figure out a consistent deployment scheme for the various automation related projects I envision for our homestead.

    For now I’ve simply deployed 2 DS18b20 temperature sensors. One on the existing Pi in the Solar shed which serves this blog, and another on a Pi Zero in the house. Only sensing for now which complements the data I’m gathering from the solar array.

    The Pi Zero consumes between 0.1 and 0.2 AmpsIMG_7476

    Sample data being gatheredScreen Shot 2016-12-10 at 10.25.03 PM

    Here are my current install notes for the Pi Zero.

    To limit power consumption, add this to /etc/rc.local to turn off HDMI output

    /usr/bin/tvservice -o

    To be able to read from the temperature probe, add the following line to /boot/config.txt


    Get the python-w1thermsensor package

    sudo apt-get install python-w1thermsensor

    Reboot & make sure devices are listed in /sys/bus/w1/devices

    The python code necessary to read the probe is:

    from w1thermsensor import W1ThermSensor
    # assuming only 1 sensor
    sensor = W1ThermSensor.get_available_sensors( [W1ThermSensor.THERM_SENSOR_DS18B20] )[0]
    temperature = sensor.get_temperature()
    if temperature is not None:
        print '%.1f' % (temperature)
        print "failed to get reading."
  • Nosy Monster

    Robin & I have been working on a rover for the land since his toy RC car broke. I opened it up to see if I could fix it, and as with many things, I quickly came to the conclusion that “I’ll just throw a Pi in there and do it myself”.

    Here’s the supposedly amphibian piece of shit that broke withing 1 hour of use.

    Screen Shot 2016-10-16 at 6.04.40 PMThe engines still worked so I bought a Raspberry Pi Zero with a Pi cam, some super cheap Sunfounder Relays

    From the ground up

    Before anything else, we introduced the notion of a relay. In the past we used Lego motors and batteries to apply power directly to actuators and create little robots. I just snipped one of the wires and had Robin create contact manually so he could make the correlation between a closed circuit and the motor going.


    With this “manual relay” in mind, we added a Pi controlled relay to make him realize that what the new gizmos do, is what he was doing by hand.IMG_7013


    Ok we have a web controlled Lego motor going. Let’s see if we can replicate with the RC car’s motors.

    IMG_7020IMG_7021First the manual relay


    Then with the Pi controlled relaysIMG_7024nosy_monster_04Our first iteration looked like this and had a few issues. I separated the circuit powering the DC motors and each were powered by only 1 AA battery. I also had many adjustments to make in the logic.

    IMG_7064Eventually, by adding a DROK voltage regulator, I was able to power everything from a single USB charger and prevent the motors from affecting the rest of the circuits.

    IMG_7127But the extra hardware is hard to fit in the Nosy Monster so it’s unlikely that I will be able to fit the solar panel that would turn it into a completely autonomous robot. So I started googling for other potential frames and OH GOD I JUST STUMBLED INTO THE WORLD OF RC ROBOTICS. Oops…

    In any case, I broke down the control into a step by step process. Instead of pressing “Go” and “Stop”, pressing “Go” will make it go for 1 second. There is 2 reasons for this. First, web based control introduces delays which make for a shitty live driving experience. Second, I would like this to behave like an actual rover on another planet. It reports back its sensors status and human decide on the next steps to follow. Heck I’m even thinking the next steps could be something that is voted on online. This would not be possible with “live” control.


  • Adding collaborative editing to the Ace web code editor with web sockets

    Using Ace‘s excellent API, it is relatively easy to enhance it to allow for live collaborative editing.

    The gist of what we’re doing here is to use Ace’s API for extracting and applying delta when changes occur in the editor. Then we simply transmit them over a websocket that all clients are connected to. This example is functional but in no way comprehensive to what a full code editing collaboration could be. It’s meant to be simple thus understandable. It’s a great starting point for whatever other pieces of functionality you want to send across web sockets.

    Loading Ace in a webpage with some custom Javascript

    This is what your web page looks like, load Ace as instructed and add Javascript to handle interaction with the websocket server.

    <!DOCTYPE html>
    <html lang="en">
            <title>Collaborative Ace Coding!</title>
            <style type="text/css" media="screen">
                #editor { 
                    position: absolute;
                    top: 0;
                    right: 0;
                    bottom: 0;
                    left: 0;
            <script src="https://<?=$_SERVER['HTTP_HOST']?>:1337/"></script>
            <script src="ace-builds/src/ace.js" type="text/javascript" charset="utf-8"></script>
            <script src="ace-builds/src/ext-language_tools.js"></script>
                var session_id = null ;
                var editor = null ;
                var collaborator = null ;
                var buffer_dumped = false ;
                var last_applied_change = null ;
                var just_cleared_buffer = null ;
                function Collaborator( session_id ) {
                    this.collaboration_socket = io.connect( "", {query:'session_id=' + session_id} ) ;
                    this.collaboration_socket.on( "change", function(delta) {
                        delta = JSON.parse( delta ) ;
                        last_applied_change = delta ;
                        editor.getSession().getDocument().applyDeltas( [delta] ) ;
                    }.bind() ) ;
                    this.collaboration_socket.on( "clear_buffer", function() {
                        just_cleared_buffer = true ;
                        console.log( "setting editor empty" ) ;
                        editor.setValue( "" ) ;
                    }.bind() ) ;
                Collaborator.prototype.change = function( delta ) {
                    this.collaboration_socket.emit( "change", delta ) ;
                Collaborator.prototype.clear_buffer = function() {
                    this.collaboration_socket.emit( "clear_buffer" ) ;
                Collaborator.prototype.dump_buffer = function() {
                    this.collaboration_socket.emit( "dump_buffer" ) ;
                function body_loaded() {
                    session_id = "meow" ;
                    editor = ace.edit( "editor" ) ;
                    collaborator = new Collaborator( session_id ) ;
                    // registering change callback
                    editor.on( "change", function( e ) {
                        // TODO, we could make things more efficient and not likely to conflict by keeping track of change IDs
                        if( last_applied_change!=e && !just_cleared_buffer ) {
                            collaborator.change( JSON.stringify(e) ) ;
                        just_cleared_buffer = false ;
                    }, false );
                    editor.setTheme( "ace/theme/monokai") ;
                    editor.$blockScrolling = Infinity ;
                    collaborator.dump_buffer() ;
                    document.getElementsByTagName('textarea')[0].focus() ;
                    last_applied_change = null ;
                    just_cleared_buffer = false ;
        <body onLoad="body_loaded()">
            <div id="editor"></div>

    Parallel to this, run the following Node.js server script

    Following is the Node.js websocket server which must be instantiated on the same server serving the web page above. It needs to be up for the page above to work.

    1. Make sure to have port 1337 open in the same capacity as ports 80 & 443, this is what this listens on.
    2. Make sure to update the paths to SSL certs, we use SSL on the websocket server. We do SSL here so browsers can run the websocket Javascript regardless of whether their original context it SSL or not.
    3. You need to have Socket.IO installed
    // config variables
    verbose = false ;
    session_directory = "/tmp" ; // it has to exist
    /* https specific */
    var https = require('https'),
        fs =    require('fs');
    var options = {
        key:    fs.readFileSync('/path/to/your/ssl.key'),
        cert:   fs.readFileSync('/path/to/your/ssl.crt'),
        ca:     fs.readFileSync('/path/to/your/CA.crt')
    var app = https.createServer(options);
    io = require('').listen(app);     // server listens to https connections
    app.listen(1337, "");
    // will use the following for file IO
    var fs = require( "fs" ) ;
    //io = require('').listen(2015) ;
    if( verbose ) { console.log( "> server launched" ) ; }
    collaborations = [] ;
    socket_id_to_session_id = [] ;
    io.sockets.on('connection', function(socket) {
        var session_id = socket.manager.handshaken[].query['session_id'] ;
        socket_id_to_session_id[] = session_id ;
        if( verbose ) { console.log( session_id + " connected on socket " + ) ; }
        if( !(session_id in collaborations) ) {
            // not in memory but is is on the filesystem?
            if( file_exists(session_directory + "/" + session_id) ) {
                if( verbose ) { console.log( "   session terminated previously, pulling back from filesystem" ) ; }
                var data = read_file( session_directory + "/" + session_id ) ;
                if( data!==false ) {
                    collaborations[session_id] = {'cached_instructions':JSON.parse(data), 'participants':[]} ;
                } else {
                    // something went wrong, we start from scratch
                    collaborations[session_id] = {'cached_instructions':[], 'participants':[]} ;
            } else {
                if( verbose ) { console.log( "   creating new session" ) ; }
                collaborations[session_id] = {'cached_instructions':[], 'participants':[]} ;
        collaborations[session_id]['participants'].push( ) ;
        socket.on('change', function( delta ) {
            if( verbose ) { console.log( "change " + socket_id_to_session_id[] + " " + delta ) ; }
            if( socket_id_to_session_id[] in collaborations ) {
                collaborations[socket_id_to_session_id[]]['cached_instructions'].push( ["change", delta,] ) ;
                for( var i=0 ; i<collaborations[session_id]['participants'].length ; i++ ) {
                    if(!=collaborations[session_id]['participants'][i] ) {
                        io.sockets.socket(collaborations[session_id]['participants'][i]).emit( "change", delta ) ;
            } else {
                if( verbose ) { console.log( "WARNING: could not tie socket_id to any collaboration" ) ; }
        socket.on('change_selection', function( selections ) {
            if( verbose ) { console.log( "change_selection " + socket_id_to_session_id[] + " " + selections ) ; }
            if( socket_id_to_session_id[] in collaborations ) {
                for( var i=0 ; i<collaborations[session_id]['participants'].length ; i++ ) {
                    if(!=collaborations[session_id]['participants'][i] ) {
                        io.sockets.socket(collaborations[session_id]['participants'][i]).emit( "change_selection", selections ) ;
            } else {
                if( verbose ) { console.log( "WARNING: could not tie socket_id to any collaboration" ) ; }
        socket.on('clear_buffer', function() {
            if( verbose ) { console.log( "clear_buffer " + socket_id_to_session_id[] ) ; }
            if( socket_id_to_session_id[] in collaborations ) {
                collaborations[socket_id_to_session_id[]]['cached_instructions'] = [] ;
                for( var i=0 ; i<collaborations[session_id]['participants'].length ; i++ ) {
                    if(!=collaborations[session_id]['participants'][i] ) {
                        io.sockets.socket(collaborations[session_id]['participants'][i]).emit( "clear_buffer" ) ;
            } else {
                if( verbose ) { console.log( "WARNING: could not tie socket_id to any collaboration" ) ; }
        socket.on('dump_buffer', function() {
            if( verbose ) { console.log( "dump_buffer " + socket_id_to_session_id[] ) ; }
            if( socket_id_to_session_id[] in collaborations ) {
                for( var i=0 ; i<collaborations[socket_id_to_session_id[]]['cached_instructions'].length ; i++ ) {
                    socket.emit( collaborations[socket_id_to_session_id[]]['cached_instructions'][i][0], collaborations[socket_id_to_session_id[]]['cached_instructions'][i][1] ) ;
            } else {
                if( verbose ) { console.log( "WARNING: could not tie socket_id to any collaboration" ) ; }
            socket.emit( "buffer_dumped" ) ;
        socket.on('disconnect', function () {
            console.log( socket_id_to_session_id[] + " disconnected" ) ;
            var found_and_removed = false ;
            if( socket_id_to_session_id[] in collaborations ) {
                //var index = collaborations[socket_id_to_session_id[]].participants.indexOf( ) ;
                var index = collaborations[socket_id_to_session_id[]]['participants'].indexOf( ) ;
                if( index>-1 ) {
                    //collaborations[socket_id_to_session_id[]].participants.splice( index, 1 ) ;
                    collaborations[socket_id_to_session_id[]]['participants'].splice( index, 1 ) ;
                    found_and_removed = true ;
                    //if( collaborations[socket_id_to_session_id[]].participants.length==0 ) {
                    if( collaborations[socket_id_to_session_id[]]['participants'].length==0 ) {
                        if( verbose ) { console.log( "last participant in collaboration, committing to disk & removing from memory" ) ; }
                        // no one is left in this session, we commit it to disk & remove it from memory
                        write_file( session_directory + "/" + socket_id_to_session_id[], JSON.stringify(collaborations[socket_id_to_session_id[]]['cached_instructions']) ) ;
                        delete collaborations[socket_id_to_session_id[]] ;
            if( !found_and_removed ) {
                console.log( "WARNING: could not tie socket_id to any collaboration" ) ;
            console.log( collaborations ) ;
    function write_file( path, data ) {
        try {
            fs.writeFileSync( path, data ) ;
            return true ;
        } catch( e ) {
            return false ;
    function read_file( path ) {
        try {
            var data = fs.readFileSync( path ) ;
            return data ;
        } catch( e ) {
            return false
    function file_exists( path ) {
        try {
            stats = fs.lstatSync( path ) ;
            if (stats.isFile()) {
                return true ;
        } catch( e ) {
            return false ;
        // we should not reach that point
        return false ;
  • Using Google’s APIs with Python scripts

    I was never able to find centralized, succinct and example based documentation for doing domain delegated API calls with Google. Hopefully here is exactly this documentation from all the pieces I gathered along the way.

    Service Account Creation

    1. Go to and create a new project.
      Screen Shot 2016-03-15 at 10.11.31 AM
    2. Call it whatever you want
      Screen Shot 2016-03-15 at 10.11.48 AM
    3. Enable the right APIs that this project will use We’ll do drive API for the purpose of this testing
      Screen Shot 2016-03-15 at 10.15.06 AMScreen Shot 2016-03-15 at 10.16.40 AMScreen Shot 2016-03-15 at 10.16.48 AM
    4. Go to the “Credentials” screen
      Screen Shot 2016-03-15 at 10.18.08 AM
    5. Create a “Service Account Key”
      Screen Shot 2016-03-15 at 10.18.33 AM
    6. Make it a “New service account” and give it a nameScreen Shot 2016-03-15 at 10.19.17 AMScreen Shot 2016-03-15 at 10.20.09 AMScreen Shot 2016-03-15 at 10.23.45 AM
    7. Download that JSON file that follows.
      Screen Shot 2016-03-15 at 10.23.53 AM
      This file contains the credentials for the account you just created, treat it with care, anyone getting their hands on it can authenticate with the account. This is especially critical since we are about to grant domain delegation to the account we created. Any one with access to this file is essentially able to run any API call masquerading as anyone in your Google Apps domain. This is for all intents and purposes a root account.

    Domain Delegation

    1. Back on the “Credentials” screen, click “Manage service accounts”
      Screen Shot 2016-03-15 at 10.26.43 AM
    2. Edit the service account you just created
      Screen Shot 2016-03-15 at 10.28.23 AM
    3. Check the “Enable Google Apps Domain-wide Delegation” checkbox and click “Save”.
      Screen Shot 2016-03-15 at 10.30.28 AM
      Google at this points needs a product name for the consent screen, so be it.
    4. At this point, if everything went well, when you go back to the “Credentials” screen, you will notice that Google create an “OAuth 2.0 client ID” that is paired with the service account you created.

    Domain delegation continued, configuring API client access

    Granting domain delegation to the service account as we just did isn’t enough, we now need to specify the scopes for which the account can request delegated access.

    1. Go to your Google Apps domain’s Admin console.
    2. Select the Security tabScreen Shot 2016-03-09 at 11.15.40 AM
    3. Click “Show more” -> “Advanced Settings” Screen Shot 2016-03-09 at 11.15.52 AM
    4. Click “Manage API client access Screen Shot 2016-03-09 at 11.16.08 AM
    5. In the “Client Name” field, use the “client_id” field from the json file you downloaded earlier. You can get it via the following command:
      cat ~/Downloads/*.json | grep client_id | cut -d '"' -f4

      In the “One or More API Scopes” field use the following scope:

      Screen Shot 2016-03-15 at 11.00.36 AM
      If you want to allow more scopes], comma separate them. This interface is very finicky, only enter URLs and don’t copy/paste the description that show up for previous entries. There also might be a few minutes delay between you granting a scope and its taking effect.

    6. Click “Authorize”, you should get a new entry that looks like this:
      Screen Shot 2016-03-15 at 11.01.51 AM
      If you need to find the URL for a scope, this link is helpful.

    Scripting & OAuth 2.0 authentication

    Okay! The account is all set up on the Google side of things, let’s write a Python script to use it. Here’s your starting point:

    This scripts contains all the functions to get you started with making API calls to Google with Python. It isn’t the simplest form it could be presented in but it solves a few issues right off the bat:

    • All Google interactions are in the “google_api” class, this allows for efficient use of tokens. When “subing as” a user in your domain, the class will keep track of access tokens for users and only re-generate them when they expire.
    • Exponential back-off is baked-in and generalized to anything unusual gotten back from Google (based on observation).
    • SIGINT will get handled properly

    Before running the script, you may need to:

    sudo apt-get update && sudo apt-get install python-pycurl

    Running the script is done as such:

    ./ /path/to/json/file/you/downloaded/earlier.json

    It will simply run the “get about” Drive API call and print the result. This should allow you to verify that the call was indeed executed as the account you specified in the arguments.

    Once you’ve ran this script once, the sky is the limit, all the Drive API calls can be added to it based on the get_about function.

    Important note on scopes: the same way that you granted domain delegation to certain comma separated scopes in the Google Apps Admin Console earlier; this script needs to reflect the scopes that are being accessed and the same space separated list of scopes need to be part of your jwt claim set (line 78 of the script). So if you need to make calls against more than just drive, make sure to update scopes in both locations or your calls won’t work.

    More scopes & more functions

    Taking it one step further with the Google Enforcer. This is the project that lead me down the path of writing my own class to handle Google API calls. While it is not quite ready for public use, I’m publishing the project here as it is an excellent reference to making all kinds of other Google API calls; some doing POSTs, PUTs, DELETEs, some implementing paging, et cetera.


    The purpose of this project is to enforce on the fly permissions on a directory tree. There is a extravagant amount of gotchas to figure out to do this. If you are interested in implementing it with your organization, please leave a comment and I can either help or get it ready for public use depending on interest.

    This project works towards the same end as AODocs, making Google Drive’s permission not completely insane as they are by default.

    Here are the scopes I have enabled for domain delegation for this project.

    Screen Shot 2016-03-15 at 4.55.25 PMProblems addressed by this project:

    • domain account “subbing as” other users AKA masquerading
    • a myriad of Google Drive API calls focused on file permissions
    • watching for changes
    • crawling through directory hierarchy
    • threading of processes to quickly set the right permissions
    • disable re-sharing of files
    • access token refreshing and handling
    • exponential back-off
  • IPv6 link-local address to MAC address online converter

    The converter

    It can also be addressed directly via:
    for all your API needs.


    This converter was implemented per Dave Russell’s suggestion as a follow up to the MAC address to IPv6 link-local address online converter. If you are interested in the steps behind this conversion, they are simply a reverse of the original Mac->IPv6 converter.

    Please note that of the various IPv6 notations, the one this script will expect is fe80::xxxx:xxxx:xxxx:xxxx.

  • Remove all Exif data from JPEGs recursively

    Because I always spend 20 minutes googling it

    apt-get update && apt-get install libimage-exiftool-perl
    find /var/www -type f -iname *.jpg -exec exiftool -all= {} \;
  • IT rant

    I don’t know why it is that every time I want to download the simplest of modules I’m asked to download this great new package manager to end all package managers. Which results in this glorious 3 fold install sequence:Screen Shot 2016-01-07 at 1.40.19 PM“Installing is easy, Composer will take care of all dependencies! Ooops it doesn’t but PECL will! Oops just kidding I guess we’re back to apt.”

    Not to mention all 3 package managers are independent from one another so changes in one don’t percolate to the others; thus throwing out the window their mission to resolve dependencies.

    I’m flabbergasted by this trend and its unquestioned acceptance.

  • A solar powered blog

    This blog is now powered by a Raspberry Pi using 100% solar energy. Nicole instrumented the Phidgets sensors so we would gain some visibility into our electricity production & consumption. This has already given us some great insights. We can see the effect that each device we use has on the system: how much the LED lamps take to charge, the hole that the inverter blasts through the battery when turned on. We can tell that not all sunny days are created equal in their ability to give a charge. We can even tell the increase in electricity consumption that rsyncing a whole bunch of data to the Pi has: 0.03A.

    The sensors

    • solar panels volts (a good indicator of sunlight)
    • input amps (indicates when the charge controller uses produced electricity)
    • output amps / load (what we consume with various devices)
    • battery volts (whether this blog will make it through the night or not)

    For now I’m only graphing using the Gnuplot one-liner from Hell. More to come…

    Screen Shot 2015-11-05 at 12.33.16 PM

    It blows my mind way too hard that I have a system in which sunlight comes in and organized information comes out. And by organized information I mean lolcats.serious-cat