Christmas Bauble Smoke Detector Prototype

At AnalogFolk, we love building hardware prototypes, especially with the Arduino open source hardware platform. These development boards are quickly getting smaller, lighter and cheaper, so it has become possible for us to build our own smoke detecting smart Christmas bauble.

The brain of our system is the Digispark ATtiny85 microcontroller board which is currently available for only $1.22 with free shipping.


Our bauble uses the popular MQ-2 gas sensor. Apart from smoke, this sensor module is also sensitive to carbon-monoxide, methane, propane, butane, LPG and higher concentrates of hydrogen and alcohol fumes. When the alarm goes off the module uses light and sound to warn people that their Christmas tree might be on fire.

mq2 Here’s the complete shopping list for the Christmas Bauble Smoke Alarm:

  • Digispark ATtiny85 Arduino compatible development board
  • MQ-2 gas sensor module
  • Electronic buzzer- LED
  • 2x 220Ω resistors- 1000µF capacitor
  • Wires and breadboard to prototype then a blank board and solder to finalise the module

To build the circuit follow this circuit diagram:


The MQ-2 sensor has a small burner inside which needs to heat up before the readings become accurate. For this reason, we start the smoke alarm in calibration mode when it powers up. As the calibration is happening, sensor readings are slowly going down until they finally stop at the accurate value after a minute or so. The Arduino code waits until these values normalise and switches the LED on to let the user know that calibration is in progress. When the LED is off the sensor is active and fully calibrated.


For writing this application logic we will be using the Arduino language and the official Arduino IDE. To get started with this environment visit:

The Arduino code that performs the previously outlined behaviour starts with setting up the sensor, the LED and the buzzer and defines their pin numbers:

#define led 10
#define buzzer 8
#define sensor A0

int treshold = 0;
int alarmLength = 40;

void setup() {
  pinMode(led, OUTPUT);
  pinMode(buzzer, OUTPUT);
  pinMode(sensor, INPUT);

    digitalWrite(led, HIGH);

  treshold = analogRead(sensor) + 10;


After setting up the components the setup block enters a while() loop to perform and check the state of the calibration. This is done in the isCalibrating() function:

bool isCalibrating(){
  int sample1 = analogRead(sensor);
  int sample2 = analogRead(sensor);

  if(sample1 < sample2){
    return true;
  } else if(abs(sample1 - sample2) <= 2) {
    return false;

As soon as the sensor reading is normalised the function returns false, then we exit the while loop and set the new threshold for the sensor then continue to the loop() block:

void loop() {  
  if(analogRead(sensor) > treshold){
    while(alarmLength >= 0){
    alarmLength = 40;
  } else {

Here we simply check wether the current reading from the gas sensor is higher than the threshold then call the alarmOn() function otherwise reset the alarm and call alarmOff(). These alarm functions simply switch the buzzer and the LED back and forth:

void alarmOn(int delayMs){
  digitalWrite(led, HIGH);
  analogWrite(buzzer, 100);

  digitalWrite(led, LOW);
  analogWrite(buzzer, 0);

void alarmOff(){
  digitalWrite(led, LOW);
  analogWrite(buzzer, 0);

Uploading this piece of code onto your Arduino will immediately start the smoke alarm.

Consider this as a little festive DIY project but please don't use it as a life saving device! Always make sure you have a professional and tested smoke alarm in your house.

HypeCast – Sports Action Prediction Engine

In Australia, sport is god. But there’s one problem that plagues every die-hard sporting fan down under – and we’re tired of it.

A (Sleepy) Sporting Nation

Australia is one of the most passionate international sporting nations in the world which is particularly impressive because we’re quite literally very far removed from a lot of the action. And so if you follow any of the Northern Hemisphere’s major sporting events – the Tour de France, NBA, UEFA Champions League etc – there’s a high chance you’ll have at some stage swapped sleeping for live sport streaming.

This doesn’t deter the hardcore fans among us but it does lead to a higher propensity for falling asleep in client meetings, groggy eyes and grumpy vibes the next day – especially if you got up at 3am to watch a game that ends up a total fizzer.

Take the 2015 Rugby World Cup.

Hosted in England and predominantly televised between 9pm and 5am (Australian Eastern Standard Time), the hefty percentage of Australian fans tuning in will face the following options: (a) stay up to watch the game and just hope it’s a cracker; or (b) catch the highlights the next day, catching their requisite Zs but missing out on the electricity of the live experience.

It’s Australia’s sleep-deprived sporting debacle.

But what if we could give our country’s sport lovers the best of both worlds?

Gettin’ Techy With It

We wanted to find the perfect middle ground: a way of to ensure Australia’s sporting fans don’t miss incendiary sports moments by waking them up – but only when one of those Moments is about to happen during a game. A smart alarm, if you will.

First, we dug into real-time data streaming, trawling through game data feeds with information around plays, scores, and game time. But our answer was in the audio feed.

Crowd noise and commentary intensity were excellent indicators of what was happening in a sporting event, where exciting moments were usually accompanied by rushes of noise and heightened levels of commentary.


From there, it was a matter of creating an automated way of analysing the audio from a game to identify the most thrilling moments, which would allow us to distinguish the exciting from the lacklustre without having to physically tune in. In step Sydney-based sound and tech gurus Uncanny Valley. We partnered with them to build an algorithm within MaxMSP that could analyse game audio and recognise excitement.

All we needed now was a name (and many hours of testing).

Next Stop: Rugby World Cup

Called ‘HypeCast’ we first ran our new algorithm on a rugby international match from 2014 – and the results far exceeded our expectations.

HypeCast managed to pick up the critical line breaks and action-packed plays within the game, with the algorithm triggering before 80 per cent of the tries during the match – and on average alerting us a whopping 13 seconds before a try.

As we tested more games, HypeCast’s learning functionality was developed through listening to new commentators, crowds, and broadcasts.


The latest iteration of HypeCast uses five different functions that measure multiple variables within a game’s audio, acting as sub-triggers for the algorithm’s master alarm. Only when a specific combination of triggers are activated, the master alarm is triggered (like we said, smart alarm).

After the last round of testing, we were getting between 80 to 100 per cent try prediction and excitement accuracy. Sleepless nights no more – HypeCast was working.

For its grand (unofficial) unveiling, we integrated HypeCast into a dedicated Rugby World Cup app called Rugby Rouser. It’s like a rugby-loving mate who’ll alert you when any high-octane match moments are about to happen, plus it keeps you in the loop on live scores and upcoming fixtures. All you need to do is wake up just in time to see the best bits and not a minute sooner.

How does it work? We’ve passed an optical feed out of the sports broadcast into a Mac mini where HypeCast lives. From there we’re sending trigger alerts from a server to the Rugby Rouser app where notifications appear based on user preferences.

Ultimately, the Rugby World Cup is a beta for this project. We’re going to be constantly testing and optimising our systems to increase accuracy, with plans to trial HypeCast across a swag of other Northern Hemisphere sporting events, and researching other potential applications for the technology.

But mostly, we’re looking forward to no more unwarranted early wake-up calls in the name of sport.

That’s it. Now it’s time for a pre-World Cup snooze.

Download the app here: iPhone Android


Digital Shoreditch 2015 – Hardware hacking: Talking plants and more

This post contains the supporting links for the ‘Hardware hacking: Talking plants and more’ MAKE session by AnalogFolk at Digital Shoreditch 2015.


By Des Holmes (@whodadada) & John Kilpatrick (@jjkilpatrick)



Make - Our hardware hacking and technical experiments blog


George the talking plant - Our demanding office plant
Built with Arduino, Node.js, and HTML5 Web Speech API

AnalogFolk Hack Day 2015 – 4th July
Register your interest for the 2015 Hack Day

Services and Platforms

Parse - BaaS (Backend as a Service)
SDK’s for iOS, Android, JavaScript, Unity, PHP, Adruino Yun

Temboo - Code the Internet of Everything
Software Stack for Connected Devices, generate production-ready code.

IFTTT - Code the Internet of Everything
Create two types of recipes (If and Do) to control your devices.

Bitbucket - Source control
Free private repos for teams of up to 5 users.



littleBits - Plug and Play DIY electronics
Easiest way to get started with prototyping, build in a matter of minutes.

Tessel - Hardware hacking for Web Developers
Control your hardware with Node.js and the npm package manager.

Spark - Connected hardware
Hardware and Software for building IoT projects.

Register your interest for the AnalogFolk make hardware kits here.

Racing Wheel controlled RC Car

Our latest tech toy is a radio controlled car that can be driven with any USB game controller. We built this car to demonstrate what’s possible with HTML5 and Arduinos. We tested the car with an Xbox gamepad and a Logitech racing wheel. The raw gamepad control data is retrieved from the browser’s navigator object so any other front-end tricks could be used to drive the car like a javascript joystick or voice control with the HTML5 web speech API.


The work has started off by taking out the original radio and motor drive modules and the antenna from the car. We were left with two motors, one for steering and one for acceleration, and the plan was to rebuild the whole architecture using Arduinos. We succeeded and the motors are now driven with a powerful H-bridge module (L298n) and the wireless connection is handled by two nRF24l01 antennas. The arduino and the motors are both powered from the original 6V battery pack (4xAA).


The onboard Arduino Nano receives wireless messages from an Arduino UNO connected to a Macbook Pro. This laptop runs a Node.js server that has two main two tasks in this workflow: firstly it sets up a realtime connection with the browser using web sockets and the node module, secondly it passes on this control data to the Arduino UNO through the USB port. Serial messages can very easily be sent through the port with the serial node module. Essentially this server is the link between the browser and the Arduino UNO.


In the browser we don’t have to worry about too many things. The raw gamepad data is available natively from the navigator.getGamepads method. This method will return an array of javascript objects. Each element in the array is a gamepad data object and each object has the x, y, z analogue values of the joysticks, pedals or wheels and the binary values for all the buttons.

We put together a quick, one and a half minute video showing the build process and the first test drives.

This project is not yet finished. We are planning to install an onboard, wireless camera that will be streamed to a monitor in front of the racing wheel for a true simulator experience, like if you were sitting in the seats of the toy car. Adding a pair of red break lights and white headlamps, two temperature sensor for the motors or a buzzer for horn is also in the plan.

Social Desk Lamp, with Spark Core and IFTTT

Following on from our article on Spark’s new product, the Electron, we’ve been busy creating a quick prototype with their existing product, the Spark Core. Combined with a 12V RGB led strip we want to demonstrate what’s possible with a Spark Core and the ‘If This Then That’ service.

A cheap IKEA lamp has been completely stripped to give space for the Spark Core and the led lights. The lamp is connected to the Internet and to the Spark Cloud where the Core receives function calls from IFTTT. The lamp works as a regular desk lamp but blinks red a couple of times when a Google email is received, blue when there’s a new Facebook notification, light blue at each new Twitter mention and green when a photo on Instagram has been liked:


The circuit is simple: power is supplied from a 12V adaptor to the LED strip which is then regulated to 5V for the Core. The three 2N7000 transistors are used to switch 12V for each colour with the 3.3V logic level signal from the Core.


This signal can set the LEDs to 256 different brightness values which means over 16 million different colour for the lamp. Plenty for any project!

Spark introduce the Electron, a cellular IoT development kit

Spark, the small startup company from San Francisco have always been exciting to follow. They debuted their first wifi enabled development board, the Core, on Kickstarter two years ago.

The Core was the first affordable and reliable tool to connect hardware projects to the Internet. Reading temperature sensors in your living room, or switching lights off from anywhere around the world has never been easier. With the code being stored in the cloud, you can even change your program while out and about. Being Arduino compatible, you don’t have to worry about learning a new language and can hit the ground running.


Spark Photon, the second generation of the Core holds similar properties except they halved the price to $19 and made it almost twice as fast.


This February Spark has started a brand new Kickstarter campaign to fund their brand new idea, the Electron. They decided to go back to Kickstarter to increase awareness of the product, and to make the community get excited about the new opportunities it holds.

Spark Electron

The new Electron wants to solve the obvious weakness of both the Core and the Photon: Wi-Fi isn’t available everywhere and that’s quite restricting.


The Electron features an onboard cellular antenna to connect to the Internet wherever 2G or 3G network is available. With the Electron you get a special, M2M SIM card and an affordable data plan to go with. M2M (Machine to Machine) communication has never been available for individuals, only for large companies dealing with fleet management, point of sale, vending machines, alarms, smart meters and much more. Clear benefits are reliability, consistency and increased security.


The Electron will be able to receive commands and send status updates or sensor readings to you or another device through cellular network in the form of text messages. You will be able to connect your drone and their sensors to your phone, monitor your bird house or greenhouse or track your stolen bike all without being connected to the Wi-Fi.


The other exciting piece of news from Spark is the ‘If This Then That’ support. IFTTT is an amazing service that allows you to create personal “recipes” consisting of a Trigger event and an Action. Triggers are notifications from popular services like Facebook, Instangram, Twitter, Google Calendar or Gmail, events from physical devices like the Nest thermostat, the Netatmo personal weather station or all of WeMo’s smart home accessories. Actions can also be chosen from a list of dozens and dozens of services and devices, 167 channels in total (


For Spark developers this means that they can now connect their hardware projects to all 167 channels and setup hundreds of different tasks in a couple of minutes. Here are a few examples:

  • Send a notification to my Android smart watch if my home temperature drops below 16 degrees
  • Email me when the dog’s water bowl is empty
  • Blink a green light when a new commit has been pushed to my Github repository, and orange after a fork
  • Create a calendar event or update my to-do list when the soil moisture sensor is dry to water my plants
  • Log home sensor readings into my Google Drive

George, the talking plant

George has got light, temperature, soil moisture and two motion sensors for measuring environmental properties. He complains if any of the values from the sensors are excessive and is able to interact with people by answering simple questions. George has got a simple face in the form of an 8×8, monochrome LED board that can display basic expression as well as turn his eyes into the direction where people are approaching.

Hardware components used:

  • Arduino UNO
  • Light sensor
  • LM35 thermometer
  • Soil moisture sensor
  • Ai PIR motion sensors
  • 8×8 LED board with an MAX7219 control chip
  • Macbook Pro

Software technologies used:

  • Arduino IDE
  • Node / Express to run an SSH server
  • USB Serialport library
  • library
  • HTML5 Web Speech API (SpeechSynthesis and SpeechRecognition)
  • Javascript

iBeacon Prototype \ AnalogFolk London office tour

We recently built a prototype iBeacon app with Apple’s new programming language, Swift. Treated as a hack project, we used Estimote Bluetooth Beacons to create a three zone setup which delivers location aware content to the user. We’re now focusing on extending this functionality, and building a contextual app for the entire office that will leverage the full power of indoor location services.

Gesture controls with LEAP Motion

As part of our exploration into gesture control we introduced the ability to control the AnalogFolk website via a LEAP Motion controller.

LEAP Motion controlls for

Ensure that your Leap Motion control is plugged in and working, then:

  • Scroll up / down: Hold one hand over the controller and with two fingers extended move your hand up and down.
  • Carousel and pagination: Make a swiping gesture with one hand to flick between carousel slides.
  • Select an item: To select an item make a clockwise motion with your index finger, to move backwards through the items do the reverse gesture.
  • Open: To open a highlighted link make a tapping motion with your index finger.
  • Close: To close a page swipe your index finger left or right.