Sheesh! Making a wireless sensor has proven to be a lot harder than I had expected.
Like a lot of weekend hardware hackers I thought it would be fun to build a wireless temperature sensor. I could use it as a feedback mechanism for my Raspberry Pi heating controller, I could create graphs of temperatures in different rooms (everyone loves a graph right?), and with a little bit of forward planning I could make a fairly useful Arduino breakout board which could be used for lots of other fun wireless projects.
I’ve been through four iterations of the design before I finally found something that worked. I could have avoided some of this if I’d have spent more time testing on breadboard, but then I wouldnt get to order colourful PCBs!
In common with a lot of my other weekend projects, this is very likely to be a white elephant, so it needs to be cheap to start with. There are lots of excellent wireless sensors already available, the Moteino is well regarded. Using that as a baseline, could I build something similar for less than £5 per unit? The answer is “nearly”, see the Bill Of Materials below.
Use a ready made microcontroller board
I didn’t want to have to design a board which I would then have to solder an ATmega328/P to myself. Laying out that board would be hard, plus I would have to spec and solder all the supporting components. I could have done that with through-hole components I suppose, but then the board would have been massive, and I very much doubt it would have actually worked out any cheaper. Instead, I decided that a better idea was to solder on a ready-made Arduino clone, specifically the Pro Mini. I buy them from this supplier on eBay and I’ve never had any problems. They are 3v / 5v switchable, very reliable and turn up quickly. Power consumption was going to be an important metric and running them off of a 3V supply would be essential.
Use a ready made radio module
As with the MCU, I don’t want to be laying out a radio board or soldering surface mount components. I needed a ready-made radio module which was cheap, readily available and had low power requirements. I started off with the NRF24L01. It had a small footprint, cost around 99p a unit, had a built in antenna and was supported by the RadioHead library. Initial testing on breadboard showed that the range of these devices would be sufficient for my house. I designed the first two revisions of the sensor board around this radio (and a CR2032, see the battery section below). Initial current draw was a bit too high for the CR2032, but I could have worked around that by using AAA batteries instead – but more annoying was that the range of the radios was not at all reliable. The 2.4GHz spectrum is very noisy and the addition of two baby monitors since the original breadboard test hasn’t helped, but in testing on the PCB I found these radios to be pretty much useless. I also tried using the version with the power amp as the central transceiver, this helped a little bit but they were still failing to get at least 50% of their transmissions to the other end, even over a couple of meters. Other people have reported really good range with these boards, but crucially those range tests were done outside. It’s my considered opinion that these radios and Wifi in the house do not co-exist.
So I gave up with those, and tried the XL4432-SMT. This is based on the Si443x chipset from Silicon Labs. They’re readily available on eBay for around £.170 each, so nearly twice the price of the NRF24L01. They’re well supported by the RadioHead library, can run down to 1.8V, have low current draw when on (virtually nothing when in stand-by), support a wide frequency range around the 433MHz ISM band and in range testing they out-performed the NRF24L01 by a long way. The downside is that these pre-made boards use 1.27mm pitch / castellated connections. I had to design an Eagle part to interface with them, but that wasn’t really too hard. See below for links to the parts I made. Another drawback was the antenna; being a much lower frequency means they need a much longer antenna so I would need to find a project box which could hold them.
The Si443x also has a temperature sensor and a wake up timer on board. However, reading the errata from Silicon Labs it seems that the WUT is actually broken and the temperature sensor was returning very strange results. The datasheet says that you need to calibrate the temperature sensor so I tried doing this but go nowhere fast, and so I opted to use a 1wire sensor instead.
The other option for a radio would have been an ESP8266. You can get ready made boards cheap on eBay, and I could have done away with a separate MCU altogether but the power consumption of these devices is just too great for a project which needs to run of a couple of batteries for a year.
Run for a long time on batteries
What’s the point in a wireless sensor if you have to plug it in to the mains. This project must run from batteries, and those batteries need to last a long time – having to change the batteries every few months would quickly get boring and the sensors would be left doing nothing. Obviously having the MCU go in to a sleep state between runs is going to be necessary, plus a radio which has modest power requirements when running. We can further reduce the power needs by making sure that the “work” that the sensor has to do is done as quickly as possible. The default 1wire temperature sensor code you find will typically takes around 2700msec to read. That’s a very long time. I changed the code a bit by hard-coding the hardware address of the sensor on the board and by only reading two bytes of temperature data (good for 0.5 degree accuracy). More information can be found below.
From a size perspective a CR2032 battery looks very appealing. Some early testing made me think that they would work fine but in real life I had a lot of problems. In hindsight I think I can put most of the problems down the Brown Out Detector on the Pro Mini being set to 2.8V, more on that in a moment.
UPDATE: Whoops. This post has been sat in drafts for nearly 2 years. Guess it’s not getting finished then. I’m posting this in case the above is interesting. Topics that I wanted to cover but haven’t are:
It’s clear that a lot of people develop software using Ubuntu. What’s less clear is exactly what sort of software is being built. We see reports of people developing Linux apps, Android apps, web services, self driving cars… the list is huge. We need to get better clarity; to understand how that relates to Ubuntu desktop.
When I was chatting with Barton George a few weeks back he expressed the same interest; what are people doing with the Sputnik machines from Dell? We want to learn more about the sorts of software projects that you’re working on so that we can make the Ubuntu developer experience as good as possible.
To that end we put together the Ubuntu Developer Desktop Survey to help us understand more about what you’re doing and how you’re doing it. This survey is aimed primarily at people who are using Ubuntu to develop software targeting any platform. It doesn’t matter if you do that at work, at home, at school – if you’re building software then we’re glad to hear from you. To be clear: this doesn’t mean we’re abandoning our mantra of Ubuntu being for human beings, software developers are human beings too. Right now I want to get a better view in to what software developers are doing.
The survey will close on Friday 31st May 2019 and we will publish the results very shortly afterwards. We’ll then follow that up with some further analysis and some ideas as to how this will influence the desktop product roadmap in the future. Please take this opportunity to help shape Ubuntu for the better.
I’m going to start a weekly newsletter style update to keep people abreast of what’s been going on with Ubuntu Desktop as we move to GNOME Shell and build the foundations for 18.04 LTS. Here’s the first instalment:
Friday 19th May 2017
We’re on to the last few MIR (https://wiki.ubuntu.com/MainInclusionProcess) reviews for the packages needed to update the seeds in order to deliver the GNOME desktop by default.
We still have some security questions to answer about how we deal with updates to mozjs/gjs in an LTS release (where mozjs has a support period of 12 months but we need to offer support for a full five years). This is being looked at now, but for 17.10 we are set.
We are aiming to have the seeds updated next week, and this will be the first milestone on the road to a fantastic GNOME experience in 17.10 Artful.
We’ve also triaged over 400 GNOME Shell bugs in Launchpad to allow us to more easily focus on the important issues.
We have been working on removing Ubuntu’s custom “aptdaemon” plugin in GNOME Software in favour of the upstream solution which uses PackageKit. This allows us to share more code with other distributions.
LivePatch delivers essential kernel security updates to Ubuntu machines without having to reboot to apply them. As an Ubuntu user you can sign up for a free account.
We’re working on integrating LivePatch in to the supported LTS desktops to provide a friendly way to setup and configure the service.
This week we started to investigate the APIs provided by the LivePatch services so we can report LivePatch activity to the user, obtain an API key on behalf of the user & set up the service. Work has also started on the software-properties-gtk dialogs (aka Software & Updates in System Settings) to add the options required for LivePatch.
Added upgrade tests from Zesty to Artful for Ubuntu and flavours. Working on making all these tests pass now so that everyone will have a solid and reliable upgrade path.
Work is being done on the installer tests. This will extend the current installer tests to check that not only has the install completed successfully but that all desktop environment is working as expected, this had previously been covered with manual tests.
GStreamer is now at 1.12 final in 17.10.
Chromium: stable 58.0.3029.110, beta 59.0.3071.47, dev 60.0.3095.5
LibreOffice 5.3.3 is being tested.
More GNOME applications are being packaged as Snaps. There is still some work to do to get them fully confined and fully integrated into the desktop. We’re working on adding Snap support to Gtk’s Portals to allow desktop Snaps to access resources outside their sandbox.
We will start tracking the Snaps here: https://wiki.ubuntu.com/DesktopTeam/GNOME/Snaps
In my previous post about How To Add OAUTH to your Alexa app in 10 minutes a couple of people commented that they couldn’t actually access the users information once they had linked their account. I didn’t actually try and access any of the user information because the only user of my skill is me, and I already know my name and email address. Nevertheless, I had a quick play with it over the weekend and here’s a simple skill to show you how to access the user’s profile information from a Python skill running in AWS Lambda.
First of all you need to make sure your skill is set up to use Login With Amazon. I’ve covered this for Smart Home skills here but it works just the same for normal skills.
You also need to make sure your skill is configured to use the scopes “profile” and “postal_code“. This is done in the Configuration tab in the developer console for your skill:
The Interaction Model for this skill is as follows:
I got a Cisco 7941 off eBay. This is a phone which was £400 when new (some time around 2004) but can now be picked up for about £10. These phones went End Of Sale in January 2010, so even if mine was one of the last phones to roll off the production line it’s still about 7 years old but it’s still working perfectly. A testament to the good build quality of these phones, and perhaps the previous owner’s careful handling.
Since these devices are no longer supported many companies will be getting rid of them (or probably already have) so there should be some bargains to be had for phone geeks.
Q: Does the Cisco 7941 work with Asterisk? A: Yes. You need to load the SIP firmware (the focus of this post) or chan-sccp (out of scope for this post but I’ll check it out at some point).
Q: Does the Cisco 7941 work with SIP? A: Yes. You need to flash the correct firmware though.
Q: Is it really hard to get working? A: No. If you’re comfortable with Linux and a few command line tools. And assuming you already have Asterisk set up.
Q: Is a lot of the information on the web about how to set up the 7941 wrong? A: Yes. There is a lot of confusion about config files (the 7940 and 7941 use different ones).
Q: Will you tell us how you got your phone to work? A: Yes! However – this is what works for me. You will need to tweak the config in places.
The steps to getting this phone working as a SIP extension on Asterisk on Ubuntu / Raspberry Pi:
The phone will download it’s firmware and config via TFTP. It needs to download it’s config on every boot, so you will always need a TFTP server running. I think that if the TFTP server is unavailable it will just use the previous config, so it’s possible that you can get away without it, but I haven’t tried. My recommendation is that you install dnsmasq. It’s a small and full featured DNS server which also includes a DHCP & TFTP server which are easy to configure and it’s almost certainly packaged for your distro. You should also (temporarily) disable any other DHCP servers on your local network so that dnsmasq is the only thing offering DHCP addresses. This will simplify the process of getting the phone to find the TFTP server, since with dnsmasq it will all be automatic. If you later re-enable your original DHCP server, say on your router, then you will need to configure it to give out the address of the dnsmasq TFTP server and disable DHCP on dnsmasq. In my opinion, if you’re going to be running a Cisco IP phone on your network you’d be better off moving all DHCP to dnsmasq.
The full configuration of dnsmasq it’s out of scope for this doc, but in a nutshell you need these in your dnsmasq config:
Set up a DHCP range
Enable the TFTP server
Set the TFTP path
tftp-root=/home/<your user>/tftp (or whatever works for you)
Download the SIP Firmware from Cisco
Usually Cisco require a valid support contract before you can download anything useful from their website, but it seems that since these phones are now out of support they have offered up the firmware free of charge. You do still need to register an account to download the files. At the time of writing the latest version is 9.4.2 SR 3 dated 14th February 2017 – so bang up to date, even though these phones are end-of-life. Bizarre, but good for us. Thanks Cisco!
This is everything you need to reflash your phone to the latest SIP firmware. Now you need to get the phone to reboot in to firmware download mode.
Flash the phone with the firmware via the TFTP server
Unplug the phone from the power. Make sure that the network cable is still connected (unless you’re using using PoE).
Plug the power back in and hold down the # key
Eventually you will see the “line” lights start to flash orange. It might take a couple of minutes to get to this stage, don’t give up, just keep holding down #
When the line lights are flashing type 123456789*0# This will start firmware download mode.
The screen will go black for a moment and then go through the process of getting an IP address and connecting to the TFTP server
Once connected to the TFTP server the software download will start
The phone will reboot once download is complete and present you with an “Unprovisioned” message on the screen. This is good news! The phone firmware has now been updated.
I put together a video showing this process. It’s not very interesting but it will give you an idea of what to expect. The actual downloading of the firmware section has been sped up 3X.
Configure the SIP extension in Asterisk
Now you need to configure the SIP extension in Asterisk. Do this as per any other SIP extension, but bear this important piece of information in mind: The Cisco 7941 can only deal with 8 character passwords, so keep your SIP authentication secret to 8 characters.
While you’re in Asterisk configuration mode, take a moment to note down these bits of information as well (in Advanced SIP settings in FreePBX):
RTP Port range, start and end.
Bind Port (probably 5060)
Write the config files for the phone and upload them via the TFTP server
Please take the time to read this section fully, this is the part that is most troublesome. The Cisco 7941 is very picky about it’s config file and even a small mistake will stop the phone from working. These settings are specific to the 79×1 series of phones running at least version 8.x of the firmware. If your phone is not a 79×1 and/or is not running v9.x.x of the firmware then these settings are not for you.
Once the phone has loaded it’s firmware and booted, it will go looking for a file called SEP<PHONE MAC ADDRESS>.cnf.xml. So if the MAC address of your phone is 11:22:33:44:55:66 then the config file needs to be named SEP112233445566.cnf.xml. This file needs to be in the root of your TFTP server.
You will see mention of a file called XMLDefault.cnf.xml. If you’ve only got a few phones, don’t worry about this, you don’t need it.
So here is a config file which is about as minimal as I can make it:
Copy and paste this into a text editor and search and replace the following:
#IP ADDRESS OF AN NTP SERVER# – with – the IP address of an NTP server
#SIP PORT FROM YOUR ASTERISK SERVER# – with – the SIP port of your asterisk server is listening on. Probably 5060
#IP ADDRESS OF YOUR ASTERISK SERVER# – with – the IP address of your Asterisk server
#PHONE NAME# – with – the text you want to appear at the top right of the phone screen
#RTP START PORT# – with – the RTP port range start from the previous stage
#RTP END PORT#’ – with – the RTP port range end from the the previous stage
#EXT NUM# – with – the Asterisk extension number as configured in the previous stage
#SIP PORT# – with – the SIP port of your Asterisk server. Probably 5060
#EXT NAME# – with – the name you want to give this extension
#SIP AUTH NAME# – with – the username for the SIP extension as configured in Asterisk
#8 CHAR PASSWORD# – with – the password for the SIP extension as configured in Asterisk
#VM NUM# – with – the number you dial for Voicemail. Probably *98
Note that this config file has two lines configured. If you just blindly search and replace you’ll end up with two extensions configured the same.
Some comments on what some of the XML tags do:
ipAddressMode – 0 is IP v4 only. But this seems to have little effect.
registerWithProxy – true – Registers the device with Asterisk, this allows incoming calls to be sent to the phone. If you’re getting “Unregistered” message on the screen, check you have this set.
featureId – 9 is SIP
autoAnswerEnabled – 2 – 2 seems to be “off”
webAccess – 0 – 0 is on (?!)
sshAccess -0 – ditto
versionStamp – bump this up every time you make a change. Something like YYYMMDD001..2..3 etc
networkLocale – United_Kingdom – sets the tones to UK, see the optional extras section for more info.
transportLayerProtocol – 2 is UDP, 1 is TCP
dialToneSettings – 2 is “always use internal dialtone”. See option extras for more info.
Edit this file as necessary and then save it to the root of your TFTP server with the filename: SEP<MAC>.cnf.xml. If your phone MAC address was aa:bb:33:44:55:66 then the filename would be: SEPAABB33445566.cnf.xml Note that it’s case sensitive, letters in the MAC address should be in upper case the extensions should be in lowercase. You can get the MAC address for the phone from the syslog on your dnsmasq server.
If your phone is still in “Unprovisioned” mode it will have been asking for this config file repeatedly. Once you save the file you should see the phone reboot shortly afterwards. It may download the firmware again for some reason, just leave it to get on with it.
Make a call!
If everything has worked you should see your extension listed on the right hand side of the screen near the buttons, and the name of the phone should appear at the top of the screen. If the icon next to the line buttons is that of a phone without an x through it, then you’re probably good to go! Press the line button and see if you get a dial tone. If not, then check the phone logs:
From these logs you should be able to tell if the phone has loaded your config correctly. Errors about “updating locale” or “no trust list installed” can be ignored. If there is a problem with the config file itself a generic error will be listed here. If the phone won’t load the config file the most likely reason is that there is a typo in your XML file. Good luck finding it. You can SSH in to the phone to get more detailed logs and debugging information, but I haven’t tried this yet. Google is your friend.
The dial plan tells the phone how to process the digits you type and when to start sending the call. Without a dial plan the phone simply waits a period of time for you to stop typing numbers before it decides you’re done and starts the call. By using a dial plan you can reduce the amount of time spent waiting after you’ve finished keying in the number. Here’s an example plan I’ve edited based on this post on Phil Lavin’s blog (Thanks Phil!) http://phil.lavin.me.uk/2012/11/united-kingdom-dial-plan-xml-for-cisco-phones/
Save this to the root of your TFTP server, named “dialplan.xml” (lowercase).
Everyone likes novelty ringtones. You can find plenty of ringtones in a format which is compatible with your phone (raw format, 8000 Hz sample rate, 8 bit, ulaw, max 2 seconds). These files need to be placed in to the root of your TFTP server. I tried putting them in a sub-directory but it didn’t work. Then you need to create a file called “ringlist.xml” also in the root of the server. The format of this file is:
Filenames are case sensitive. Once you’ve save this file, copy it to “distinctiveringlist.xml” as well. This will allow you to set ring tones for the default ringer and different rings for each line.
By default the 7941 will have a psuedo North American dial tone. This is annoyingly shrill (yes, it is). By specifying a NetworkLocale in the phone config we can get it to load a different set of informational tones from a file stored in (per the example XML above) United_Kingdom. In the root of the TFTP server create a directory called United_Kingdom. In this directory you need to create a file called g3-tones.xml. Bizarrely Cisco require you to have a support contract in order to download the correct tones settings for your country, despite giving the phone firmware away for free. Go figure. So this means I’m not going to paste the XML here. If you search hard enough you’ll find an example g3-tones.xml file you can use as a base. In our phone configuration above we told the phone to always use the internal dialing tone, so this means we only need to change the idial section of the tones file. The magic numbers are:
The phone comes with a single default wallpaper with horizontal lines on it. This is easily replaced by your own designs with a simple PNG. Create a directory in the root of the TFTP server called Desktops. In here create another directory called 320x196x4.
In to this directory you need to place a “List.xml” file:
The “-tn” in the file is a smaller thumbnail version of the larger image. The PNGs need to be sized exactly 320×196 for the large and 80×49 for the thumbnail. Here’s something to get you started:
You will have noticed that the phone has a “Directories” button and a “Services” button. I haven’t managed to add an extra phone book to the Directories button yet although I think it’s certainly possible, just that the XML file refuses to do anything. However, I have got a phone directory working on the Services button.
In the main phone config file there is a tag for “servicesURL”. Point this to a web server on your local network which will serve up an XML file. For example:
Assuming you are using Apache 2 to serve that XML file (or it could equally be a CGI script which generates the XML dynamically from a database such as the FreePBX phone book) the format looks like this:
Important note: You must tell Apache to serve those files as type “text/xml“. “application/xml” will not work.
You can do this via your CGI script, or if you want to serve a static file add something like this to your Apache config:
Inside your VirtualHost section.
Watch /var/log/syslog on the machine running the TFTP server. You’ll be able to see exactly what files the phone is asking for. Bear in mind that it does ask for files it doesn’t strictly need, so don’t worry too much about file not found errors unless it’s one of the above.
Here’s a final video showing the boot up for a fully configured phone
Alexa smart home skills require you to provide OAUTH2 so that users can authorise a skill to access the assumed cloud service powering their lightbulbs or any number of other pointlessly connected devices. This makes sense since OAUTH2 is a standard and secure way to grant access for users from one system to the resources of another. However, with this come a few caveats which are potential blockers for casual skill developers like me. If you’re writing a skill for your own personal use, with no intention of adding it to the store, you still have to have a valid and recognised SSL certificate and a whole OAUTH2 server set up somewhere.
The SSL certificate is easy enough to implement, but it’s a bit of a faff (renewing Let’s Encrypt certs, or paying for cert which needs you to deal with the certificate authorities, send in scans of your passport and other tedious red tape) but – in my opinion anyway – setting up an OAUTH server is even more of a faff. If only there was some way to avoid having to do either of these things….
Using “Login With Amazon” as your OAUTH provider
Since you already have an Amazon account you can use “Login With Amazon” as your skill’s OAUTH server and your normal everyday Amazon account as your credentials. You’re only sharing your Amazon account data with yourself, and even then we can restrict it to just your login ID. You don’t actually need to do anything with the OAUTH token once it’s returned since you’re the only user. I mean, you could if you wanted to, but this HOWTO assumes that you’re the only user and that you don’t care about that sort of thing. We are also going to assume that you have already created the Lambda function and the smart home skill or are familiar with how to do that. This is a bit tricky because you can’t test your smart home skill on a real device until you’ve implemented OAUTH, and you can’t complete the OAUTH set-up until you’ve got the IDs from your Lambda function and skill. If you haven’t written your skill yet, just create a placeholder Lambda function and smart home skill to be going on with.
Click “Create a New Security Profile”. Fill out the form along these lines:
and hit Save.
You should see a message along the lines of “Login with Amazon successfully enabled for Security Profile.”
Hover the mouse over the cog icon to the right of your new security profile and choose “Security Profile”.
Copy your “Client ID” and “Client Secret” and paste it in to a notepad. You’ll need this again shortly.
2. Configure your skill to use Login With Amazon
Back in the Developer Console, navigate to the Configuration page for your skill. (Click on your skill, then click on Configuration). You need to enable “Account Linking” and this will then show the extra boxes discussed below.
In to the “Authorization URL” box you should put:
and then copy the Redirect URL from further down the page and append it to the end of the Authorization URL. For example:
As far as I can tell Layla is for UK/Europe and Pitangui is for the US. Use the appropriate one for you. Also, keep a note of the redirect URL in your notepad, you will need this again later.
In to the “Client Id” box paste your client id from step 1.
You can leave “Domain List” blank for now.
For “Scope” I suggest you use:
This will give your Alexa Skill access to a minimal amount of information about you from Amazon, in this case just a user_id. That user ID is unique to your app so can’t be used by other apps or to identify that user elsewhere. Since you don’t really have any customers for your skill, only you, there is no reason to provide access to any other information.
Further down the page you need to configure the Grant Type:
Select an “Auth Code Grant”
Set the “Access Token URI” to:
and in to “Client Secret” paste your secret from step 1.
A Python library for talking to some Sony Bravia TVs, and an accompanying Alexa Skill to let you control the TV by voice. If you like that sort of thing.
These scripts make use of the excellent Requests module. You’ll need to install that first.
This is a fairly simple library which allows you to “pair” the script with the TV and will then send cookie-authenticated requests to the TVs own web API to control pretty much everything. You can:
Set up the initial pairing between the script and the TV
Retrieve system information from the TV (serial, model, mac addr etc)
Enumerate the available TV inputs (HDMI1,2,3,4 etc)
Switch to a given input
Enumerate the remote control buttons
Virtually press those buttons
Find out what Smart TV apps are available
Start those apps
Enumerate the available DVB-T channels
Switch to those channels
Send Wake On Lan packets to switch the TV on from cold (assuming you’ve enabled that)
A couple of convenience functions
The library tries to hide the complexity and pre-requisites and give you an easy to use API.
I built this for a couple of reasons: 1. Because the TV had an undocumented API, and that tickles me 2. I quite fancied hooking it up to Alexa for lols
Most of the information about how to talk to the TV’s API came from looking at packet captures from the iPhone app “TV Sideview”.
There is a script called testit.py that will give you a few clues about how to use it, but it’s a bit of a mess. I’ve left a lot of comments in the code for the library which should help you.
Really, I think that this library should be imported in to a long-running process rather than be called every time you want to press a remote control button. On a Raspberry Pi, Requests can take a while (a couple of seconds) to import, and then bravialib pre-populates a few data sources, and all of that takes time, like about 20 seconds – so you really don’t want to use this library if you just want to fire a few remote control commands. Also – be aware that the TV takes a long time to boot and accept commands. From cold you’re talking about a minute maybe two, it’s really annoying.
The aforementioned long running process – bravia_rest.py
As the main library takes a while to start and needs a certain amount of data from the TV to work properly it really makes sense to start it up once and then leave it running as long as you can. The bravia_rest.py script does exactly that, and also exposes some of the functionality as a very crude REST interface that you can easily hook it in to various home automation systems.
First you need to add the IP address and MAC address (needed to turn on the TV the first time the script is run, it can be discovered automatically if you just power the TV on for a few minutes before you run the script).
Then run bravia_rest.py.
If this is the first time you have run it you will need to pair with the TV. You will be told to point your browser at the IP address of the machine where the script is running on port 8090 (by default). Doing this will make the script attempt to pair with the TV. If it works you will see a PIN number on the TV screen, you will need to enter this in to the box in your browser. After a few seconds, and with a bit of luck, pairing will now complete. This shouldn’t take too long.
If you are now paired, in your browser go to /dumpinfo for a view in to what the script knows about the TV.
Once everything is running you can POST to these URLs for things to happen (no body is required):
/set/power/[on|off] – turns the telly on and off
/set/send/<button> – e.g. mute, play, pause, up, down. See dumpinfo for all the key names.
/set/volumeup/3 – turn the volume up 3 notches. You MUST pass a number, even it it’s just 1.
/set/volumedown/1 – as above.
/set/loadapp/<app name> – e.g. Netflix, iplayer. Again /dumpinfo will show you what apps are available.
/set/channel/<channel> – e.g. BBC ONE, BBC TWO
/set/input/<input label> – You need to have given your inputs labels on the TV, then pass the label here.
Hooking it up to Alexa
Now we can poke the TV through a simplified REST interface, it’s much easier to hook in to other things, like Alexa for example. Setting up a custom skill in AWS/Lambda is beyond the scope of what I can write up at lunchtime, I’m sure there are lots of other people who have done it better than I could. You’ll need to create a custom app and upload a Python Deployment Package to Lambda including my lambda_function.py script (see inside the Alexa directory), a secrets.py file with your info in it, a copy of the Requests library (you need to create a Python Virtual Environment – it’s quite easy) and possibly a copy of your PEM for a self signed HTTPS certificate. I’ve also included the skills data such as the utterances that I’m using. These will need to be adjusted for your locale.
You issue the command to Alexa: Alexa tell The TV to change to channel BBC ONE.
Your voice is sent to AWS (the green lines) decoded and the utterances and intents etc are sent to the Lambda script.
The Lambda script works out what the requested actions are and sends them back out (the orange lines) to a web server running in your home (in my case a Raspberry Pi running Apache and the bravia_proxy.py script). You need to make that Apache server accessible to the outside world so that AWS can POST data to it. I would recommend that you configure Apache to use SSL and you put at least BASIC Auth in front of the proxy script.
The bravia_proxy.py script receives the POSTed form from AWS and in turn POSTs to the bravia_rest.py script having done a quick bit of sanity checking and normalisation. The proxy and the rest scripts could live on different hosts (and probably should, there are no security considerations in either script – so ya know, don’t use them.)
bravia_rest.py uses bravialib to poke the TV in the right way and returns back a yes or a no which then flows back (the blue lines) to AWS and your Lambda function.
If everything worked you should hear “OK” from Alexa and your TV should do what you told it.
I could have put bravia_rest.py straight on the web an implemented some basic auth and SSL there – but I think this is something better handled by Apache (or whichever server you prefer), not some hacked up script that I wrote.
It doesn’t deal with the TV being off at all well at the moment.
I don’t know what happens when the cookies expire.
As a analytically minded person, is it worth getting Amazon Prime?
I signed up yesterday and spent the evening setting up the various services. It was on offer with £20 off, it was the day before The Grand Tour came out, and I recently bought an Amazon Dot – well played Amazon, well played.
As I’m sure you’re aware Amazon Prime comes with a few bundled goodies to sweeten the deal. But are they actually useful? I think the real value is the next day delivery (if you order lots of things from Amazon), the online photo storage and perhaps the video. Everything else falls short of being good enough to be considered a real product in it’s own right. Yes, I know that’s on purpose – but the marketing material would have you believe otherwise. How about that.
The loan of a book a month from the Kindle Lending Library
Rating: 4/10 (based on going to an actual library being 8/10)
You’re not going to find many books you actually want to read in there. Browsing the catalogue from your desktop is virtually impossible and it’s been made that way deliberately. This is frustrating and annoying. Also annoying is that if you use the Kindle app on your phone you get nagged to upgrade to Kindle Unlimited all the bloody time. You can share this with someone else on your household account, but they won’t thank you for it.
Rating: 6.5/10 (based on Spotify Premium being 10/10)
Better than I expected, but still full of annoyances. And again with the constant nagging to upgrade to Music Unlimited.
The selection is ok. It feels like they’ve taken the time to make it just-good-enough that you will use it but not good enough that you won’t consider upgrading.
You can’t share this perk with anyone else in your household, and since there’s no point in two people in your household having a Prime account each, you end up setting up another Chrome profile just to access the music service for managing play lists etc. Also annoying is that you can’t play music on an Echo or Dot and play music in the browser at the same time. This raises the question of what will happen if you have two Amazon Alexa devices on the same account and you try and play music on both. If you can’t (and I don’t know yet, perhaps someone can comment if they do) then one of the good features of Alexa is somewhat spoiled. (Edit: some searching suggests that indeed you cannot listen to Prime Music on more than one device at at time.)
This service is a very good replacement for the kitchen radio but not a replacement for even a free Spotify subscription. If you already have a premium Spotify account then this will be of no interest to you at all.
Rating: 7.5/10 (based on Netflix being 10/10)
Not so many nag screens here, but guess what, you still need to spend more money for the full experience. It’s a bit galling to discover that some of the headline shows need to be paid for (Game Of Thrones for example). But, there is some genuinely good exclusive content here, The Grand Tour for example. I don’t see myself watching it more than Netflix but it’s worth, say, 3 quid a month of the £6.50 a month Prime subscription.
Installing on an iPhone was easy enough, but on Android it’s outrageously bad. You have to install Amazon’s “Underground” app store, and to do that you have to enable “Unknown Sources”. Unlol. The app works fine, and you can uninstall Underground once it’s on there. The app on my Sony TV is fine. Prime Video doesn’t support Chromecast, which isn’t a surprise but is, you guessed it, annoying.
Next Day Delivery
Rating: 10/10 (based on going to the shops being 1/10)
I don’t order that much from Amazon, which is why I’m wondering if this is all worth it, but when I do having it turn up the next day for “free” is nice.
Unlimited Online Photo Storage
Rating: 9/10 (based on Google Drive being 7/10)
Google also offer unlimited photo storage, but critically they compress your photos. The Amazon offering does not (based on MD5 sums for a picture selected at random). There is also an excellent Linux cli util called acd_cli (https://github.com/yadayada/acd_cli). You’ll want to exclude videos and other files from being uploaded because they will count towards your 5GB limit for other stuff. Something like this:
acd_cli --verbose ul -xe mov -xe mp4 -xe ini . /Pictures
Don’t be surprised if this has a storage limit applied in the future.
Prime Early Access
Rating: 0/10 (based on normal Amazon shopping being 10/10)
Early access to the electronic jumble sale that is Black Friday. Pointless.
Rating: 5/10 (based on not having it being 0/10)
Skip this if you don’t know what Twitch is. With Prime you can subscribe to a channel for a month for free. It’s a free way to support Twitch streamers you like, and you get some crappy game add-ons that you won’t ever use.
Amazon Prime is basically shareware from the late 90s. It does do what they claim, but there are constant nags to spend money on the basis that all the good stuff is just out of reach.
The new Amazon Dot in the kitchen was not warmly welcomed by everyone at Whizzy Towers but having it play music on demand has changed that perception quite a lot in the last day. I have saved the first episode of The Grand Tour to watch tonight, which I am looking forward to since the reviews have been good, and all my photos which had previously been backed up on to a USB HDD are now slowly making their way to the cloud. I also ordered a £5 book yesterday which arrived today as promised.
However, the shortcomings have the feeling of being deliberate. Which is more annoying than if the services were just a bit crap.
So all in all you’re getting what you pay for. It’s not amazing value, but neither is it a total rip off. Someone at Amazon has done their job well. I would recommend you get it.
As you’re probably aware Ubuntu 16.10 was released yesterday and brings with it the Unity 8 desktop session as a preview of what’s being worked on right now and a reflection of the current state of play.
You might have already logged in and kicked the proverbial tyres. If not I would urge you to do so. Please take the time to install a couple of apps as laid out here:
The main driver for getting Unity 8 in to 16.10 was the chance to get it in the hands of users so we can get feedback and bug reports. If you find something doesn’t work, please, log a bug. We don’t monitor every forum or comments section on the web so the absolute best way to provide your feedback to people who can act on it is a bug report with clear steps on how to reproduce the issue (in the case of crashes) or an explanation of why you think a particular behaviour is wrong. This is how you get things changed or fixed.
You can contribute to Ubuntu by simply playing with it.
TL;DR: There is no such thing as a “none” directive in Apache 2. If you’ve got “deny from none” or “allow from none” then you’re doing DNS lookups on each host that connects regardless of whether you want to or not.
I was experiencing a very annoying problem trying to serve static HTML pages and CGI scripts from Apache 2 recently. The problem manifested itself like this:
Running the scripts on the server hosting Apache shows they ran in well under a second
Connecting to the Apache server from the LAN, everything was fine and ran in under a second
Connecting to the Apache server from the Internet, but from a machine known to my network, ran fine
Connecting from an AWS Lambda script, suddenly there is a 20 second or more delay before getting data back
Connecting from Digital Ocean, there is a 20 second delay
Connecting from another computer on the internet, there is a 20 second delay
What the heck is going on here?
I spent time trying to debug my CGI scripts and adding lots more logging and finally convinced myself that it was a problem with the Apache config and not something like MTUs or routing problems.
But what was causing it? It started to feel like like a DNS related issue since the machines where it ran fine where all known to me, and so had corresponding entries in my local DNS server. But but but… I clearly had “HostnameLookups Off” in my apache2.conf file. When I looked at the logs again, I noticed that indeed hostnames were being looked up, even though I told it not to.
Why? Because I don’t know how to configure Apache servers properly. At some point in time I thought this was a good idea:
Order deny, allow
Deny from none
Allow from all
But, there is no such thing as a “none” directive. Apache interprets “none” as a host name and so has to look it up to see if it’s supposed to be blocking it or not, which causes a DNS lookup delays and hostnames to appear in your Apache logs.