How will machine learning change how we interact with computers? This a question I attempted to answer at the inaugural CW Future Tech meeting. The talk was given th Pecha Kucha style, so the slides don't stand up well on their own. Let me sketch the argument here.
The typical outcome of software development is a complex machine designed for a human to operate. It has lots of buttons to press, and maybe even some sliders and levers to pull. These machines amplify what a human can achieve, but the job of operating them is not much different from that which an assembly line worker performs.
With machine learning we can delegate more work to the machine. We give the machine a high level goal and gets on with the job, with minimal intervention. Domestic robots like the Roomba are an obvious example of this, but I think the real revolution will occur in the data centre. I see Myna as an early step on the road to marketing automation, and anomaly detection as a step towards devops automation.
That's the gist of it. There isn't much you can say in 6 minutes and 40 seconds, but take a look at the slides for a bit more colour.
This was the first time I've done a presentation in the Pecha Kucha style. It was way harder than any other talk I've given! Sticking to the strict timing required a lot of practice and it was difficult to balance the content so I had a roughly equal amount to say for each slide. I ended up chasing the slides a bit towards the end of my talk, which is probably better than having pauses to fill. I ended up being the only presenter who had the 20s-per-slide time limit enforced -- my slides were in PDF, while everyone else was using Powerpoint. I think they had a much easier time of it!