Inside Google: Developer Day

Swav Kulinski
By Swav Kulinski under Insights 06 October 2016

Last year, Google launched an initiative aimed at bringing software development companies together to discuss all the latest developments and APIs. At TAB, this gives us invaluable insight into what’s on the horizon for Google. Last month, I attended a Developer Day, part of Google’s broader initiative, and which focused on delivering solutions for customers on both Android and iOS. Here are some of the key takeaways and learnings from the day.

I joined other partners at the Google Developer Day for Agencies last month, which focused on delivering solutions for customers on both Android and iOS. Here are some of the key takeaways and learnings from the day.

Machine learning: who needs a PhD anyway?

If you needed convincing (and you really shouldn’t), then the Google Developer Day was clear confirmation that machine learning is on the rise. In my last post, I outlined the basics of machine learning - with a simple definition, and overview of how it works. It’s a rapidly evolving space, and one we are particularly excited to experiment in, looking at the opportunities it can provide for our clients.

At the Developer Day itself, what I was most pleased to see is just how much progress is being made - in leaps and bounds, with particular progress in delivering not just theoretical use cases, but real-life commercial situations.

One example that springs to mind was a session run by Sharif Salah. Now, you might be tempted to think that machine-learning requires a PhD, and therefore is the preserve of only a select few. It doesn’t have to be. Salah explored the processes to follow in order to implement machine learning in an app - and what was clear was that although the complexity of machine learning may seem opaque, then at least the integration of machine learning into your app doesn’t have to be.

Google is, essentially, trying to make things as easy as possible with readily available APIs. The Mobile Vision API, for example, allows for the recognition of text, barcodes, and faces - allowing it to answer questions like “Are people in the picture smiling?” or “Is that Swav Kulinski in the picture?”

Things do, however, start to become more complex when you need to expand upon standard APIs and integrate them. If you wanted to categorise a book by its cover, say, then you would need both image and text recognition APIs. Whilst this type of integration is difficult, there are now ‘nanodegrees’ that provide an accessible way to enhance your engineering capabilities in order to take better advantage of machine learning opportunities.

Integrating machine learning into apps fundamentally means not only will they continue to actively assist you, but they will get better and smarter at doing so.

Now, every app competes to be smarter, more intuitive and more valuable that competitors. In the context of machine learning, however, getting smarter means the app learns how to be the most useful, and understand you better than any other.

Beacons and the Nearby API

Besides machine learning, beacons was another key takeaway for me. That might see strange, since beacons aren’t new - in fact, they are one of those things that have been talked up for years, but no one seems to have really nailed them. At least, not in a really practical, transformative way. The Developer Day provided as really great overview as to why. Peter Lewis, a Google Product Manager, who focuses on location and proximity, explained that for him, there are two primary obstacles restricting the practical use of beacons:

► Low-level work: you need an ecosystem already in place to make them effective.

► Deployment: you also need a lot of beacons to make them effective - and they need to be constantly upgraded. This is costly and time-consuming.

So, can beacons live up to their promise?

A big, and previously untapped, use case is for government bodies to make them completely public. Beacons can now be used by anyone to help them contextualise their surroundings. Previously, you would have had to build a specific app, with a custom back-end - timely, costly and ultimately, impractical for most.

However, by opening beacons up to public consumption - and therefore the information they hold - the possibilities become endless. In fact, the city of Amsterdam is experimenting with beacons by placing 200 of them at bus and tram stations around the inner city, and inviting business owners to test the network and experiment with possible opportunities.

Now, you might be thinking, “Does it really matter? How many users have their Bluetooth enabled anyway?” Well, according to the Nearby API gurus, the number is a lot higher than had been expected, around 40% - and this number is expected to grow as wireless headphones and wearables become ever more mainstream.

Fire(base) everything we've got

Finally, the last key takeaway for me concerned Firebase: a mobile platform designed to help businesses and developers create better apps. It has a clear, simple goal: to make setting up and running backend systems leaner and more effective in every way.

While that sounds both grand and oversimplified at the same time, it is exactly what Google is aiming to do. With Firebase, you can cherry pick the services you need and leave the others behind - and there is a lot to choose from. Firebase offers you everything from mobile-oriented analytics, storage hosting, crash reporting, push notifications, mobile ads and more.

Importantly, everything Firebase offers runs on both Android and iOS - with the only exception being Test Lab for Android. It’s this notable flexibility, and range of services, that makes Firebase an interesting new toolkit to watch and something to bear in mind for developers embarking on a new build project.


In the recap above, I really just scratch the surface of all the exciting things happening at Google right now. The great advantage of the Google Developer Agency Programme is the access it gives us at TAB to see new developments and new opportunities before they reach the wider world. If you’d like to learn more, get in touch.