Power Great Gaming with New Analytics from Play Games

By Ben Frenkel, Google Play Games team

A few weeks ago at the Game Developers Conference (GDC), we announced Play Games Player Analytics, a new set of free reports to help you manage your games business and understand in-game player behavior. Today, we’re excited to make these new tools available to you in the Google Play Developer Console.

Analytics is a key component of running a game as a service, which is increasingly becoming a necessity for running a successful mobile gaming business. When you take a closer look at large developers that do this successfully, you find that they do three things really well:

  • Manage their business to revenue targets
  • Identify hot spots in their business metrics so they can continuously focus on the game updates that will drive the most impact
  • Use analytics to understand how players are progressing, spending, and churning

“With player engagement and revenue data living under one roof, developers get a level of data quality that is simply not available to smaller teams without dedicated staff. As the tools evolve, I think Google Play Games Player Analytics will finally allow indie devs to confidently make data-driven changes that actually improve revenue.”

Kevin Pazirandeh
Developer of Zombie Highway 2

With Player Analytics, we wanted to make these capabilities available to the entire developer ecosystem on Google Play in a frictionless, easy-to-use way, freeing up your precious time to create great gaming experiences. Small studios, including the makers of Zombie Highway 2 and Bombsquad, have already started to see the benefits and impact of Player Analytics on their business.

Further, if you integrate with Google Play game services, you get this set of analytics with no incremental effort. But, for a little extra work, you can also unlock another set of high impact reports by integrating Google Play game services Events, starting with the Sources and Sinks report, a report to help you balance your in-game economy.

If you already have a game integrated with Google Play game services, go check out the new reports in the Google Play Developer Console today. For everyone else, enabling Player Analytics is as simple as adding a handful of lines of code to your game to integrate Google Play game services.

Manage your business to revenue targets

Set your spend target in Player Analytics by choosing a daily goal

To help assess the health of your games business, Player Analytics enables you to select a daily in-app purchase revenue target and then assess how you’re doing against that goal through the Target vs Actual report depicted below. Learn more.

Identify hot spots using benchmarks with the Business Drivers report

Ever wonder how your game’s performance stacks up against other games? Player Analytics tells you exactly how well you are doing compared to similar games in your category.

Metrics highlighted in red are below the benchmark. Arrows indicate whether a metric is trending up or down, and any cell with the icon can be clicked to see more details about the underlying drivers of the change. Learn more.

Track player retention by new user cohort

In the Retention report, you can see the percentage of players that continued to play your game on the following seven days after installing your game.

Learn more.

See where players are spending their time, struggling, and churning with the Player Progression report

Measured by the number of achievements players have earned, the Player Progression funnel helps you identify where your players are struggling and churning to help you refine your game and, ultimately, improve retention. Add more achievements to make progression tracking more precise.

Learn more.

Manage your in-game economy with the Sources and Sinks report

The Sources and Sinks report helps you balance your in-game economy by showing the relationship between how quickly players are earning or buying and using resources.

For example, Eric Froemling, one man developer of BombSquad, used the Sources & Sinks report to help balance the rate at which players earned and spent tickets.

Read more about Eric’s experience with Player Analytics in his recent blog post.

To enable the Sources and Sinks report you will need to create and integrate Play game services Events that track sources of premium currency (e.g., gold coins earned), and sinks of premium currency (e.g., gold coins spent to buy in-app items).

Power Great Gaming with New Analytics from Play Games

By Ben Frenkel, Google Play Games team

A few weeks ago at the Game Developers Conference (GDC), we announced Play Games Player Analytics, a new set of free reports to help you manage your games business and understand in-game player behavior. Today, we’re excited to make these new tools available to you in the Google Play Developer Console.

Analytics is a key component of running a game as a service, which is increasingly becoming a necessity for running a successful mobile gaming business. When you take a closer look at large developers that do this successfully, you find that they do three things really well:

  • Manage their business to revenue targets
  • Identify hot spots in their business metrics so they can continuously focus on the game updates that will drive the most impact
  • Use analytics to understand how players are progressing, spending, and churning

“With player engagement and revenue data living under one roof, developers get a level of data quality that is simply not available to smaller teams without dedicated staff. As the tools evolve, I think Google Play Games Player Analytics will finally allow indie devs to confidently make data-driven changes that actually improve revenue.”

Kevin Pazirandeh
Developer of Zombie Highway 2

With Player Analytics, we wanted to make these capabilities available to the entire developer ecosystem on Google Play in a frictionless, easy-to-use way, freeing up your precious time to create great gaming experiences. Small studios, including the makers of Zombie Highway 2 and Bombsquad, have already started to see the benefits and impact of Player Analytics on their business.

Further, if you integrate with Google Play game services, you get this set of analytics with no incremental effort. But, for a little extra work, you can also unlock another set of high impact reports by integrating Google Play game services Events, starting with the Sources and Sinks report, a report to help you balance your in-game economy.

If you already have a game integrated with Google Play game services, go check out the new reports in the Google Play Developer Console today. For everyone else, enabling Player Analytics is as simple as adding a handful of lines of code to your game to integrate Google Play game services.

Manage your business to revenue targets

Set your spend target in Player Analytics by choosing a daily goal

To help assess the health of your games business, Player Analytics enables you to select a daily in-app purchase revenue target and then assess how you’re doing against that goal through the Target vs Actual report depicted below. Learn more.

Identify hot spots using benchmarks with the Business Drivers report

Ever wonder how your game’s performance stacks up against other games? Player Analytics tells you exactly how well you are doing compared to similar games in your category.

Metrics highlighted in red are below the benchmark. Arrows indicate whether a metric is trending up or down, and any cell with the icon can be clicked to see more details about the underlying drivers of the change. Learn more.

Track player retention by new user cohort

In the Retention report, you can see the percentage of players that continued to play your game on the following seven days after installing your game.

Learn more.

See where players are spending their time, struggling, and churning with the Player Progression report

Measured by the number of achievements players have earned, the Player Progression funnel helps you identify where your players are struggling and churning to help you refine your game and, ultimately, improve retention. Add more achievements to make progression tracking more precise.

Learn more.

Manage your in-game economy with the Sources and Sinks report

The Sources and Sinks report helps you balance your in-game economy by showing the relationship between how quickly players are earning or buying and using resources.

For example, Eric Froemling, one man developer of BombSquad, used the Sources & Sinks report to help balance the rate at which players earned and spent tickets.

Read more about Eric’s experience with Player Analytics in his recent blog post.

To enable the Sources and Sinks report you will need to create and integrate Play game services Events that track sources of premium currency (e.g., gold coins earned), and sinks of premium currency (e.g., gold coins spent to buy in-app items).

Google Analytics Introduces Product Release Notes

Ever feel like you just can’t keep up with all the new features in Google Analytics? We hear you! To help you keep track of everything that’s going on, we’ve started publishing Release Notes in our product Help Center.

Release notes will be updated periodically and will have the most comprehensive list of new features or changes to the Google Analytics product. So, if you see something new in your account and have questions, we recommend starting here. We’ll point you to the relevant documentation to get you up to speed on everything you need to know.

We’re happy to be adding another resources to keep our users informed. Check it out today!

Posted by Louis Gray, Analytics Advocate

How most people like to greet others

How most people like to greet others

View

Game Performance: Layout Qualifiers

Today, we want to share some best practices on using the OpenGL Shading Language (GLSL) that can optimize the performance of your game and simplify your workflow. Specifically, Layout qualifiers make your code more deterministic and increase performance by reducing your work.

Let’s start with a simple vertex shader and change it as we go along.

This basic vertex shader takes position and texture coordinates, transforms the position and outputs the data to the fragment shader:

attribute vec4 vertexPosition;
attribute vec2 vertexUV;

uniform mat4 matWorldViewProjection;

varying vec2 outTexCoord;

void main()
{
  outTexCoord = vertexUV;
  gl_Position = matWorldViewProjection * vertexPosition;
}

Vertex Attribute Index

To draw a mesh on to the screen, you need to create a vertex buffer and fill it with vertex data, including positions and texture coordinates for this example.

In our sample shader, the vertex data may be laid out like this:

struct Vertex
{
  Vector4 Position;
  Vector2 TexCoords;
};

Therefore, we defined our vertex shader attributes like this:

attribute vec4 vertexPosition;
attribute vec2  vertexUV;

To associate the vertex data with the shader attributes, a call to glGetAttribLocation will get the handle of the named attribute. The attribute format is then detailed with a call to glVertexAttribPointer.

GLint handleVertexPos = glGetAttribLocation( myShaderProgram, "vertexPosition" );
glVertexAttribPointer( handleVertexPos, 4, GL_FLOAT, GL_FALSE, 0, 0 );

GLint handleVertexUV = glGetAttribLocation( myShaderProgram, "vertexUV" );
glVertexAttribPointer( handleVertexUV, 2, GL_FLOAT, GL_FALSE, 0, 0 );

But you may have multiple shaders with the vertexPosition attribute and calling glGetAttribLocation for every shader is a waste of performance which increases the loading time of your game.

Using layout qualifiers you can change your vertex shader attributes declaration like this:

layout(location = 0) in vec4 vertexPosition;
layout(location = 1) in vec2 vertexUV;

To do so you also need to tell the shader compiler that your shader is aimed at GL ES version 3.1. This is done by adding a version declaration:

#version 300 es

Let’s see how this affects our shader, changes are marked in bold:

#version 300 es

layout(location = 0) in vec4 vertexPosition;
layout(location = 1) in vec2 vertexUV;

uniform mat4 matWorldViewProjection;

out vec2 outTexCoord;

void main()
{
  outTexCoord = vertexUV;
  gl_Position = matWorldViewProjection * vertexPosition;
}

Note that we also changed outTexCoord from varying to out. The varying keyword is deprecated from version 300 es and requires changing for the shader to work.

Note that Vertex Attribute qualifiers and #version 300 es are supported from OpenGL ES 3.0. The desktop equivalent is supported on OpenGL 3.3 and using #version 330.

Now you know your position attributes always at 0 and your texture coordinates will be at 1 and you can now bind your shader format without using glGetAttribLocation:

const int ATTRIB_POS = 0;
const int ATTRIB_UV   = 1;

glVertexAttribPointer( ATTRIB_POS, 4, GL_FLOAT, GL_FALSE, 0, 0 );
glVertexAttribPointer( ATTRIB_UV, 2, GL_FLOAT, GL_FALSE, 0, 0 );

This simple change leads to a cleaner pipeline, simpler code and saved performance during loading time.

To learn more about performance on Android, check out the Android Performance Patterns series.

Posted by Shanee Nishry, Games Developer Advocate

Android Developers Blog 2015-03-26 15:45:00

Today, we want to share some best practices on simple OpenGL Shading Language (GLSL) that can optimize the performance of your games and simplify your workflow. Specifically, Layout qualifiers make your code more deterministic and increase performance by reducing your work.

Let’s start with a simple vertex shader and change it as we go along.

This basic vertex shader takes position and texture coordinates, transforms the position and outputs the data to the fragment shader:

attribute vec4 vertexPosition;
attribute vec2 vertexUV;

uniform mat4 matWorldViewProjection;

varying vec2 outTexCoord;

void main()
{
outTexCoord = vertexUV;
gl_Position = matWorldViewProjection * vertexPosition;
}

Vertex Attribute Index

To draw a mesh on to the screen, you need to create a vertex buffer and fill it with vertex data, including positions and texture coordinates for this example.

In our sample shader, the vertex data may be laid out like this:

struct Vertex
{
  Vector4 Position;
  Vector2 TexCoords;
};

Therefore, we defined our vertex shader attributes like this:

attribute vec4 vertexPosition;
attribute vec2  vertexUV;

To associate the vertex data with the shader attributes, a call to glGetAttribLocation will get the handle of the named attribute. The attribute format is then detailed with a call to glVertexAttribPointer.

GLint handleVertexPos = glGetAttribLocation( myShaderProgram, "vertexPosition" );
glVertexAttribPointer( handleVertexPos, 4, GL_FLOAT, GL_FALSE, 0, 0 );

GLint handleVertexUV = glGetAttribLocation( myShaderProgram, "vertexUV" );
glVertexAttribPointer( handleVertexUV, 2, GL_FLOAT, GL_FALSE, 0, 0 );

But you may have multiple shaders with the vertexPosition attribute and calling glGetAttribLocation for every shader is a waste of performance which increases the loading time of your game.

Using layout qualifiers you can change your vertex shader attributes declaration like this:

layout(location = 0) in vec4 vertexPosition;
layout(location = 1) in vec2 vertexUV;

To do so you also need to tell the shader compiler that your shader is aimed at GL ES version 3.1. This is done by adding a version declaration:

#version 300 es

Let’s see how this affects our shader, changes are marked in bold:

#version 300 es

layout(location = 0) in vec4 vertexPosition;
layout(location = 1) in vec2 vertexUV;

uniform mat4 matWorldViewProjection;

out vec2 outTexCoord;

void main()
{
outTexCoord = vertexUV;
gl_Position = matWorldViewProjection * vertexPosition;
}

Note that we also changed outTexCoord from varying to out. The varying keyword is deprecated from version 300 es and requires changing for the shader to work.

Note that Vertex Attribute qualifiers and #version 300 es are supported from OpenGL ES 3.0. The desktop equivalent is supported on OpenGL 3.3 and using #version 330.

Now you know your position attributes always at 0 and your texture coordinates will be at 1 and you can now bind your shader format without using glGetAttribLocation:

const int ATTRIB_POS = 0;
const int ATTRIB_UV   = 1;

glVertexAttribPointer( ATTRIB_POS, 4, GL_FLOAT, GL_FALSE, 0, 0 );
glVertexAttribPointer( ATTRIB_UV, 2, GL_FLOAT, GL_FALSE, 0, 0 );

This simple change leads to a cleaner pipeline, simpler code and saved performance during loading time.

To learn more about performance on Android, check out the Android Performance Patterns series.

Posted by Shanee Nishry, Games Developer Advocate

Solutions Guide for Implementing Google Analytics via Google Tag Manager

Marketers, developers, and practitioners of analytics depend on having the right data at the right time – but implementing analytics code or AdWords pixels can be a less than fun (or easy) experience. Google Tag Manager makes tagging simple and fast by letting you add tags with a simple UI instead of code, while also offering advanced tracking features used by some of the web’s top sites.
Today we’re excited to announce the launch of the Solutions Guide section on the Google Analytics and Google Tag Manager Help Centers. The Solutions Guide area is focused on providing actionable, hands on, step-by-step instructions for implementing Google Analytics, AdWords, DoubleClick, and other third party tags via Google Tag Manager. 
In this guide, you’ll learn:
  • When and why to use Google Tag Manager
  • Best practices for naming conventions and setup tips
  • When to choose the Data Layer or the Tag Manager UI
  • How to implement GA event tracking, custom dimensions & cross-domain tracking
  • How to setup AdWords, Doubleclick, and Dynamic Remarketing tags in GTM

We’re thrilled to share this with you and hope you find it helpful as you implement Google Tag Manager.
Check out the new GTM Solutions Guide today!
Happy Tagging.

Posted by Krista Seiden, Analytics Advocate

Evolving Beyond The Conversion With Neil Hoyne

Measurement is constantly evolving, and while metrics by themselves each tell us something interesting, they do not necessarily tell the whole story or equally important, what to do next. In essence, our tools provide the what, but not always the why. As marketers and analysts, we need to put in the work and be able to take the next steps with our data: tell the whole story to our teams and stakeholders and be consultative in decision making and direction.
This is really important to get right, because use of data for companies is still new territory for many (frequently, decisions are still just based on how marketers feel). And while this may have been fine in a pre-digital age, the future of your company may very well depend on embracing analytics. With fragmentation of users and channels, there’s just too much for anyone to do. So it comes down to knowing what really works and why – these are the modern modern keys to success.
In this recent talk, Googler Neil Hoyne, Global Program Manager Customer Analytics shares how to embrace the above as well as take the next steps with your measurement.
A few key takeaways:
  • You need to evolve your measurement plan to better fit the state of the web & complex customer journey (see our recent measurement guide to help).
  • Question if you have the right goals or you need to adjust, and don’t be afraid to change goals if need be. Really make sure you have the right macro & micro conversions.
  • Build an attribution model (also see our guide) that works for your brand, considering the unique factors that make up your business and what messages make sense in each different context (for example, mobile, social, email, etc). 
  • Measure your customers in a user-centric way, move beyond the old session-based world.
Watch the whole talk embedded below:
And be sure and connect with Neil on Twitter and Google+

Posted by the Google Analytics Team

Developing audio apps for Android Auto

Posted by Joshua Gordon, Developer Advocate

Have you ever wanted to develop apps for the car, but found the variety of OEMs and proprietary platforms too big of a hurdle? Now with Android Auto, you can target a single platform supported by vehicles coming soon from 28 manufacturers.

Using familiar Android APIs, you can easily add a great in-car user experience to your existing audio apps, with just a small amount of code. If you’re new to developing for Auto, watch this DevByte for an overview of the APIs, and check out the training docs for an end-to-end tutorial.

Playback and custom controls

Custom playback controls on NPR One and iHeartRadio.

The first thing to understand about developing audio apps on Auto is that you don’t draw your user interface directly. Instead, the framework has two well-defined UIs (one for playback, one for browsing) that are created automatically. This ensures consistent behavior across audio apps for drivers, and frees you from dealing with any car specific functionalities or layouts. Although the layout is predefined, you can customize it with artwork, color themes, and custom controls.

Both NPR One and iHeartRadio customize their UI. NPR One adds controls to mark a story as interesting, to view a list of upcoming stories, and to skip to the next story. iHeartRadio adds controls to favorite stations and to like songs. Both apps store user preferences across form factors.

Because the UI is drawn by the framework, playback commands need to be relayed to your app. This is accomplished with the MediaSession callback, which has methods like onPlay() and onPause(). All car specific functionality is handled behind the scenes. For example, you don’t need to be aware if a command came from the touch screen, the steering wheel buttons, or the user’s voice.

Browsing and recommendations

Browsing content on NPR One and iHeartRadio.

The browsing UI is likewise drawn by the framework. You implement the MediaBrowserService to share your content hierarchy with the framework. A content hierarchy is a collection of MediaItems that are either playable (e.g., a song, audio book, or radio station) or browsable (e.g., a favorites folder). Together, these form a tree used to display a browsable menu of your content.

With both apps, recommendations are key. NPR One recommends a short list of in-depth stories that can be selected from the browsing menu. These improve over time based on user feedback. iHeartRadio’s browsing menu lets you pick from favorites and recommended stations, and their “For You” feature gives recommendations based on user location. The app also provides the ability create custom stations, from the browsing menu. Doing so is efficient and requires only three taps (“Create Station” -> “Rock” -> “Foo Fighters”).

When developing for the car, it’s important to quickly connect users with content to minimize distractions while driving. It’s important to note that design considerations on Android Auto are different than on a mobile device. If you imagine a typical media player on a phone, you may picture a browsable menus of “all tracks” or “all artists”. These are not ideal in the car, where the primary focus should be on the road. Both NPR One and iHeartRadio provide good examples of this, because they avoid deep menu hierarchies and lengthy browsable lists.

Voice actions for hands free operation

Voice actions (e.g., “Play KQED”) are an important part of Android Auto. You can support voice actions in your app by implementing onPlayFromSearch() in the MediaSession.Callback. Voice actions may also be used to start your app from the home screen (e.g., “Play KQED on iHeartRadio”). To enable this functionality, declare the MEDIA_PLAY_FROM_SEARCH intent filter in your manifest. For an example, see this sample app.

Next steps

NPR One and iHeartRadio are just two examples of great apps for Android Auto today. They feel like a part of the car, and look and sound great. You can extend your apps to the car today, too, and developing for Auto is easy. The framework handles the car specific functionalities for you, so you’re free to focus on making your app special. Join the discussion at http://g.co/androidautodev if you have questions or ideas to share. To get started on your app, visit developer.android.com/auto.

Should you put coffee in your face right now?

Should you put coffee in your face right now?

View

Hello Places API for Android and iOS!

Posted by Jen Kovnats Harrington, Product Manager, Google Maps APIs

Originally posted to Google Geo Developers blog

People don’t think of their location in terms of coordinates on a map. They want context on what shops or restaurants they’re at, and what’s around them. To help your apps speak your users’ language, we’re launching the Places API for Android, as well as opening a beta program for the Places API for iOS.

The Places API web service and JavaScript library have been available for some time. By providing native support for Android and iOS devices, you can optimize the mobile experience with the new APIs by taking advantage of the device’s location signals.

The Places APIs for Android and iOS bridge the gap between simple geographic locations expressed as latitude and longitude, and how people associate location with a known place. For example, you wouldn’t tell someone you were born at 25.7918359,-80.2127959. You’d simply say, “I was born in Jackson Memorial Hospital in Miami, Florida.” The Places API brings the power of Google’s global places database into your app, providing more than 100 million places, like restaurants, local businesses, hotels, museums, and other attractions.

Key features include:

  • Add a place picker: a drop-in UI widget that allows your users to specify a place
  • Get the place where the user is right now
  • Show detailed place information, including the place’s name, address, phone number, and website
  • Use autocomplete to save your users time and frustration typing out place names, by automatically completing them as they type
  • Make your app stand out by adding new places that are relevant to your users and seeing the places appear in Google’s Places database
  • Improve the map around you by reporting the presence of a device at a particular place.

To get started with the Places API for Android, watch this DevByte, check out the developer documentation, and play with the demos. To apply for the Places API for iOS beta program, go here.

Take your apps on the road with Android Auto

Posted by Wayne Piekarski, Developer Advocate

Starting today, anyone can take their apps for a drive with Android Auto using Android 5.0+ devices, connected to compatible cars and aftermarket head units. Android Auto lets you easily extend your apps to the car in an efficient way for drivers, allowing them to stay connected while still keeping their hands on the wheel and their eyes on the road. When users connect their phone to a compatible vehicle, they will see an Android experience optimized for the head unit display that seamlessly integrates voice input, touch screen controls, and steering wheel buttons. Moreover, Android Auto provides consistent UX guidelines to ensure that developers are able to create great experiences across many diverse manufacturers and vehicle models, with a single application available on Google Play.

With the availability of the Pioneer AVIC-8100NEX, AVIC-7100NEX, and AVH-4100NEX aftermarket systems in the US, the AVIC-F77DAB, AVIC-F70DAB, AVH-X8700BT in the UK, and in Australia the AVIC-F70DAB, AVH-X8750BT, it is now possible to add Android Auto to many cars already on the road. As a developer, you now have a way to test your apps in a realistic environment. These are just the first Android Auto devices to launch, and vehicles from major auto manufacturers with integrated Android Auto support are coming soon.

With the increasing adoption of Android Auto by manufacturers, your users are going to be expecting more support of their apps in the car, so now is a good time to get started with development. If you are new to Android Auto, check out our DevByte video, which explains more about how this works, along with some live demos.

The SDK for Android Auto was made available to developers a few months ago, and now Google Play is ready to accept your application updates. Your existing apps can take advantage of all these cool new Android Auto features with just a few small changes. You’ll need to add Android Auto support to your application, and then agree to the Android Auto terms in the Pricing & Distribution category in the Google Play Developer Console. Once the application is approved, it will be made available as an update to your users, and shown in the cars’ display.

Adding support for Android Auto is easy. We have created an extensive set of documentation to help you add support for messaging (sample), and audio playback (sample). There are also short introduction DevByte videos for messaging and audio as well. Stay tuned for a series of posts coming up soon discussing more details of these APIs and how to work with them. We also have simulators to help you test your applications right at your desk during development.

With the launch of Android Auto, a new set of possibilities are available for you to make even more amazing experiences for your users, providing them the right information for the road ahead. Come join the discussion about Android Auto on Google+ at http://g.co/androidautodev where you can share ideas and ask questions with other developers.

Android Developer Story: Outfit7 — Building an entertainment company with Google

Posted by Leticia Lago, Google Play team

Outfit7, creators of My Talking Tom and My Talking Angela, recently announced they’ve achieved 2.5 billion app downloads across their portfolio. The company now offers a complete entertainment experience to users spanning mobile apps, user generated and original YouTube content, and a range of toys, clothing, and accessories. They even have a silver screen project underway.

We caught up with Iza Login, Rok Zorko and Marko Štamcar – some of the co-founders- in Ljubljana, Slovenia, to learn best practices that helped them in reaching this milestone.

To learn about some of the Google and Google Play features used by Outfit7 to create their successful business, check out these resources:

  • Monetization — explore the options available for generating revenue from your apps and games.
  • Monetization with AdMob — learn how you can maximize your ad revenue.
  • YouTube for Developers — Whether you’re building a business on YouTube or want to enhance your app with video, a rich set of YouTube APIs can bring your products to life.

Creating Better User Experiences on Google Play

Posted by Eunice Kim, Product Manager for Google Play

Whether it’s a way to track workouts, chart the nighttime stars, or build a new reality and battle for world domination, Google Play gives developers a platform to create engaging apps and games and build successful businesses. Key to that mission is offering users a positive experience while searching for apps and games on Google Play. Today we have two updates to improve the experience for both developers and users.

A global content rating system based on industry standards

Today we’re introducing a new age-based rating system for apps and games on Google Play. We know that people in different countries have different ideas about what content is appropriate for kids, teens and adults, so today’s announcement will help developers better label their apps for the right audience. Consistent with industry best practices, this change will give developers an easy way to communicate familiar and locally relevant content ratings to their users and help improve app discovery and engagement by letting people choose content that is right for them.

Starting now, developers can complete a content rating questionnaire for each of their apps and games to receive objective content ratings. Google Play’s new rating system includes official ratings from the International Age Rating Coalition (IARC) and its participating bodies, including the Entertainment Software Rating Board (ESRB), Pan-European Game Information (PEGI), Australian Classification Board, Unterhaltungssoftware Selbstkontrolle (USK) and Classificação Indicativa (ClassInd). Territories not covered by a specific ratings authority will display an age-based, generic rating. The process is quick, automated and free to developers. In the coming weeks, consumers worldwide will begin to see these new ratings in their local markets.

To help maintain your apps’ availability on Google Play, sign in to the Developer Console and complete the new rating questionnaire for each of your apps. Apps without a completed rating questionnaire will be marked as “Unrated” and may be blocked in certain territories or for specific users. Starting in May, all new apps and updates to existing apps will require a completed questionnaire before they can be published on Google Play.

An app review process that better protects users

Several months ago, we began reviewing apps before they are published on Google Play to better protect the community and improve the app catalog. This new process involves a team of experts who are responsible for identifying violations of our developer policies earlier in the app lifecycle. We value the rapid innovation and iteration that is unique to Google Play, and will continue to help developers get their products to market within a matter of hours after submission, rather than days or weeks. In fact, there has been no noticeable change for developers during the rollout.

To assist in this effort and provide more transparency to developers, we’ve also rolled out improvements to the way we handle publishing status. Developers now have more insight into why apps are rejected or suspended, and they can easily fix and resubmit their apps for minor policy violations.

Over the past year, we’ve paid more than $7 billion to developers and are excited to see the ecosystem grow and innovate. We’ll continue to build tools and services that foster this growth and help the developer community build successful businesses.

Feeling free …

Feeling free ...

View

Haystack TV Doubles Engagement with Android TV

Posted by Joshua Gordon, Developer Advocate

Haystack TV is a small six person startup with an ambitious goal: personalize the news. Traditionally, watching news on TV means viewing a list of stories curated by the network. Wouldn’t it be better if you could watch a personalized news channel, based on interesting YouTube stories?

Haystack already had a mobile app, but entering the living room space seemed daunting. Although “Smart TVs” have been on the market for a while, they remain challenging for developers to work with. Many hardware OEMs have proprietary platforms, but Android TV is different. It’s an open ecosystem with great developer resources. Developers can reach millions of users with familiar Android APIs. If you have an existing Android app, it’s easy to bring it to the living room.

Two weeks was all it took for Haystack TV to bring their mobile app to Android TV. That includes building an immersive, cinematic UI (a task greatly simplified by the Android framework). Since launching on Android TV, Haystack TV’s viewership is growing at 40% per month. Previously, users were spending about 40 minutes watching content on mobile per week. Now that’s up to 80 minutes in the living room. Their longest engagements are through Chromecast and Android TV.

Hear from Daniel Barreto, CEO of Haystack TV, on developing for Android TV

Haystack TV’s success on Android TV is a great example of how the Android multi-form factor developer experience shines. Once you’ve learned the ropes of writing Android apps, developing for another form factor (Wear, Auto, TV) is simple.

Android TV helps you create cinematic UIs

Haystack TV’s UI is smooth and cinematic. How were they able to build a great one so quickly? Developing an immersive UI/UX with Android TV is surprisingly easy. The Leanback support library provides fragments for browsing content, showing a details screen, and search. You can use these to get transitions and animations almost for free. To learn more about building UIs for Android TV, watch the Using the Leanback Library DevByte and check out the code samples.

Browsing recommended stories

Your content, front and center

The recommendations row is a central feature of the Android TV home screen. It’s the first thing users see when they turn on their TVs. You can surface content to appear on the recommendations row by implementing the recommendation service. For example, your app can suggest videos your users will want to watch next (say, the next episode in a series, or a related news story). This is great for getting noticed and increasing engagements.

Make your content searchable

How can users find their favorite movie or show from a library of thousands? On Android TV, they can search for it using their voice. This is much faster and more relaxing than typing on the screen with a remote control! In addition to providing in-app search, your app can surface content to appear on the global search results page. The framework takes care of speech recognition for you and delivers the result to your app as a plain text string.

Next Steps

Android TV makes it possible for small startups to create apps for the living room. There are extensive developer resources. For an overview, watch the Introduction to Android TV DevByte. For details, see the developer training docs. Watch this episode of Coffee with a Googler to learn more about the vision for the platform. To get started on your app, visit developer.android.com/tv.

A new reference app for multi-device applications

It is now possible to bring the benefits of your app to your users wherever they happen to be, no matter what device they have near them. Today we’re releasing a reference sample that shows how to implement such a service with an app that works across multiple Android form-factors. This sample, the Universal Music Player, is a bare-bones but functional reference app that supports multiple devices and form factors in a single codebase. It is compatible with Android Auto, Android Wear, and Google Cast devices. Give it a try and easily adapt your own app for wherever your users are, be that a phone, watch, TV, car, or more!


Playback controls and album art in the lock screen.
On the application toolbar, the Google Cast icon.

Controlling playback through Android Auto

Controlling playback on an Android Wear watch

This sample uses a number of new features in Android 5.0 Lollipop, like MediaStyle notifications, MediaSession and MediaBrowserService. They make it easy to implement media browsing and playback on multiple devices with a single version of your app.

Check out the source code and let your users enjoy your app from wherever they like.

Posted by Renato Mangini, Senior Developer Platform Engineer, Google Developer Platform Team

Android 5.1 Lollipop SDK

style="border-radius: 6px;padding:0;margin:0;" />

By Jamal Eason, Product Manager, Android

Yesterday we announced Android 5.1, an updated version of the Android Lollipop platform that improves stability, provides better control of notifications, and increases performance. As a part of the Lollipop update, we are releasing the Android 5.1 SDK (API Level 22) which supports the new platform and lets you get started with developing and testing.

What’s new in Android 5.1?

For developers, Android 5.1 introduces a small set of new APIs. A key API addition is support for multiple SIM cards, which is important for many regions where Android One phones are being adopted. Consumers of Android One devices will have more flexibility to switch between carriers and manage their network activities in the way that works best for them. Therefore you, as a developer, can create new app experiences that take advantage of this new feature.

In addition to the new consumer features, Android 5.1 also enhances enterprise features to better support the launch of Android for Work.


Android 5.1 supports multiple SIM cards on compatible devices like Android One.

Updates for the Android SDK

To get you started with Android 5.1, we have updated the Android SDK tools to support the new platform and its new APIs. The SDK now includes Android 5.1 emulator system images that you can use to test your apps and develop using the latest capabilities and APIs. You can update your SDK through the Android SDK Manager in Android Studio.

For details on the new developer APIs, take a look at the API Overview.

Coming to Nexus devices soon

Over the next few weeks, we’ll be rolling out updates for Android 5.1 to the following Nexus devices: Nexus 4, Nexus 5, Nexus 6, Nexus 7 [2012], Nexus 7 [2012] (3G), Nexus 7 (2013), Nexus 7 [2013] (3G/LTE), Nexus 9, Nexus 9 (LTE), Nexus 10, and Nexus Player.

Next Steps

As with all Android releases, it’s a good idea to test your apps on the new platform as soon as possible. You can get started today using Android 5.1 system images with the emulator that’s included in the SDK, or you can download an Android 5.1 Nexus image and flash the system image to your Nexus device.

If you have not had a chance to update your app to material design, or explore how your app might work on Android Wear, Android TV, or even Android Auto, now is a good time to get started with the Android 5.1 SDK update.

Build a loyal user base with three new Mobile App Analytics reports

Successful developers understand that in order to have a popular app, focusing on retaining a loyal user base is just as important as driving new installs. Today at the Game Developer Conference in San Francisco, we introduced new reports that will help you measure how to do this in two meaningful ways. We’re happy to announce that Mobile App Analytics will now let you understand how users come back to your app day after day, and provide the rich insights you need in order to measure their value over time. Let’s take a look at how these new reports can help make your app a hit.
Active Users
The active user report displays your 1-day, 7-day, 14-day and 30-day trailing active users next to each other in one, easy-to-view dashboard. The new overview gives immediate insights into how users interact with your app over time, along with dropoff rate comparisons. With this report, an app download is only the beginning of a potentially valuable relationship with your new users.

Benchmark active users at 1-7-14-30 days by selecting the segments you want. (Click to enlarge image)
While these metrics help you monitor your active user trends, when put into context they can answer important questions about your user acquisition strategies. For example, if you are investing in different campaigns, you can compare the cost of retaining users acquired via paid traffic versus organic to understand if you are attracting the right type of users. Not only can you measure your cost effectiveness, but you can also continue to monitor whether or not the users you paid for are still coming back after the campaign is over. This is particularly important when trying to keep your loyal user base engaged and happy with your app.

Lior Romano, Founder and CEO of Gentoo Labs (the makers of  Contacts+ for iOS and Android), was one of the first customers to try out this new report during our beta test period. He found the Active Users report especially useful when managing and organizing all their information at-a-glance: “We love the new Google Analytics Active Users feature — it’s a real time-saver! We get a quick overview of the 1/7/14/30-day active user trends side by side in a snap, which helps us to easily track our main metrics.”
Cohort Analysis
After learning how many users have opened your app, the next step in driving engagement is understanding when they come back. Cohort Analysis is a user analysis technique that allows you to analyze and compare your users by looking at their customer journey. Using Cohort Analysis, you can see when users are coming back to your app and their behavior over time after the day of the first session, and lets you further filter the information by day, week or month. We’ve also added the ability to compare different segments of users based on the day of the first install. 
In order to validate your user acquisition strategies, Cohort analysis lets you compare different periods or campaigns. For example, you can compare different weeks or months to measure the retention effectiveness of a single channel to see if you continue to attract valuable users throughout a campaign. The flexibility of the report also allows you to see how much time users are spending in an app as they come back day after day. With these valuable insights, Mobile App Analytics users can tailor their acquisition campaigns or app experience, just as our partner E-Nor did: “Cohort analysis in GA made it easy for E-Nor to gauge the effectiveness of lead nurturing efforts during an app free-trial promotion campaign. The analysis clearly showed that many users responded well to email and in-app reminders, resulting in over 50% retention between the 3rd and 5th day post sign-up as opposed to 30% in the first and 2nd day.

See at a glimpse when users are coming back to your app. (Click to enlarge image)

Lifetime Value
Analyzing retention is a great way to ensure users stick with your app and come back day after day. With Lifetime Value reporting, you’ll get a full picture of these users’ value over time. To get the most out of this report, it’s important to start with a clear definition of what a user’s value means to you based on your business objectives. Once you’ve defined the value, you can access the report to measure certain variables such as revenue per user and number of screen views per user over a period of 90 days. For example, if the goal of your app is to get users to purchase virtual or material goods, you’ll want to use this report to get a clear view of when they make a purchase and how much they are spending in your app over time.

Lifetime Value is a key metric to use to measure the effectiveness of your acquisition campaigns. If your cost to acquire a new user is higher than the average value over time, you might want to optimize your campaigns to meet the lifetime revenue they generate. Lifetime Value is particularly valuable if you offer in-app purchases, but it can be applied to discovering many other useful insights, such as number of times they open your app, total number of screens and goal completions.
Session duration per users compared to goal completion over a 60 day window. (Click to enlarge image)

How to get started
Cohort Analysis report can be found under the ‘Audience’ section in your Google Analytics account, and is now available in beta. Lifetime Value and Active Users reports are coming soon to all Analytics accounts.
To get started login into your Analytics account and look for the new reports under the Audience section. 
Posted by Gene Chan and Rahul Oak on behalf of the Google Analytics Team

Google Play services 7.0 – Places Everyone!

Posted by Ian Lake, Developer Advocate

Today, we’re bringing you new tools to build better apps with the completion of the rollout of Google Play services 7.0. With this release, we’re delivering improvements to location settings experiences, a brand new API for place information, new fitness data, Google Play Games, and more.

Location Settings Dialog

While the FusedLocationProviderApi combines multiple sensors to give you the optimal location, the accuracy of the location your app receives still depends greatly on what settings are enabled on the device (e.g. GPS, wifi, airplane mode, etc). In Google Play services 7.0, we’re introducing a standard mechanism to check that the necessary location settings are enabled for a given LocationRequest to succeed. If there are possible improvements, you can display a one touch control for the user to change their settings without leaving your app.

This API provides a great opportunity to make for a much better user experience, particularly if location information is critical to the user experience of your app such as was the case with Google Maps when they integrated the Location Settings dialog and saw a dramatic increase in the number of users in a good location state.

Places API

Location can be so much more than a latitude and longitude: the new Places API makes it easy to get details from Google’s database of places and businesses. The built-in place picker makes it easy for the user to pick their current place and provides all the relevant place details including name, address, phone number, website, and more.

If you prefer to provide your own UI, the getCurrentPlace() API returns places directly around the user’s current location. Autocomplete predictions are also provided to allow a low latency search experience directly within your app.

You can also manually add places with the addPlace() API and report that the user is at a particular place, ensuring that even the most explorative users can input and share their favorite new places.

The Places API will also be available cross-platform: in a few days, you’ll be able to apply for the Places API for iOS beta program to ensure a great and consistent user experience across mobile platforms.

Google Fit

Google Fit makes building fitness apps easier with fitness specific APIs on retrieving sensor data like current location and speed, collecting and storing activity data in Google Fit’s open platform, and automatically aggregating that data into a single view of the user’s fitness data.

In Google Play services 7.0, the previous Fitness.API that you passed into your GoogleApiClient has now been replaced with a number of APIs, matching the high level set of Google Fit Android APIs:

  • SENSORS_API to access raw sensor data via SensorsApi
  • RECORDING_API to record data via RecordingApi
  • HISTORY_API for inserting, deleting, or reading data via HistoryApi
  • SESSIONS_API for managing sessions via SessionsApi
  • BLE_API to interact with Bluetooth Low Energy devices via BleApi
  • CONFIG_API to access custom data types and settings for Google Fit via ConfigApi

This change significantly reduces the memory requirement for Google Fit enabled apps running in the background. Like always, apps built on previous versions of Google Play services will continue to work, but we strongly suggest you rebuild your Google Fit enabled apps to take advantage of this change.

Having all the data can be an empowering part of making meaningful changes and Google Fit is augmenting their existing data types with the addition of body fat percentage and sleep data.

Google Play Games

Announced at Game Developers Conference (GDC), we’re offering new tools to supercharge your games on Google Play. Included in Google Play services 7.0 is the Nearby Connections API, allowing games to seamlessly connect smartphones and tablets as second-screen controls to the game running on your TV.

App Indexing

App Indexing lets Google index apps just like websites, enabling Google search results to deep-link directly into your native app. We’ve simplified the App Indexing API to make this integration even easier for you by combining the existing view()/viewEnd() and action()/end() flows into a single start() and end() API.

Changes to GoogleApiClient

GoogleApiClient serves as the common entry point for accessing Google APIs. For this release, we’ve made retrieval of Google OAuth 2.0 tokens part of GoogleApiClient, making it much easier to request server auth codes to access Google APIs.

SDK Now Available!

You can get started developing today by downloading the Google Play services SDK from the Android SDK Manager.

To learn more about Google Play services and the APIs available to you through it, visit the Google Services section on the Android Developer site.