Get your hands on Android Studio 1.3

Posted by Jamal Eason, Product Manager, Android

Previewed earlier this summer at Google I/O, Android Studio 1.3 is now available on the stable release channel. We appreciated the early feedback from those developers on our canary and beta channels to help ship a great product.

Android Studio 1.3 is our biggest feature release for the year so far, which includes a new memory profiler, improved testing support, and full editing and debugging support for C++. Let’s take a closer look.

New Features in Android Studio 1.3

Performance & Testing Tools

  • Android Memory (HPROF) Viewer

    Android Studio now allows you to capture and analyze memory snapshots in the native Android HPROF format.

  • Allocation Tracker

    In addition to displaying a table of memory allocations that your app uses, the updated allocation tracker now includes a visual way to view the your app allocations.

  • APK Tests in Modules

    For more flexibility in app testing, you now have the option to place your code tests in a separate module and use the new test plugin (‘com.android.test’) instead of keeping your tests right next to your app code. This feature does require your app project to use the Gradle Plugin 1.3.

Code and SDK Management

  • App permission annotations

    Android Studio now has inline code annotation support to help you manage the new app permissions model in the M release of Android. Learn more about code annotations.

  • Data Binding Support

    New data brinding features allow you to create declarative layouts in order to minimize boilerplate code by binding your application logic into your layouts. Learn more about data binding.

  • SDK Auto Update & SDK Manager

    Managing Android SDK updates is now a part of the Android Studio. By default, Android Studio will now prompt you about new SDK & Tool updates. You can still adjust your preferences with the new & integrated Android SDK Manager.

  • C++ Support

    As a part of the Android 1.3 stable release, we included an Early Access Preview of the C++ editor & debugger support paired with an experimental build plugin. See the Android C++ Preview page for information on how to get started. Support for more complex projects and build configurations is in development, but let us know your feedback.

Time to Update

An important thing to remember is that an update to Android Studio does not require you to change your Android app projects. With updating, you get the latest features but still have control of which build tools and app dependency versions you want to use for your Android app.

For current developers on Android Studio, you can check for updates from the navigation menu. For new users, you can learn more about Android Studio on the product overview page or download the stable version from the Android Studio download site.

We are excited to launch this set of features in Android Studio and we are hard at work developing the next set of tools to make develop Android development easier on Android Studio. As always we welcome feedback on how we can help you. Connect with the Android developer tools team on Google+.

Iterate faster on Google Play with improved beta testing

Posted by Ellie Powers, Product Manager, Google Play

Today, Google Play is making it easier for you to manage beta tests and get your users to join them. Since we launched beta testing two years ago, developers have told us that it’s become a critical part of their workflow in testing ideas, gathering rapid feedback, and improving their apps. In fact, we’ve found that 80 percent of developers with popular apps routinely run beta tests as part of their workflow.

Improvements to managing a beta test in the Developer Console

Currently, the Google Play Developer Console lets developers release early versions of their app to selected users as an alpha or beta test before pushing updates to full production. The select user group downloads the app on Google Play as normal, but can’t review or rate it on the store. This gives you time to address bugs and other issues without negatively impacting your app listing.

Based on your feedback, we’re launching new features to more effectively manage your beta tests, and enable users to join with one click.

  • NEW! Open beta – Use an open beta when you want any user who has the link to be able to join your beta with just one click. One of the advantages of an open beta is that it allows you to scale to a large number of testers. However, you can also limit the maximum number of users who can join.
  • NEW! Closed beta using email addresses – If you want to restrict which users can access your beta, you have a new option: you can now set up a closed beta using lists of individual email addresses which you can add individually or upload as a .csv file. These users will be able to join your beta via a one-click opt-in link.
  • Closed beta with Google+ community or Google Group – This is the option that you’ve been using today, and you can continue to use betas with Google+ communities or Google Groups. You will also be able to move to an open beta while maintaining your existing testers.

How developers are finding success with beta testing

Beta testing is one of the fast iteration features of Google Play and Android that help drive success for developers like Wooga, the creators of hit games Diamond Dash, Jelly Splash, and Agent Alice. Find out more about how Wooga iterates on Android first from Sebastian Kriese, Head of Partnerships, and Pal Tamas Feher, Head of Engineering.

Kabam is a global leader in AAA quality mobile games developed in partnership with Hollywood studios for such franchises such as Fast & Furious, Marvel, Star Wars and The Hobbit. Beta testing helps Kabam engineers perfect the gameplay for Android devices before launch. “The ability to receive pointed feedback and rapidly reiterate via alpha/beta testing on Google Play has been extremely beneficial to our worldwide launches,” said Kabam VP Rob Oshima.

Matt Small, Co-Founder of Vector Unit recently told us how they’ve been using beta testing extensively to improve Beach Buggy Racing and uncover issues they may not have found otherwise. You can read Matt’s blog post about beta testing on Google Play on Gamasutra to hear about their experience. We’ve picked a few of Matt’s tips and shared them below:

  1. Limit more sensitive builds to a closed beta where you invite individual testers via email addresses. Once glaring problems are ironed out, publish your app to an open beta to gather feedback from a wider audience before going to production.
  2. Set expectations early. Let users know about the risks of beta testing (e.g. the software may not be stable) and tell them what you’re looking for in their feedback.
  3. Encourage critical feedback. Thank people when their criticisms are thoughtful and clearly explained and try to steer less-helpful feedback in a more productive direction.
  4. Respond quickly. The more people see actual responses from the game developer, the more encouraged they are to participate.
  5. Enable Google Play game services. To let testers access features like Achievements and Leaderboards before they are published, go into the Google Play game services testing panel and enable them.

We hope this update to beta testing makes it easier for you to test your app and gather valuable feedback and that these tips help you conduct successful tests. Visit the Developer Console Help Center to find out more about setting up beta testing for your app.

L’Oréal Canada finds beauty in programmatic buying

Cross-posted on the DoubleClick Advertiser Blog


While global sales of L’Oréal Luxe makeup brand Shu Uemura were booming, reaching its target audience across North America proved challenging. By collaborating with Karl Lagerfeld (and his cat, Choupette) and using DoubleClick Bid Manager and Google Analytics Premium, the campaign delivered nearly double the anticipated revenue.
Goals
  • Re-introduce and raise awareness of the Shu Uemura cosmetics brand in North America
  • Drive North American sales of Karl Lagerfeld’s Shupette collection for Shu Uemura
  • Grow the Shu Uemura email subscriber list
  • Approach
  • Organized website audiences with Google Analytics Premium
  • Used programmatic buying to lead prospects down the path to purchase
  • Leveraged a range of audience data in DoubleClick Bid Manager to buy paid media in display and social channels
  • Results
  • Drove almost 2X the anticipated revenue
  • Exceeded CPA targets and achieved a 2,200% return on ad spend (ROAS)
  • Increased web traffic and email subscribers
  • To learn more about Shu Uemura’s approach, check out the full case study.

    Auto Backup for Apps made simple

    Posted by Wojtek Kaliciński, Developer Advocate, Android

    Auto Backup for Apps makes seamless app data backup and restore possible with zero lines of application code. This feature will be available on Android devices running the upcoming M release. All you need to do to enable it for your app is update the targetSdkVersion to 23. You can test it now on the M Developer Preview, where we’ve enabled Auto Backup for all apps regardless of targetSdkVersion.

    Auto Backup for Apps is provided by Google to both users and developers at no charge. Even better, the backup data stored in Google Drive does not count against the user’s quota. Please note that data transferred may still incur charges from the user’s cellular / internet provider.

    What is Auto-Backup for Apps?

    By default, for users that have opted in to backup, all of the data files of an app are automatically copied out to a user’s Drive. That includes databases, shared preferences and other content in the application’s private directory, up to a limit of 25 megabytes per app. Any data residing in the locations denoted by Context.getCacheDir(), Context.getCodeCacheDir() and Context.getNoBackupFilesDir() is excluded from backup. As for files on external storage, only those in Context.getExternalFilesDir() are backed up.

    How to control what is backed up

    You can customize what app data is available for backup by creating a backup configuration file in the res/xml folder and referencing it in your app’s manifest:

    
    <application
            android:fullBackupContent="@xml/mybackupscheme">
    
    

    In the configuration file, specify <include/> or <exclude/> rules that you need to fine tune the behavior of the default backup agent. Please refer to a detailed explanation of the rules syntax available in the documentation.

    What to exclude from backup

    You may not want to have certain app data eligible for backup. For such data, please use one of the mechanisms above. For example:

    • You must exclude any device specific identifiers, either issued by a server or generated on the device. This includes the Google Cloud Messaging (GCM) registration token which, when restored to another device, can render your app on that device unable to receive GCM messages.
    • Consider excluding account credentials or other sensitive information information, e.g., by asking the user to reauthenticate the first time they launch a restored app rather than allowing for storage of such information in the backup.

    With such a diverse landscape of apps, it’s important that developers consider how to maximise the benefits to the user of automatic backups. The goal is to reduce the friction of setting up a new device, which in most cases means transferring over user preferences and locally saved content.

    For example, if you have the user’s account stored in shared preferences such that it can be restored on install, they won’t have to even think about which account they used to sign in with previously – they can submit their password and get going!

    If you support a variety of log-ins (Google Sign-In and other providers, username/password), it’s simple to keep track of which log-in method was used previously so the user doesn’t have to.

    Transitioning from key/value backups

    If you have previously implemented the legacy, key/value backup by subclassing BackupAgent and setting it in your Manifest (android:backupAgent), you’re just one step away from transitioning to full-data backups. Simply add the android:fullBackupOnly="true" attribute on <application/>. This is ignored on pre-M versions of Android, meaning onBackup/onRestore will still be called, while on M+ devices it lets the system know you wish to use full-data backups while still providing your own BackupAgent.

    You can use the same approach even if you’re not using key/value backups, but want to do any custom processing in onCreate(), onFullBackup() or be notified when a restore operation happens in onRestoreFinished(). Just remember to call super.onFullBackup() if you want to retain the system implementation of XML include/exclude rules handling.

    What is the backup/restore lifecycle?

    The data restore happens as part of the package installation, before the user has a chance to launch your app. Backup runs at most once a day, when your device is charging and connected to Wi-Fi. If your app exceeds the data limit (currently set at 25 MB), no more backups will take place and the last saved snapshot will be used for subsequent restores. Your app’s process is killed after a full backup happens and before a restore if you invoke it manually through the bmgr command (more about that below).

    Test your apps now

    Before you begin testing Auto Backup, make sure you have the latest M Developer Preview on your device or emulator. After you’ve installed your APK, use the adb shell command to access the bmgr tool.

    Bmgr is a tool you can use to interact with the Backup Manager:

    • bmgr run schedules an immediate backup pass; you need to run this command once after installing your app on the device so that the Backup Manager has a chance to initialize properly
    • bmgr fullbackup <packagename> starts a full-data backup operation.
    • bmgr restore <packagename> restores previously backed up data

    If you forget to invoke bmgr run, you might see errors in Logcat when trying the fullbackup and restore commands. If you are still having problems, make sure you have Backup enabled and a Google account set up in system Settings -> Backup & reset.

    Learn more

    You can find a sample application that shows how to use Auto Backup on our GitHub. The full documentation is available on developer.android.com

    Join the Android M Developer Preview Community on Google+ for more information on Android M features and remember to report any bugs you find with Auto Backup in the bug tracker.

    [New eBook] Download The No-nonsense Guide to App Growth

    Originally posted on the AdMob Blog.

    What’s the secret to rapid growth for your app?

    Play Store or App Store optimization? A sophisticated paid advertising strategy? A viral social media campaign?

    While all of these strategies could help you grow your user base, the foundation for rapid growth is much more basic and fundamental—you need an engaging app.

    This handbook will walk you through practical ways to increase your app’s user engagement to help you eventually transition to growth. You’ll learn how to:

    • Pick the right metric to represent user engagement
    • Look at data to audit your app and find areas to fix
    • Promote your app after you’ve reached a healthy level of user engagement

    Download a free copy here.

    For more tips on app monetization, be sure to stay connected on all things AdMob by following our Twitter and Google+ pages.

    Posted by Raj Ajrawat, Product Specialist, AdMob

    Lighting the way with BLE beacons

    Originally posted on the Google Developers blog.

    Posted by Chandu Thota, Engineering Director and Matthew Kulick, Product Manager

    Just like lighthouses have helped sailors navigate the world for thousands of years, electronic beacons can be used to provide precise location and contextual cues within apps to help you navigate the world. For instance, a beacon can label a bus stop so your phone knows to have your ticket ready, or a museum app can provide background on the exhibit you’re standing in front of. Today, we’re beginning to roll out a new set of features to help developers build apps using this technology. This includes a new open format for Bluetooth low energy (BLE) beacons to communicate with people’s devices, a way for you to add this meaningful data to your apps and to Google services, as well as a way to manage your fleet of beacons efficiently.

    Eddystone: an open BLE beacon format

    Working closely with partners in the BLE beacon industry, we’ve learned a lot about the needs and the limitations of existing beacon technology. So we set out to build a new class of beacons that addresses real-life use-cases, cross-platform support, and security.

    At the core of what it means to be a BLE beacon is the frame format—i.e., a language—that a beacon sends out into the world. Today, we’re expanding the range of use cases for beacon technology by publishing a new and open format for BLE beacons that anyone can use: Eddystone. Eddystone is robust and extensible: It supports multiple frame types for different use cases, and it supports versioning to make introducing new functionality easier. It’s cross-platform, capable of supporting Android, iOS or any platform that supports BLE beacons. And it’s available on GitHub under the open-source Apache v2.0 license, for everyone to use and help improve.

    By design, a beacon is meant to be discoverable by any nearby Bluetooth Smart device, via its identifier which is a public signal. At the same time, privacy and security are really important, so we built in a feature called Ephemeral Identifiers (EIDs) which change frequently, and allow only authorized clients to decode them. EIDs will enable you to securely do things like find your luggage once you get off the plane or find your lost keys. We’ll publish the technical specs of this design soon.

    Eddystone for developers: Better context for your apps

    Eddystone offers two key developer benefits: better semantic context and precise location. To support these, we’re launching two new APIs. The Nearby API for Android and iOS makes it easier for apps to find and communicate with nearby devices and beacons, such as a specific bus stop or a particular art exhibit in a museum, providing better context. And the Proximity Beacon API lets developers associate semantic location (i.e., a place associated with a lat/long) and related data with beacons, stored in the cloud. This API will also be used in existing location APIs, such as the next version of the Places API.

    Eddystone for beacon manufacturers: Single hardware for multiple platforms

    Eddystone’s extensible frame formats allow hardware manufacturers to support multiple mobile platforms and application scenarios with a single piece of hardware. An existing BLE beacon can be made Eddystone compliant with a simple firmware update. At the core, we built Eddystone as an open and extensible protocol that’s also interoperable, so we’ll also introduce an Eddystone certification process in the near future by closely working with hardware manufacturing partners. We already have a number of partners that have built Eddystone-compliant beacons.

    Eddystone for businesses: Secure and manage your beacon fleet with ease

    As businesses move from validating their beacon-assisted apps to deploying beacons at scale in places like stadiums and transit stations, hardware installation and maintenance can be challenging: which beacons are working, broken, missing or displaced? So starting today, beacons that implement Eddystone’s telemetry frame (Eddystone-TLM) in combination with the Proximity Beacon API’s diagnostic endpoint can help deployers monitor their beacons’ battery health and displacement—common logistical challenges with low-cost beacon hardware.

    Eddystone for Google products: New, improved user experiences

    We’re also starting to improve Google’s own products and services with beacons. Google Maps launched beacon-based transit notifications in Portland earlier this year, to help people get faster access to real-time transit schedules for specific stations. And soon, Google Now will also be able to use this contextual information to help prioritize the most relevant cards, like showing you menu items when you’re inside a restaurant.

    We want to make beacons useful even when a mobile app is not available; to that end, the Physical Web project will be using Eddystone beacons that broadcast URLs to help people interact with their surroundings.

    Beacons are an important way to deliver better experiences for users of your apps, whether you choose to use Eddystone with your own products and services or as part of a broader Google solution like the Places API or Nearby API. The ecosystem of app developers and beacon manufacturers is important in pushing these technologies forward and the best ideas won’t come from just one company, so we encourage you to get some Eddystone-supported beacons today from our partners and begin building!

    Connect With the World Around You Through Nearby APIs

    Originally posted on the Google Developers blog.

    Posted by Akshay Kannan, Product Manager

    Mobile phones have made it easy to communicate with anyone, whether they’re right next to you or on the other side of the world. The great irony, however, is that those interactions can often feel really awkward when you’re sitting right next to someone.

    Today, it takes several steps — whether it’s exchanging contact information, scanning a QR code, or pairing via bluetooth — to get a simple piece of information to someone right next to you. Ideally, you should be able to just turn to them and do so, the same way you do in the real world.

    This is why we built Nearby. Nearby provides a proximity API, Nearby Messages, for iOS and Android devices to discover and communicate with each other, as well as with beacons.

    Nearby uses a combination of Bluetooth, Wi-Fi, and inaudible sound (using the device’s speaker and microphone) to establish proximity. We’ve incorporated Nearby technology into several products, including Chromecast Guest Mode, Nearby Players in Google Play Games, and Google Tone.

    With the latest release of Google Play services 7.8, the Nearby Messages API becomes available to all developers across iOS and Android devices (Gingerbread and higher). Nearby doesn’t use or require a Google Account. The first time an app calls Nearby, users get a permission dialog to grant that app access.

    A few of our partners have built creative experiences to show what’s possible with Nearby.

    Edjing Pro uses Nearby to let DJs publish their tracklist to people around them. The audience can vote on tracks that they like, and their votes are updated in realtime.

    Trello uses Nearby to simplify sharing. Share a Trello board to the people around you with a tap of a button.

    Pocket Casts uses Nearby to let you find and compare podcasts with people around you. Open the Nearby tab in Pocket Casts to view a list of podcasts that people around you have, as well as podcasts that you have in common with others.

    Trulia uses Nearby to simplify the house hunting process. Create a board and use Nearby to make it easy for the people around you to join it.

    To learn more, visit developers.google.com/nearby.

    M Developer Preview Gets Its First Update

    By Jamal Eason, Product Manager, Android

    Earlier this summer at Google I/O, we launched the M Developer Preview. The developer preview is an early access opportunity to test and optimize your apps for the next release of Android. Today we are releasing an update to the M Developer Preview that includes fixes and updates based on your feedback.

    What’s New

    The Developer Preview 2 update includes the up to date M release platform code, and near-final APIs for you to validate your app. To provide more testing support, we have refined the Nexus system images and emulator system images with the Android platform updates. In addition to platform updates, the system images also include Google Play services 7.6.

    How to Get the Update

    If you are already running the M developer preview launched at Google I/O (Build #MPZ44Q) on a supported Nexus device (e.g. Nexus 5, Nexus 6, Nexus 9, or Nexus Player), the update can be delivered to your device via an over-the-air update. We expect all devices currently on the developer preview to receive the update over the next few days. We also posted a new version of the preview system image on the developer preview website. (To view the preview website in a language other than English, select the appropriate language from the language selector at the bottom of the page).

    For those developers using the emulator, you can update your M preview system images via the SDK Manager in Android Studio.

    What are the Major Changes?

    We have addressed many issues brought up during the first phase of the developer preview. Check out the release notes for a detailed list of changes in this update. Some of the highlights to the update include:

    • Android Platform Changes:
      • Modifications to platform permissions including external storage, Wi-Fi & Bluetooth location, and changes to contacts/identity permissions. Device connections through the USB port are now set to charge-only mode by default. To access the device, users must explicitly grant permission.
    • API Changes:
      • Updated Bluetooth Stylus APIs with updated callback events. View.onContextClickListener and GestureDetector.OnContextClickListener to listen for stylus button presses and to perform secondary actions.
      • Updated Media API with new callback InputDevice.hasMicrophone() method for determining if a device microphone exists.
    • Fixes for developer-reported issues:
      • TextInputLayout doesn’t set hint for embedded EditText. (fixed issue)
      • Camera Permission issue with Legacy Apps (fixed issue)

    Next Steps

    With the final M release still on schedule for this fall, the platform features and API are near final. However, there is still time to report critical issues as you continue to test and validate your apps on the M Developer Preview. You can also visit our M Developer Preview community to share ideas and information.

    Thanks again for your support. We look forward to seeing your apps that are ready to go for the M release this fall.

    M Developer Preview Gets Its First Update

    By Jamal Eason, Product Manager, Android

    Earlier this summer at Google I/O, we launched the M Developer Preview. The developer preview is an early access opportunity to test and optimize your apps for the next release of Android. Today we are releasing an update to the M Developer Preview that includes fixes and updates based on your feedback.

    What’s New

    The Developer Preview 2 update includes the up to date M release platform code, and near-final APIs for you to validate your app. To provide more testing support, we have refined the Nexus system images and emulator system images with the Android platform updates. In addition to platform updates, the system images also include Google Play services 7.6.

    How to Get the Update

    If you are already running the M developer preview launched at Google I/O (Build #MPZ44Q) on a supported Nexus device (e.g. Nexus 5, Nexus 6, Nexus 9, or Nexus Player), the update can be delivered to your device via an over-the-air update. We expect all devices currently on the developer preview to receive the update over the next few days. We also posted a new version of the preview system image on the developer preview website. (To view the preview website in a language other than English, select the appropriate language from the language selector at the bottom of the page).

    For those developers using the emulator, you can update your M preview system images via the SDK Manager in Android Studio.

    What are the Major Changes?

    We have addressed many issues brought up during the first phase of the developer preview. Check out the release notes for a detailed list of changes in this update. Some of the highlights to the update include:

    • Android Platform Changes:
      • Modifications to platform permissions including external storage, Wi-Fi & Bluetooth location, and changes to contacts/identity permissions. Device connections through the USB port are now set to charge-only mode by default. To access the device, users must explicitly grant permission.
    • API Changes:
      • Updated Bluetooth Stylus APIs with updated callback events. View.onContextClickListener and GestureDetector.OnContextClickListener to listen for stylus button presses and to perform secondary actions.
      • Updated Media API with new callback InputDevice.hasMicrophone() method for determining if a device microphone exists.
    • Fixes for developer-reported issues:
      • TextInputLayout doesn’t set hint for embedded EditText. (fixed issue)
      • Camera Permission issue with Legacy Apps (fixed issue)

    Next Steps

    With the final M release still on schedule for this fall, the platform features and API are near final. However, there is still time to report critical issues as you continue to test and validate your apps on the M Developer Preview. You can also visit our M Developer Preview community to share ideas and information.

    Thanks again for your support. We look forward to seeing your apps that are ready to go for the M release this fall.

    The App Developer Business Kit: Now available in 10 languages

    Posted by Sean Meng, a Product Marketing Manager on the AdMob team

    Today we’re excited to launch The App Developer Business Kit in 10 more languages. The website includes tips for new app developers on building, promoting and monetizing your app. Check out the Business Kit in your language:

    To help you make decisions about growing your app business in other regions, we’ve added 6 new market reports providing great insights about app users in Italy, Spain, Germany, Brazil, France, and Russia. Did you know that Brazilian smartphone users engage with ads more frequently than users in the US and Japan? Or that while nearly 2/3rds of French users exclusively download free apps, only 31% of Brazilian smartphone users do? Check out statistics like these about exciting regions around the world here.

    Stay connected on all things mobile apps by following us on Google+ and Twitter.

    Game Performance: Data-Oriented Programming

    Posted by Shanee Nishry, Game Developer Advocate

    To improve game performance, we’d like to highlight a programming paradigm that will help you maximize your CPU potential, make your game more efficient, and code smarter.

    Before we get into detail of data-oriented programming, let’s explain the problems it solves and common pitfalls for programmers.

    Memory

    The first thing a programmer must understand is that memory is slow and the way you code affects how efficiently it is utilized. Inefficient memory layout and order of operations forces the CPU idle waiting for memory so it can proceed doing work.

    The easiest way to demonstrate is by using an example. Take this simple code for instance:

    char data[1000000]; // One Million bytes
    unsigned int sum = 0;
    
    for ( int i = 0; i < 1000000; ++i ) {   sum += data[ i ]; }

    An array of one million bytes is declared and iterated on one byte at a time. Now let's change things a little to illustrate the underlying hardware. Changes marked in bold:

    char data[16000000]; // Sixteen Million bytes
    unsigned int sum = 0;
    
    for ( int i = 0; i < 16000000; i += 16 )
    {
      sum += data[ i ];
    }

    The array is changed to contain sixteen million bytes and we iterate over one million of them, skipping 16 at a time.

    A quick look suggests there shouldn’t be any effect on performance as the code is translated to the same number of instructions and runs the same number of times, however that is not the case. Here is the difference graph. Note that this is on a logarithmic scale–if the scale were linear, the performance difference would be too large to display on any reasonably-sized graph!


    Graph in logarithmic scale

    The simple change making the loop skip 16 bytes at a time makes the program run 5 times slower!

    The average difference in performance is 5x and is consistent when iterating 1,000 bytes up to a million bytes, sometimes increasing up to 7x. This is a serious change in performance.

    Note: The benchmark was run on multiple hardware configurations including a desktop with Intel 5930K 3.50GHz CPU, a Macbook Pro Retina laptop with 2.6 GHz Intel i7 CPU and Android Nexus 5 and Nexus 6 devices. The results were pretty consistent.

    If you wish to replicate the test, you might have to ensure the memory is out of the cache before running the loop because some compilers will cache the array on declaration. Read below to understand more on how it works.

    Explanation

    What happens in the example is quite simply explained when you understand how the CPU accesses data. The CPU can’t access data in RAM; the data must be copied to the cache, a smaller but extremely fast memory line which resides near the CPU chip.

    When the program starts, the CPU is set to run an instruction on part of the array but that data is still not in the cache, therefore causing a cache miss and forcing the CPU to wait for the data to be copied into the cache.

    For simplicity sake, assume a cache size of 16 bytes for the L1 cache line, this means 16 bytes will be copied starting from the requested address for the instruction.

    In the first code example, the program next tries to operate on the following byte, which is already copied into the cache following the initial cache miss, therefore continuing smoothly. This is also true for the next 14 bytes. After 16 bytes, since the first cache miss the loop, will encounter another cache miss and the CPU will again wait for data to operate on, copying the next 16 bytes into the cache.

    In the second code sample, the loop skips 16 bytes at a time but hardware continues to operate the same. The cache copies the 16 subsequent bytes each time it encounters a cache miss which means the loop will trigger a cache miss with each iteration and cause the CPU to wait idle for data each time!

    Note: Modern hardware implements cache prefetch algorithms to prevent incurring a cache miss per frame, but even with prefetching, more bandwidth is used and performance is lower in our example test.

    In reality the cache lines tend to be larger than 16 bytes, the program would run much slower if it were to wait for data at every iteration. A Krait-400 found in the Nexus 5 has a L0 data cache of 4 KB with 64 Bytes per line.

    If you are wondering why cache lines are so small, the main reason is that making fast memory is expensive.

    Data-Oriented Design

    The way to solve such performance issues is by designing your data to fit into the cache and have the program to operate on the entire data continuously.

    This can be done by organizing your game objects inside Structures of Arrays (SoA) instead of Arrays of Structures (AoS) and pre-allocating enough memory to contain the expected data.

    For example, a simple physics object in an AoS layout might look like this:

    struct PhysicsObject
    {
      Vec3 mPosition;
      Vec3 mVelocity;
    
      float mMass;
      float mDrag;
      Vec3 mCenterOfMass;
    
      Vec3 mRotation;
      Vec3 mAngularVelocity;
    
      float mAngularDrag;
    };

    This is a common way way to present an object in C++.

    On the other hand, using SoA layout looks more like this:

    class PhysicsSystem
    {
    private:
      size_t mNumObjects;
      std::vector< Vec3 > mPositions;
      std::vector< Vec3 > mVelocities;
      std::vector< float > mMasses;
      std::vector< float > mDrags;
    
      // ...
    };

    Let’s compare how a simple function to update object positions by their velocity would operate.

    For the AoS layout, a function would look like this:

    void UpdatePositions( PhysicsObject* objects, const size_t num_objects, const float delta_time )
    {
      for ( int i = 0; i < num_objects; ++i )
      {
        objects[i].mPosition += objects[i].mVelocity * delta_time;
      }
    }

    The PhysicsObject is loaded into the cache but only the first 2 variables are used. Being 12 bytes each amounts to 24 bytes of the cache line being utilised per iteration and causing a cache miss with every object on a 64 bytes cache line of a Nexus 5.

    Now let’s look at the SoA way. This is our iteration code:

    void PhysicsSystem::SimulateObjects( const float delta_time )
    {
      for ( int i = 0; i < mNumObjects; ++i )
      {
        mPositions[ i ] += mVelocities[i] * delta_time;
      }
    }

    With this code, we immediately cause 2 cache misses, but we are then able to run smoothly for about 5.3 iterations before causing the next 2 cache misses resulting in a significant performance increase!

    The way data is sent to the hardware matters. Be aware of data-oriented design and look for places it will perform better than object-oriented code.

    We have barely scratched the surface. There is still more to data-oriented programming than structuring your objects. For example, the cache is used for storing instructions and function memory so optimizing your functions and local variables affects cache misses and hits. We also did not mention the L2 cache and how data-oriented design makes your application easier to multithread.

    Make sure to profile your code to find out where you might want to implement data-oriented design. You can use different profilers for different architecture, including the NVIDIA Tegra System Profiler, ARM Streamline Performance Analyzer, Intel and PowerVR PVRMonitor.

    If you want to learn more on how to optimize for your cache, read on cache prefetching for various CPU architectures.

    America explained to non-Americans

    America explained to non-Americans

    View

    How To Setup Enhanced Ecommerce Impressions Using Scroll Tracking

    A version of this post originally appeared on Google Analytics Certified Partner InfoTrust’s site.
    by Nate Denlinger, Web Developer at GACP InfoTrust, LLC

    One of our specialities here at InfoTrust is helping ecommerce businesses leverage their web analytics to make better data-driven marketing decisions. This typically starts with installing Google’s Universal Analytics web analytics software and utilizing all of the functionality that is offered with Enhanced Ecommerce tracking capabilities.
    Enhanced Ecommerce provides you with a complete picture of what customers on your site are seeing, interacting with and purchasing.
    One of the ways you track what your customers are seeing is with product impressions (whenever a user sees an image or description of your products on your website).
    Normally, you track what products users see or impressions by simply adding an array of product objects to the DataLayer. These represent the products seen on the page, meaning when any page loads with product images/descriptions, data is sent to Google Analytics that a user saw those specific products. This works well.
    However, there is a major issue with this method.  Sometimes you are sending impressions for products that the user never actually sees. This can happen when your page scrolls vertically and some products are off the page or “below the fold”.
    For example, lets take a look at a page on Etsy.com:
    Sample page on Etsy.com (click for full size)
    Here are the results for the search term “Linens”. Currently, you can see sixteen products listed in the search results.  However, in the normal method of sending product impressions, a product impression would be sent for every product on the page.
    So, in reality this is what we are telling Google Analytics that the user is seeing (every single product on the page):
    Sample page of Etsy.com (click for full-size)

    Obviously, no one’s screen looks like this, but by sending all products as an impression, we are effectively saying that our customer saw all 63 products. What happens if the user never scrolls past the 16 products shown in the first screenshot?
    We are greatly skewing the impressions for the products on the bottom of the page, because often times, users are not scrolling the entire length of the page (and therefore not seeing the additional products).
    This could cause you to make incorrect assumptions about how well a product is selling based off of position.
    The solution: Scroll-based impression tracking!
    Here is how it works at a high level:
    1. Instead of automatically adding all product impressions to the DataLayer, we add it to another variable just for temporary storage. Meaning, we do not send all the products loaded on a page directly to Google Analytics, but rather just identify the products that loaded on the page.
    2. When the page loads, we actually see what products are visible on the page (ones “above the fold” or where the user can actually see them) and add only those products to the DataLayer for product impressions. Now we don’t send any other product impressions unless they are actually visible to the user.
    3. Once the user starts to scroll, we start capturing all the products that haven’t been seen before. We continue to capture these products until the user stops scrolling for a certain amount of time.
    4. We then batch all of those products together and send them to the DataLayer as product impressions. 
    5. If the user starts to scroll again, we start checking again. However, we never send the same product twice on the same page. If they scroll to the bottom then back up, we don’t send the first products twice.
    Using our example on the “Linen” search results, right away we would send product impressions for the first 16 products. Then, let’s say the user scrolled halfway down the page and stopped. We would then send product impressions for products 18 through 40. The user then scrolls to the bottom of the page so we would send product impressions for 41 through 63. Finally the user scrolls back to the top of the page before clicking on the first product. No more impressions would be sent as impressions for all products have already been sent.
    The result: Product impressions are only sent as users actually navigate through the pages and can see the products. This is a much more accurate form of product impression tracking since it reflects actual user navigation. 
    Next steps: for the technical how-to guide + code samples, please see this post on the InfoTrust site.

    An update on Eclipse Android Developer Tools

    Posted by Jamal Eason, Product Manager, Android

    Over the past few years, our team has focused on improving the development experience for building Android apps with Android Studio. Since the launch of Android Studio, we have been impressed with the excitement and positive feedback. As the official Android IDE, Android Studio gives you access to a powerful and comprehensive suite of tools to evolve your app across Android platforms, whether it’s on the phone, wrist, car or TV.

    To that end and to focus all of our efforts on making Android Studio better and faster, we are ending development and official support for the Android Developer Tools (ADT) in Eclipse at the end of the year. This specifically includes the Eclipse ADT plugin and Android Ant build system.

    Time to Migrate

    If you have not had the chance to migrate your projects to Android Studio, now is the time. To get started, download Android Studio. For many developers, migration is as simple as importing your existing Eclipse ADT projects in Android Studio with File → New→ Import Project as shown below:

    For more details on the migration process, check out the migration guide. Also, to learn more about Android Studio and the underlying build system, check out this overview page.

    Next Steps

    Over the next few months, we are migrating the rest of the standalone performance tools (e.g. DDMS, Trace Viewer) and building in additional support for the Android NDK into Android Studio.

    We are focused on Android Studio so that our team can deliver a great experience on a unified development environment. Android tools inside Eclipse will continue to live on in the open source community via the Eclipse Foundation. Check out the latest Eclipse Andmore project if you are interested in contributing or learning more.

    For those of you that are new to Android Studio, we are excited for you to integrate Android Studio into your development workflow. Also, if you want to contribute to Android Studio, you can also check out the project source code. To follow all the updates on Android Studio, join our Google+ community.

    Remarketing Lists for Search Ads, Powered by Google Analytics

    Today we’re excited to announce you can use audiences (previously remarketing lists) created in Google Analytics to reach your customers on Google Search, with no tagging changes required. 
    Remarketing Lists for Search Ads (RLSA) allows you to tailor your search ads and based on your visitors’ past activity on your website. Now you can leverage more than 200 Google Analytics dimensions and metrics to create and activate your audiences for remarketing, then use those audiences to reach and re-engage your customers with a consistent message across both Google Search and Display.
    TransUnion cuts CPA in half with RLSA
    In order to find more customers while reducing waste in their search campaigns, TransUnion, a leading financial services provider, used the audience creation capabilities in Google Analytics to spend more efficiently on Google Search.
    TransUnion started by creating two audiences. The first was for new customers―those who had visited the site and started, but not completed a credit application. The other included customers who had already converted. Splitting the audience between new and existing customers allowed TransUnion to bid higher on Google search ads for new customers and spend less on converted customers.
    The new RLSA capabilities in Google Analytics yielded impressive conversion rates and cost efficiencies for TransUnion’s search campaigns. RLSA visitors had a lower bounce rate and viewed twice as many pages per session compared with regular visitors. 
    By using more tailored text with their remarketing lists, TransUnion increased their conversion rate by 65% and average transaction value by 58%. Meanwhile, CPCs for existing customers dropped 50%, resulting in a roughly 50% drop in their cost per transaction. Read the full case study here
    How to get started
    Getting started with RLSA is easier than ever before thanks to Instant Activation. Within the Admin tab, simply click Property, then Tracking Info, and finally Data Collection. Ensure that Remarketing is set to ‘ON.’


    Once you’ve enabled this setting, all your eligible audiences will begin to populate for RLSA.

    Building Audiences
    If you’d like to create new audiences, there are three ways to get started. 
    First, you can create a new audience using the Audience builder in the remarketing section of the Admin tab. Make sure you select the relevant AdWords account to share your audience with for remarketing.



    If you have an existing segment you’d like to turn into an audience, simply click on the segment options and select “Build Audience” right from within reporting. This option will take you directly to the audience builder as above.  


    Finally, you can get started quickly and easily by importing audiences from the Google Analytics Solutions Gallery.
    Activating audiences in AdWords
    Once you have shared an audience with AdWords, it will appear instantly in your AdWords Shared Library and will show eligible users in the column List size (Google search).  Keep in mind that an audience must accumulate a minimum of 1,000 users before you can use it for remarketing on Google Search. To get started, follow the instructions in the AdWords Help Center

    Support for RLSA with Google Analytics is part of an ongoing investment to provide powerful ways to activate your customer insights in Google Analytics, along with recent features like Cohort Analysis, Lifetime Value Analysis, and Active User Reporting. Stay tuned for more announcements!
    Happy Analyzing,
    Lan Huang, Technical Lead, Google Analytics,
    Xiaorui Gan, Technical Lead, Google Search Ads

    Android Developer Story: Shifty Jelly drives double-digit growth with material design and expansion to the car and wearables

    Posted by Lily Sheringham, Google Play team

    Pocket Casts is a leading podcasting app on Google Play built by Australian-based mobile development company Shifty Jelly. The company recently achieved $1 million in sales for the first time, reaching more than 500K users.

    According to the co-founder Russell Ivanovic, the adoption of material design played a significant role in driving user engagement for Pocket Casts by streamlining the user experience. Moreover, users are now able to access the app beyond the smartphone — in the car with Android Auto, on a watch with Android Wear or on the TV with Google Cast. The rapid innovation of Android features helped Pocket Casts increase sales by 30 percent.

    We chatted with co-founders and Android developers Russell and Philip Simpson to learn more about how they are growing their business with Android.

    Here are some of the features Pocket Casts used:

    • Material Design: Learn more about material design and how it helps you create beautiful, engaging apps.
    • Android Wear: Extend your app to Android Wear devices with enhanced notifications or a standalone wearable app.
    • Android Auto: Extend your app to an interface that’s optimized for driving with Android Auto.
    • Google Cast: let your users cast your app’s content to Google Cast devices like Chromecast, Android TV, and speakers with Google Cast built-in.

    And check out the Pocket Casts app on Google Play!

    Learn to optimize your tag implementation with Google Tag Manager Fundamentals

    We’re excited to announce that our next Analytics Academy course, Google Tag Manager Fundamentals, is now open for participation. Whether you’re a marketer, analyst, or developer, this course will teach you how Google Tag Manager can simplify the tag implementation and management process.

    You’ll join instructor Krista Seiden to explore topics through the lens of a fictional online retailer, The Great Outdoors and their Travel Adventures website. Using practical examples, she’ll show you how to use tools like Google Analytics and Google AdWords tags to improve your data collection process and advertising strategies.

    By participating in the course, you’ll explore:

    • the core concepts and principles of tag management using Google Tag Manager
    • how to create website tags and manage firing triggers
    • how to enhance your Google Analytics implementation
    • the importance of using the Data Layer to collect valuable data for analysis
    • how to configure other marketing tags, like AdWords Conversion Tracking and Dynamic Remarketing

    We’re looking forward to your participation in this course!

    Sign up for Google Tag Manager Fundamentals and start learning today.

    Happy tagging!

    Post By: Lizzie Pace & The Google Analytics Education Team

    Fitness Apps on Android Wear

    Posted by Joshua Gordon, Developer Advocate

    Go for a run, improve your game, and explore the great outdoors with Android Wear! Developers are creating a diverse array of fitness apps that provide everything from pace and heart rate while running, to golf tips on your favorite course, to trail maps for hiking. Let’s take a look features of the open and flexible Wear platform they use to create great user experiences.

    Always-on stats

    If your app supports always-on, you’ll never have to touch or twist your watch to activate the display. Running and want to see your pace? Glance at your wrist and it’s there! Runtastic, Endomondo, and MapMyRun use always-on to keep your stats visible, even in ambient mode. When it’s time for golf, I use Golfshot. Likewise, Golfshot uses always-on to continuously show yardage to the hole, so I never have to drop my club. Check out the doc, DevByte, and code sample to learn more.

    Runtastic automatically transitions to ambient mode to conserve battery. There, it reduces the frequency at which stats are updated to about once per 10 seconds.


    Maps, routes, and markers

    It’s encouraging to see how much ground I’ve covered when I go for a run or ride! Using the Maps API, you can show users their route, position, and place markers on the map they can tap to see more info you provide. All of this functionality is available to you using the same Maps API you’ve already worked with on Android. Check out the doc, DevByte, code sample, and blog post to learn more.

    Endomondo tracks your route while your run. You can pan and zoom the map.

    Google Fit

    Google Fit is an open platform designed to make it easier to write fitness apps. It provides APIs to help with many common tasks. For example, you can use the Recording API to estimate how many steps the user has taken and how many calories they’ve burned. You can make that data to your app via the History API, and even access it over the web via REST, without having to write your own backend. Now, Google Fit can store data from a wide variety of exercises, from running to weightlifting. Check out the DevByte and code samples to learn more.

    Bluetooth Low Energy: pair with your watch

    With the latest release of Android Wear, developers can now pair BLE devices directly with the Wearable. This is a great opportunity for all fitness apps — and especially for running — where carrying both a phone and the Wearable can be problematic. Imagine if your users could pair their heart rate straps or bicycle cadence sensors directly to their Wear device, and leave their phones at home. BLE is now supported by all Wear devices, and is supported by Google Fit. To learn more about it, check out this guide and DevByte.

    Pack light with onboard GPS

    When I’m running, carrying both a phone and a wearable can be a bit much. If you’re using an Android Wear device that supports onboard GPS, you can leave your phone at home! Since not all Wear devices have an onboard GPS sensor, you can use the FusedLocationProviderApi to seamlessly retrieve GPS coordinates from the phone if not available on the wearable. Check out this handy guide for more about detecting location on Wear.

    RunKeeper supports onboard GPS if it’s available on your Wearable.


    Sync data transparently

    When I’m back home and ready for more details on my activity, I can see them by opening the app on my phone. My favorite fitness apps transparently sync data between my Wearable and phone. To learn more about syncing data between devices, watch this DevByte on the DataLayer API.

    Next Steps

    Android Wear gives you the tools and training you need to create exceptional fitness apps. To get started on yours, visit developer.android.com/wear and join the discussion at g.co/androidweardev.

    What I mean when I say ‘definitely.’

    What I mean when I say 'definitely.'

    View

    Growing Android TV engagement with search and recommendations

    Posted by Maru Ahues, Media Developer Advocate

    When it comes to TV, content is king. But to enjoy great content, you first need to find it. We created Android TV with that in mind: a truly smart TV should deliver interesting content to users. Today, EPIX® joins a growing list of apps that use the Android TV platform to make it easy to enjoy movies, TV shows, sports highlights, music videos and more.

    Making TV Apps Searchable

    Think of your favorite movie. Now try to locate it in one of your streaming apps. If you have a few apps to choose from, it might take some hunting before you can watch that movie. With Android TV, we want to make it easier to be entertained. Finding ‘Teenage Mutant Ninja Turtles’ should be as easy as picking up the remote, saying ‘Teenage Mutant Ninja Turtles’ and letting the TV find it.

    Searching for ‘Teenage Mutant Ninja Turtles’ shows results from Google Play and EPIX

    You can drive users directly to content within your app by making it searchable from the Android TV search interface. Join app developers like EPIX, Sky News, YouTube, and Hulu Plus who are already making content discovery a breeze.

    Recommending TV Content

    When users want suggestions for content, the recommendations row on Android TV helps them quickly access relevant content right from the home screen. Recommendations are based on the user’s recent and frequent usage behaviors, as well as content preferences.

    Recommendations from installed apps, like EPIX, appear in the Android TV home screen

    Android TV allows developers to create recommendations for movies, TV shows, music and other types of content. Your app can provide recommendations to users to help get your content noticed. As an example, EPIX shows hollywood movies. NBA Game Time serves up basketball highlights. Washington Post offers video summaries of world events, and YouTube suggests videos based on your subscriptions and viewing history.

    With less than one year since the consumer launch of Android TV, we’re already building upon a simpler, smarter and more personalized TV experience, and we can’t wait to see what you create.