WO2014116542A1 - Activation of dormant features in native applications - Google Patents

Activation of dormant features in native applications Download PDF

Info

Publication number
WO2014116542A1
WO2014116542A1 PCT/US2014/012217 US2014012217W WO2014116542A1 WO 2014116542 A1 WO2014116542 A1 WO 2014116542A1 US 2014012217 W US2014012217 W US 2014012217W WO 2014116542 A1 WO2014116542 A1 WO 2014116542A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
input
user input
gesture
native application
Prior art date
Application number
PCT/US2014/012217
Other languages
French (fr)
Inventor
Jason Lap-Wing KOO
Charles Scott GLOMMEN
Original Assignee
Tealium Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tealium Inc. filed Critical Tealium Inc.
Publication of WO2014116542A1 publication Critical patent/WO2014116542A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/20Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel
    • H04W4/21Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel for social networking applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/08Configuration management of networks or network elements
    • H04L41/0803Configuration setting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/22Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks comprising specially adapted graphical user interfaces [GUI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services

Definitions

  • Some operators of content sites such as websites, regularly obtain the results of analytics performed with regard to user interactions on their content sites.
  • User analytics can include any type of data regarding interactions of end users with content sites, among other types of data.
  • There are different approaches to gathering analytics data one of which includes employing the use of tags.
  • Tags can include small pieces of website code that allow a website operator to measure traffic and visitor behavior, understand the impact of online advertising and social channels, use remarketing and audience targeting, or test and improve a content site, among optionally other functions. Adding tags to a content site has typically required involving a developer to manually insert tag code into one or more pages of a website.
  • a method of presenting information about elements of a host application is disclosed.
  • the method can be performed under control of a physical computing device including digital logic circuitry.
  • the method can include: executing a host application; receiving a first user input indicative of a user shaking the physical computing device; in response to determining that the first user input matches a first activation input, executing a confirmation routine to process one or more additional user inputs to the physical computing device; receiving a second user input with the confirmation routine after said receiving the first user input, the second user input indicative of the user contacting a screen of the physical computing device; and in response to determining, using the confirmation routine, that the second user input matches a second activation input, displaying a configuration utility on the screen, the configuration utility configured to output information regarding trackable elements of the host application.
  • the method of the preceding paragraph can further include one or more of the following features:
  • the method can include (i) receiving a third user input indicative of selection of an interactive user interface element of the trackable elements of the host application after said receiving the second user input, the third user input indicative of the user contacting the screen, (ii) in response to determining that the third user input matches a configuration selection input, processing the third user input using the configuration utility to output a tracking identifier associated with the interactive user interface element, and (iii) in response to determining that the third user input does not match the configuration selection input, navigating within the host application based at least on the interactive user interface element.
  • the configuration utility can be configured for use by an administrator of the host application and not for use by an end user of the host application.
  • the method can include, in response to determining that the second user input has not been received within a timeout period, stopping said executing the confirmation routine.
  • the physical computing device can include a mobile phone or a tablet computer, and the host application can include the confirmation routine and the configuration utility.
  • non-transitory physical computer storage including computer-executable instructions stored thereon.
  • the computer-executable instructions when executed by one or more processors, can implement a process.
  • the process can include: receiving configuration information for configuring a physical computing device; receiving a first user input from a user of the physical computing device, the first user input comprising a motion component; in response to determining that the first user input matches a first activation input, listening for a second user input to the physical computing device using confirmation instructions of the computer-executable instructions; receiving the second user input from the user; and in response to determining, using the confirmation instructions, that the second user input matches a second activation input, displaying a configuration utility interface on a display of the physical computing device, the configuration utility interface configured to display information indicative of the configuration information.
  • the computer-executable instructions of the preceding paragraph when executed by one or more processors, can further implement a process that includes one or more of the following features:
  • the first activation input can be different from the second activation input.
  • the process can include (i) receiving a third user input from the user, the third user input indicative of selection of an element of a user interface displayed on the display, (ii) in response to determining that the third user input matches a configuration selection input, displaying information corresponding to the third user input in the configuration utility interface, the configuration utility interface shown in juxtaposition to the user interface on the display, and (iii) in response to determining that the third user input does not match the configuration selection input, displaying information corresponding to the third user input in the user interface.
  • the process can include, in response to determining that the second user input has not been received within a timeout period, stopping said listening for the second input to the physical computing device using the confirmation instructions.
  • the configuration information can denote elements of a user interface to be tracked as the user interacts with the user interface.
  • the process can include transmitting, to a tracking server, data indicative of interactions of the user with the elements of the user interface denoted by the configuration information.
  • the elements of the user interface denoted by the configuration information can include links displayed in the user interface.
  • the configuration utility interface can be configured to display whether elements of a user interface are trackable as the user interacts with the user interface.
  • the configuration utility interface can be usable by the user to change the configuration information stored on the configuration information server when the user may be an authenticated user.
  • the second user input can be an input indicative of consecutive taps on the display by the user.
  • the computer-executable instructions can include user interface instructions for displaying a user interface and configuration utility instructions for displaying the configuration utility interface, the confirmation and configuration utility instructions including third-party developed computer-executable instructions, the user interface instructions including first-party developed computer- executable instructions.
  • the configuration utility interface can be configured for use by an administrator of the computer-executable instructions, and the user interface can be configured for use by an end user of the computer-executable instructions.
  • a system for presenting information regarding elements of a host application can include a memory and a processor.
  • the memory can be configured to store a host application
  • the hardware processor can be configured to communicate with the memory.
  • the hardware processor configured to: execute the host application; listen for a motion input; in response to determining that the motion input matches an expected motion input, listen for a user input received before an end of a timeout period; and in response to determining that the user input matches an activation input, invoke an operation module.
  • the expected motion input can be different from the activation input.
  • the system of the preceding paragraph can further include one or more of the following features:
  • the processor can be configured to: in response to determining that a second user input matches a configuration selection input, process the second user input using the configuration utility; and in response to determining that the second user input does not match the configuration selection input, not process the second user input using the configuration utility.
  • the determination of whether the motion input matches the expected motion input and the determination of whether the user input matches the activation input are configured to provide a confirmation that a user intends to activate the configuration utility so that an end user of the host application does not accidentally encounter the configuration utility during routine use of the host application.
  • a system for providing access to a tag management application can include a mobile device.
  • the mobile device can include a processor and a memory device.
  • the memory device can be configured to store at least a tag management application and a gesture-to-display module.
  • the gesture-to-display module configured, when executed by the processor, to: listen for a shake gesture corresponding to a user shaking the mobile device; in responsive to identifying the shake gesture, determine whether a predetermined interaction with the mobile device has occurred; and in response to determining that the predetermined interaction with the mobile device has occurred, invoke the tag management application.
  • the system of the preceding paragraph can further include one or more of the following features:
  • the gesture-to-display module can be configured to listen for the shake gesture by hooking into a gesture application programming interface (API) of a host application stored in the memory.
  • the gesture-to-display module can be configured to output an invisible overlay over a host application interface.
  • the gesture-to-display module can be configured to detect screen activity via the invisible overlay to determine whether the predetermined interaction with the mobile device has occurred.
  • the predetermined interaction can include one or both of taps and swipes.
  • the gesture-to-display module can be configured to determine whether a predetermined interaction with the mobile device has occurred by activating a voice detection module of the mobile device to listen for a voice command
  • a system including a processor and a memory device.
  • the memory device can be configured to store at least a first application and a gesture-to-display module.
  • the gesture-to-display module configured, when executed by the processor, to: listen for a shake gesture corresponding to a user shaking the mobile device; in responsive to identifying the shake gesture, determine whether a predetermined interaction with the mobile device has occurred; and in response to determining that the predetermined interaction with the mobile device has occurred, invoke the first application.
  • the system of the preceding paragraph can further include one or more of the following features:
  • the gesture-to-display module can be configured to listen for the shake gesture by hooking into a gesture application programming interface (API) of a host application stored in the memory.
  • the gesture-to-display module can be configured to output an invisible overlay over a host application interface.
  • the gesture-to-display module can be configured to detect screen activity via the invisible overlay to determine whether the predetermined interaction with the mobile device has occurred.
  • the predetermined interaction can include one or both of taps and swipes.
  • the gesture-to-display module can be configured to determine whether a predetermined interaction with the mobile device has occurred by activating a voice detection module of the mobile device to listen for a voice command
  • a method is disclosed.
  • the method can be performed under control of a computing device comprising a processor.
  • the method can include: listening for a shake gesture corresponding to a user shaking the computing device; in responsive to identifying the shake gesture, determining whether a predetermined interaction with the computing device has occurred; and in response to determining that the predetermined interaction with the computing device has occurred, invoking the first application.
  • the listening for the shake gesture can include hooking into a gesture application programming interface (API) of a host application.
  • the method can include outputting an invisible overlay over a host application interface.
  • the method can include detecting screen activity via the invisible overlay to determine whether the predetermined interaction with the mobile device has occurred.
  • the predetermined interaction can include one or both of taps and swipes.
  • the method can include determining whether a predetermined interaction with the computing device has occurred by activating a voice detection module of the computing device to listen for a voice command.
  • FIGURE 1 illustrates an example system on which may be implemented various embodiments of methods in accordance with the disclosure.
  • FIGURE 2 illustrates a device configuration for a portable device on which may be implemented various embodiments of systems and methods in accordance with the disclosure.
  • FIGURE 3 illustrates a host system configuration on which may be implemented various embodiments of systems and methods in accordance with the disclosure.
  • FIGURE 4 illustrates an example application architecture for the tag management system in accordance with the disclosure.
  • FIGURE 5 depicts an embodiment of a computing environment that provides access to an analytics system, a business intelligence system, and tag vendor systems.
  • FIGURE 6 depicts an embodiment of a native application configuration update process.
  • FIGURE 7 depicts an embodiment of a configuration utility activation process.
  • FIGURE 8 depicts an embodiment of a configuration utility activation process with user authentication.
  • FIGURE 9 depicts an embodiment of a configuration utility operation process.
  • FIGURES 10A-C and 1 1 depict embodiments of native application interfaces.
  • tags are commonly used to track user interactions with web sites
  • digital marketing users may also desire to manage tracking of end user interactions in native applications, including both desktop and mobile applications.
  • a native application can include links, images, or the like that may be viewed or selected by an end user of the native application.
  • a digital marketing user thus can beneficially control the gathering of information about the views or selections by the end user using a tag management system to collect information useful in making business decisions related to the native application or other promoted content.
  • a native application can be a locally installed application and deployed as a precompiled unit of executable code or executable code compiled at run-time
  • digital marketing users or other marketing users may have difficultly modifying configurations for tracking of user interactions and events that are coded in the native application after the native application has been developed.
  • some native applications such as applications for mobile devices, may require advance approval by an organization before updates to the native applications can be released to the end users, thus further slowing the release of modifications to the configuration of the native application.
  • Digital marketing users and other marketing users additionally may desire to view tag information or manage tags associated with native applications using an easy to access and intuitive interface.
  • One such interface can be a user interface of a native application itself.
  • the user interface of the native application can desirably present tag information or enable management of tags in juxtaposition to, overlaid on, or otherwise together with the end user interface for the digital marketing users and other marketing users.
  • the digital marketing users and other marketing users may thus understand or control information relevant to end user interactions with the native application using a view similar to that of an end user of the native application.
  • the functionality to view tag information and manage the tags may desirably be unobtrusive and hidden from the end user. Hiding this functionality from the end user can be challenging though since usable space or features for hiding the functionality can be limited in some native application environments, such as applications for mobile devices.
  • a native application can be deployed that may obtain some configuration information for the native application at run-time.
  • Digital marketing users or other marketing users can then view or control the behavior of the native application by displaying or setting the configuration information of the native application obtained at run-time.
  • the native application can report tracked end user interactions and events in accordance with the configuration information to tag management systems for data compilation by the tag management systems.
  • the native application can enable digital marketing users or other marketing users to view or control the behavior of the native application from within the native application using an integrated configuration utility.
  • the configuration utility can be activated using a two-stage activation process or an activation and authorization process to prevent an end user from accidentally encountering the configuration utility during routine use of the native application.
  • the configuration utility can be considered an Easter egg since the two-stage activation process can activate a dormant configuration utility for the native application in response to one or more secret input commands.
  • the term "native application,” in addition to having its ordinary meaning, can refer to an application other than a web application or to an application that is not implemented entirely in a web browser.
  • the native application may be a hybrid native/web application in an embodiment.
  • a tag management system can enable companies to improve the way they manage the tags (sometimes referred to as pixels) which can be used on their web properties for an increasingly broad range of digital marketing technologies and vendors, ranging from site analytics and affiliate marketing to multivariate testing and retargeting tools. Waiting for IT departments to implement tags may be often a barrier to marketing agility, taking up tech bandwidth which could be more productively used on other areas of website development. Tag implementations are often incomplete and hard to keep track off. As such, digital marketing users want to be more self-sufficient in their management of these tags, so that they can remove the IT bottleneck and impose on tech teams for more value-adding improvements to their web properties.
  • Tag management systems can enable the placement of a JavaScript snippet on website pages. That code snippet may replace the tags that would otherwise have been individually deployed.
  • code snippet may replace the tags that would otherwise have been individually deployed.
  • coding lines of HTML and JavaScript on pages can be replaced by a web interface where vendors, actions, and pages are unified and controlled.
  • a tag management solution may offer support for managing the tagging and/or tracking of individual elements of a web page, such as link clicks, image clicks, etc. This can be often provided via a tool which offers a number of convenience features for discovering the elements of the webpage which are desirable for tracking.
  • Such tools may not be a feature of the webpage itself, but rather, an external tool that can be capable of interacting with the webpage.
  • the methods of enabling these tools vary, but may utilize well-understood browser capabilities.
  • One such capability can be a known as a "bookmarklet.”
  • a bookmarklet is a web browser bookmark item that can be capable of injecting content into the visible webpage, effectively integrating into the webpage in order to offer the aforementioned features.
  • the native mobile application environment brings a number of challenges for any third-party vendor desiring a similar level of integration. These challenges can exist in part because the delivery platform may not be a web browser. It is the web browser platform that, in some cases, can allow these aforementioned graceful, on-demand integrations, such as bookmarklets.
  • the native mobile application environment may offer no such on-demand integration capabilities.
  • Native applications can be deployed as precompiled units of executable code. It may be within the source code of these applications and at development time (versus on-demand) when these tools can be integrated. The challenge can then become ensuring the desired third-party vendor tool may be accessible to the application's digital marketer or marketing user (who can be managing the configuration of the application) while not accessible to the general user audience, and without the need of source code modification.
  • Systems and methods in accordance with some embodiments of the disclosure provide a means for a manager of a native mobile application to activate the user interface of a third party tool. This may be done without exposing the capability through a conventional user interface. In the general case, this can be useful since the desired third-party tool may not be intended to be utilized by the general user audience, but rather by the application manager.
  • an activation process can utilize a combination of phone shake gestures and interactions, such as taps or voice.
  • the third-party tool's user interface can be revealed to the user.
  • the sequence required may be complex enough to ensure no accidental activation occurs by a general user. This can be further secured with the use of a server-side mechanism in some implementations.
  • FIGURE 1 illustrates a system 100.
  • System 100 includes one or more portable or mobile devices 1 10 (also denoted herein for brevity as “devices 1 10”) such as cellular phones, PDAs, Wi-Fi (802.1 1 ) devices, or other portable devices.
  • portable devices 1 10 also denoted herein for brevity as “devices 1 10”
  • the device may not be portable and the functionality herein may be implemented on more stationary devices, such as desktop or notebook computers or other types of fixed devices.
  • portable devices as described herein may include other types of devices that are mobile but may not be portable.
  • System 100 further includes a host processing system 140 (also denoted herein as “host system 140") comprising one or more servers as well as other associated computer and data processing hardware (not shown in FIGURE 1 ) such as networking equipment, displays, monitors, I/O devices or other computer or data communication systems, hardware and/or software.
  • host system 140 may be provided by or operated by an associated host services company or host services supplier.
  • host system 140 includes one or more servers 370 that include one or more databases 390 (as shown in FIGURE 3) either internal or external to the servers 370. These databases may be used to store advertisements and data such as is further described herein. Host system 140 may also include one or more operating systems 362 associated with the servers, as well as one or more application programs to implement the various host service functionality as is described further herein. Host system 140 may be implemented at a centralized physical location, such as a network connected server farm or other similar facility, and/or may comprise a plurality of distributed servers connected by any of a variety of networking connections at different physical locations.
  • Network 130 may include wired or wireless networking elements such as Ethernet, LAN technologies, telephony networks, such as POTS phone networks, cellular networks, data networks, or other telephony networks as well as Wi-Fi or Wi-Max networks, other wired or wireless Internet network connections and/or other networks as are known or developed in the art.
  • wired or wireless networking elements such as Ethernet, LAN technologies, telephony networks, such as POTS phone networks, cellular networks, data networks, or other telephony networks as well as Wi-Fi or Wi-Max networks, other wired or wireless Internet network connections and/or other networks as are known or developed in the art.
  • a memory 260 of the device 1 10 may be provided with a tag management application or applications 264, and a gesture-to-display module 266 as shown in FIGURE 2 that may be installed on the user's device 1 10.
  • the tag management application 264 and gesture-to-display module 266 may be installed on a ROM (read only memory) 230 at a factory, thereby negating the need for the user to download the client 264.
  • the user may be supplied with the client application 264 on a computer media such as a CD or DVD, a thumb drive, or via other media known or developed in the art.
  • FIGURE 2 illustrates additional details of an example configuration of a portable device 1 10 with example device elements that may be used to implement embodiments of the systems and methods in accordance with the disclosure.
  • device 1 10 may include one or more processors (CPUs) 210, which can include one or more specialized or dedicated portable device microprocessors or microcontrollers, an input/output device module 220 configured to allow users to input and output information and interact with applications installed on the device 1 10, such as the tag management application 264, as well as transfer and receive advertising data, one or more read only memory (ROM) devices 230 or equivalents to provide non-volatile storage of data and/or application or operating system programs, one or more display modules 250, such as an LCD or equivalent display device, as well as one or more memory spaces 260.
  • the device 1 10 also can include an acceleration sensor 225 that can be activated by a shaking motion in order to activate the tag management application 264 as described herein.
  • Memory space 260 may comprise DRAM, SRAM, FLASH, hard disk drives or other memory storage devices configured to store and access operating systems 262, the tag management application 264 and/or data 268.
  • Data 268 may include information such as advertisements received from an advertising source system.
  • FIGURE 3 illustrates additional details of one example of a host system 140 with example device elements that may be used to implement embodiments of the present disclosure.
  • host system 140 may include one or more processors (CPUs) 310, an input/output device module 320 configured to allow users to input and output information and interact with the host system 140 as well as transfer and receive data, one or more read only memory (ROM) devices 330 or equivalents to provide nonvolatile storage of data and/or programs, one or more display modules 350 such as a computer monitor or other display device, one more network connections 340 and associated network interfaces 342 configured to allow host system 140 to connect to other systems, servers and/or portable devices, including other elements of system 140 in embodiments where the servers or other components are distributed at other physical locations, as well as one or more memory spaces 360 and one or more databases 390.
  • processors CPUs
  • ROM read only memory
  • display modules 350 such as a computer monitor or other display device
  • network connections 340 and associated network interfaces 342 configured to allow host system 140 to connect to other
  • Database(s) 390 may be further divided or distributed as one or more sub-databases 390a-390n, with the sub-databases storing feature or function specific information associated with a particular feature or function.
  • the various components shown in FIGURE 3 may be incorporated in one or more physical servers 370 comprising part of host system 140. It is noted that the various components shown in FIGURE 3, including database 390, are typically included as part of server(s) 370, however, they may be external to server(s) 370 in some embodiments. For example, in some embodiments database(s) 390 may be external to server(s) 370 and may comprise part of a separate database server system or networked database system.
  • Memory space 360 may comprise DRAM, SRAM, FLASH, hard disk drives or other memory storage devices, such as media drives 380, configured to store operating systems, application programs and/or data, and memory space 360 may be shared with, distributed with or overlap with the memory storage capacity of database 390.
  • memory space 360 may include database 390 or in some embodiments database 390 may include data 368 as shown in memory space 360.
  • Data stored in memory space 360 and/or database 390 may include information, such as tag management system information or other types of data.
  • memory space 360 may include a host system application or applications 364 stored in the memory space for execution on CPU 310 to perform the various host-side functionality described herein.
  • the host application's source code is compiled with the source code of a third-party tool's source code that includes an integrated third party library 415;
  • This third party tool's source code includes the source code 420 for at least performing the activation functionality described in this disclosure (the gesture-to-configure functionality).
  • the third party tool can exist within the host application in order to provide functionality that can be intended to be utilized by the application's developer or manager, not the general user audience.
  • a gesture-to display code module 266 can be utilized.
  • the gesture-to-display module 266 can operates as follows:
  • the sequence required to activate can be determined by the gesture-to-display module 266. This determination may be accomplished through any of the following means, depending on the specific implementation:
  • a server call can be made to retrieve the activation configuration and sequence requirements
  • the gesture-to-display module 266 can then "hook” into the host application's gesture API. This is a standard API provided by most modern smart-phones or other devices. This "hook” can allow the gesture-to-display module 266 to listen for the expected "shake” gestures; • Upon detecting the proper "shake” gesture, the gesture-to-display module 266 can then activate the next step of validation. This step can be used to prevent accidental activations that may otherwise occur if only relying on shake activity. Any of the following can comprise this next activation step:
  • the gesture-to-display module 266 can place an invisible overlay over the entirety or some portion of the host application's user interface.
  • This overlay can exist to "catch" screen activity, such as taps and swipes.
  • This overlay can exist for a brief amount of time, such as a few seconds, and
  • a voice detection module of the host device can be activated, allowing to listen for the proper voice command to activate.
  • the gesture-to- display module 266 can then conclude that the intent of the user may be to activate the hidden tool, and displays its user interface to the user.
  • Appendix A includes a code snippet, for inclusion within an implantation of the gesture-to-display module 266, which can be capable of detecting such sequences of gestures made with respect to devices based upon the AppleTM iOSTM platform.
  • the functionality of this code snippet may be implemented on other platforms as well, including the GoogleTM AndroidTM platform, for example.
  • some embodiments in accordance with the disclosure may include computer software and/or computer hardware/software combinations configured to implement one or more processes or functions, such as those described herein and/or in the related applications. These embodiments may be in the form of modules implementing functionality in software and/or hardware software combinations. Embodiments may also take the form of a computer storage product with a computer-readable medium having computer code thereon for performing various computer-implemented operations, such as operations related to functionality as describe herein.
  • the media and computer code may be those specially designed and constructed for the purposes of performing functionality described herein, or they may be of the kind well known and available to those having skill in the computer software arts, or they may be a combination of both.
  • Examples of computer-readable media within the spirit and scope of the present disclosure include, but are not limited to: magnetic media such as hard disks; optical media such as CD-ROMs, DVDs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store and execute program code, such as programmable microcontrollers, application- specific integrated circuits ("ASICs"), programmable logic devices ("PLDs”) and ROM and RAM devices.
  • Examples of computer code may include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter.
  • Computer code may be comprised of one or more modules executing a particular process or processes to provide useful results, and the modules may communicate with one another via means known in the art.
  • some embodiments may be implemented using assembly language, Java, C, C#, C++, or other programming languages and software development tools as are known in the art.
  • Other embodiments of the disclosure may be implemented in hardwired circuitry in place of, or in combination with, machine-executable software instructions.
  • FIGURE 5 an embodiment of a computing environment 500 is shown for implementing various tag features, including some or all of the tag management features described above.
  • one or more end user systems 502 such as end user systems 502-1 , 502-2, 502-3, can communicate over a network 508 with a configuration information server 510 and an analytics system 530.
  • the end user system 502 can include any form of computing device and may be a desktop, laptop, smartphone, tablet, a virtualization of a smartphone or tablet, or the like.
  • the end user system 502 can further include a native application 503B, which can provide application content or functionality to the end user system 502, and a user input 503A that can receive user inputs for interacting with the native application 503B.
  • the native application 503B can present a user interface on a display and can request and receive configuration information from the configuration information server 510 for controlling or adjusting operation of the native application 503B.
  • the configuration information can include a directives file, such as a JavaScript file, loadable by the native application 503B.
  • the configuration information server 510 can be a server implemented in computer hardware and/or software and managed by the provider of the native application 503B or the provider of the analytics system 530, for example.
  • the network 108 can include a local area network (LAN), a wide area network (WAN), a company intranet, the public Internet, combinations of the same, or the like.
  • the analytics system 530 is shown in communication with the configuration information server 510.
  • the analytics system 530 can be implemented in computer hardware and/or software.
  • the analytics system 530 may be implemented in physical and/or virtual servers, which may be geographically dispersed or co : located.
  • the analytics system 530 includes the user processing system 540 and the tag management system 550, as well as a user profile data repository 560.
  • the user processing and tag management systems 540, 550 are shown separately for illustrative purposes, although their functionality may be implemented by a single system.
  • the analytics system 530 can also be implemented without the tag management system 550, and thus, the functionality of the user processing system 540 can be implemented independent of any tag management functionality. Further, the analytics system 530 can be implemented without the user processing system 540, and thus, the functionality of the tag management system 550 can be implemented independent of any user processing functionality.
  • One or more marketing user systems 504 can access the configuration information server 510, analytics system 530, or business intelligence system 580 via the network 508.
  • the marketing user system 504 can include a native application 505B, which can provide application content or functionality to the marketing user system 504, and a user input 505A that can receive user inputs for interacting with the native application 505B.
  • the marketing user system 504 can also be any type of computing device including, but not limited to, a desktop, laptop, tablet, smartphone, a virtualization of a smartphone or tablet, or the like.
  • the native application 505B can be a different instance of the same or a similar application as the native application 503B.
  • the user input 505A can include one or more of a motion sensor, touch screen sensor, microphone, button, or the like to receive user inputs.
  • the marketing user system 504 further can include a browser 505C.
  • the browser 505C or a configuration utility of the native application 505B can be used to access or change the configuration information stored on the configuration information server 510 via the analytics system 530.
  • the marketing user system 504 is illustrated as having both the native application 505B and browser 505C, some market user systems 504 may not include the native application or 505B browser 505C, depending on the implementation.
  • the marketing user system 504 can be operated by marketing users, such as digital marketing professionals, business users, providers of the native application 503B, or any other individual who uses tags or data obtained from tags. Marketing users may not be the primary intended end users of the native applications 503B, 505B in certain embodiments. Instead, a marketing user may use the marketing user system 504 to dynamically view or update the types of data tracked or analyzed for different users of the native application 503B. This data can be tracked by the user processing system 540 via updating the configuration information stored in the configuration information server 510 or updating processing by the user processing system 540 of data obtained from the native application 503B to build updated user profiles 560. In addition, marketing users can access the information stored in the business intelligence system 580 to obtain an understanding of particular end user system 502 for purposes such as evaluating the effectiveness of various marketing campaigns, for instance.
  • marketing users can access the information stored in the business intelligence system 580 to obtain an understanding of particular end user system 502 for purposes such as evaluating the effectiveness of various marketing campaigns, for instance.
  • the user processing system 540 can enable marketing users to configure the types of data tracked for different users of a native application 503B, as well as analyze and report on this user data.
  • the user processing system 540 can provide one or more user interfaces via the browser 505C that enable customization of collecting information about user of the native application 503B.
  • the native application 503B can supply user data to the user the analytics system 530 (optionally through the configuration information server 510).
  • Such user data can be stored in user profiles in the user profile data repository 560, which may include physical computer storage. Marketing users can subsequently query the user profiles to obtain reports or other information about users of the native application 503B.
  • the tag management system 550 can be used to manage the tags provided by third-party vendors.
  • the tag management system 150 can provide functionality for marketing users to select which third-party vendor tags to associate with a native application for a variety of vendor-specific processing purposes. These purposes can include obtaining analytics for data analysis or business intelligence, tracking affiliate activity with respect to the native application, obtaining user data for displaying targeted ads, obtaining user data for customizing search functionality or email campaigns targeted to the end users, obtaining user data for personalizing content of the native application, obtaining user data for integration with social networking functionality, obtaining user data for big data analysis, combinations of the same, or the like.
  • Tags for any of these vendor- specific processing purposes, among others, can be considered digital marketing tags.
  • Data collected by the tags from the native application 503B can be provided to tag vendor systems 570, which can perform any of this vendor-specific processing. The data or related data may additionally or alternatively be passed to the tag vendor systems 570 through the tag management system 550.
  • the tag management system 550 provides functionality (such as one or more user interfaces through the browser 505B) for marketing users to map data sources in the native application 503B to data sources gathered by the tags. For instance, if a native application includes a shopping cart value named "cart value,” the tag management system can provide a user interface that enables a user to tell the tag management system 550 to collect data on the "cart value” and map this data to a "cart_value” variable of one of the tags.
  • functionality such as one or more user interfaces through the browser 505B
  • the tag management system can provide a user interface that enables a user to tell the tag management system 550 to collect data on the "cart value” and map this data to a "cart_value” variable of one of the tags.
  • the tags can instead perform a greedy collection of some or all data available in the native application 503B. Since the tags and user processing system 540 can be provided by the same entity, the data obtained by the tags need not be mapped to third-party mappings like the data obtained by the third-party tags. Instead, some or all of the data available to the tags can be provided to the user processing system 540 for processing. Thus, the tags can facilitate vendor-neutral data gathering of some or all of the data elements in the native application 503B.
  • the data can be exported to business intelligence systems, such as business intelligence system 580, without a need to massage the data from its mapped form (which can be cumbersome) to its original, raw form.
  • the systems 540, 570 can provide the processed data to a business intelligence system 580, which may be owned, operated, or otherwise used by an operator of the native application 503B to analyze application user behavior.
  • the business intelligence system 580 can be implemented in computer hardware and/or software.
  • the business intelligence system 580 can receive the raw data or processed data from the systems 540, 570 and store and manage the data in a way that facilitates a meaningful presentation of information to those interested in the performance of the native application 503B.
  • the business intelligence system 580 is part of the user processing system 540 or the tag management system 550 rather than separate as illustrated in FIGURE 5.
  • FIGURE 6 an embodiment of a native application configuration information update process 600 is shown.
  • the process 600 illustrates an example mode of operation of the computing environment 500 of FIGURE 5 and may be implemented by the various components shown in the computing environment 500 of FIGURE 5.
  • the process 600 is described in the context of the computing environment 500 but may instead be implemented by other systems described herein or other computing systems not shown.
  • the process 600 provides one example approach by which the end user system 502 or marketing user system 504 can obtain updated configuration information for the native application 503B or 505B upon start-up of the native application 503B or 505B.
  • the process 600 enables the marketing users to make digital marketing changes to the native application 503B without having to submit an updated version of the native application 503B to an application approval organization or without the end users having to re-download the updated version of the native application 503B.
  • the end user system 502 or marketing user system 504 can receive a command to begin executing the native application 503B or 505B.
  • the command can be received from an end user of the end user system 502 or marketing user of the marketing user system 504 via the user input 503A or 505A.
  • the native application 503B and 505B can include first-party code providing application functionality for the end user of the end user system 502.
  • the native application 503B and 505B can include integrated third-party code, such as third-party source code and/or a third-party code library.
  • the third-party code can perform functions intended to be hidden from the end user of the end user system 502.
  • the end user system 502 or marketing user system 504 can request configuration information for the native application 503B or 505B from the configuration information server 510 via the network 108.
  • the native application 503B or 505B can cause the end user system 502 or marketing user system 504 to request the configuration information as part of a start-up process for the native application 503B or 505B.
  • the end user system 502 or marketing user system 504 can receive the configuration information for the native application 503B or 505B from the configuration information server 510 via the network 108.
  • the configuration information can include a directives file, such as a JavaScript file, loadable by the native application 503B or 505B as part of start-up process for the native application 503B or 505B or usable as a reference file once the native application 503B or 505B may be running.
  • the configuration information can be stored to a memory of the end user system 502 or a memory of the marketing user system 504, in some implementations.
  • the configuration information can provide information on what to track within the native application 503B or 505B (for example, element identifiers associated with links or events in the native application 503B or 505B), how the information should be tracked, who should be tracked, or the like.
  • the configuration information can be provided by the provider of the analytics system 530 via the configuration information server 510.
  • the end user system 502 or marketing user system 504 can execute the native application 503B or 505B based at least on the configuration information from the configuration information server 510.
  • the end user system 502 or marketing user system 504 can activate listeners of the third-party code integrated in the native application 503B or 505B so that the activated listeners process user inputs to the user input 503A or 505A.
  • the activated listeners can include button listeners, scroll listeners, video tracking listeners, or the like, for instance.
  • Executing the native application 503B or 505B can cause the end user system 502 or marketing user system 504 to display an end user interface on a screen of the end user system 502 or marketing user system 504.
  • FIGURE 7 depicts an embodiment of a configuration utility activation process 700.
  • the process 700 illustrates an example mode of operation of the computing environment 500 of FIGURE 5 and may be implemented by the various components shown in the computing environment 500 of FIGURE 5. For convenience, the process 700 is described in the context of the computing environment 500 but may instead be implemented by other systems described herein or other computing systems not shown.
  • the process 700 provides one example approach by which the marketing user system 504 can activate a configuration utility as a result of user inputs by a marketing user.
  • the process 700 can provide a two-stage activation process to help prevent end users of the end user system 502 from accidentally encountering the configuration utility while operating the native application 503B.
  • the marketing user system 504 can execute the native application 505B based at least on the configuration information from the configuration information server 510.
  • the marketing user system 504 can activate listeners of the third-party code integrated in the native application 505B so that the activated listeners process user inputs to the user input 505A.
  • the activated listeners can include button listeners, scroll listeners, video tracking listeners, or the like, for instance.
  • Executing the native application 503B or 505B can cause the end user system 502 or marketing user system 504 to display an end user interface on a screen of the end user system 502 or marketing user system 504.
  • the configuration information can be obtained as described with respect to blocks 602, 604, and 606 of the process 600 of FIGURE 6. Additionally or alternatively, the configuration information can be obtained from configuration information loaded onto the marketing user system 504 by the marketing user or from configuration information previously loaded or saved by the native application 505B, for example.
  • the native application 505B can receive a user input via the user input 505A.
  • the user input 505A can receive inputs including one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a fingerprint detected by a scanner of the marketing user system 504, or a voice input to a microphone of the marketing user system 504.
  • the native application 505B can determine whether the user input received at block 704 matches an initial activation action.
  • the initial activation action in one implementation, can be three consecutive shakes of the marketing user system 504.
  • the native application 505B determines that the marketing user system 504 has been shaken three times, the native application 505B can determine that the user input matches the initial activation action.
  • the initial activation action can include one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a fingerprint detected by a scanner of the marketing user system 504, a voice input to a microphone of the marketing user system 504, or the like.
  • the initial activation action can be coded in the native application 505B or determined from the configuration information at run-time, for instance. In some embodiments, the native application 505B or the marketing user system 504 can provide feedback to indicate that the user input matches the initial activation action.
  • the feedback can include one or more or a combination of: a sound output from a speaker of the marketing user system 504, a vibration of the marketing user system 504, a translucent overlay on a screen (for instance, shadowing, coloring, bordering, opaquing, or gridding of one or more elements displayed on the screen) of the marketing user system 504, a dimming or brightening of at least part of a display on a screen of the marketing user system 504, a small icon displayed on a display (for instance, in a corner) on a screen of the marketing user system 504, or the like.
  • the process 700 moves to block 702 and the marketing user system 504 can continue execution of the native application 505B. On the other hand, if the native application 505B determines that the user input matches the initial activation action, the process 700 moves to block 708.
  • the native application 505B can activate a confirmation application to process subsequent user inputs.
  • the confirmation application can be a routine of the native application 505B that remains dormant during normal use by the end user but can activate upon satisfaction of the initial activation action.
  • the confirmation application for example, can be a part of the third-party code integrated in the native application 505B.
  • the confirmation application can divert or block some or all user inputs to the user input 505A from being processed by the first-party code of the native application 505B, such as by providing a visible or invisible overlay on the display of the marketing user system 504 to receive all touch inputs to the user input 505A.
  • the native application 505B or the marketing user system 504 can provide feedback to indicate that the confirmation application has been activated.
  • the feedback can include one or more or a combination of: a sound output from a speaker of the marketing user system 504, a vibration of the marketing user system 504, a translucent overlay on a screen (for instance, shadowing, coloring, bordering, opaquing, or gridding of one or more elements displayed on the screen) of the marketing user system 504, a dimming or brightening of at least part of a display on a screen of the marketing user system 504, a small icon displayed on a display (for instance, in a corner) on a screen of the marketing user system 504, or the like.
  • the native application 505B determines whether a timeout period for the confirmation application has expired.
  • the timeout period can be a relatively short duration (for instance, about 5, 10, or 20 seconds or more or less) during which the marketing user of the marketing user system 504 can enter another user input to confirm an intent to activate the configuration utility. If no user inputs are received during the timeout period, the process 700 moves to block 702 and the marketing user system 504 can continue normal execution of the native application 505B. On the other hand, if a user input is received during the timeout period, the process 700 moves to block 712.
  • the confirmation application can receive the user input via the user input 505A.
  • the user input 505A can receive inputs including one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a fingerprint detected by a scanner of the marketing user system 504, or a voice input to a microphone of the marketing user system 504.
  • the confirmation application can determine whether the user input received at block 712 matches a confirmation activation action.
  • the confirmation activation action in one implementation, can be three consecutive taps on a screen of the marketing user system 504.
  • the confirmation activation action can include one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a voice input to a microphone of the marketing user system 504, or the like.
  • the confirmation activation action can be coded in the native application 505B or determined from the configuration information at run-time, for instance. If the confirmation application determines that the user input does not match the confirmation activation action, the process 700 moves to block 702 and the marketing user system 504 can continue execution of the native application 505B. On the other hand, if the confirmation application determines that the user input matches the confirmation activation action, the process 700 moves to block 716.
  • the native application 505B can display the configuration utility.
  • the configuration utility can be a routine of the native application 505B that remains dormant during normal use by the end user but can activate upon satisfaction of the initial and confirmation activation actions.
  • the configuration utility can be used by the marketing user to view the configuration information from the configuration information server 510 in juxtaposition to, overlaid on, or otherwise together with the end user interface of the native application 503B.
  • the configuration utility can be a part of the third-party code integrated in the native application 505B and present the configuration information about what tags are being leveraged at that moment by the first-party code of the native application 505B, what elements of the first-party code user interface can be tagged, or how to identify an element of the first-party application for tagging purposes, or the like.
  • the configuration utility can be used by the marketing user to change the configuration information stored at the configuration information server 510, such as to enable or remove tracking for one or more elements of the native application 503B.
  • the configuration utility can provide readonly access to the configuration information since the initial and confirmation activation actions may not be considered sufficient security protections to permit editing of the configuration information stored in the configuration information server 510.
  • the configuration utility can provide read and write access to the configuration information.
  • the native application 505B can additionally or alternatively invoke a particular routine or operation module.
  • the routine or operation module can include, for example, a function, method, script, or the like.
  • FIGURE 8 depicts an embodiment of a configuration utility activation process 800 with user authentication.
  • the process 800 illustrates an example mode of operation of the computing environment 500 of FIGURE 5 and may be implemented by the various components shown in the computing environment 500 of FIGURE 5.
  • the process 800 is described in the context of the computing environment 500 but may instead be implemented by other systems described herein or other computing systems not shown.
  • the process 800 provides one example approach by which the marketing user system 504 can activate a configuration utility as a result of a user input and user authentication by the marketing user.
  • the marketing user system 504 can execute the native application 505B based at least on the configuration information from the configuration information server 510.
  • the marketing user system 504 can activate listeners of the third-party code integrated in the native application 505B so that the activated listeners process user inputs to the user input 505A.
  • the activated listeners can include button listeners, scroll listeners, video tracking listeners, or the like, for instance.
  • Executing the native application 503B or 505B can cause the end user system 502 or marketing user system 504 to display an end user interface on a screen of the end user system 502 or marketing user system 504.
  • the configuration information can be obtained as described with respect to blocks 602, 604, and 606 of the process 600 of FIGURE 6.
  • the native application 505B can receive a user input via the user input 505A.
  • the user input 505A can receive inputs including one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a fingerprint detected by a scanner of the marketing user system 504, or a voice input to a microphone of the marketing user system 504.
  • the native application 505B can determine whether the user input received at block 804 matches activation instructions.
  • the activation instructions in one implementation can be three consecutive shakes of the marketing user system 504.
  • the native application 505B can determine that the activation instructions have been received.
  • the activation instructions can be determined to be received in response to sensing one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a voice input to a microphone of the marketing user system 504, or the like.
  • the activation instructions can be coded in the native application 505B or determined from the configuration information at run-time, for instance.
  • the native application 505B or the marketing user system 504 can provide feedback to indicate that the user input matches the activation instructions.
  • the feedback can include one or more or a combination of: a sound output from a speaker of the marketing user system 504, a vibration of the marketing user system 504, a translucent overlay on a screen (for instance, shadowing, coloring, bordering, opaquing, or gridding of one or more elements displayed on the screen) of the marketing user system 504, a dimming or brightening of at least part of a display on a screen of the marketing user system 504, a small icon displayed on a display (for instance, in a corner) on a screen of the marketing user system 504, or the like.
  • the process 800 moves to block 802 and the marketing user system 504 can continue execution of the native application 505B. On the other hand, if the native application 505B determines that the activation instructions have been received, the process 800 moves to block 808.
  • the native application 505B can request a user authentication from the marketing user.
  • the native application 505B can request that the marketing user, for example, provide a usemame, a password, a fingerprint, or the like via the user input 505A.
  • the native application 505B can receive the user authentication from the user input 505A.
  • the native application 505B can determine whether the user authentication is confirmed to match the authentication for a marketing user of the native application 505B. If the user authentication is not confirmed to match the authentication for a marketing user of the native application 505B, the process 800 moves to block 802 and the marketing user system 504 can continue execution of the native application 505B. On the other hand, if the user authentication is confirmed to match the authentication for a marketing user of the native application 505B, the process 800 moves to block 814.
  • the native application 505B can display the configuration utility.
  • the configuration utility can be a routine of the native application 505B that remains dormant during normal use by the end user but can activate upon satisfaction of the initial and confirmation activation actions.
  • the configuration utility can be used by the marketing user to view the configuration information from the configuration information server 510 in juxtaposition to, overlaid on, or otherwise together with the end user interface of the native application 503B.
  • the configuration utility can be a part of the third-party code integrated in the native application 505B and present the configuration information about what tags are being leveraged at that moment by the first-party code of the native application 505B, what elements of the first-party code user interface can be tagged, or how to identify an element of the first-party application for tagging purposes, or the like.
  • the configuration utility can be used by the marketing user to change the configuration information stored at the configuration information server 510, such as to enable or remove tracking for an element of the native application 503B.
  • the configuration utility can automatically provide read and write access to the configuration information since the marketing user has provided both the activation instructions and a confirmed user authentication, which may be considered sufficient security protections to permit editing of the configuration information stored in the configuration information server 510.
  • the native application 505B can additionally or alternatively invoke a particular routine or operation module.
  • the routine or operation module can include, for example, a function, method, script, or the like.
  • FIGURE 9 depicts an embodiment of a configuration utility operation process 900.
  • the process 900 illustrates an example mode of operation of the computing environment 500 of FIGURE 5 and may be implemented by the various components shown in the computing environment 500 of FIGURE 5.
  • the process 900 is described in the context of the computing environment 500 but may instead be implemented by other systems described herein or other computing systems not shown.
  • the process 900 provides one example approach by which the marketing user system 504 can operate the configuration utility in juxtaposition to the end user interface of the native application 505B.
  • the process 900 also can alter the end user facing features of the native application 505B so that the marketing user can access or change the configuration information while selectively enabling and viewing the end user facing features.
  • the marketing user system 504 can execute the native application 505B based at least on the configuration information from the configuration information server 510.
  • the marketing user system 504 can activate listeners of the third-party code integrated in the native application 505B so that the activated listeners process user inputs to the user input 505A.
  • the activated listeners can include button listeners, scroll listeners, video tracking listeners, or the like, for instance.
  • Executing the native application 503B or 505B can cause the end user system 502 or marketing user system 504 to display an end user interface on a screen of the end user system 502 or marketing user system 504.
  • the configuration information can be obtained as described with respect to blocks 602, 604, and 606 of the process 600 of FIGURE 6.
  • the native application 505B can receive activation instructions from the marketing user of the marketing user system 504.
  • the activation instructions in one implementation can be three consecutive shakes of the marketing user system 504.
  • the native application 505B determines that the marketing user system 504 has been shaken three times, the native application 505B can determine that the activation instructions have been received.
  • the activation instructions can be determined to be received in response to sensing one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a fingerprint detected by a scanner of the marketing user system 504, a voice input to a microphone of the marketing user system 504, or the like.
  • the activation instructions can be coded in the native application 505B or determined from the configuration information at run-time, for instance.
  • the native application 505B can display the configuration utility.
  • the configuration utility can be a routine of the native application 505B that remains dormant during normal use by the end user but can activate upon satisfaction of the initial and confirmation activation actions.
  • the configuration utility can be displayed as a highlighting of one or more elements displayed on the end user interface, a logo indicative of activation of the configuration utility, a configuration display menu providing the configuration information from the configuration information server 510, or the like.
  • the configuration utility can be a part of the third-party code integrated in the native application 505B.
  • the configuration utility can present the configuration information about what tags are being leveraged at that moment by the first-party code of the native application 505B, what elements of the first-party code user interface can be tagged, or how to identify an element of the first-party application for tagging purposes, or the like.
  • the configuration utility can be used by the marketing user to change the configuration information stored at the configuration information server 510, such as to enable or remove tracking for an element of the native application 503B.
  • the configuration utility can receive a user selection via the user input 505A.
  • the user input 505A can receive inputs including one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a fingerprint detected by a scanner of the marketing user system 504, or a voice input to a microphone of the marketing user system 504.
  • the configuration utility can determine whether the user selection received at block 908 is a configuration selection. For example, in one implementation, when the user selection includes double-tapping on the screen over of a displayed element, the user selection can be considered a configuration selection. In other implementations, the user selection can be considered a configuration selection in response to selection of menu displays of the configuration utility or one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a fingerprint detected by a scanner of the marketing user system 504, a voice input to a microphone of the marketing user system 504, or the like.
  • the process 900 moves to block 914 and the user selection can be processed using the end user features of the native application 505B. For example, if the user selection designated an element (such as a link or button) of the end user interface of the native application 505B, the first-party code of the native application 505B can process the selection as if an end user selected the element of the end user interface. The process 900 can then move to block 908 and await a further user selection. On the other hand, if the user selection is determined to be a configuration selection, the process moves to block 912.
  • the user selection can be processed using the configuration utility. For example, if the user selection designated an element (such as a link or button) of the end user interface of the native application 505B, the configuration utility can display the configuration information, such as an element identifier, associated with the selected element in a configuration display menu. In another example, if the user selection designated an item in a configuration display menu of the configuration utility, the configuration utility can accept selection of the item and perform the function associated with the designated menu item.
  • the configuration utility can accept selection of the item and perform the function associated with the designated menu item.
  • FIGURE 10A-C and 1 1 depict embodiments of native application interfaces 1000A-C and 1 100.
  • the interfaces 1000A-C and 1 100 can be displayed by the native application 505B or by a separate application on a display of the marketing user system 504, for example.
  • the data included in the interface 1000A- C and 1 100 may be supplied by the native application 505B, the configuration information from the configuration information server 510, the analytics system 530, the tag vendor systems 570, or the business intelligence system 580.
  • the native application 505B and the configuration information alone supply the data shown in the interfaces 1000A-C and 1 100.
  • the interfaces 1000A- C and 1 100 can advantageously, in certain embodiments, display a configuration utility in juxtaposition to, overlaid on, or otherwise together with the end user interface of the native application 505B for the marketing user of the marketing user system 504.
  • FIGURE 10A depicts an example interface 1000A of a native application as viewed by an end user of the native application.
  • the interface 1000A includes four buttons, an edit button 1010, an add button 1020, an samples button 1030, and an video samples button 1040.
  • a marketing user of the interface 1000A can choose to activate the configuration utility as described with respect to the processes 700 or 800, for example.
  • FIGURE 10B depicts an example interface 1000B of the native application of the interface 1000A once the marketing user has activated the configuration utility.
  • the configuration utility is displayed in the interface 1000B as a highlighting of the four buttons included in the interface 1000A, as well as a logo 1050 indicative of activation of the configuration utility near the center of the interface 1000B.
  • the marketing user can select the one of the four buttons 1010, 1020, 1030, 1040 or the logo 1050 to view the configuration information associated with the selected button or logo as described with respect to the process 900, for example.
  • the highlighting of the four buttons can indicate that the four buttons may be tracked by the native application.
  • FIGURE 10C depicts an example interface 1000C of the native application of the interface 1000B once the marketing user has selected the samples button 1030.
  • the selection of the samples button 1030 has triggered a configuration display menu 1060 to appear that presents the configuration information associated with the samples button 1030.
  • the highlighting of the samples button 1030 can cycle on and off to indicate that the samples button has been selected.
  • the marketing user can use the value of the accessibilityLabel or Ref for the samples button 1030 as an element identifier to change a tracking setting for tracking use of the samples button 1030 by end users of the native application.
  • the configuration display menu 1060 further can be minimized by selecting the switch 1068.
  • FIGURE 1 1 depicts an example interface 1 100 of a native application once the configuration display menu 1 160 for native application has been activated.
  • the configuration display menu 1 160 can be similar to the configuration display menu 1060 of FIGURE 10; however, the configuration display menu 1 160 notably also presents the current tracking status 1 161 for the element for the highlighted text "Timeline: Egypt in Transition".
  • buttons, dropdown boxes, select boxes, text boxes, check boxes, slider controls, and other user interface controls shown may be substituted with other types of user interface controls that provide the same or similar functionality.
  • the user interface controls may be combined or divided into other sets of user interface controls such that similar functionality or the same functionality may be provided with very different looking user interfaces.
  • each of the user interface controls may be selected by a user using one or more input options, such as a mouse, touch screen input, or keyboard input, among other user interface input options.
  • the disclosure can further apply to other application environments.
  • the disclosure can apply to native application environments where a digital marketing user or native application provider can desire to flexibly change the way a native application behaves after deployment, such as by changing colors, background images, text, or the like of the native application.
  • the disclosure can apply to native application environments where a digital marketing user or native application provider may desire to use the native application itself as a medium for viewing or changing configurations for the native application for end users of the native application.
  • the disclosure can apply to native application environments where a digital marketing user or native application provider can desire to display or alter information related to third-party code embedded in a first-party native application or third-party configurations associated with the first-party native application.
  • the disclosure can apply to native application environments where a digital marketing user or native application provider can desire to activate third-party features embedded in a first-party native application where the third-party features are intended for use by the digital marketing user or native application provider and not for use by the end user of the first-party native application.
  • a general purpose processor can be a microprocessor, but in the alternative, the processor can be a controller, microcontroller, or state machine, combinations of the same, or the like.
  • a hardware processor can include electrical circuitry or digital logic circuitry configured to process computer-executable instructions.
  • a processor includes an FPGA or other programmable device that performs logic operations without processing computer-executable instructions.
  • a processor can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • a computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.
  • a software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of non-transitory computer-readable storage medium, media, or physical computer storage known in the art.
  • An example storage medium can be coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor.
  • the storage medium can be volatile or nonvolatile.
  • the processor and the storage medium can reside in an ASIC.
  • Conditional language used herein such as, among others, “can,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or states. Thus, such conditional language is not generally intended to imply that features, elements and/or states are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or states are included or are to be performed in any particular embodiment.
  • deltaX fabs(last.x - current.x),
  • deltaY fabs(last.y - current.y),
  • deltaZ fabs(last.z - current.z);
  • JastCMAcceleration motion. userAcceleration;
  • deltaX fabs(last.x - current.x),
  • deltaY fabs(last.y - current.y),
  • deltaZ fabs(last.z - current.z);
  • UIButton 'button (UIButton*)gesture.view
  • id controller [_°verlayControllers objectAtlndex:button.tag];
  • CGRect frame CGRectMake(0, controllerHeight, controllerWidth, controllerHeight);
  • _controllerButton [[UI Button alloc]initWithFrame:frame];
  • UITapGestureRecognizer *singleTap [[UITapGestureRecognizer alloc] initWit Target:self action:@selector(displayPropertiesView)];
  • viewController window.roofViewController
  • CGPoint translation [gesture translationlnView:view .window];
  • center CGPointMake(view.center.x + translation.x,
  • double topBoundry (view.window.frame. origin. y + (view.frame. size. height/2));
  • double bottomBoundry (view.window.frame. size. height - (view.frame.size. height/2));
  • center CGPointMake(newX, newY);
  • CGRect converted [view convertRect:view.layer.frame toView:nil];
  • NSMutableDictionary *mDict [NSMutableDictionary dictionary];
  • Class classObject NSCIassFromString(key);
  • id subObj [self dictionaryWithPropertiesOfObject:[object valueForKey:key]];
  • NSDictionary * propertyDict [NSDictionary dictionaryWithDictionary:mDict]; return propertyDict;
  • NSMutableDictionary * mDict [NSMutableDictionary dictionary];
  • NSDictionary * dictionary [NSDictionary dictionaryWithDictionary:mDict]; return [NSDictionary dictionaryWithObjectsAndKeys:dictionary,
  • NSMutableDictionary * viewData [NS utableDictionary
  • NSMutableDictionary * mDict [NSMutableDictionary dictionary];
  • objectCopy mpvc.moviePlayer.view
  • NSDictionary *eventData [[[TealiumiOSTagger sharedlnstance] autoTracker] eventDataForObjectobject];
  • NSDictionary 'additionalEventData [[TealiumiOSTagger sharedlnstance] additionalEventDataFortealiumlD];
  • additionalEventData [[TealiumiOSTagger sharedlnstance] additionalEventDataFonobject];
  • NSDictionary *objectlnfo [NSDictionary dictionaryWithObjectsAndKeys:mDict, ⁇ "Object Info", linkData, @"Utag Call Data (Link)", nil];
  • NSMutableArray * mArray [NSMutableArray array]
  • NSArray 'controllers [[controller valueForKey:@"viewControllers"] copy]; for (id aController in controllers) ⁇
  • object playerViewController
  • objectView playerViewController.moviePlayer.view
  • objectView playerController.view
  • UITapGestureRecognizer *doubleTap [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(displayTrackableltemDetails:)];
  • UIButton 'button [UIButton buttonWithType:UIButtonTypeCustom];
  • UITapGestureRecognizer * doubleTap [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(displayTrackableltemDetails:)];
  • UITapGestureRecognizer *singleTap [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(passThru:)];
  • CGAffineTransform transform CGAffineTransform Identity
  • button. transform transform
  • UI Button 'button (UI Button*)gesture.view
  • id object [_overlayTargets objectAtlndex:button.tag];

Abstract

A native application can be deployed that obtains configuration information for the native application at run-time. Digital marketing users or other marketing users can view or control the behavior of the native application by displaying or setting the configuration information of the native application. The native application can report tracked end user interactions and events with the native application according to the configuration information to tag management systems for data compilation by the tag management systems. In addition, the native application can enable digital marketing users or other marketing users to view or control the behavior of the native application from within the native application using an integrated configuration utility. The configuration utility can be activated using a two- stage activation process or an activation and authorization process to prevent an end user from accidentally encountering the configuration utility during routine use of the native application.

Description

ACTIVATION OF DORMANT FEATURES IN NATIVE
APPLICATIONS
RELATED APPLICATIONS
[0001] This application claims priority under 35 U.S.C. § 1 19(e) as a nonprovisional application of the following U.S. Provisional Applications:
Filing
App. NO. Title Attorney Docket
Date
CONTENT SITE VISITOR
61/872530 08/30/13 TEALM.001 PR
PROCESSING SYSTEM
COMBINED SYNCHRONOUS AND
61/889876 10/1 1/13 ASYNCHRONOUS TAG TEALM.002PR
DEPLOYMENT
UNIVERSAL VISITOR
61/900274 1 1/05/13 IDENTIFICATION IN TAG TEALM.003PR
MANAGEMENT SYSTEMS
SYSTEM FOR PREFETCHING
61/896351 10/28/13 TEALM.004PR
DIGITAL MARKETING TAGS
SYSTEMS AND METHODS FOR
61/755362 01/22/13 PROVIDING TAG MANAGEMENT TEALM.006PR
FOR PORTABLE DEVICES
[0002] In addition, this application is related to U.S. Application No. 14/149,717, filed January 7, 2014, titled "Content Site Visitor Processing System" and U.S. Application No. 14/151 ,700, filed January 9, 2014, titled "Combined Synchronous and Asynchronous Tag Deployment." The disclosures of each of the foregoing applications are hereby incorporated by reference in their entirety. Further, any subset of the embodiments described herein can be implemented in combination with any subset of the embodiments described in the foregoing applications.
COPYRIGHT NOTICE
[0003] A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
BACKGROUND
[0004] Some operators of content sites, such as websites, regularly obtain the results of analytics performed with regard to user interactions on their content sites. User analytics can include any type of data regarding interactions of end users with content sites, among other types of data. There are different approaches to gathering analytics data, one of which includes employing the use of tags.
[0005] Tags can include small pieces of website code that allow a website operator to measure traffic and visitor behavior, understand the impact of online advertising and social channels, use remarketing and audience targeting, or test and improve a content site, among optionally other functions. Adding tags to a content site has typically required involving a developer to manually insert tag code into one or more pages of a website.
SUMMARY
[0006] In one embodiment, a method of presenting information about elements of a host application is disclosed. The method can be performed under control of a physical computing device including digital logic circuitry. The method can include: executing a host application; receiving a first user input indicative of a user shaking the physical computing device; in response to determining that the first user input matches a first activation input, executing a confirmation routine to process one or more additional user inputs to the physical computing device; receiving a second user input with the confirmation routine after said receiving the first user input, the second user input indicative of the user contacting a screen of the physical computing device; and in response to determining, using the confirmation routine, that the second user input matches a second activation input, displaying a configuration utility on the screen, the configuration utility configured to output information regarding trackable elements of the host application. [0007] The method of the preceding paragraph can further include one or more of the following features: The method can include (i) receiving a third user input indicative of selection of an interactive user interface element of the trackable elements of the host application after said receiving the second user input, the third user input indicative of the user contacting the screen, (ii) in response to determining that the third user input matches a configuration selection input, processing the third user input using the configuration utility to output a tracking identifier associated with the interactive user interface element, and (iii) in response to determining that the third user input does not match the configuration selection input, navigating within the host application based at least on the interactive user interface element. The configuration utility can be configured for use by an administrator of the host application and not for use by an end user of the host application. The method can include, in response to determining that the second user input has not been received within a timeout period, stopping said executing the confirmation routine. The physical computing device can include a mobile phone or a tablet computer, and the host application can include the confirmation routine and the configuration utility.
[0008] In one embodiment, non-transitory physical computer storage including computer-executable instructions stored thereon is disclosed. The computer-executable instructions, when executed by one or more processors, can implement a process. The process can include: receiving configuration information for configuring a physical computing device; receiving a first user input from a user of the physical computing device, the first user input comprising a motion component; in response to determining that the first user input matches a first activation input, listening for a second user input to the physical computing device using confirmation instructions of the computer-executable instructions; receiving the second user input from the user; and in response to determining, using the confirmation instructions, that the second user input matches a second activation input, displaying a configuration utility interface on a display of the physical computing device, the configuration utility interface configured to display information indicative of the configuration information. [0009] The computer-executable instructions of the preceding paragraph, when executed by one or more processors, can further implement a process that includes one or more of the following features: The first activation input can be different from the second activation input. The process can include (i) receiving a third user input from the user, the third user input indicative of selection of an element of a user interface displayed on the display, (ii) in response to determining that the third user input matches a configuration selection input, displaying information corresponding to the third user input in the configuration utility interface, the configuration utility interface shown in juxtaposition to the user interface on the display, and (iii) in response to determining that the third user input does not match the configuration selection input, displaying information corresponding to the third user input in the user interface. The process can include, in response to determining that the second user input has not been received within a timeout period, stopping said listening for the second input to the physical computing device using the confirmation instructions. The configuration information can denote elements of a user interface to be tracked as the user interacts with the user interface. The process can include transmitting, to a tracking server, data indicative of interactions of the user with the elements of the user interface denoted by the configuration information. The elements of the user interface denoted by the configuration information can include links displayed in the user interface. The configuration utility interface can be configured to display whether elements of a user interface are trackable as the user interacts with the user interface. The configuration utility interface can be usable by the user to change the configuration information stored on the configuration information server when the user may be an authenticated user. The second user input can be an input indicative of consecutive taps on the display by the user. The computer-executable instructions can include user interface instructions for displaying a user interface and configuration utility instructions for displaying the configuration utility interface, the confirmation and configuration utility instructions including third-party developed computer-executable instructions, the user interface instructions including first-party developed computer- executable instructions. The configuration utility interface can be configured for use by an administrator of the computer-executable instructions, and the user interface can be configured for use by an end user of the computer-executable instructions.
[0010] In one embodiment, a system for presenting information regarding elements of a host application is disclosed. The system can include a memory and a processor. The memory can be configured to store a host application, and the hardware processor can be configured to communicate with the memory. The hardware processor configured to: execute the host application; listen for a motion input; in response to determining that the motion input matches an expected motion input, listen for a user input received before an end of a timeout period; and in response to determining that the user input matches an activation input, invoke an operation module. The expected motion input can be different from the activation input.
[0011 ] The system of the preceding paragraph can further include one or more of the following features: The processor can be configured to: in response to determining that a second user input matches a configuration selection input, process the second user input using the configuration utility; and in response to determining that the second user input does not match the configuration selection input, not process the second user input using the configuration utility. The determination of whether the motion input matches the expected motion input and the determination of whether the user input matches the activation input are configured to provide a confirmation that a user intends to activate the configuration utility so that an end user of the host application does not accidentally encounter the configuration utility during routine use of the host application.
[0012] In one embodiment, a system for providing access to a tag management application is disclosed. The system can include a mobile device. The mobile device can include a processor and a memory device. The memory device can be configured to store at least a tag management application and a gesture-to-display module. The gesture-to-display module configured, when executed by the processor, to: listen for a shake gesture corresponding to a user shaking the mobile device; in responsive to identifying the shake gesture, determine whether a predetermined interaction with the mobile device has occurred; and in response to determining that the predetermined interaction with the mobile device has occurred, invoke the tag management application.
[0013] The system of the preceding paragraph can further include one or more of the following features: The gesture-to-display module can be configured to listen for the shake gesture by hooking into a gesture application programming interface (API) of a host application stored in the memory. The gesture-to-display module can be configured to output an invisible overlay over a host application interface. The gesture-to-display module can be configured to detect screen activity via the invisible overlay to determine whether the predetermined interaction with the mobile device has occurred. The predetermined interaction can include one or both of taps and swipes. The gesture-to-display module can be configured to determine whether a predetermined interaction with the mobile device has occurred by activating a voice detection module of the mobile device to listen for a voice command
[0014] In one embodiment, a system including a processor and a memory device is disclosed. The memory device can be configured to store at least a first application and a gesture-to-display module. The gesture-to-display module configured, when executed by the processor, to: listen for a shake gesture corresponding to a user shaking the mobile device; in responsive to identifying the shake gesture, determine whether a predetermined interaction with the mobile device has occurred; and in response to determining that the predetermined interaction with the mobile device has occurred, invoke the first application.
[0015] The system of the preceding paragraph can further include one or more of the following features: The gesture-to-display module can be configured to listen for the shake gesture by hooking into a gesture application programming interface (API) of a host application stored in the memory. The gesture-to-display module can be configured to output an invisible overlay over a host application interface. The gesture-to-display module can be configured to detect screen activity via the invisible overlay to determine whether the predetermined interaction with the mobile device has occurred. The predetermined interaction can include one or both of taps and swipes. The gesture-to-display module can be configured to determine whether a predetermined interaction with the mobile device has occurred by activating a voice detection module of the mobile device to listen for a voice command
[0016] In one embodiment, a method is disclosed. The method can be performed under control of a computing device comprising a processor. The method can include: listening for a shake gesture corresponding to a user shaking the computing device; in responsive to identifying the shake gesture, determining whether a predetermined interaction with the computing device has occurred; and in response to determining that the predetermined interaction with the computing device has occurred, invoking the first application.
[0017] The method of the preceding paragraph can further include one or more of the following features: The listening for the shake gesture can include hooking into a gesture application programming interface (API) of a host application. The method can include outputting an invisible overlay over a host application interface. The method can include detecting screen activity via the invisible overlay to determine whether the predetermined interaction with the mobile device has occurred. The predetermined interaction can include one or both of taps and swipes. The method can include determining whether a predetermined interaction with the computing device has occurred by activating a voice detection module of the computing device to listen for a voice command.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] Throughout the drawings, reference numbers are re-used to indicate correspondence between referenced elements. The drawings are provided to illustrate embodiments of the features described herein and not to limit the scope thereof.
[0019] FIGURE 1 illustrates an example system on which may be implemented various embodiments of methods in accordance with the disclosure.
[0020] FIGURE 2 illustrates a device configuration for a portable device on which may be implemented various embodiments of systems and methods in accordance with the disclosure. [0021 ] FIGURE 3 illustrates a host system configuration on which may be implemented various embodiments of systems and methods in accordance with the disclosure.
[0022] FIGURE 4 illustrates an example application architecture for the tag management system in accordance with the disclosure.
[0023] FIGURE 5 depicts an embodiment of a computing environment that provides access to an analytics system, a business intelligence system, and tag vendor systems.
[0024] FIGURE 6 depicts an embodiment of a native application configuration update process.
[0025] FIGURE 7 depicts an embodiment of a configuration utility activation process.
[0026] FIGURE 8 depicts an embodiment of a configuration utility activation process with user authentication.
[0027] FIGURE 9 depicts an embodiment of a configuration utility operation process.
[0028] FIGURES 10A-C and 1 1 depict embodiments of native application interfaces.
[0029] Various components of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If only the first reference label is used herein, the description can be applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
DETAILED DESCRIPTION
I. Tag Management for Native Applications
[0030] While tags are commonly used to track user interactions with web sites, digital marketing users may also desire to manage tracking of end user interactions in native applications, including both desktop and mobile applications. For example, a native application can include links, images, or the like that may be viewed or selected by an end user of the native application. In some implementations, a digital marketing user thus can beneficially control the gathering of information about the views or selections by the end user using a tag management system to collect information useful in making business decisions related to the native application or other promoted content. However, since a native application can be a locally installed application and deployed as a precompiled unit of executable code or executable code compiled at run-time, digital marketing users or other marketing users may have difficultly modifying configurations for tracking of user interactions and events that are coded in the native application after the native application has been developed. In addition, some native applications, such as applications for mobile devices, may require advance approval by an organization before updates to the native applications can be released to the end users, thus further slowing the release of modifications to the configuration of the native application.
[0031 ] Digital marketing users and other marketing users additionally may desire to view tag information or manage tags associated with native applications using an easy to access and intuitive interface. One such interface can be a user interface of a native application itself. The user interface of the native application can desirably present tag information or enable management of tags in juxtaposition to, overlaid on, or otherwise together with the end user interface for the digital marketing users and other marketing users. The digital marketing users and other marketing users may thus understand or control information relevant to end user interactions with the native application using a view similar to that of an end user of the native application. However, since the user interface of the native application can be intended for viewing by the end user, the functionality to view tag information and manage the tags may desirably be unobtrusive and hidden from the end user. Hiding this functionality from the end user can be challenging though since usable space or features for hiding the functionality can be limited in some native application environments, such as applications for mobile devices.
[0032] This disclosure describes embodiments, systems, methods, and non-transitory computer readable media that can address one or more of the preceding desires or concerns. For example, a native application can be deployed that may obtain some configuration information for the native application at run-time. Digital marketing users or other marketing users can then view or control the behavior of the native application by displaying or setting the configuration information of the native application obtained at run-time. The native application can report tracked end user interactions and events in accordance with the configuration information to tag management systems for data compilation by the tag management systems. Additionally, the native application can enable digital marketing users or other marketing users to view or control the behavior of the native application from within the native application using an integrated configuration utility. The configuration utility can be activated using a two-stage activation process or an activation and authorization process to prevent an end user from accidentally encountering the configuration utility during routine use of the native application. In this way, in some embodiments, the configuration utility can be considered an Easter egg since the two-stage activation process can activate a dormant configuration utility for the native application in response to one or more secret input commands.
[0033] As used herein, the term "native application," in addition to having its ordinary meaning, can refer to an application other than a web application or to an application that is not implemented entirely in a web browser. The native application may be a hybrid native/web application in an embodiment.
[0034] For purposes of summarizing the disclosure, certain aspects, advantages and novel features of several embodiments are described herein. It is to be understood that not necessarily all such advantages can be achieved in accordance with any particular embodiment of the embodiments disclosed herein. Thus, the embodiments disclosed herein can be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein. II. First Example Native Application Tag Management Systems and Methods
[0035] A tag management system (TMS) can enable companies to improve the way they manage the tags (sometimes referred to as pixels) which can be used on their web properties for an increasingly broad range of digital marketing technologies and vendors, ranging from site analytics and affiliate marketing to multivariate testing and retargeting tools. Waiting for IT departments to implement tags may be often a barrier to marketing agility, taking up tech bandwidth which could be more productively used on other areas of website development. Tag implementations are often incomplete and hard to keep track off. As such, digital marketing users want to be more self-sufficient in their management of these tags, so that they can remove the IT bottleneck and impose on tech teams for more value-adding improvements to their web properties.
[0036] Tag management systems can enable the placement of a JavaScript snippet on website pages. That code snippet may replace the tags that would otherwise have been individually deployed. For TMS users, coding lines of HTML and JavaScript on pages can be replaced by a web interface where vendors, actions, and pages are unified and controlled.
[0037] Further, a tag management solution may offer support for managing the tagging and/or tracking of individual elements of a web page, such as link clicks, image clicks, etc. This can be often provided via a tool which offers a number of convenience features for discovering the elements of the webpage which are desirable for tracking. Such tools may not be a feature of the webpage itself, but rather, an external tool that can be capable of interacting with the webpage. The methods of enabling these tools vary, but may utilize well-understood browser capabilities. One such capability can be a known as a "bookmarklet." A bookmarklet is a web browser bookmark item that can be capable of injecting content into the visible webpage, effectively integrating into the webpage in order to offer the aforementioned features.
[0038] The native mobile application environment brings a number of challenges for any third-party vendor desiring a similar level of integration. These challenges can exist in part because the delivery platform may not be a web browser. It is the web browser platform that, in some cases, can allow these aforementioned graceful, on-demand integrations, such as bookmarklets.
[0039] Rather, the native mobile application environment may offer no such on-demand integration capabilities. Native applications can be deployed as precompiled units of executable code. It may be within the source code of these applications and at development time (versus on-demand) when these tools can be integrated. The challenge can then become ensuring the desired third-party vendor tool may be accessible to the application's digital marketer or marketing user (who can be managing the configuration of the application) while not accessible to the general user audience, and without the need of source code modification.
[0040] Systems and methods in accordance with some embodiments of the disclosure provide a means for a manager of a native mobile application to activate the user interface of a third party tool. This may be done without exposing the capability through a conventional user interface. In the general case, this can be useful since the desired third-party tool may not be intended to be utilized by the general user audience, but rather by the application manager.
[0041 ] In one aspect, an activation process can utilize a combination of phone shake gestures and interactions, such as taps or voice. When the user can execute the proper sequence of shake gestures and interactions, the third-party tool's user interface can be revealed to the user. The sequence required may be complex enough to ensure no accidental activation occurs by a general user. This can be further secured with the use of a server-side mechanism in some implementations.
[0042] Attention is now directed to FIGURE 1 , which illustrates a system 100. System 100 includes one or more portable or mobile devices 1 10 (also denoted herein for brevity as "devices 1 10") such as cellular phones, PDAs, Wi-Fi (802.1 1 ) devices, or other portable devices. In the system 100, three portable devices 1 10-1 , 1 10-2 and 110-3 are shown, but the system 100 can include many more portable devices 1 10. It is further noted that, in some embodiments, the device may not be portable and the functionality herein may be implemented on more stationary devices, such as desktop or notebook computers or other types of fixed devices. In addition, portable devices as described herein may include other types of devices that are mobile but may not be portable.
[0043] System 100 further includes a host processing system 140 (also denoted herein as "host system 140") comprising one or more servers as well as other associated computer and data processing hardware (not shown in FIGURE 1 ) such as networking equipment, displays, monitors, I/O devices or other computer or data communication systems, hardware and/or software. In an embodiment, host system 140 may be provided by or operated by an associated host services company or host services supplier.
[0044] As noted previously, host system 140 includes one or more servers 370 that include one or more databases 390 (as shown in FIGURE 3) either internal or external to the servers 370. These databases may be used to store advertisements and data such as is further described herein. Host system 140 may also include one or more operating systems 362 associated with the servers, as well as one or more application programs to implement the various host service functionality as is described further herein. Host system 140 may be implemented at a centralized physical location, such as a network connected server farm or other similar facility, and/or may comprise a plurality of distributed servers connected by any of a variety of networking connections at different physical locations.
[0045] Devices 1 10 are typically configured to connect to each other and/or to host system 140 through network 130 as shown in FIGURE 1 . Network 130 may include wired or wireless networking elements such as Ethernet, LAN technologies, telephony networks, such as POTS phone networks, cellular networks, data networks, or other telephony networks as well as Wi-Fi or Wi-Max networks, other wired or wireless Internet network connections and/or other networks as are known or developed in the art. These connections may be facilitated by one or more client applications 264 (as shown in FIGURE 2) running on devices 1 10 as well as one or more host system applications 364 running on one or more host system servers 370 included in host system 140, along with one more network interfaces 342 and/or other networking hardware and/or software as is known or developed in the art (not shown). [0046] In one aspect, a memory 260 of the device 1 10 may be provided with a tag management application or applications 264, and a gesture-to-display module 266 as shown in FIGURE 2 that may be installed on the user's device 1 10. The tag management application 264 and gesture-to-display module 266 may be installed on a ROM (read only memory) 230 at a factory, thereby negating the need for the user to download the client 264. Alternately, the user may be supplied with the client application 264 on a computer media such as a CD or DVD, a thumb drive, or via other media known or developed in the art.
[0047] FIGURE 2 illustrates additional details of an example configuration of a portable device 1 10 with example device elements that may be used to implement embodiments of the systems and methods in accordance with the disclosure. As shown in FIGURE 2, device 1 10 may include one or more processors (CPUs) 210, which can include one or more specialized or dedicated portable device microprocessors or microcontrollers, an input/output device module 220 configured to allow users to input and output information and interact with applications installed on the device 1 10, such as the tag management application 264, as well as transfer and receive advertising data, one or more read only memory (ROM) devices 230 or equivalents to provide non-volatile storage of data and/or application or operating system programs, one or more display modules 250, such as an LCD or equivalent display device, as well as one or more memory spaces 260. The device 1 10 also can include an acceleration sensor 225 that can be activated by a shaking motion in order to activate the tag management application 264 as described herein.
[0048] Memory space 260 may comprise DRAM, SRAM, FLASH, hard disk drives or other memory storage devices configured to store and access operating systems 262, the tag management application 264 and/or data 268. Data 268 may include information such as advertisements received from an advertising source system.
[0049] FIGURE 3 illustrates additional details of one example of a host system 140 with example device elements that may be used to implement embodiments of the present disclosure. As shown in FIGURE 3, host system 140 may include one or more processors (CPUs) 310, an input/output device module 320 configured to allow users to input and output information and interact with the host system 140 as well as transfer and receive data, one or more read only memory (ROM) devices 330 or equivalents to provide nonvolatile storage of data and/or programs, one or more display modules 350 such as a computer monitor or other display device, one more network connections 340 and associated network interfaces 342 configured to allow host system 140 to connect to other systems, servers and/or portable devices, including other elements of system 140 in embodiments where the servers or other components are distributed at other physical locations, as well as one or more memory spaces 360 and one or more databases 390. Database(s) 390 may be further divided or distributed as one or more sub-databases 390a-390n, with the sub-databases storing feature or function specific information associated with a particular feature or function. The various components shown in FIGURE 3 may be incorporated in one or more physical servers 370 comprising part of host system 140. It is noted that the various components shown in FIGURE 3, including database 390, are typically included as part of server(s) 370, however, they may be external to server(s) 370 in some embodiments. For example, in some embodiments database(s) 390 may be external to server(s) 370 and may comprise part of a separate database server system or networked database system.
[0050] Memory space 360 may comprise DRAM, SRAM, FLASH, hard disk drives or other memory storage devices, such as media drives 380, configured to store operating systems, application programs and/or data, and memory space 360 may be shared with, distributed with or overlap with the memory storage capacity of database 390. In some embodiments, memory space 360 may include database 390 or in some embodiments database 390 may include data 368 as shown in memory space 360.
[0051] Data stored in memory space 360 and/or database 390 may include information, such as tag management system information or other types of data. In particular, memory space 360 may include a host system application or applications 364 stored in the memory space for execution on CPU 310 to perform the various host-side functionality described herein.
[0052] In one embodiment, the following assumptions can be made regarding a source code topology 400 illustrated in FIGURE 4:
1 . There exists a host application developed as a native mobile application that includes a user interface 405 and business logic code and libraries 410 (for example, not a web-page based application);
2. The host application's source code is compiled with the source code of a third-party tool's source code that includes an integrated third party library 415; and
3. This third party tool's source code includes the source code 420 for at least performing the activation functionality described in this disclosure (the gesture-to-configure functionality).
[0053] In one embodiment, the third party tool can exist within the host application in order to provide functionality that can be intended to be utilized by the application's developer or manager, not the general user audience. In order to achieve the activation of the tool without exposing the existence of the tool to the general user audience, a gesture-to display code module 266 can be utilized.
[0054] The gesture-to-display module 266 can operates as follows:
• Upon the host application activating, the sequence required to activate can be determined by the gesture-to-display module 266. This determination may be accomplished through any of the following means, depending on the specific implementation:
o a server call can be made to retrieve the activation configuration and sequence requirements, and
o via pre-determined, "hardcoded" values;
• The gesture-to-display module 266 can then "hook" into the host application's gesture API. This is a standard API provided by most modern smart-phones or other devices. This "hook" can allow the gesture-to-display module 266 to listen for the expected "shake" gestures; • Upon detecting the proper "shake" gesture, the gesture-to-display module 266 can then activate the next step of validation. This step can be used to prevent accidental activations that may otherwise occur if only relying on shake activity. Any of the following can comprise this next activation step:
o The gesture-to-display module 266 can place an invisible overlay over the entirety or some portion of the host application's user interface. This overlay can exist to "catch" screen activity, such as taps and swipes. This overlay can exist for a brief amount of time, such as a few seconds, and
o A voice detection module of the host device can be activated, allowing to listen for the proper voice command to activate.
• If the overlay detects the proper activation input during this step (such as the proper tap sequence, swipe path, or voice command), the gesture-to- display module 266 can then conclude that the intent of the user may be to activate the hidden tool, and displays its user interface to the user.
[0055] Appendix A includes a code snippet, for inclusion within an implantation of the gesture-to-display module 266, which can be capable of detecting such sequences of gestures made with respect to devices based upon the Apple™ iOS™ platform. The functionality of this code snippet may be implemented on other platforms as well, including the Google™ Android™ platform, for example.
[0056] As noted, some embodiments in accordance with the disclosure may include computer software and/or computer hardware/software combinations configured to implement one or more processes or functions, such as those described herein and/or in the related applications. These embodiments may be in the form of modules implementing functionality in software and/or hardware software combinations. Embodiments may also take the form of a computer storage product with a computer-readable medium having computer code thereon for performing various computer-implemented operations, such as operations related to functionality as describe herein. The media and computer code may be those specially designed and constructed for the purposes of performing functionality described herein, or they may be of the kind well known and available to those having skill in the computer software arts, or they may be a combination of both.
[0057] Examples of computer-readable media within the spirit and scope of the present disclosure include, but are not limited to: magnetic media such as hard disks; optical media such as CD-ROMs, DVDs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store and execute program code, such as programmable microcontrollers, application- specific integrated circuits ("ASICs"), programmable logic devices ("PLDs") and ROM and RAM devices. Examples of computer code may include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter. Computer code may be comprised of one or more modules executing a particular process or processes to provide useful results, and the modules may communicate with one another via means known in the art. For example, some embodiments may be implemented using assembly language, Java, C, C#, C++, or other programming languages and software development tools as are known in the art. Other embodiments of the disclosure may be implemented in hardwired circuitry in place of, or in combination with, machine-executable software instructions.
III. Second Example Native Application Tag Management Systems and Methods
[0058] The following example or one or more sub-features thereof may be implemented in any combination with any of the features described above. Turning to FIGURE 5, an embodiment of a computing environment 500 is shown for implementing various tag features, including some or all of the tag management features described above. In the computing environment 500, one or more end user systems 502, such as end user systems 502-1 , 502-2, 502-3, can communicate over a network 508 with a configuration information server 510 and an analytics system 530. The end user system 502 can include any form of computing device and may be a desktop, laptop, smartphone, tablet, a virtualization of a smartphone or tablet, or the like. The end user system 502 can further include a native application 503B, which can provide application content or functionality to the end user system 502, and a user input 503A that can receive user inputs for interacting with the native application 503B. The native application 503B can present a user interface on a display and can request and receive configuration information from the configuration information server 510 for controlling or adjusting operation of the native application 503B. The configuration information can include a directives file, such as a JavaScript file, loadable by the native application 503B. The configuration information server 510 can be a server implemented in computer hardware and/or software and managed by the provider of the native application 503B or the provider of the analytics system 530, for example. Further, the network 108 can include a local area network (LAN), a wide area network (WAN), a company intranet, the public Internet, combinations of the same, or the like.
[0059] The analytics system 530 is shown in communication with the configuration information server 510. The analytics system 530 can be implemented in computer hardware and/or software. For instance, the analytics system 530 may be implemented in physical and/or virtual servers, which may be geographically dispersed or co:located. In the depicted embodiment, the analytics system 530 includes the user processing system 540 and the tag management system 550, as well as a user profile data repository 560. The user processing and tag management systems 540, 550 are shown separately for illustrative purposes, although their functionality may be implemented by a single system. The analytics system 530 can also be implemented without the tag management system 550, and thus, the functionality of the user processing system 540 can be implemented independent of any tag management functionality. Further, the analytics system 530 can be implemented without the user processing system 540, and thus, the functionality of the tag management system 550 can be implemented independent of any user processing functionality.
[0060] One or more marketing user systems 504, such as marketing user systems 504-1 , 504-2, 504-3, can access the configuration information server 510, analytics system 530, or business intelligence system 580 via the network 508. Like the end user system 502, the marketing user system 504 can include a native application 505B, which can provide application content or functionality to the marketing user system 504, and a user input 505A that can receive user inputs for interacting with the native application 505B. The marketing user system 504 can also be any type of computing device including, but not limited to, a desktop, laptop, tablet, smartphone, a virtualization of a smartphone or tablet, or the like. The native application 505B can be a different instance of the same or a similar application as the native application 503B. The user input 505A can include one or more of a motion sensor, touch screen sensor, microphone, button, or the like to receive user inputs. In addition, the marketing user system 504 further can include a browser 505C. The browser 505C or a configuration utility of the native application 505B can be used to access or change the configuration information stored on the configuration information server 510 via the analytics system 530. Although the marketing user system 504 is illustrated as having both the native application 505B and browser 505C, some market user systems 504 may not include the native application or 505B browser 505C, depending on the implementation.
[0061 ] The marketing user system 504 can be operated by marketing users, such as digital marketing professionals, business users, providers of the native application 503B, or any other individual who uses tags or data obtained from tags. Marketing users may not be the primary intended end users of the native applications 503B, 505B in certain embodiments. Instead, a marketing user may use the marketing user system 504 to dynamically view or update the types of data tracked or analyzed for different users of the native application 503B. This data can be tracked by the user processing system 540 via updating the configuration information stored in the configuration information server 510 or updating processing by the user processing system 540 of data obtained from the native application 503B to build updated user profiles 560. In addition, marketing users can access the information stored in the business intelligence system 580 to obtain an understanding of particular end user system 502 for purposes such as evaluating the effectiveness of various marketing campaigns, for instance.
[0062] In certain embodiments, the user processing system 540 can enable marketing users to configure the types of data tracked for different users of a native application 503B, as well as analyze and report on this user data. For instance, the user processing system 540 can provide one or more user interfaces via the browser 505C that enable customization of collecting information about user of the native application 503B. Upon execution of the native application 503B, the native application 503B can supply user data to the user the analytics system 530 (optionally through the configuration information server 510). Such user data can be stored in user profiles in the user profile data repository 560, which may include physical computer storage. Marketing users can subsequently query the user profiles to obtain reports or other information about users of the native application 503B.
[0063] The tag management system 550 can be used to manage the tags provided by third-party vendors. For instance, the tag management system 150 can provide functionality for marketing users to select which third-party vendor tags to associate with a native application for a variety of vendor-specific processing purposes. These purposes can include obtaining analytics for data analysis or business intelligence, tracking affiliate activity with respect to the native application, obtaining user data for displaying targeted ads, obtaining user data for customizing search functionality or email campaigns targeted to the end users, obtaining user data for personalizing content of the native application, obtaining user data for integration with social networking functionality, obtaining user data for big data analysis, combinations of the same, or the like. Tags for any of these vendor- specific processing purposes, among others, can be considered digital marketing tags. Data collected by the tags from the native application 503B can be provided to tag vendor systems 570, which can perform any of this vendor-specific processing. The data or related data may additionally or alternatively be passed to the tag vendor systems 570 through the tag management system 550.
[0064] In an embodiment, the tag management system 550 provides functionality (such as one or more user interfaces through the browser 505B) for marketing users to map data sources in the native application 503B to data sources gathered by the tags. For instance, if a native application includes a shopping cart value named "cart value," the tag management system can provide a user interface that enables a user to tell the tag management system 550 to collect data on the "cart value" and map this data to a "cart_value" variable of one of the tags.
[0065] However, in some embodiments, the tags can instead perform a greedy collection of some or all data available in the native application 503B. Since the tags and user processing system 540 can be provided by the same entity, the data obtained by the tags need not be mapped to third-party mappings like the data obtained by the third-party tags. Instead, some or all of the data available to the tags can be provided to the user processing system 540 for processing. Thus, the tags can facilitate vendor-neutral data gathering of some or all of the data elements in the native application 503B. Since this data may not be mapped to a vendor- specific format in certain embodiments, the data can be exported to business intelligence systems, such as business intelligence system 580, without a need to massage the data from its mapped form (which can be cumbersome) to its original, raw form.
[0066] The systems 540, 570 can provide the processed data to a business intelligence system 580, which may be owned, operated, or otherwise used by an operator of the native application 503B to analyze application user behavior. The business intelligence system 580 can be implemented in computer hardware and/or software. The business intelligence system 580 can receive the raw data or processed data from the systems 540, 570 and store and manage the data in a way that facilitates a meaningful presentation of information to those interested in the performance of the native application 503B. In. some embodiments, the business intelligence system 580 is part of the user processing system 540 or the tag management system 550 rather than separate as illustrated in FIGURE 5.
[0067] Turning to FIGURE 6, an embodiment of a native application configuration information update process 600 is shown. The process 600 illustrates an example mode of operation of the computing environment 500 of FIGURE 5 and may be implemented by the various components shown in the computing environment 500 of FIGURE 5. For convenience, the process 600 is described in the context of the computing environment 500 but may instead be implemented by other systems described herein or other computing systems not shown. The process 600 provides one example approach by which the end user system 502 or marketing user system 504 can obtain updated configuration information for the native application 503B or 505B upon start-up of the native application 503B or 505B. Advantageously, in certain embodiments, the process 600 enables the marketing users to make digital marketing changes to the native application 503B without having to submit an updated version of the native application 503B to an application approval organization or without the end users having to re-download the updated version of the native application 503B.
[0068] At block 602, the end user system 502 or marketing user system 504 can receive a command to begin executing the native application 503B or 505B. The command can be received from an end user of the end user system 502 or marketing user of the marketing user system 504 via the user input 503A or 505A. The native application 503B and 505B can include first-party code providing application functionality for the end user of the end user system 502. In addition, the native application 503B and 505B can include integrated third-party code, such as third-party source code and/or a third-party code library. The third-party code can perform functions intended to be hidden from the end user of the end user system 502.
[0069] At block 604, the end user system 502 or marketing user system 504 can request configuration information for the native application 503B or 505B from the configuration information server 510 via the network 108. The native application 503B or 505B can cause the end user system 502 or marketing user system 504 to request the configuration information as part of a start-up process for the native application 503B or 505B.
[0070] At block 606, the end user system 502 or marketing user system 504 can receive the configuration information for the native application 503B or 505B from the configuration information server 510 via the network 108. The configuration information can include a directives file, such as a JavaScript file, loadable by the native application 503B or 505B as part of start-up process for the native application 503B or 505B or usable as a reference file once the native application 503B or 505B may be running. The configuration information can be stored to a memory of the end user system 502 or a memory of the marketing user system 504, in some implementations. The configuration information can provide information on what to track within the native application 503B or 505B (for example, element identifiers associated with links or events in the native application 503B or 505B), how the information should be tracked, who should be tracked, or the like. In some embodiments, the configuration information can be provided by the provider of the analytics system 530 via the configuration information server 510.
[0071] At block 608, the end user system 502 or marketing user system 504 can execute the native application 503B or 505B based at least on the configuration information from the configuration information server 510. In one example, the end user system 502 or marketing user system 504 can activate listeners of the third-party code integrated in the native application 503B or 505B so that the activated listeners process user inputs to the user input 503A or 505A. The activated listeners can include button listeners, scroll listeners, video tracking listeners, or the like, for instance. Executing the native application 503B or 505B can cause the end user system 502 or marketing user system 504 to display an end user interface on a screen of the end user system 502 or marketing user system 504.
[0072] FIGURE 7 depicts an embodiment of a configuration utility activation process 700. The process 700 illustrates an example mode of operation of the computing environment 500 of FIGURE 5 and may be implemented by the various components shown in the computing environment 500 of FIGURE 5. For convenience, the process 700 is described in the context of the computing environment 500 but may instead be implemented by other systems described herein or other computing systems not shown. The process 700 provides one example approach by which the marketing user system 504 can activate a configuration utility as a result of user inputs by a marketing user. Advantageously, in certain embodiments, the process 700 can provide a two-stage activation process to help prevent end users of the end user system 502 from accidentally encountering the configuration utility while operating the native application 503B. [0073] At block 702, the marketing user system 504 can execute the native application 505B based at least on the configuration information from the configuration information server 510. In one example, the marketing user system 504 can activate listeners of the third-party code integrated in the native application 505B so that the activated listeners process user inputs to the user input 505A. The activated listeners can include button listeners, scroll listeners, video tracking listeners, or the like, for instance. Executing the native application 503B or 505B can cause the end user system 502 or marketing user system 504 to display an end user interface on a screen of the end user system 502 or marketing user system 504. In some embodiments, the configuration information can be obtained as described with respect to blocks 602, 604, and 606 of the process 600 of FIGURE 6. Additionally or alternatively, the configuration information can be obtained from configuration information loaded onto the marketing user system 504 by the marketing user or from configuration information previously loaded or saved by the native application 505B, for example.
[0074] At block 704, the native application 505B can receive a user input via the user input 505A. The user input 505A can receive inputs including one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a fingerprint detected by a scanner of the marketing user system 504, or a voice input to a microphone of the marketing user system 504.
[0075] At block 706, the native application 505B can determine whether the user input received at block 704 matches an initial activation action. For instance, the initial activation action, in one implementation, can be three consecutive shakes of the marketing user system 504. Thus, in this example, if the native application 505B determines that the marketing user system 504 has been shaken three times, the native application 505B can determine that the user input matches the initial activation action. In other implementations, the initial activation action can include one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a fingerprint detected by a scanner of the marketing user system 504, a voice input to a microphone of the marketing user system 504, or the like. Moreover, the initial activation action can be coded in the native application 505B or determined from the configuration information at run-time, for instance. In some embodiments, the native application 505B or the marketing user system 504 can provide feedback to indicate that the user input matches the initial activation action. The feedback can include one or more or a combination of: a sound output from a speaker of the marketing user system 504, a vibration of the marketing user system 504, a translucent overlay on a screen (for instance, shadowing, coloring, bordering, opaquing, or gridding of one or more elements displayed on the screen) of the marketing user system 504, a dimming or brightening of at least part of a display on a screen of the marketing user system 504, a small icon displayed on a display (for instance, in a corner) on a screen of the marketing user system 504, or the like.
[0076] If the native application 505B determines that the user input does not match the initial activation action, the process 700 moves to block 702 and the marketing user system 504 can continue execution of the native application 505B. On the other hand, if the native application 505B determines that the user input matches the initial activation action, the process 700 moves to block 708.
[0077] At block 708, the native application 505B can activate a confirmation application to process subsequent user inputs. The confirmation application can be a routine of the native application 505B that remains dormant during normal use by the end user but can activate upon satisfaction of the initial activation action. The confirmation application, for example, can be a part of the third-party code integrated in the native application 505B. In some embodiments, the confirmation application can divert or block some or all user inputs to the user input 505A from being processed by the first-party code of the native application 505B, such as by providing a visible or invisible overlay on the display of the marketing user system 504 to receive all touch inputs to the user input 505A. Moreover, in some embodiments, the native application 505B or the marketing user system 504 can provide feedback to indicate that the confirmation application has been activated. The feedback can include one or more or a combination of: a sound output from a speaker of the marketing user system 504, a vibration of the marketing user system 504, a translucent overlay on a screen (for instance, shadowing, coloring, bordering, opaquing, or gridding of one or more elements displayed on the screen) of the marketing user system 504, a dimming or brightening of at least part of a display on a screen of the marketing user system 504, a small icon displayed on a display (for instance, in a corner) on a screen of the marketing user system 504, or the like.
[0078] At block 710, the native application 505B determines whether a timeout period for the confirmation application has expired. The timeout period can be a relatively short duration (for instance, about 5, 10, or 20 seconds or more or less) during which the marketing user of the marketing user system 504 can enter another user input to confirm an intent to activate the configuration utility. If no user inputs are received during the timeout period, the process 700 moves to block 702 and the marketing user system 504 can continue normal execution of the native application 505B. On the other hand, if a user input is received during the timeout period, the process 700 moves to block 712.
[0079] At block 712, the confirmation application can receive the user input via the user input 505A. The user input 505A can receive inputs including one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a fingerprint detected by a scanner of the marketing user system 504, or a voice input to a microphone of the marketing user system 504.
[0080] At block 714, the confirmation application can determine whether the user input received at block 712 matches a confirmation activation action. For instance, the confirmation activation action, in one implementation, can be three consecutive taps on a screen of the marketing user system 504. Thus, in this example, if the confirmation application determines that the screen of the marketing user system 504 has been tapped three times, the confirmation application can determine that the user input matches the confirmation activation action. In other implementations, the confirmation activation action can include one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a voice input to a microphone of the marketing user system 504, or the like. Moreover, the confirmation activation action can be coded in the native application 505B or determined from the configuration information at run-time, for instance. If the confirmation application determines that the user input does not match the confirmation activation action, the process 700 moves to block 702 and the marketing user system 504 can continue execution of the native application 505B. On the other hand, if the confirmation application determines that the user input matches the confirmation activation action, the process 700 moves to block 716.
[0081] At block 716, the native application 505B can display the configuration utility. The configuration utility can be a routine of the native application 505B that remains dormant during normal use by the end user but can activate upon satisfaction of the initial and confirmation activation actions. The configuration utility can be used by the marketing user to view the configuration information from the configuration information server 510 in juxtaposition to, overlaid on, or otherwise together with the end user interface of the native application 503B. For example, the configuration utility can be a part of the third-party code integrated in the native application 505B and present the configuration information about what tags are being leveraged at that moment by the first-party code of the native application 505B, what elements of the first-party code user interface can be tagged, or how to identify an element of the first-party application for tagging purposes, or the like. In addition, the configuration utility can be used by the marketing user to change the configuration information stored at the configuration information server 510, such as to enable or remove tracking for one or more elements of the native application 503B. In some embodiments, the configuration utility can provide readonly access to the configuration information since the initial and confirmation activation actions may not be considered sufficient security protections to permit editing of the configuration information stored in the configuration information server 510. In other embodiments, the configuration utility can provide read and write access to the configuration information. Moreover, in some embodiments, at block 716, the native application 505B can additionally or alternatively invoke a particular routine or operation module. The routine or operation module can include, for example, a function, method, script, or the like.
[0082] FIGURE 8 depicts an embodiment of a configuration utility activation process 800 with user authentication. The process 800 illustrates an example mode of operation of the computing environment 500 of FIGURE 5 and may be implemented by the various components shown in the computing environment 500 of FIGURE 5. For convenience, the process 800 is described in the context of the computing environment 500 but may instead be implemented by other systems described herein or other computing systems not shown. The process 800 provides one example approach by which the marketing user system 504 can activate a configuration utility as a result of a user input and user authentication by the marketing user.
[0083] At block 802, the marketing user system 504 can execute the native application 505B based at least on the configuration information from the configuration information server 510. In one example, the marketing user system 504 can activate listeners of the third-party code integrated in the native application 505B so that the activated listeners process user inputs to the user input 505A. The activated listeners can include button listeners, scroll listeners, video tracking listeners, or the like, for instance. Executing the native application 503B or 505B can cause the end user system 502 or marketing user system 504 to display an end user interface on a screen of the end user system 502 or marketing user system 504. In some embodiments, the configuration information can be obtained as described with respect to blocks 602, 604, and 606 of the process 600 of FIGURE 6.
[0084] At block 804, the native application 505B can receive a user input via the user input 505A. The user input 505A can receive inputs including one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a fingerprint detected by a scanner of the marketing user system 504, or a voice input to a microphone of the marketing user system 504.
[0085] At block 806, the native application 505B can determine whether the user input received at block 804 matches activation instructions. For instance, the activation instructions in one implementation can be three consecutive shakes of the marketing user system 504. Thus, in this example, if the native application 505B determines that the marketing user system 504 has been shaken three times, the native application 505B can determine that the activation instructions have been received. In other implementations, the activation instructions can be determined to be received in response to sensing one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a voice input to a microphone of the marketing user system 504, or the like. Moreover, the activation instructions can be coded in the native application 505B or determined from the configuration information at run-time, for instance. In some embodiments, the native application 505B or the marketing user system 504 can provide feedback to indicate that the user input matches the activation instructions. The feedback can include one or more or a combination of: a sound output from a speaker of the marketing user system 504, a vibration of the marketing user system 504, a translucent overlay on a screen (for instance, shadowing, coloring, bordering, opaquing, or gridding of one or more elements displayed on the screen) of the marketing user system 504, a dimming or brightening of at least part of a display on a screen of the marketing user system 504, a small icon displayed on a display (for instance, in a corner) on a screen of the marketing user system 504, or the like.
[0086] If the native application 505B determines that the activation instructions have not been received, the process 800 moves to block 802 and the marketing user system 504 can continue execution of the native application 505B. On the other hand, if the native application 505B determines that the activation instructions have been received, the process 800 moves to block 808.
[0087] At block 808, the native application 505B can request a user authentication from the marketing user. The native application 505B can request that the marketing user, for example, provide a usemame, a password, a fingerprint, or the like via the user input 505A. At block 810, the native application 505B can receive the user authentication from the user input 505A. At block 812, the native application 505B can determine whether the user authentication is confirmed to match the authentication for a marketing user of the native application 505B. If the user authentication is not confirmed to match the authentication for a marketing user of the native application 505B, the process 800 moves to block 802 and the marketing user system 504 can continue execution of the native application 505B. On the other hand, if the user authentication is confirmed to match the authentication for a marketing user of the native application 505B, the process 800 moves to block 814.
[0088] At block 814, the native application 505B can display the configuration utility. The configuration utility can be a routine of the native application 505B that remains dormant during normal use by the end user but can activate upon satisfaction of the initial and confirmation activation actions. The configuration utility can be used by the marketing user to view the configuration information from the configuration information server 510 in juxtaposition to, overlaid on, or otherwise together with the end user interface of the native application 503B. For example, the configuration utility can be a part of the third-party code integrated in the native application 505B and present the configuration information about what tags are being leveraged at that moment by the first-party code of the native application 505B, what elements of the first-party code user interface can be tagged, or how to identify an element of the first-party application for tagging purposes, or the like. In addition, the configuration utility can be used by the marketing user to change the configuration information stored at the configuration information server 510, such as to enable or remove tracking for an element of the native application 503B. In some embodiments, the configuration utility can automatically provide read and write access to the configuration information since the marketing user has provided both the activation instructions and a confirmed user authentication, which may be considered sufficient security protections to permit editing of the configuration information stored in the configuration information server 510. Moreover, in some embodiments, at block 814, the native application 505B can additionally or alternatively invoke a particular routine or operation module. The routine or operation module can include, for example, a function, method, script, or the like.
[0089] FIGURE 9 depicts an embodiment of a configuration utility operation process 900. The process 900 illustrates an example mode of operation of the computing environment 500 of FIGURE 5 and may be implemented by the various components shown in the computing environment 500 of FIGURE 5. For convenience, the process 900 is described in the context of the computing environment 500 but may instead be implemented by other systems described herein or other computing systems not shown. The process 900 provides one example approach by which the marketing user system 504 can operate the configuration utility in juxtaposition to the end user interface of the native application 505B. Advantageously, in certain embodiments, the process 900 also can alter the end user facing features of the native application 505B so that the marketing user can access or change the configuration information while selectively enabling and viewing the end user facing features.
[0090] At block 902, the marketing user system 504 can execute the native application 505B based at least on the configuration information from the configuration information server 510. In one example, the marketing user system 504 can activate listeners of the third-party code integrated in the native application 505B so that the activated listeners process user inputs to the user input 505A. The activated listeners can include button listeners, scroll listeners, video tracking listeners, or the like, for instance. Executing the native application 503B or 505B can cause the end user system 502 or marketing user system 504 to display an end user interface on a screen of the end user system 502 or marketing user system 504. In some embodiments, the configuration information can be obtained as described with respect to blocks 602, 604, and 606 of the process 600 of FIGURE 6.
[0091] At block 904, the native application 505B can receive activation instructions from the marketing user of the marketing user system 504. For instance, the activation instructions in one implementation can be three consecutive shakes of the marketing user system 504. Thus, in this example, if the native application 505B determines that the marketing user system 504 has been shaken three times, the native application 505B can determine that the activation instructions have been received. In other implementations, the activation instructions can be determined to be received in response to sensing one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a fingerprint detected by a scanner of the marketing user system 504, a voice input to a microphone of the marketing user system 504, or the like. Moreover, the activation instructions can be coded in the native application 505B or determined from the configuration information at run-time, for instance.
[0092] At block 906, the native application 505B can display the configuration utility. The configuration utility can be a routine of the native application 505B that remains dormant during normal use by the end user but can activate upon satisfaction of the initial and confirmation activation actions. The configuration utility can be displayed as a highlighting of one or more elements displayed on the end user interface, a logo indicative of activation of the configuration utility, a configuration display menu providing the configuration information from the configuration information server 510, or the like. The configuration utility can be a part of the third-party code integrated in the native application 505B. Advantageously, in certain embodiments, the configuration utility can present the configuration information about what tags are being leveraged at that moment by the first-party code of the native application 505B, what elements of the first-party code user interface can be tagged, or how to identify an element of the first-party application for tagging purposes, or the like. In addition, the configuration utility can be used by the marketing user to change the configuration information stored at the configuration information server 510, such as to enable or remove tracking for an element of the native application 503B.
[0093] At block 908, the configuration utility can receive a user selection via the user input 505A. The user input 505A can receive inputs including one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a fingerprint detected by a scanner of the marketing user system 504, or a voice input to a microphone of the marketing user system 504.
[0094] At block 910, the configuration utility can determine whether the user selection received at block 908 is a configuration selection. For example, in one implementation, when the user selection includes double-tapping on the screen over of a displayed element, the user selection can be considered a configuration selection. In other implementations, the user selection can be considered a configuration selection in response to selection of menu displays of the configuration utility or one or more or a combination of: a tap on a screen of the marketing user system 504, a pattern drawn on a screen of the marketing user system 504, a shake of the marketing user system 504, a fingerprint detected by a scanner of the marketing user system 504, a voice input to a microphone of the marketing user system 504, or the like.
[0095] If the user selection is determined not to be a configuration selection, the process 900 moves to block 914 and the user selection can be processed using the end user features of the native application 505B. For example, if the user selection designated an element (such as a link or button) of the end user interface of the native application 505B, the first-party code of the native application 505B can process the selection as if an end user selected the element of the end user interface. The process 900 can then move to block 908 and await a further user selection. On the other hand, if the user selection is determined to be a configuration selection, the process moves to block 912.
[0096] At block 912, the user selection can be processed using the configuration utility. For example, if the user selection designated an element (such as a link or button) of the end user interface of the native application 505B, the configuration utility can display the configuration information, such as an element identifier, associated with the selected element in a configuration display menu. In another example, if the user selection designated an item in a configuration display menu of the configuration utility, the configuration utility can accept selection of the item and perform the function associated with the designated menu item.
[0097] FIGURE 10A-C and 1 1 depict embodiments of native application interfaces 1000A-C and 1 100. The interfaces 1000A-C and 1 100 can be displayed by the native application 505B or by a separate application on a display of the marketing user system 504, for example. The data included in the interface 1000A- C and 1 100 may be supplied by the native application 505B, the configuration information from the configuration information server 510, the analytics system 530, the tag vendor systems 570, or the business intelligence system 580. In some embodiments, the native application 505B and the configuration information alone supply the data shown in the interfaces 1000A-C and 1 100. The interfaces 1000A- C and 1 100 can advantageously, in certain embodiments, display a configuration utility in juxtaposition to, overlaid on, or otherwise together with the end user interface of the native application 505B for the marketing user of the marketing user system 504.
[0098] FIGURE 10A depicts an example interface 1000A of a native application as viewed by an end user of the native application. The interface 1000A includes four buttons, an edit button 1010, an add button 1020, an samples button 1030, and an video samples button 1040. At this time, a marketing user of the interface 1000A can choose to activate the configuration utility as described with respect to the processes 700 or 800, for example.
[0099] FIGURE 10B depicts an example interface 1000B of the native application of the interface 1000A once the marketing user has activated the configuration utility. The configuration utility is displayed in the interface 1000B as a highlighting of the four buttons included in the interface 1000A, as well as a logo 1050 indicative of activation of the configuration utility near the center of the interface 1000B. The marketing user can select the one of the four buttons 1010, 1020, 1030, 1040 or the logo 1050 to view the configuration information associated with the selected button or logo as described with respect to the process 900, for example. In some embodiments, the highlighting of the four buttons can indicate that the four buttons may be tracked by the native application. [0100] FIGURE 10C depicts an example interface 1000C of the native application of the interface 1000B once the marketing user has selected the samples button 1030. As can be seen, the selection of the samples button 1030 has triggered a configuration display menu 1060 to appear that presents the configuration information associated with the samples button 1030. In the illustrated example, the properties for the samples button 1030 are: accessibilityLabel = "samples button", class = "UIRoundedRectButton", and Ref = "5170c". In addition, the highlighting of the samples button 1030 can cycle on and off to indicate that the samples button has been selected. Advantageously, in certain embodiments, the marketing user can use the value of the accessibilityLabel or Ref for the samples button 1030 as an element identifier to change a tracking setting for tracking use of the samples button 1030 by end users of the native application. The configuration display menu 1060 further can be minimized by selecting the switch 1068.
[0101 ] FIGURE 1 1 depicts an example interface 1 100 of a native application once the configuration display menu 1 160 for native application has been activated. As can be seen by the highlighted text "Timeline: Egypt in Transition", the element for the highlighted text "Timeline: Egypt in Transition" has been selected by a marketing user and the configuration information associated with the element is being displayed in the configuration display menu 1 160. The configuration display menu 1 160 can be similar to the configuration display menu 1060 of FIGURE 10; however, the configuration display menu 1 160 notably also presents the current tracking status 1 161 for the element for the highlighted text "Timeline: Egypt in Transition". As illustrated, the property for the element is Currently Tracking = "YES" indicating that use of the element is currently being tracked by the native application.
[0102] The user interface controls shown in FIGURES 10A-C and 1 1 are merely illustrative examples and can be varied in other embodiments. For instance, buttons, dropdown boxes, select boxes, text boxes, check boxes, slider controls, and other user interface controls shown may be substituted with other types of user interface controls that provide the same or similar functionality. Further, the user interface controls may be combined or divided into other sets of user interface controls such that similar functionality or the same functionality may be provided with very different looking user interfaces. Moreover, each of the user interface controls may be selected by a user using one or more input options, such as a mouse, touch screen input, or keyboard input, among other user interface input options.
IV. Additional Embodiments and Terminology
[0103] Many other variations than those described herein will be apparent from this disclosure. For example, depending on the embodiment, certain acts, events, or functions of any of the algorithms described herein can be performed in a different sequence, can be added, merged, or left out altogether (e.g., not all described acts or events are necessary for the practice of the algorithms). Moreover, in certain embodiments, acts or events^ can be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors or processor cores or on other parallel architectures, rather than sequentially. In addition, different tasks or processes can be performed by different machines and/or computing systems that can function together.
[0104] Although some embodiments of the disclosure have been illustrated using tag management systems and associated devices and systems as examples, the disclosure can further apply to other application environments. In one example, the disclosure can apply to native application environments where a digital marketing user or native application provider can desire to flexibly change the way a native application behaves after deployment, such as by changing colors, background images, text, or the like of the native application. In another example, the disclosure can apply to native application environments where a digital marketing user or native application provider may desire to use the native application itself as a medium for viewing or changing configurations for the native application for end users of the native application. In a further example, the disclosure can apply to native application environments where a digital marketing user or native application provider can desire to display or alter information related to third-party code embedded in a first-party native application or third-party configurations associated with the first-party native application. In yet another example, the disclosure can apply to native application environments where a digital marketing user or native application provider can desire to activate third-party features embedded in a first-party native application where the third-party features are intended for use by the digital marketing user or native application provider and not for use by the end user of the first-party native application.
[0105] The various illustrative logical blocks, modules, and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. The described functionality can be implemented in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosure.
[0106] The various illustrative logical blocks and modules described in connection with the embodiments disclosed herein can be implemented or performed by a machine, such as a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor can be a microprocessor, but in the alternative, the processor can be a controller, microcontroller, or state machine, combinations of the same, or the like. A hardware processor can include electrical circuitry or digital logic circuitry configured to process computer-executable instructions. In another embodiment, a processor includes an FPGA or other programmable device that performs logic operations without processing computer-executable instructions. A processor can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. A computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.
[0107] The steps of a method, process, or algorithm described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module stored in one or more memory devices and executed by one or more processors, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of non-transitory computer-readable storage medium, media, or physical computer storage known in the art. An example storage medium can be coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor. The storage medium can be volatile or nonvolatile. The processor and the storage medium can reside in an ASIC.
[0108] Conditional language used herein, such as, among others, "can," "might," "may," "e.g.," and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or states. Thus, such conditional language is not generally intended to imply that features, elements and/or states are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or states are included or are to be performed in any particular embodiment. The terms "comprising," "including," "having," and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term "or" is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term "or" means one, some, or all of the elements in the list. Further, the term "each," as used herein, in addition to having its ordinary meaning, can mean any subset of a set of elements to which the term "each" is applied.
[0109] While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it will be understood that various omissions, substitutions, and changes in the form and details of the devices or algorithms illustrated can be made without departing from the spirit of the disclosure. As will be recognized, certain embodiments of the inventions described herein can be embodied within a form that does not provide all of the features and benefits set forth herein, as some features can be used or practiced separately from others.
APPENDIX A
//
// TealiumiOSMobileCompanion.m
// TealiumiOSTagger
//
// Created by Jason Koo on 11/7/12.
// Copyright (c) 2012 Tealium, Inc. All rights reserved. //
#import "TealiumiOSMobileCompanion.h"
©implementation TealiumiOSMobileCompanion
#define TealiumAnimationDuration 0.5
#define AnimationScaling 1.2
#define TealiumAnimationDelay 0.25
#define DefaultTriggerWindow 5.0
NSString * const TealiumMobileCompanionEnable = @"mobileCompanion_enable";
NSString * const TealiumMobileCompanionDisable = @"mobileCompanion_disable";
#pragma mark - SETUP
- (id) initWithData:(NSDictionary*)data{
self = [super init];
if (self){
if ([data isKindOfClass:[NSDictionary class]]) _data initWithDictionary:data];
//default settings
self.isAnimated = YES;
self.displayUIEIements = YES;
self.displayPropertyElements = NO;
self.overlayAipha = 0.25;
[self layEasterEgg];
}
return self;
}
#pragma mark - UNLOCK
// add listener for 2 stage mobile companion trigger - (void) layEasterEgg{
49 //using UIAccelerometer
// [UIAccelerometer shared Accelerometer]. delegate = self;
//using CoreMotion
[self performSelectorlnBackground:@selector(enableShakeDetection) withObject:nil];
}
- (void) popEasterEggs{
[self removeSecondEasterEgg];
//using UIAccelerometer
// [UIAccelerometer sharedAccelerometer]. delegate = nil;
//using CoreMotion
[self disableShakeDetection];
}
//USING CoreMotion to track shakes
// Ensures the shake is strong enough on at least two axes before declaring it a shake.
// "Strong enough" means "greater than a client-supplied threshold" in G's.
static BOOL CMAccelerationlsShaking(CMAcceleration last, CMAcceieration current, double threshold) {
double
deltaX = fabs(last.x - current.x),
deltaY = fabs(last.y - current.y),
deltaZ = fabs(last.z - current.z);
return
(deltaX > threshold && deltaY > threshold) ||
(deltaX > threshold && deltaZ > threshold) ||
(deltaY > threshold && deltaZ > threshold);
}
- (void) enableShakeDetection{
_motionManager = [[CMMotionManager alloc] init];
_motionManager.deviceMotionUpdatelnterval = 0.01 ;
LmotionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue mainQueue]
withHandler:A(CMDeviceMotion *motion, NSError *error)
{
// if (JastCMAcceleration) {
if (!_shaking && CMAccelerationlsShaking(JastCMAcceleration, motion. userAcceleration, 0.7)) {
_shaking = YES;
/* SHAKE DETECTED. Listen for second key in mobile companion
50 trigger */
[self laySeconclEasterEgg];
} else if (_shaking && !C AccelerationlsShaking(_lastCMAcceleration, motion. userAcceleration, 0.3)) {
_shaking = NO;
}
// }
JastCMAcceleration = motion. userAcceleration;
}
1;
}
-(void) disableShakeDetection{
_motion anager = nil;
}
//USING UIAccelerometer to track shakes - DEPRECATED in 5.0
// Ensures the shake is strong enough on at least two axes before declaring it a shake.
// "Strong enough" means "greater than a client-supplied threshold" in G's. static BOOL LOAccelerationlsShaking(UIAcceleration* last, U I Acceleration* current, double threshold) {
double
deltaX = fabs(last.x - current.x),
deltaY = fabs(last.y - current.y),
deltaZ = fabs(last.z - current.z);
return
(deltaX > threshold && deltaY > threshold) ||
(deltaX > threshold && deltaZ > threshold) ||
(deltaY > threshold && deltaZ > threshold);
}
// triggered by motion - this is the first key in the mobile companion trigger // Deprecated in 5.0
- (void)accelerometer:(UIAccelerometer *)accelerometer
didAccelerate:(UIAcceleration *)acceleration {
if (JastAcceleration) {
if (Lshaking && LOAccelerationlsShaking(_lastAcceleration, acceleration,
0.7)) {
_shaking = YES;
SHAKE DETECTED. Listen for second key in mobile companion trigger */
[self laySecondEasterEgg];
51 } else if (_shaking && !LOAccelerationlsShaking(_lastAcceleration, acceleration, 0.2)) {
_shaking = NO;
}
}
JastAcceleration = acceleration;
// the second lock in the mobile companion trigger listens for the second key for only a short time before dimsissing itself
-(void) laySecondEasterEgg{
NSLog(@"%s: second easter egg laid", _FUNCTION );
if (!_secondEasterEgg){
_secondEasterEgg = [[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(unlock)];
[_secondEasterEgg setNumberOfTapsRequired:3];
UlView *view = [[UIApplication sharedApplicationJ.windows lastObject]; [view addGestureRecognizer:_secondEasierEgg];
if (!_mobileCompanionTriggerWindow) _mobileCompanionTriggerWindo = DefaultTriggerWindow;
[self performSelector:@selector(removeSecondEasterEgg) withObject:nil afterDelay:_mobileCompanionTriggerWindow];
}
}
//2nd lock re-engaged. First easter egg must be tripped again to unlock -(void) removeSecondEasterEgg{
// NSLog(@"%s", _FUNCTION_);
UlView "view = [[UIApplication sharedApplicationJ.windows lastObject];
[view removeGestureRecognizer:_secondEasterEgg];
#if (! has_feature(objc_arc))
LsecondEasterEgg release];
#endif
_secondEasterEgg = nil;
_shaking = NO;
}
-(void) unlock{
NSLog(@"%s", _FUNCTION_);
52 self.enabled = YES;
[self popEasterEggs];
[[NSNotificationCenter defaultCenter]
postNotificationNameiTealiumMobileCompanionEnable object:self];
#pragma mark - GENERAL ACTIONS
-(void) setOverlayColor:(UIColor*)color{
_overlayColor = color;
}
// popup in properties mode
-(void) displayPropertiesView{
NSMutableDictionary *mDict = [NSMutableDictionary
dictionaryWithDictionary:[self overviewData]];
[mDict addEntriesFromDictionary:[self
propertiesDataForViewController:_hostViewController]];
[mDict addEntriesFromDictionary:[self outboundLog]];
NSDictionary *popupData = [NSDictionary dictionaryWithDictionary:mDict]; if (!_popup) _popup = [[TealiumiOSPopup alloc]
initWithNibName:@"TealiumiOSPopup" bundle:nil];
[_popup setDelegate:self];
[_popup refreshlnView:_hostViewController.view.window data:popupData openTab:TealiumPopupTab_Parent];
[self matchRotation];
[self ideObject:_controllerButton];
}
// popup in object details mode
-(void) displayTrackableltemDetails:(id) sender{
//sender should always be the gesture recognizer attached
if (![sender isKindOfClass:[UIGestureRecognizer class]]) return;
UIGestureRecognizer *gesture = sender;
UIButton 'button = (UIButton*)gesture.view;
NSMutableDictionary *mDict = [NSMutableDictionary
dictionaryWithDictionary:[self overviewData]];
[mDict addEntriesFromDictionary:[self objectDataForOverlayButton:button]]; [mDict addEntriesFromDictionary:[self outboundLog]];
id controller = [_°verlayControllers objectAtlndex:button.tag];
53 [mDict addEntriesFromDictionary:[self
propertiesDataForViewController: controller]];
NSDictionary *popupData = [NSDictionary dictionaryWithDictionary:mDict]; if (!_popup) _popup = [[TealiumiOSPopup
alloc]initWithNibName:@"TealiumiOSPopup" bundle:nil];
Lpopup setDelegate:self];
Lpopup refreshlnView:_hostViewController.view.window data:popupData openTab:TealiumPopupTab_ltem];
[self matchRotation];
[self hideObject:_controllerButton];
}
-(void) displayOverlays{
// after scanning and overlay creation is complete, display the overlay buttons if (Loverlays respondsToSelector:@selector(count)]){
for ( int i = 0; i < Loverlays count]; i++){
UIButton *button = Loverlays objectAtlndex:i];
if (self.isAnimated){
[self performSelector:@selector(revealOverlay:) withObject:button afterDelay:(i * TealiumAnimationDelay)];
}
else button. alpha = self.overlayAlpha;
}
}
}
//Purge overlay tracking data - Iterate through overlay array, dismissing all overlay buttons
-(void) purge{
if (Loverlays respondsToSelector:@selector(count)]){
for (int i = 0 ; i < Loverlays count]; i++){
UIButton *button = Loverlays objectAtlndex:i];
[button removeFromSuperview];
}
Loverlays removeAIIObjects];
LoverlayTargets removeAIIObjects];;
}
if (_overlays != nil) _overlays = nil;
if LoverlayTargets != nil) _overlayTargets = nil;
54 if (_overlayControllers != nil) _overlayControllers = nil;
}
- (void) matchRotation{
UlViewController "viewController = _hostViewController;
[UlView animateWithDuration:.25 animations:A{
CGAffineTransform transform;
BOOL transformOk = NO;
if (viewController.interfaceOrientation ==
UllnterfaceOrientationLandscapeLeft){
transform = CGAffineTransfonm akeRotation( _PI_2 * -1 );
transformOk = YES;
}
else if (viewController.interfaceOrientation ==
UllnterfaceOrientationLandscapeRight){
transform = CGAffineTransform MakeRotation(M_PI_2);
transformOk = YES;
}
else if (viewController.interfaceOrientation ==
UllnterfaceOrientationPortraitUpsideDown){
transform = CGAffineTransformMakeRotation(M_PI);
transformOk = YES;
}
else if (viewController.interfaceOrientation == UllnterfaceOrientationPortrait) { transform =CGAffineTransformldentity;
transformOk = YES;
}
if (transformOk){
_controllerButton. transform = transform;
_popup. view. transform = transform;
}
}];
}
- (void) lock{
[self hideObject:_controllerButton];
[self removeControllerTab];
[self purge];
[self layEasterEgg];
self.enabled = NO;
// [[NS otificationCenter defaultCenter]
postNotificationName:TealiumMobileCompanionDisable object:nil userlnfomil]; }
55 #pragma mark - CONTROLLER BUTTON
// makes mobile companion an observer of device orientation changes
- (void) enableOrientationHook{
U I Device 'device = [U I Device currentDevice];
[[NSNotificationCenter defaultCenter] removeObservenself
name:UI DeviceOrientationDidChangeNotification object:device];
if (device. generatesDeviceOrientationNotifications){
[[NSNotificationCenter defaultCenter] addObservenself
selector:@selector(matchRotation)
name: U I DeviceOrientationDidChangeNotification object:device];
} else {NSLog(@"%s: Device orientation notifications have been disabled.
Enable to activate Mobile Companions orientation tracking.", FUNCTION );}
}
// adds controller tab button to window of viewController passed in
-(void) addControllerTab:(UIViewController*) viewController{
NSLog(@"%s: vC:%@", _FU NCTION__, viewController);
if (!_controllerButton){
//initial setup
double controllerWidth = 50;
double controllerHeight = 50;
CGRect frame = CGRectMake(0, controllerHeight, controllerWidth, controllerHeight);
_controllerButton = [[UI Button alloc]initWithFrame:frame];
Ullmage *buttonBG = [Ul l mage
imageNamed:@"TealiumiOSMobileCompanion_icon.png"];
LcontrollerButton setlmage:buttonBG forState:UIControlStateNormal];
_controllerButton. alpha = 0.0;
//dropshadow
_controllerButton.layer.masksToBounds = NO;
_controllerButton.layer.cornerRadius = 8;
_controllerButton.layer.shadowOffset = CGSizeMake(3, 3);
_controllerButton.layer.shadowRadius = 5;
_controllerButton.layer.shadowOpacity = 0.5;
//draggability
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self
action:@selector(viewDragged: )] ;
LcontrollerButton addGestureRecognizerpanGesturej;
56 #if !( has_feature(objc_arc))
[panGesture release];
#endif
//activation trigger
UITapGestureRecognizer *singleTap = [[UITapGestureRecognizer alloc] initWit Target:self action:@selector(displayPropertiesView)];
[singleTap setNumberGfTapsRequired:1 ];
[singleTap setNumberOfTouchesRequired: 1 ];
[singleTap requireGestureRecognizerToFail.panGesture];
LcontrolierButton addGestureRecognizer:singleTap];
#if !( has_feature(objc_arc))
[singleTap release];
#endif
//track orientation
[self enableOrientationHook];
if (!viewController) {
UlWindow *window = [[UIApplication sharedApplication]. windows lastObject];
viewController = window.roofViewController;
NSLog(@"%s: no vc passed in, autolocated rootviewcontroller in windows:%@ root:%@", _FUNCTION [UIApplication
sharedApplication].windows, viewController);
}
[viewController.view.window addSubview:_controllerButton];
}
if (_controllerButton. alpha < 1.0 && (!_popup || _popup.view.alpha < 1.0)) [self revealObject:_controllerButton];
_hostViewController = viewController;
[self matchRotation];
[self popEasterEggs];
}
-(void) revealObject:(id)object{
if (!object) return;
//TODO: animate size
UlView *view = (UIView*)object;
if (view.alpha < 1.0){
// CGAffineTransform transform =
CGAffineTransformMakeScale(1.0/AnimationScaling, 1 .0/AnimationScaling); // view.transform = transform;
[UlView animateWithDurationiTealiumAnimationDuration
57 animations:A{
view.alpha = 1.0;
} completion:nil
-(void) hideObject:(id)object{
if (!object) return;
U I View *view = (UIView*)object;
if (view.alpha > 0.0){
[UlView animateWithDuration.TealiumAnimationDuration
animations:A{
view.alpha = 0.0;
} completion:nil
];
}
- (void)viewDragged:(UIPanGestureRecognizer *)gesture
{
UlView *view = (UlView *)gesture.view;
CGPoint translation = [gesture translationlnView:view .window];
view, center = CGPointMake(view.center.x + translation.x,
view.center.y + translation. y);
//check & limit to window boundries
double newX = view.center.x;
double newY = view.center.y;
double leftBoundry = (view.window.frame. origin. x + (view.frame. size. width/2)); double rightBoundry = (view.window.frame. size.width - (view.frame. size. width/2));
double topBoundry = (view.window.frame. origin. y + (view.frame. size. height/2));
//enable to prevent tab bar from being partially hidden undereath status bar. Currently good only if app is limited to portrait mode
// if(![[UIApplication sharedApplication] isStatusBarHidden]) topBoundry = topBoundry + 20;
double bottomBoundry = (view.window.frame. size. height - (view.frame.size. height/2));
if (newX < leftBoundry) newX = leftBoundry;
if (newX > rightBoundry) newX = rightBoundry;
if (newY < topBoundry) newY = topBoundry;
58 if (newY > bottomBoundry) newY = bottomBoundry;
if (newX != view.center.x || newY != view. center. y){
view. center = CGPointMake(newX, newY);
}
CGRect converted = [view convertRect:view.layer.frame toView:nil];
[self tealiumiOSPopupMoved:converted];
// reset translation
[gesture setTranslation.CGPointZero inView.view];
}
-(void) removeControllerTab{
[_controllerButton removeFromSuperview];
_controllerButton = nil;
}
//resets mobile companion icon & dialog box but NOT overlays
- (void) refresh:(UIViewController*)viewController{
NSLog(@"%s. vc.%@", _FUNCTION viewController);
_hostViewController = viewController;
@synchronized(self){
if(!_controllerButton) [self addControllerTab:viewController];
}
if (_popup. view. alpha > 0.0) [self displayPropertiesView];
}
//TODO: update to use same method from autoTracker
// returns diet of propertyNames and class types
-(NSDictionary *) dictionaryWithPropertiesOfObject:(id)object{
if([object isKindOfClass:[UIViewController class]]){
NSMutableDictionary *mDict = [NSMutableDictionary dictionary];
unsigned count;
objc_property_t *properties = class_copyPropertyList([object class], &count); for (int i = 0; i < count; i++) {
NSString *key = [NSString
stringWithUTF8String:property_getName(properties[i])];
Class classObject = NSCIassFromString(key);
if (classObject) {
NSLog(@"%s: class object found:%@", _FUNCTION , classObject);
59 if ([object respondsToSelector:@selector(valueForKey:)]){
id subObj = [self dictionaryWithPropertiesOfObject:[object valueForKey:key]];
[mDict setObject:subObj forKey:key];
}
}
else
{
if ([object respondsToSelector:@selector(valueForKey:)]){
id aObject = [object valueForKey:key];
if(aObject) [mDict setObject:aObject forKey:key];
}
}
}
free(properties);
NSDictionary *propertyDict = [NSDictionary dictionaryWithDictionary:mDict]; return propertyDict;
} else return nil;
}
#pragma mark - POPUP DATA
- (NSDictionary*) overviewData{
NSMutableDictionary *mDict = [NSMutableDictionary dictionary];
if (_data){
[mDict setObject:_data forKey:@"Account Information"];
}
NSDictionary *utag_config = [[TealiumiOSTagger sharedlnstance] utag_config]; if (!utag_config) utag_config = [NSDictionary
dictionaryWithObjectsAndKeys:@"T racking all objects", @"(No data loaded)", nil];
[mDict setObject:utag_config forKey:@"Current UTag Configuration"];
NSDictionary *dictionary = [NSDictionary dictionaryWithDictionary:mDict]; return [NSDictionary dictionaryWithObjectsAndKeys:dictionary,
TealiumPopupTab_Overview, nil];
}
- (NSDictionary*) propertiesDataForViewController:(id)controller{
NSDictionary *propertiesData= [self
dictionary WithPropertiesOfObject:controller];
60 NSDictionary *typeData = [NSDictionary
dictionaryWithObjectsAndKeys:NSStringFromClass([controller class]), @"class", nil];
//TODO: insert UTAG data dictionary here
//get global data
NSMutableDictionary *viewData = [NS utableDictionary
dictionary WithDictionary:[TealiumiOSTagger
sharedlnstance].baseUTagVariables];
/ TODO: add any additional event data
NSDictionary *additionalEventData = ([TealiumiOSTagger sharedlnstance] additionalEventDataFoncontroller];
if (additionalEventData) [viewData
addEntriesFromDictionary:additionalEventData];
NSDictionary *parentData = [NSDictionary
dictionaryWithObjectsAndKeys:viewData, @"Utag Call Data (View)",
propertiesData, ©"Properties", typeData, @"lnfo", nil];
return [NSDictionary dictionaryWithObjectsAndKeys:parentData,
TealiumPopupTab_Parent, nil];
}
//sent and queued calls log
- (NSDictionary*) outboundLog{
NSDictionary *log = [[TealiumiOSTagger sharedlnstance] outboundLog]; NSDictionary 'queue = [[TealiumiOSTagger sharedlnstance] outboundQueue];
NSDictionary *logData = [NSDictionary dictionaryWithObjectsAndKeys:log, @"Sent", queue, @"Queued",nil];
// NSLog(@"%s: log:%@", _FUNCTION_, callLogDict);
return [NSDictionary dictionaryWithObjectsAndKeysilogData,
TealiumPopupTab_Log, nil];
}
- (NSDictionary*) objectDataForOverlayButton:(UIButton*)overlayButton{ id object = [_overlayTargets objectAtlndex:overlayButton.tag];
UlView *objectView = object;
NSMutableDictionary *mDict = [NSMutableDictionary dictionary];
//object class
NSString *objectClass = NSStringFromClass([object class]);
[mDict setObject:objectClass forKey:@"class"];
61 //accessibility label
if (lobjectView.accessibilityLabel) [mDict setObject:@"(Not set)"
forKey:@"accessibilityLabel"];
else [mDict setObject:objectView.accessibilityLabel
forKey:@"accessibilityLabel"];
//tealium id
NSString ealiumID = [[[TealiumiOSTagger shared Instance]
autoTracker].refTracker tealiumldForObject:object];
if (tealiumID) [mDict setObject:tealiumlD
forKey:TealiumUTAG_TealiumRefKey];
else {
NSLog(@"%s: object has no tealium ID yet. Denying overlay request", _FUNCTION__);
return nil;
}
//index
UlViewController *parentObject = [_°vsriayControllers
objectAtlndex.overlayButton.tag];
NSArray *subviews = [[parentObject.view subviews] copy];
block int targetlndex = -1 ;
[subviews enumerateObjectsL)singBlock:A(id obj, NSUInteger idx, BOOL *stop)
//special processing for mpmovieplayerviewcontrollers - get underlying mpmovieplayer's view
id objectCopy = object;
if ([object isKindOfClass:[MPMoviePlayerViewController class]]){
P oviePlayerViewController *mpvc = object;
objectCopy = mpvc.moviePlayer.view;
}
if ([object isKindOfClass:[ P oviePlayerController class]]){
PMoviePlayerController *mpc = object;
objectCopy = mpc.view;
}
if (obj == objectCopy){
targetlndex = idx;
•stop = YES;
}
}];
#if !( has_feature(objc_arc))
[subviews autorelease];
#endif
62 //Link call data
NSMutableDictionary "linkData = [NS utableDictionary
dictionary WithDictionary:[TealiumiOSTagger
sharedlnstance]. baseUTagVariables];
NSDictionary *eventData = [[[TealiumiOSTagger sharedlnstance] autoTracker] eventDataForObjectobject];
[linkData addEntriesFromDictionary:eventData];
//add any custom additional event data
NSDictionary 'additionalEventData = [[TealiumiOSTagger sharedlnstance] additionalEventDataFortealiumlD];
if (!additionalEventData) additionalEventData = [[TealiumiOSTagger sharedlnstance] additionalEventDataFonobject];
if (additionalEventData) [linkData
addEntriesFromDictionary:additionalEventData];
//set object data to diet
NSDictionary *objectlnfo = [NSDictionary dictionaryWithObjectsAndKeys:mDict, ©"Object Info", linkData, @"Utag Call Data (Link)", nil];
NSDictionary *item = [NSDictionary
dictionaryWithObjectsAndKeys:objectlnfo,TealiumPopupTab_ltem, nil];
return item;
}
#pragma mark - POPUP DELEGATE
- (void) tealiumiOSPopupMoved:(CGRect)popupFrame{
UlViewController *viewController = [self
encompassingViewControllerForPopup:popupFrame];
if (viewController && viewController != _hostViewController) [self
refresh:viewController];
}
// returns current view controller of view beneath popup
- (UlViewController*)
encompassingViewControllerForPopup:(CGRect)popupFrame{
UlViewController *rootViewController = [[TealiumiOSTagger sharedlnstance] rootController];
NSMutableArray *mArray = [NSMutableArray array];
mArray = [self extendArray:mArray
withObjectslnViewControllerrootViewController];
for (UIViewController*viewController in mArray){
CGRect converted = [viewController. view
convertRect: iewController. view.frame toView:nil];
63 if (viewController != _hostViewController &&
CGRectlntersectsRect(converted, popupFrame)){
return viewController;
}
}
return nil;
}
// scans and returns viewControllers within a given controller object
// calls itself to dive deeper into navigation and tabbarcontrollers
- (NSMutableArray*) extendArray:(NS utableArray*)originArray
withObjectslnViewController:(id) controlled
if (controller == _hostViewController) return originAnray;
if ([controller isKindOfClass:[UINavigationController class]]){
// grab only presentedViewController
UINavigationController *navC = controller;
originArray = [self extendArray:originArray
withObjectslnViewControllemavC.topViewController];
} else if ([controller isKindOfClass:[UITabBarController class]]){
// grab only selected tab
UITabBarController *tbc = controller;
originArray = [self extendArrayioriginArray
withObjectslnViewControllentbc.selectedViewController];
} else if ([controller respondsToSelector:@selector(viewControllers)]){
//grab all views
NSArray 'controllers = [[controller valueForKey:@"viewControllers"] copy]; for (id aController in controllers){
originArray = [self extendArray:originArray
withObjectslnViewController: aController];
}
#if !( has_feature(objc_arc))
[controllers autorelease];
#endif
} else if ([controller isKindOfClass:[UIViewController class]]){
[originArray addObject:controller];
}
return originArray;
}
- (void) tealiumiOSPopupClosed{
if (_hostViewController && !_controllerButton) [self
addControllerTab:_hostViewController];
64 [self revealObject:_controllerButton];
}
//shutdown mobile companion
- (void) tealiumiOSPopupTurnOff{
[self lock];
}
#pragma mark - OVERLAY BUTTONS
//Adds a Tealium MC button over target UlObject
//param object is the target object to add overlay to
-(void) addOverlayButtonToObject:(id)object
inRoadMap:(NSMutableArray*)roadMap{
// NSLog(@"%s: object:%@ viewController:%@", _FUNCTION_, object, viewController);
if (!object) object = [[[TealiumiOSTagger sharedlnstance]
autoTracker].refTracker lastObjectFrom:roadMap];
NSLog(@"%s: object:%@ roadmap:%@", FUNCTION , object, roadMap); if ([self objectAlreadyOverlaid.object]) return;
// if (![[TealiumiOSTagger sharedlnstance] isTrackable:object]) return;
if (object == _controllerButton) return;
NSLog(@"%s: object new.",_FUNCTION_);
//TODO: add new BOOL check for mobilecompanion within roadmap
// if ([viewController is indOfClass:[TealiumiOSMobileCompanion class]]) return; //get viewController of object for overlays
UlViewController 'viewController = [[[TealiumiOSTagger sharedlnstance] autoTracker].refTracker lastClass:@"UIViewController" from:roadMap];
UlView *objectView = nil;
//find view of mpmovieplayerViewcontroller
if ([object isKindOfClass:[MPMoviePlayerViewController class]]){
NSLog(@"%s: mpmovieplayerViewcontroller found", FUNCTION );
PMoviePlayerViewController *playerViewController = object;
object = playerViewController;
objectView = playerViewController.moviePlayer.view;
}
else if ([object isKindOfClass:[MPMoviePlayerController class]]){
65 NSLog(@"%s: mpmovieplayercontroller found", FUNCTION );
MPMoviePlayerController *playerController = object;
objectView = playerController.view;
}
//find view of avplayer
//TODO: complete AVPIayerClass items
if ([object isKindOfClass:[AVPIayer class]]){
NSLog(@"%s: avplayer found", _FUNCTION_);
//TODO: hookup avplayer overlays
// AVPIayer *player = object;
// object = player;
// objectView = player.currentltem.view;
return;
}
else if ([object isKindOfClass:[AVPIayerltem class]]){
NSLog(@"%s: avplayeritem found", _FUNCTION_);
// MPMoviePlayerController *playerController = object;
// objectView = playerController.view;
return;
}
//all other views
else if ([object respondsToSelector:@selector(addSubview:)]){
//object view is the actual object to lay the overlay button over objectView = object;
} if (!_overlays ) _overlays = [[NSMutableArray alloc] init];
if (!_overlayTargets ) _overlayTargets = [[NSMutableArray alloc] init]; if (!_overlayControllers ) _overlayControllers = [[NSMutableArray alloc] i
// //get immediate viewController of object for overlays
// UlViewController *viewController = [[TealiumiOSTagger
sharedlnstance].refTracker lastClass:@"UI ViewController"
from:roadMap];//[[TealiumiOSTagger sharedlnstance].refTracker lastViewControllerFromiroadMap];
NSLog(@"%s: vc:%@", _FUNCTION_, viewController);
// special processing for segmented controls
if ([object isKindOfClass:[UISegmentedControl class]]){
// if ([object respondsToSelector:@selector(subviews)]){
66 //add double tap gesture recognizer to top segmentedController itself if (!_overlayColor) _overlayColor = [UlColor blueColor];
UITapGestureRecognizer *doubleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(displayTrackableltemDetails:)];
[doubleTap setNumberOfTapsRequired:2];
[doubleTap setNumberOfTouchesRequired:1];
[object addGestureRecognizerdoubleTap];
#if !( as_feature(objc_arc))
[doubleTap release];
#endif
//add single tap gesture recognizer to segment items
NSArray *subviews = [object subviews];
for (int i = 0; i < [subviews count]; i++){
id aSubview = [subviews objectAtlndex:i];
[self addOverlayToSubview:aSubview target:object
controller: viewController];
}
}
else {
[self addOverlayToSubview:objectView target:object
controllenviewController];
}
}
// Adds actual overlay
-(void) addOverlayToSubview:(id)subview target:(id)target controller:(id)controller{
NSLog(@"%s: subview:%@ target:%@ controller:%@", _FUNCTiON_, subview, target, controller);
if (!subview || !target || !controller) return;
if (![subview respondsToSelector:@selector(addSubview:)]) return;
block UIButton 'button = [self createOverlayButtonForView.subview];
if (button){
if (!controller){
NSLog(@"%s: WARNING! object:%@'s overlay has no associated controller.", _FUNCTION target);
controller = [NSNull null];
}
LoverlayTargets addObject:target];
67 [_overlayControllers addObject: controller];
//tag of gesture rec should match index of target
LoverlayTargets enumerateObjectsUsingBlock:A(id obj, NSUInteger idx, BOOL *stop) {
if (obj == target) {
[button setTag: idx];
'stop = YES;
}
}];
[_overlays addObject:button];
button. alpha = 0.0;
[subview addSubview:buttonJ;
}
}
//actual button creation
-(UIButton*) createOverlayButtonForView:(UIView*)view{
//ignore objects that are not currently visible
if (view. alpha == 0.0 || !view) return nil;
CGRect bounds = view. bounds;
UIButton 'button = [UIButton buttonWithType:UIButtonTypeCustom];
[button setFrame:bounds];
if (!_overlayColor) _overlayColor = [UlColor blueColor];
[button setBackgroundColor:_overlayColor];
[button setAlpha:self.overlayAlpha];
UITapGestureRecognizer *doubleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(displayTrackableltemDetails:)];
[doubleTap setNumberOfTapsRequired:2];
[doubleTap setNumberOfTouchesRequired: 1];
[button addGestureRecognizerdoubleTap];
#if !( has_feature(objc_arc))
[doubleTap release];
#endif
UITapGestureRecognizer *singleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(passThru:)];
[singleTap setNumberOfTapsRequired: 1];
[singleTap setNumberOfTouchesRequired:1 ];
[singleTap requireGestureRecognizerToFail:doubleTap];
[button addGestureRecognizersingleTap];
68 #if !( has_feature(objc_arc))
[singleTap release];
#endif
return button;
}
-(BOOL) objectAlreadyOverlaid:(id) object{
block BOOL answer = NO;
LoverlayTargets enumerateObjectsUsingBlock:A(id obj, NSUInteger idx, BOOL *stop) {
if (obj == object) {
answer = YES;
•stop = YES;
}
}];
return answer;
-(void) revealOverlay:(UI Button*) button{
if(button. alpha < self.overlayAlpha){
CGAffineTransform transform =
CGAffineTransformMakeScale(1 .0/AnimationScaling, 1.0/AnimationScaling); button. transform = transform;
[UlView animateWithDuration:TealiumAnimationDuration
animations:A{
CGAffineTransform transform = CGAffineTransform Identity; button. transform = transform;
button. alpha = self.overlayAlpha;
}];
}
}
-(void) passThru:(id)sender{
//Pass single tap through to underlying object
//Note: sender is the gesture recognizer, not the button attached to
if([sender isKindOfClass:[UIGestureRecognizer class]]){
UIGestureRecognizer 'gesture = sender;
UI Button 'button = (UI Button*)gesture.view;
id object = [_overlayTargets objectAtlndex:button.tag];
//buttons
if ([object respondsToSelector:@selector(sendActionsForControlEvents:)]){
UIButton *host = object;
69 [host sendActionsForControlEvents:UIControlEventTouchUplnside];
}
//switches
if ([object isKindOfClass:[UISwitch class]]){
UlSwitch *host = object;
//changes visually
if (host.on) [host setOn:NO animated:YES];
else [host setOn:YES animated:YES];
//triggers any attached actions
[host sendActionsForControlEvents:UIControlEventTouchUplnside]; [host sendActionsForControlEventsiUIControlEventValueChanged];
}
//textfields
if ([object isKindOfClass:[UITextField class]]){
UITextField *host = object;
[host becomeFirstResponder];
}
//textviews
if ([object isKindOfClass:[UITextView class]]){
UITextView *host = object;
[host becomeFirstResponder];
}
//video
if ([object isKindOfClass:[MPMoviePlayerViewController class]]){ //tell underlying mpmoviePlayer to play
MPMoviePlayerViewController *playerViewController = object; MPMoviePlayerController *player = playerViewController.movie
//play
if (player.playbackState != 1){
[player play];
}
//pause
else {
[player pause];
}
}
if ([object isKindOfClass:[MPMoviePlayerController class]]){
MPMoviePlayerController *player = object;
70 //play
if (player.playbackState
[player play];
}
//pause
else {
[player pause];
}
//tableview cells
II...
//segmentedControls
II...
}
}
#pragma mark - MEMORY MANAGMENT
#if !( as_feature(objc_arc))
-(void)dealloc{
[super dealloc];
[self purge];
if (_data) { Ldata release]; _data = nil; }
if (jTiotionManager) (LmotionManager release]; _motionManager
//may not need this call
if (_controllerButton != nil){
LcontrollerButton release];
_controllerButton = nil;
}
}
#endif
@end
71

Claims

WHAT IS CLAIMED IS:
1 . A method of presenting information about elements of a host application, the method comprising:
under control of a physical computing device comprising digital logic circuitry:
executing a host application;
receiving a first user input indicative of a user shaking the physical computing device;
in response to determining that the first user input matches a first activation input, executing a confirmation routine to process one or more additional user inputs to the physical computing device;
receiving a second user input with the confirmation routine after said receiving the first user input, the second user input indicative of the user contacting a screen of the physical computing device; and in response to determining, using the confirmation routine, that the second user input matches a second activation input, displaying a configuration utility on the screen, the configuration utility configured to output information regarding trackable elements of the host application.
2. The method of claim 1 , further comprising:
under control of the physical computing device:
receiving a third user input indicative of selection of an interactive user interface element of the trackable elements of the host application after said receiving the second user input, the third user input indicative of the user contacting the screen;
in response to determining that the third user input matches a configuration selection input, processing the third user input using the configuration utility to output a tracking identifier associated with the interactive user interface element; and
-41 - in response to determining that the third user input does not match the configuration selection input, navigating within the host application based at least on the interactive user interface element.
3. The method of claim 1 , wherein the configuration utility is configured for use by an administrator of the host application and not for use by an end user of the host application.
4. The method of any of claims 1 -3, further comprising:
under control of the physical computing device:
in response to determining that the second user input has not been received within a timeout period, stopping said executing the confirmation routine.
5. The method of any of claims 1 -3, wherein the physical computing device comprises a mobile phone or a tablet computer, and the host application comprises the confirmation routine and the configuration utility.
6. Non-transitory physical computer storage comprising computer- executable instructions stored thereon that, when executed by one or more processors, are configured to implement a process comprising:
receiving configuration information for configuring a physical computing device;
receiving a first user input from a user of the physical computing device, the first user input comprising a motion component;
in response to determining that the first user input matches a first activation input, listening for a second user input to the physical computing device using confirmation instructions of the computer-executable instructions;
receiving the second user input from the user; and
in response to determining, using the confirmation instructions, that the second user input matches a second activation input, displaying a configuration utility interface on a display of the physical computing device, the configuration utility interface configured to display information indicative of the configuration information.
-42-
7. The non-transitory physical computer storage of claim 6, wherein the first activation input is different from the second activation input.
8. The non-transitory physical computer storage of claim 6, wherein process further comprises:
receiving a third user input from the user, the third user input indicative of selection of an element of a user interface displayed on the display;
in response to determining that the third user input matches a configuration selection input, displaying information corresponding to the third user input in the configuration utility interface, the configuration utility interface shown in juxtaposition to the user interface on the display; and
in response to determining that the third user input does not match the configuration selection input, displaying information corresponding to the third user input in the user interface.
9. The non-transitory physical computer storage of claim 6, wherein process further comprises, in response to determining that the second user input has not been received within a timeout period, stopping said listening for the second input to the physical computing device using the confirmation instructions.
10. The non-transitory physical computer storage of any of claims 6-9, wherein the configuration information denotes elements of a user interface to be tracked as the user interacts with the user interface.
1 1 . The non-transitory physical computer storage of claim 10, wherein the process further comprises transmitting, to a tracking server, data indicative of interactions of the user with the elements of the user interface denoted by the configuration information.
12. The non-transitory physical computer storage of claim 1 1 , wherein the elements of the user interface denoted by the configuration information comprise links displayed in the user interface.
13. The non-transitory physical computer storage of any of claims 6-9, wherein the configuration utility interface is further configured to display whether elements of a user interface are trackable as the user interacts with the user interface.
-43-
14. The non-transitory physical computer storage of any of claims 6-9, wherein the configuration utility interface is further usable by the user to change the configuration information stored on the configuration information server when the user is an authenticated user.
15. The non-transitory physical computer storage of any of claims 6-9, wherein the second user input comprises an input indicative of consecutive taps on the display by the user.
16. The non-transitory physical computer storage of any of claims 6-9, wherein the computer-executable instructions comprise user interface instructions for displaying a user interface and configuration utility instructions for displaying the configuration utility interface, the confirmation and configuration utility instructions comprising third-party developed computer-executable instructions, the user interface instructions comprising first-party developed computer-executable instructions.
17. The non-transitory physical computer storage of claim 16, wherein the configuration utility interface is configured for use by an administrator of the computer-executable instructions, and the user interface is configured for use by an end user of the computer-executable instructions.
18. A system for presenting information regarding elements of a host application, the system comprising:
a memory configured to store a host application; and
a hardware processor in communication with the memory, the hardware processor configured to:
execute the host application,
listen for a motion input,
in response to determining that the motion input matches an expected motion input, listen for a user input received before an end of a timeout period, and
in response to determining that the user input matches an activation input, invoke an operation module,
-44- wherein the expected motion input is different from the activation input.
19. The system of claim 18, wherein the processor is further configured to: in response to determining that a second user input matches a configuration selection input, process the second user input using the configuration utility; and
in response to determining that the second user input does not match the configuration selection input, not process the second user input using the configuration utility.
20. The system of claim 18 or 19, wherein the determination of whether the motion input matches the expected motion input and the determination of whether the user input matches the activation input are configured to provide a confirmation that a user intends to activate the configuration utility so that an end user of the host application does not accidentally encounter the configuration utility during routine use of the host application.
21 . A system for providing access to a tag management application, the system comprising a mobile device, the mobile device comprising:
a processor; and
a memory device configured to store at least a tag management application and a gesture-to-display module;
the gesture-to-display module configured to, when executed by the processor:
listen for a shake gesture corresponding to a user shaking the mobile device;
in responsive to identifying the shake gesture, determine whether a predetermined interaction with the mobile device has occurred; and
in response to determining that the predetermined interaction with the mobile device has occurred, invoke the tag management application.
-45-
22. The system of claim 21 , wherein the gesture-to-display module is further configured to listen for the shake gesture by hooking into a gesture application programming interface (API) of a host application stored in the memory.
23. The system of claim 21 , wherein the gesture-to-display module is further configured to output an invisible overlay over a host application interface.
24. The system of claim 23, wherein the gesture-to-display module is further configured to detect screen activity via the invisible overlay to determine whether the predetermined interaction with the mobile device has occurred.
25. The system of claim 23, wherein the predetermined interaction comprises one or both of taps and swipes.
26. The system of claim 21 , wherein the gesture-to-display module is further configured to determine whether a predetermined interaction with the mobile device has occurred by activating a voice detection module of the mobile device to listen for a voice command.
27. A system comprising:
a processor; and
a memory device configured to store at least a first application and a gesture-to-display module;
the gesture-to-display module configured to, when executed by the processor:
listen for a shake gesture corresponding to a user shaking the mobile device;
in responsive to identifying the shake gesture, determine whether a predetermined interaction with the mobile device has occurred; and
in response to determining that the predetermined interaction with the mobile device has occurred, invoke the first application.
28. The system of claim 27, wherein the gesture-to-display module is further configured to listen for the shake gesture by hooking into a gesture application programming interface (API) of a host application stored in the memory.
-46-
29. The system of claim 27, wherein the gesture-to-display module is further configured to output an invisible overlay over a host application interface.
30. The system of claim 29, wherein the gesture-to-display module is further configured to detect screen activity via the invisible overlay to determine whether the predetermined interaction with the mobile device has occurred.
31 . The system of claim 29, wherein the predetermined interaction comprises one or both of taps and swipes.
32. The system of claim 27, wherein the gesture-to-display module is further configured to determine whether a predetermined interaction with the mobile device has occurred by activating a voice detection module of the mobile device to listen for a voice command.
33. A method comprising:
under control of a computing device comprising a processor:
listening for a shake gesture corresponding to a user shaking the computing device;
in responsive to identifying the shake gesture, determining whether a predetermined interaction with the computing device has occurred; and
in response to determining that the predetermined interaction with the computing device has occurred, invoking the first application.
34. The method of claim 33, wherein said listening for the shake gesture comprises hooking into a gesture application programming interface (API) of a host application.
35. The method of claim 33, further comprising outputting an invisible overlay over a host application interface.
36. The method of claim 35, further comprising detecting screen activity via the invisible overlay to determine whether the predetermined interaction with the mobile device has occurred.
37. The method of claim 35, wherein the predetermined interaction comprises one or both of taps and swipes.
-47-
38. The method of claim 33, further comprising determining whether a predetermined interaction with the computing device has occurred by activating a voice detection module of the computing device to listen for a voice command.
-48-
PCT/US2014/012217 2013-01-22 2014-01-20 Activation of dormant features in native applications WO2014116542A1 (en)

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US201361755362P 2013-01-22 2013-01-22
US61/755,362 2013-01-22
US201361872530P 2013-08-30 2013-08-30
US61/872,530 2013-08-30
US201361889876P 2013-10-11 2013-10-11
US61/889,876 2013-10-11
US201361896351P 2013-10-28 2013-10-28
US61/896,351 2013-10-28
US201361900274P 2013-11-05 2013-11-05
US61/900,274 2013-11-05

Publications (1)

Publication Number Publication Date
WO2014116542A1 true WO2014116542A1 (en) 2014-07-31

Family

ID=50102201

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/012217 WO2014116542A1 (en) 2013-01-22 2014-01-20 Activation of dormant features in native applications

Country Status (2)

Country Link
US (2) US8843827B2 (en)
WO (1) WO2014116542A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9973585B2 (en) 2015-04-11 2018-05-15 Evidon, Inc. Methods, apparatus and systems for providing notice of digital tracking technologies in mobile apps on mobile devices, and for recording user consent in connection with same
US10026098B2 (en) 2010-01-06 2018-07-17 Evidon, Inc. Systems and methods for configuring and presenting notices to viewers of electronic ad content regarding targeted advertising techniques used by Internet advertising entities
US10291492B2 (en) 2012-08-15 2019-05-14 Evidon, Inc. Systems and methods for discovering sources of online content

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012161707A1 (en) * 2011-05-25 2012-11-29 Hewlett-Packard Development Company, L.P. Implementation of network device components in network devices
US9348490B2 (en) 2012-09-14 2016-05-24 Ca, Inc. User interface with configuration, registration, and runtime selection of views
US9412115B2 (en) 2013-03-14 2016-08-09 Observepoint, Inc. Configuring tags to monitor other webpage tags in a tag management system
US9418170B2 (en) 2013-03-14 2016-08-16 Observepoint, Inc. Creating rules for use in third-party tag management systems
US10134095B2 (en) * 2013-06-05 2018-11-20 Brabble TV.com LLC System and method for media-centric and monetizable social networking
US9537964B2 (en) 2015-03-11 2017-01-03 Tealium Inc. System and method for separating content site visitor profiles
US20150066587A1 (en) 2013-08-30 2015-03-05 Tealium Inc. Content site visitor processing system
US11695845B2 (en) 2013-08-30 2023-07-04 Tealium Inc. System and method for separating content site visitor profiles
US8805946B1 (en) 2013-08-30 2014-08-12 Tealium Inc. System and method for combining content site visitor profiles
US9081789B2 (en) 2013-10-28 2015-07-14 Tealium Inc. System for prefetching digital tags
US8990298B1 (en) 2013-11-05 2015-03-24 Tealium Inc. Universal visitor identification system
US10373192B2 (en) 2014-08-18 2019-08-06 Google Llc Matching conversions from applications to selected content items
US9357366B2 (en) 2014-09-12 2016-05-31 Observepoint, Inc. Auditing of mobile applications
US9842133B2 (en) 2014-09-12 2017-12-12 Observepoint, Inc. Auditing of web-based video
US9363311B1 (en) 2014-12-05 2016-06-07 Tealium Inc. Delivery of instructions in host applications
US20160162168A1 (en) * 2014-12-05 2016-06-09 Microsoft Technology Licensing, Llc Interaction sensing and recording of a process to control a computer system
KR101620779B1 (en) * 2015-01-08 2016-05-17 네이버 주식회사 Method and system for providing retargeting search services
CN105094801B (en) 2015-06-12 2019-12-24 阿里巴巴集团控股有限公司 Application function activation method and device
CN105093580B (en) * 2015-08-06 2020-12-01 京东方科技集团股份有限公司 Peep-proof structure, display panel, backlight module and display device
US10656907B2 (en) 2015-11-03 2020-05-19 Observepoint Inc. Translation of natural language into user interface actions
CN107153498B (en) * 2016-03-30 2021-01-08 斑马智行网络(香港)有限公司 Page processing method and device and intelligent terminal
JP6683835B2 (en) * 2016-04-12 2020-04-22 グーグル エルエルシー Reduced waiting time when downloading electronic resources using multiple threads
US9753898B1 (en) 2016-05-02 2017-09-05 Tealium Inc. Deployable tag management in computer data networks
US9807184B1 (en) 2016-06-02 2017-10-31 Tealium Inc. Configuration of content site user interaction monitoring in data networks
US11295706B2 (en) * 2016-06-30 2022-04-05 Microsoft Technology Licensing, Llc Customizable compact overlay window
US10078708B2 (en) 2016-11-15 2018-09-18 Tealium Inc. Shared content delivery streams in data networks
US10268657B2 (en) 2017-06-06 2019-04-23 Tealium Inc. Configuration of content site user interaction monitoring in data networks
US10327018B2 (en) 2017-10-17 2019-06-18 Tealium Inc. Engagement tracking in computer data networks
CN108874450B (en) * 2018-05-28 2021-05-04 北京小米移动软件有限公司 Method and device for waking up voice assistant
US10289445B1 (en) 2018-12-11 2019-05-14 Fmr Llc Automatic deactivation of software application features in a web-based application environment
US10599434B1 (en) * 2018-12-14 2020-03-24 Raytheon Company Providing touch gesture recognition to a legacy windowed software application
US11095735B2 (en) 2019-08-06 2021-08-17 Tealium Inc. Configuration of event data communication in computer networks
US11146656B2 (en) 2019-12-20 2021-10-12 Tealium Inc. Feature activation control and data prefetching with network-connected mobile devices
US11841880B2 (en) 2021-08-31 2023-12-12 Tealium Inc. Dynamic cardinality-based group segmentation

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030184452A1 (en) * 2002-03-28 2003-10-02 Textm, Inc. System, method, and computer program product for single-handed data entry
US20120154292A1 (en) * 2010-12-16 2012-06-21 Motorola Mobility, Inc. Method and Apparatus for Activating a Function of an Electronic Device
US20120169624A1 (en) * 2011-01-04 2012-07-05 Microsoft Corporation Staged access points

Family Cites Families (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999018514A1 (en) 1997-10-06 1999-04-15 Contact Dynamics, Inc. System enabling a salesperson to chat with a customer browsing the salesperson's web site
US6836799B1 (en) 1998-09-11 2004-12-28 L.V. Partners, L.P. Method and apparatus for tracking user profile and habits on a global network
JP4741153B2 (en) 2000-03-01 2011-08-03 ブリティッシュ・テレコミュニケーションズ・パブリック・リミテッド・カンパニー Data transfer method and apparatus
GB0019151D0 (en) 2000-08-07 2000-09-27 Pace Micro Tech Plc Deferred internet page reformatting
WO2002044869A2 (en) 2000-11-02 2002-06-06 Netiq Corporation System and method for generating and reporting cookie values at a client node
JP2004529428A (en) 2001-04-16 2004-09-24 ポルト ラネリ,エセ.アー. How to integrate email and world wide web communication with users
US20030208594A1 (en) 2002-05-06 2003-11-06 Urchin Software Corporation. System and method for tracking unique visitors to a website
US7389343B2 (en) 2002-09-16 2008-06-17 International Business Machines Corporation Method, system and program product for tracking web user sessions
US7334088B2 (en) 2002-12-20 2008-02-19 International Business Machines Corporation Page descriptors for prefetching and memory management
US9117217B2 (en) 2003-08-01 2015-08-25 Advertising.Com Llc Audience targeting with universal profile synchronization
US20050125290A1 (en) 2003-08-01 2005-06-09 Gil Beyda Audience targeting system with profile synchronization
US20050138143A1 (en) 2003-12-23 2005-06-23 Thompson Blake A. Pre-fetching linked content
US7483941B2 (en) 2004-01-13 2009-01-27 International Business Machines Corporation System and method for dynamically inserting prefetch tags by the web server
US7458019B2 (en) 2004-01-20 2008-11-25 International Business Machines Corporation System and method for creating and rendering client-side user interfaces via custom tags
JP4452533B2 (en) 2004-03-19 2010-04-21 株式会社日立製作所 System and storage system
US7792954B2 (en) 2004-04-02 2010-09-07 Webtrends, Inc. Systems and methods for tracking web activity
US20060031778A1 (en) 2004-07-01 2006-02-09 Microsoft Corporation Computing platform for loading resources both synchronously and asynchronously
US8131861B2 (en) 2005-05-20 2012-03-06 Webtrends, Inc. Method for cross-domain tracking of web site traffic
US8239882B2 (en) * 2005-08-30 2012-08-07 Microsoft Corporation Markup based extensibility for user interfaces
US7805670B2 (en) 2005-12-16 2010-09-28 Microsoft Corporation Partial rendering of web pages
US8775919B2 (en) 2006-04-25 2014-07-08 Adobe Systems Incorporated Independent actionscript analytics tools and techniques
US7493451B2 (en) 2006-06-15 2009-02-17 P.A. Semi, Inc. Prefetch unit
US7992135B1 (en) 2006-06-26 2011-08-02 Adobe Systems Incorporated Certification of server-side partner plug-ins for analytics and privacy protection
US20060271669A1 (en) 2006-07-13 2006-11-30 Cubicice(Pty) Ltd Method of collecting data regarding a plurality of web pages visited by at least one user
US8539345B2 (en) 2006-07-24 2013-09-17 International Business Machines Corporation Updating portlet interface controls by updating a hidden version of the control and then switching it with a displayed version
WO2008024706A2 (en) 2006-08-21 2008-02-28 Crazy Egg, Inc. Visual web page analytics
US7610276B2 (en) 2006-09-22 2009-10-27 Advertise.Com, Inc. Internet site access monitoring
US7685200B2 (en) 2007-03-01 2010-03-23 Microsoft Corp Ranking and suggesting candidate objects
US7680940B2 (en) 2007-03-28 2010-03-16 Scenera Technologies, Llc Method and system for managing dynamic associations between folksonomic data and resources
WO2009009109A1 (en) 2007-07-09 2009-01-15 Blaksley Ventures 108, Llc System and method for providing universal profiles for networked clusters
US7685168B2 (en) 2007-08-31 2010-03-23 International Business Machines Corporation Removing web application flicker using AJAX and page templates
US8429243B1 (en) 2007-12-13 2013-04-23 Google Inc. Web analytics event tracking system
US10664889B2 (en) 2008-04-01 2020-05-26 Certona Corporation System and method for combining and optimizing business strategies
CN101483651B (en) 2009-01-09 2012-04-25 南京联创科技集团股份有限公司 Data transmission method based map queue
US8386599B2 (en) 2009-03-04 2013-02-26 Max Fomitchev Method and system for estimating unique visitors for internet sites
US8930818B2 (en) 2009-03-31 2015-01-06 International Business Machines Corporation Visualization of website analytics
US20100281008A1 (en) 2009-04-29 2010-11-04 Digital River, Inc. Universal Tracking Agent System and Method
US8713536B2 (en) 2009-06-11 2014-04-29 Webtrends, Inc. Method and system for constructing a customized web analytics application
US20110015981A1 (en) 2009-07-17 2011-01-20 Mahesh Subramanian Systems and methods to incentivize transactions to enhance social goodness
US8453059B2 (en) 2009-08-31 2013-05-28 Accenture Global Services Limited Traffic visualization across web maps
WO2011041465A1 (en) 2009-09-30 2011-04-07 Tracking.Net Enhanced website tracking system and method
US8671089B2 (en) 2009-10-06 2014-03-11 Brightedge Technologies, Inc. Correlating web page visits and conversions with external references
US20110119100A1 (en) 2009-10-20 2011-05-19 Jan Matthias Ruhl Method and System for Displaying Anomalies in Time Series Data
US8359313B2 (en) 2009-10-20 2013-01-22 Google Inc. Extensible custom variables for tracking user traffic
US8578010B2 (en) 2009-12-17 2013-11-05 Mastercard International Incorporated Methods and system for tracking web page analytics
US10185964B2 (en) 2009-12-23 2019-01-22 International Business Machines Corporation Unification of web page reporting and updating through a page tag
US20120084349A1 (en) 2009-12-30 2012-04-05 Wei-Yeh Lee User interface for user management and control of unsolicited server operations
US8432368B2 (en) * 2010-01-06 2013-04-30 Qualcomm Incorporated User interface methods and systems for providing force-sensitive input
US20130290480A1 (en) 2010-01-11 2013-10-31 Ensighten, Inc. Use of Method Overrides for Dynamically Changing Visible Page Content
US9158733B2 (en) 2010-03-07 2015-10-13 Sailthru, Inc. Computerized system and method for linking a user's E-mail that tracks a user's interest and activity
US9785722B2 (en) 2010-04-01 2017-10-10 Forsee Results, Inc. Systems and methods for remote replay of user interaction with a webpage
ITRM20100175A1 (en) 2010-04-13 2011-10-14 Andrea Buratti MODULAR DYNAMIC WEB APPLICATIONS PERSISTENT WITH COMPLEX INTERFACES
US8407321B2 (en) 2010-04-21 2013-03-26 Microsoft Corporation Capturing web-based scenarios
US20110282739A1 (en) 2010-05-11 2011-11-17 Alex Mashinsky Method and System for Optimizing Advertising Conversion
US8560610B2 (en) 2010-06-16 2013-10-15 Brighttag Inc. Unified collection and distribution of data
US8949315B2 (en) 2010-06-30 2015-02-03 Nbcuniversal Media, Llc System and method for generating web analytic reports
US8498895B2 (en) 2010-07-19 2013-07-30 Accenture Global Services Limited Browser based user identification
US8904277B2 (en) 2010-08-31 2014-12-02 Cbs Interactive Inc. Platform for serving online content
JP5730407B2 (en) 2010-12-20 2015-06-10 ザ ニールセン カンパニー (ユーエス) エルエルシー Method and apparatus for determining media impressions using distributed demographic information
US20120221411A1 (en) 2011-02-25 2012-08-30 Cbs Interactive Inc. Apparatus and methods for determining user intent and providing targeted content according to intent
US20130191208A1 (en) 2012-01-23 2013-07-25 Limelight Networks, Inc. Analytical quantification of web-site communications attributed to web marketing campaigns or programs
US9165308B2 (en) 2011-09-20 2015-10-20 TagMan Inc. System and method for loading of web page assets
US9208470B2 (en) 2011-10-04 2015-12-08 Yahoo! Inc. System for custom user-generated achievement badges based on activity feeds
US20130091025A1 (en) 2011-10-06 2013-04-11 Yahoo! Inc. Methods and systems for measuring advertisement effectiveness
US20130124327A1 (en) 2011-11-11 2013-05-16 Jumptap, Inc. Identifying a same user of multiple communication devices based on web page visits
CN102693501A (en) 2012-05-31 2012-09-26 刘志军 Method for analyzing Internet advertisement popularizing effect
AU2013204865B2 (en) 2012-06-11 2015-07-09 The Nielsen Company (Us), Llc Methods and apparatus to share online media impressions data
US20140013203A1 (en) 2012-07-09 2014-01-09 Convert Insights, Inc. Systems and methods for modifying a website without a blink effect
US20140081981A1 (en) 2012-09-19 2014-03-20 Deep River Ventures, Llc Methods, Systems, and Program Products for Identifying a Matched Tag Set
US20140215050A1 (en) 2013-01-29 2014-07-31 Array Networks, Inc. Method and system for web analytics using a proxy

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030184452A1 (en) * 2002-03-28 2003-10-02 Textm, Inc. System, method, and computer program product for single-handed data entry
US20120154292A1 (en) * 2010-12-16 2012-06-21 Motorola Mobility, Inc. Method and Apparatus for Activating a Function of an Electronic Device
US20120169624A1 (en) * 2011-01-04 2012-07-05 Microsoft Corporation Staged access points

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10026098B2 (en) 2010-01-06 2018-07-17 Evidon, Inc. Systems and methods for configuring and presenting notices to viewers of electronic ad content regarding targeted advertising techniques used by Internet advertising entities
US10291492B2 (en) 2012-08-15 2019-05-14 Evidon, Inc. Systems and methods for discovering sources of online content
US9973585B2 (en) 2015-04-11 2018-05-15 Evidon, Inc. Methods, apparatus and systems for providing notice of digital tracking technologies in mobile apps on mobile devices, and for recording user consent in connection with same

Also Published As

Publication number Publication date
US20140208216A1 (en) 2014-07-24
US20150143244A1 (en) 2015-05-21
US9116608B2 (en) 2015-08-25
US8843827B2 (en) 2014-09-23

Similar Documents

Publication Publication Date Title
US9116608B2 (en) Activation of dormant features in native applications
KR102490421B1 (en) Systems, devices, and methods for dynamically providing user interface controls at a touch-sensitive secondary display
AU2013215233B2 (en) Sharing services
CN103154856B (en) For the environmental correclation dynamic range control of gesture identification
US9158518B2 (en) Collaborative application development environment using a connected device
EP2207333B1 (en) Method and system for modifying the execution of a native application running on a portable eletronic device
EP2184668B1 (en) Method, system and graphical user interface for enabling a user to access enterprise data on a portable electronic device
US10481760B2 (en) Interactive dynamic push notifications
US20170024226A1 (en) Information processing method and electronic device
US9582139B1 (en) Multi-level mobile device profiles
US11868605B2 (en) Application bar display method and electronic device
JP2020510250A (en) Service processing method and device
US20140325195A1 (en) Method for unlocking a mobile device
WO2017059676A1 (en) Smart card read/write methods and devices
US20150180998A1 (en) User terminal apparatus and control method thereof
US20180101574A1 (en) Searching index information for application data
US11416319B1 (en) User interface for searching and generating graphical objects linked to third-party content
US20190369827A1 (en) Remote data input framework
US20210026913A1 (en) Web browser control feature
US10708391B1 (en) Delivery of apps in a media stream
US11295706B2 (en) Customizable compact overlay window
WO2016200715A1 (en) Transitioning command user interface between toolbar user interface and full menu user interface based on use context
US20230350967A1 (en) Assistance user interface for computer accessibility
AU2012258338B2 (en) Method and system for modifying the execution of a native application running on a portable electronic device
Pradhan Cross-platform Mobile and Tablet Application

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14704432

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14704432

Country of ref document: EP

Kind code of ref document: A1