Introduction Last updated: 30-11-2023

Learn how to use AppNava to allow everyone in your organization to get the business data they need without bothering technical teams.

What is AppNava?

AppNava was founded back in 2019 to make apps and games richer and happier. We are a team of four awesome tech and mobile data experts who aspire to bring a revolution to the mobile and digital world. Data-driven solutions and machine learning utilization just run in our blood! This team of four excels at modelling complex user behaviours for mobile apps and predicting the future of your customers. With our experience in data science and engineering, we are putting forth our best to make you thrive. After a year of model development, we are happy to be there for mobile apps and games with data driven insights.

What makes AppNava different?

Know Your Players

Understand and analyze the what, when and why of your players' behaviour. Understand the past; what happened and why! At this stage, machine learning algorithms start being trained.

Predict Next Behaviour

Start predicting how your players will behave in the future even they don't know yet. Our registered machine learning algorithms predict player behavior. AppNava allows you to understand your players much more deeper.

Take Right Actions

Everyone is different in many ways; tastes, lifestyles, expectations… Showing the same offer to everybody does not make sense. Personalize your campaign and game regarding this knowledge.

How AppNava Works?

Connect to the largest mobile ecosystem with integrated partners globally. Just select your analytics partners and measurement tools (AWS, Google Analytics, Game Analytics, etc.). Then, on the receiving side, choose the API that works for you to get access to your data, all of the time. With completed API connection, the mapping system is working to choose the required features/variables from your raw data. Here 80% match is enough to continue to the next steps.

For next steps you can go to this How to Nava page to learn more!

Dashboard

Introducing dashboard and information about user interfaces.

What is a Dashboard?

A dashboard is a visual display of all of your data. While it can be used in all kinds of different ways, its primary intention is to provide information at-a-glance.

A dashboard usually sits on its own page and receives information from a linked database. In many cases it’s configurable, allowing you the ability to choose which data you want to see and whether you want to include charts or graphs to visualize the numbers..

Why Are Dashboards Important?

Dashboards allow all kinds of professionals the ability to monitor performance, create reports and set estimates and targets for future work.

Other benefits include:

  • A visual representation of performance, such as with charts and graphs
  • The ability to identify trends
  • An easy way of measuring efficiency
  • The means to generate detailed reports with a single click
  • The capacity to make more informed decisions
  • Total visibility of all systems, campaigns, and actions
  • Quick identification of data outliers and correlations

Getting Started

How to Sign Up?

  • Go to appnava.com and click and go the Sign Up page
  • Fill in the necessary information and click Sign Up
  • An activation email is sent to you. Click Activate Account and you are done.
  • How to Create My Company?

  • Go to Profile tab on Company page
  • Click Start Your Company! button
  • Fill in the necessary information and click save
  • How to Join a Company?

  • First get your token from the email sent by the company admin
  • Copy the Link
  • Now go to appnava.com and click to Profile tab on Company page
  • Click Join a Company! button
  • Enter the token you copied and click Join
  • How to Send Invitation?

  • Go to Users page on the dashboard and click Invite User
  • Enter the email address of the person you want to invite and click Send Invitation
  • How to Add My Game/App?

  • Go to Products tab on Company page
  • Click Add Product button and fill the form until BigQuery Project ID
  • For BigQuery Project ID Go to Google Cloud Platform
  • Search your BigQuery Dataset ID, as shown in red rectangle. Then type into field. (e.g. focusshot-12345) (It is all lowercase and uses hyphens instead of spaces)
  • Fill the rest of the form and save your product
  • How to Use Machine Learning Models?

  • Go to Data Operations Page
    • We will be notified and start working on your models
    • When your models are ready, you will be notified by e-mail

    Note

    Usually a dataset of 1.000.000 events and 30.000 distinct users is needed for the models to be working. We will find the time-point after which these numbers are met and update the associated columns.

    How to Use Predictions?

  • You will be able to see the ready-to-use models in Trained Models section
    • Go to Firebase Console and define a User Property
    • Embed shown sample code in your Unity scripts which sets the user property value depending on the prediction
    • Go to Google Cloud Console and create a new Cloud Function with the shown sample code
    • Use Firebase Remote Config for taking actions according to predictions
    • Post data to the given endpoint
    • Prediction will be returned in JSON format
    • Take your actions according to predictions

    Note

    You need to define a conversion event and log it right before calling the Google Cloud Function so not-yet-collected events will be flushed to BigQuery and then they can be used for prediction.

    Integration

    In this section you can learn about how to configure database connection. Below steps describe how AppNava works for Firebase / Google Cloud & Unity customers.

    BigQuery

    Below steps describe how AppNava works for Firebase / Google Cloud & Unity customers.

    Collect Data

    Allow us to pull your data

    • Go to Google Cloud IAM
    • The following e-mail addresses are entered from the menu that opens on the side: 4
    • We need at least three roles to be able to read on Bigquery; search for roles using the search box:
      • BigQuery Data Viewer 5
      • Bigquery Resource Viewer 6
      • BigQuery User 7
    • Click Save button 8

    Automatic prediction integration with MMp’s and Network

    Follow these steps to set up and link Google Analytics 4 with BigQuery for Firebase/Google Cloud & Unity customers.

    Step 1: Create a Google-APIs-Console project and enable BigQuery

    Note: You must be an Editor or above to create a Google-APIs-Console project and enable BigQuery.

    1. Log in to the Google Cloud Console.
    2. Create a new Google Cloud Console project or select an existing project.
    3. Navigate to the APIs table: Open the Navigation menu in the top-left corner, click APIs & Services, then click Library.
    4. Activate BigQuery: Under Google Cloud APIs, click BigQuery API. On the following page, click Enable.
    5. If prompted, review and agree to the Terms of Service.

    Step 2: Prepare your project for BigQuery Export

    You can export Google Analytics data to the BigQuery sandbox free of charge (sandbox limits apply).

    Learn more about upgrading from the sandbox and BigQuery pricing.

    Step 3: Link a Google Analytics 4 property to BigQuery

    After you complete the first two steps, you can enable BigQuery Export from Analytics Admin.

    BigQuery Export is subject to the same collection and configuration limits as Google Analytics. If you need higher limits, you can upgrade your property to 360.

    1. In Admin, under Product Links, click BigQuery Links. Note: The previous link opens to the last Analytics property you accessed. You can change the property using the property selector.
    2. You must be an Editor or above at the property level to link an Analytics property to BigQuery.
    3. You must also use an email address that has OWNER access to the BigQuery project (view Permissions below for detailed access requirements).
    4. Click Link.
    5. Click Choose a BigQuery project to display a list of projects for which you have access.
      • If you have linked Analytics and Firebase (or plan to), consider exporting to the same Cloud project, which will facilitate easier joins with other Firebase data.
    6. Select a project from the list, then click Confirm.
    7. Select a location for the data. (If your project already has a dataset for the Analytics property, you can't configure this option.)
    8. Click Next.
    9. Select Configure data streams and events to select which data streams to include with the export and specific events to exclude from the export. You can exclude events by either clicking Add to select from a list of existing events or by clicking Specify event by name to choose existing events by name or to specify event names that have yet to be collected on the property.
    10. Click Done.
    11. Select Include advertising identifiers for mobile app streams if you want to include advertising identifiers. Note: To ensure smooth functionality and avoid issues, make sure to enable Include Advertising Identifiers in Export in Firebase settings under Project > Settings > Integrations > BigQuery.
    12. Select either or both a Daily (once a day) or Streaming (continuous) export of data. For Analytics 360 properties, you may also select Fresh Daily.
    13. Click Next.
    14. Review your settings, then click Submit.

    Permissions

    Project getIamPolicy/setIamPolicy rights, Services get/enable rights. OWNER is a superset of these permissions.

    To create a BigQuery link the minimal permissions you need are:

    • resourcemanager.projects.get: To get the project
    • resourcemanager.projects.getIamPolicy: To get a list of permissions
    • resourcemanager.projects.setIamPolicy: To check if user has permission to create the link on this project
    • serviceusage.services.enable: To enable the BigQuery API
    • serviceusage.services.get: To check if the BigQuery API is enabled

    Verify the service account

    When you link Analytics and BigQuery, that process creates the following service account:

    [email protected]

    • Verify that the account has been added as a member of the project, and given the role of BigQuery User (roles/bigquery.user).
    • If you previously set up BigQuery Export to give your service account the Editor role for the Google Cloud project, you can reduce that role to BigQuery User. To change the role for the service account, you need to unlink and then relink Analytics to your BigQuery project. The first step is to unlink Analytics and BigQuery and remove the service account with the Editor role. Then, relink Analytics and BigQuery per the instructions above to create the new service account with the correct permission for the project.
    • After relinking, ensure that the Service Account has the Owner (bigquery.dataOwner) role on the existing export dataset. You can do this by viewing access policy of the dataset.

    Change regions

    If you choose the wrong region and need to change it after you've created the link:

    1. Delete the link to BigQuery (see below).
    2. Backup the data to another dataset in BigQuery (move or copy).
    3. Delete the original dataset. Take note of the name: you'll need it in the next step.
    4. Create a new dataset with the same name as the dataset you just deleted, and select the location for the data.
    5. Share the new dataset with [email protected] and give the service account the BigQuery Data Owner role.
    6. Copy the backup data into the new dataset.
    7. Repeat the procedure above to create a new link to BigQuery.

    After changing the location, you'll have a gap in your data: streaming and daily exports of data will not process between deletion of the existing link and creation of the new link.

    Delete a link to BigQuery

    1. In Admin, under Product Links, click BigQuery Links. Note: The previous link opens to the last Analytics property you accessed. You can change the property using the property selector. You must be an Editor or above at the property level to delete a link to BigQuery.
    2. Click the row for the link.
    3. In the top right, click More > Delete.

    BigQuery Export limits

    • Standard GA4 properties have a BigQuery Export limit of 1 million events for Daily (batch) exports. There is no limit on the number of events for Streaming export. If your property consistently exceeds the export limit, the daily BigQuery export will be paused and previous days’ exports will not be reprocessed.
    • For Analytics 360 properties, the Fresh Daily export contains all data fields and columns understood to be in the daily export, including observed user attribution and Ad Impression data. Learn more about the Fresh Daily export.
    • Property editors and administrators will receive an email notification each time a property they manage exceeds the daily limit. That notification will indicate when their export will be paused if action is not taken. Additionally, if a standard property significantly exceeds the one-million-event daily limit, Analytics may pause daily exports immediately. If you receive a notification, please leverage the data-filtering options (data-stream export and event-exclusion) to decrease the volume of events exported each day and ensure the daily export continues to operate.
    • Learn more about the higher limits available with 360 properties.

    Data filtering

    You can exclude specific data streams and events from your export, either to limit the size of your export or to make sure you're exporting just the events you want in BigQuery.

    • Exclude data streams and events during linking process: During the linking process, when you select the data streams you want to export, you also have the option to select events to exclude from export. See Step 9 in the linking process.
    • Add or remove data streams or events after you've configured linking: You can add or remove data streams and add events to or remove events from the exclusion list after you've configured the BigQuery link.
    1. In Admin, under Product Links, click BigQuery Links. Note: The previous link opens to the last Analytics property you accessed. You can change the property using the property selector.
    2. You must be an Editor or above at the property level to add or remove data streams or events.
    3. You must also use an email address that has OWNER access to the BigQuery project.
    4. Click the row for the project whose link you want to modify.
    5. Under Data streams and events, click View data streams and events.
    6. Under Data streams to export, you can select additional data streams to export or remove existing data streams from the list.
    7. On the Events to exclude list, click Add to select from a list of existing events or click Specify event by name to choose existing events by name or to specify event names that have yet to be collected on the property.
    8. To remove an event from the list, click the minus sign at the end of that row.

    Pricing and billing

    • BigQuery charges for usage with two pricing components: storage and query processing. You can review the pricing table and learn about the differences between interactive and batch queries.
    • You need to have a valid form of payment on file in Cloud in order for the export to proceed. If the export is interrupted due to an invalid payment method, we are not able to re-export data for that time.
    • You can also export Analytics data to the BigQuery sandbox free of charge but keep in mind that sandbox limits apply.

    When you start seeing data

    Once the linkage is complete, data should start flowing to your BigQuery project within 24 hours. If you enable daily export, then 1 file will be exported each day that contains the previous day’s data (generally, during early afternoon in the time zone you set for reporting).

    AWS

    Below steps describe how AppNava works for AWS customers

    Collect Data

  • Use an AWS service of your choice to collect data into a bucket
  • Game Analytics Pipeline
  • Pinpoint
  • Allow us to pull your data

    • Go to AWS S3 Buckets
    • Bucket Policy:

      {
      "Version": "2012-10-17",
      "Statement": [
          {
              "Sid": "AppNavaDataPullPolicy",
              "Effect": "Allow",
              "Principal": {
                  "AWS": "arn:aws:iam::324464245602:root"
              },
              "Action": [
                  "s3:ListBucket",
                  "s3:GetObject"
              "Resource": [
                  "arn:aws:s3:::<your-bucket-name-here>/*",
                  "arn:aws:s3:::<your-bucket-name-here>"
              ]
          }
       ]
    • Click Save

    APIs

    When you use an application on your mobile phone, the application connects to the Internet and sends data to a server. The server then retrieves that data, interprets it, performs the necessary actions and sends it back to your phone. The application then interprets that data and presents you with the information you wanted in a readable way. This is what an API is - all of this happens via API. You can learn about what APIs we are using in this section.

    Real-Time Integration

    Start predicting the future behavior of new players while even they do not know how to behave in the future. AppNava defines player groups who are likely to complete a specific action and conversion events. So you can engage with players before they churn; attract players who are likely to complete in-app purchases and much more. Also with AppNava, you can choose the right balance between segment size and accuracy.

    After the model has been trained, there are two ways for carrying out predictions:

    • Dashboard
    • cURL Request

    Dashboard

    Predictions can be done directly through the dashboard by clicking on the Predict Now button. Upon successful completion, the results will be visible in the table on the Trained Models tab as shown below.

    After clicking the Predict Now button, you will see a modal (a slight delay might occur while fetching daily users) to select the following options:

    • Trained Models: A list of selectable models that you have trained.
    • Prediction Selection: : The duration for the prediction that you will perform. A choice between First Session and First Day.
    • User Id: A list of users that are multi-selectable and on which the predictions will be performed.

    After selecting these fields, the Predict button can be clicked to start the prediction and upon successful completion the results should be available in the table below.

    Note: The Bulk Prediction button is an experimental feature and is subject to change. Clicking this button performs predictions for all users of the day and stores the results in our database (results are not visible on the dashboard). The fields Trained Models and Prediction Selection are mandatory, while the User Id field is optional for this feature.

    cURL Request

    Another alternative is to send a curl request to our endpoint and recieve the results back in the form of a json response.

    In order to do that, first navigate to the Predictions tab and create a prediction using the Create Prediction button.

    After clicking the button, a modal will open prompting the user to select between two options:

    • API Call
    • Scheduled Prediction

    API Call

    Choosing API Call presents two options:

    • Through Firebase: Requires Firebase measurement API integration.
    • Direct Response to Game: The api can be invoked directly inside the game client and the results can be collected without any additional integration.

    Choosing Direct Response to Game prompts you to fill the following fields:

    • Prediction Name: A unique name for your prediction.
    • Model Name: A selectable list of models that you have trained.
    • Prediction Frequency: The time constraint for the prediction that you will perform i.e First Session etc
    • Conversion Value: A checkbox to allow for receiving numeric values for non-numeric results for example in the case of Churn Models
    • Bulk Prediction:: A checkbox to specify if the prediction should be done for all the daily users.

    After clicking, Nextyou will see the following modal that will contain a sample curl request that can be adjusted accordingly.

    Finally, clicking the Create Prediction button will redirect the user back to the Predictions page and create an entry on the table in that page with the Prediction Type as API Call: Direct Access. The prediction type in the table will vary depending on what type of prediction was selected.

    For API Calls, the action tab will have a button to display the generated embed code as follows:

    This curl request can be used to hit our endpoint and retrieve the results directly inside the game client or from other sources.

    API Response (Successful Responses)

    Classification Models
    
    {
      "success": true,
      "message": "Prediction succeeded",
      "data": {
        "prediction_result": "churn",
        "prediction_probability_for_positive_class": 0.85,
        "prediction_probability_for_negative_class": 0.15,
        "probability_group": "high"
      }
    }
    
    
    Regression Models
    
    {
      "success": true,
      "message": "Prediction succeeded",
      "data": {
        "prediction_result": 42.5,
        "prediction_probability": 42.5
      }
    }
    
    
    Notes:
    • All predictions are automatically saved to BigQuery for historical tracking.
    • The API supports both classification and regression models.
    • A positive class probability denotes the probability of the class the model is interested in happening i.e in the case of a churn model, it would simply be Churn and negative class would be vice versa
    • prediction_probability_for_tertiary_class will be additionally sent but only for the LTV C model since it is a multi-classification model.
    • prediction_result will change depending on the model i.e churn model will give a result of either churn or not-churn, a subscriber model will give a result of subscriber or non-subscriber.
    • Probability groups for classification models are categorized as follows:
      • Low: ≤ 0.39
      • Mid: 0.40 - 0.60
      • High: > 0.60
    • As mentioned in before, probability_group will be either “high“, 'mid“ or “low“ and depends on the positive class. This is automatically classed for general evaulation according to preset conditions and is an estimation. In regressor models, prediction_result is calculated by checking the average ltv and serves a similar purpose.
    • Individual users are limited to 20 predictions

    API Response (Error Responses)

    The API returns different error response formats depending on the error type.

    No Data Available
    
    {
      "success": false,
      "error_code": "NO_DATA",
      "message": "No data available for user: user123",
      "user_id": "user123"
    }
    
    
    Too Many Predictions
    
    {
      "success": false,
      "error_code": "SEEN_TOO_MANY",
      "message": "User has exceeded maximum prediction limit",
      "user_id": "user123"
    }
    
    
    No Users Found
    
    {
      "success": false,
      "error_code": "NO_USERS_FOUND",
      "message": "No users found matching the criteria",
      "user_id": "N.A (Bulk Pred)"
    }
    
    

    Common Mistakes:

    • Often the curl requests are not constructed properly inside the game client and hence the requests fails.
    • The --data-raw option specifies the data to be sent to the endpoint as a JSON object, it should be ensured that the JSON is properly formatted and is valid.
    • Booleans should be lowercase in JSON e.g true is valid but True is not valid.
    • JSON should be sent in the body of a HTTP request and should not be nested inside another body key.

    Scheduled Prediction

    This option allows users to schedule predictions for a specific duration that will run daily automatically. Upon clicking the option, a modal can be seen with the following fields:

    • Prediction Name: A unique name for the scheduled prediction.
    • Model Name:A selectable list of models that you have trained.
    • Prediction Frequency: The time constraint for the prediction that you will perform. i.e First Session etc
    • Start Date: When the scheduled prediction should start.
    • Schedule Date: The duration for the scheduled prediction in days i.e how long it should last?

    Lastly, after clicking done, the user will be redirected to the Predictions page and prediction will be created in the table with the status Scheduled Prediction.

    Visualization of Results

    Appnava offers an elegant solution for visualizing your data. All prediction made through curl requests can be visualized using the Prediction Results tab.

    The table and chart shows an accumulation of daily predictions. Depending on the models i.e Churn, LTV etc the columns will vary. The real results are calculated depending on the model as well and can be compared to the predicted results to evalute the accuracy of the model.

    Resources

    In this section you can learn about what are our resources and how to use them

    ROI and ROAS Calculator

    Learn what ROI and ROAS are and how to use the calculator!

    ROI definition

    ROI stands for “Return on Investment.” Essentially, it’s a measurement of the return on a particular investment, relative to the cost of the investment. In other words, it’s a ratio between your net profit and investment. There’s a simple formula that you can use to work out the ROI: ROI = (Net Profit / Net Spend) x 100

    ROAS definition

    ROAS stands for “Return on Ad Spend.” ROAS can help you determine the efficiency of your online advertising campaigns by calculating the amount of money your business earns for each pound it spends on advertising. You can use the following formula to calculate ROAS: ROAS = (Revenue Generated from Ads / Advertising Spend) x 100.

    The difference between ROI and ROAS

    When it comes to ROI vs. ROAS, there are a couple of major differences. Firstly, ROAS looks at revenue, rather than profit. Secondly, ROAS only considers direct spend, rather than other costs associated with your online campaign. In a nutshell, ROAS is the best metric to look at for determining whether your ads are effective at generating clicks, impressions, and revenue. However, unlike ROI, it won’t tell you whether your paid advertising effort is actually profitable for the company.

    Should I use ROI or ROAS?

    When you consider ROI vs. ROAS, it’s important to remember that it isn’t an either/or situation. Whereas ROI can help you understand long-term profitability, ROAS may be more suited to optimising short-term strategy. To craft an effective digital marketing campaign, you’ll need to utilise both the ROI and ROAS formulas. ROI provides you with insight into the overall profitability of your advertising campaign, while ROAS can be used to identify specific strategies that can help you improve your online marketing efforts and generate clicks and revenue.

    Tip

    AppNava helps you automate payment collection, cutting down on the amount of admin your team needs to deal with when chasing invoices. You can go this page and try our ROI and Roas Calculator!

    Do you want to know about the titles of the fields you need to enter into the calculator?

    Read below for more info!

    LTV

    LTV is lifetime value. This is the estimated value that you expect to extract from the player. It makes more sense to couple this lifetime value with a number of days during which the user interacts with the product (the game in this case). This enables us to study whether we are on the right track, and to reason about the product. So, LTV365 is the expected (read average) value or revenue we get from a player after 365 days or 1 year after coming into contact with the game for the first time.

    Retention

    Retention is a measure that will tell us how the players will keep interacting with the game. Day 1 retention (D1), is a percentage of how many players returned to the game after launching it for the first time (D0). The higher the retention, the better because it means that players keep coming back, so there is something about the product/game/app that motivated them to return.

    ARPDAU

    ARPDAU is average revenue per daily active user. This metric is very convoluted. By itself, it doesn’t say much. An ARPDAU of 2€ says very little. If you have a restaurant and DAU (daily active user), is the number of customers that walk in, 2€ ARPDAU might leave you bankrupt. For a mobile game, if you create a compelling title where on average you get 2€ per daily active user then you might have struck gold.

    CPI

    CPI is the cost per install. Lately, User Acquisition (UA), is an integral part of the business model of scaling F2P mobile games. And with more and more publishers paying to acquire users, the market is getting more and more competitive, and the cost per install and acquiring a new player is going through the roof.

    UA

    So User Acquisition works just like the old fashioned advertisements. You pay upfront, to get customers/players/users walking in, and hopefully, those that convert (end up buying something), will make up for the price of the advertisement and yield some extra revenue. So it’s an upfront investment and in order to minimize the risks associated with this investment, we have to study and predict how we will make the money back with a profit.