A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://support.google.com/analytics/answer/10071301 below:

[GA4] About Data Import - Analytics Help

Why use Data Import?

Each business system you use generates its own data. Your CRM might contain information like customer-loyalty ratings, lifetime values, and product preferences. If you're a web publisher, your content-management system might store dimensions like author and article category. If you run an ecommerce business, you store item attributes like price, style, and size.

And you use Analytics to measure traffic and performance for your websites and apps.

Typically, each body of data exists in its own silo, uninformed by the other data. Data Import lets you join all this data in Analytics on a defined schedule in order to take down these silos, unlock new insights, and democratize your data.

How Data Import works Uploading data

You upload CSV files that contain external data to an Analytics property. You can export those CSV files from an offline business tool like your CRM or CMS system, or for smaller amounts of data, you can create the files manually in a text editor or spreadsheet.

Data Import joins the offline data you upload with the event data that Analytics collects. The imported data enhances your reports, comparisons, and audiences. The result is a more complete picture of online and offline activity.

Joining data

Data is joined 2 different ways depending on the type of imported data:

When you import data, previously imported data will persist while appending any new imported data. Note that if the imported data has the same set of keys as previously imported data, the data will be overwritten.

Note: Imported processing data such as user data and offline-event data are joined at collection/processing time and will be exported to BigQuery. Query time data such as cost, item, and custom event data are joined at reporting/query time and can be joined within BigQuery.

Types of metadata you can import Metadata

Importing metadata adds to the data already collected and processed by a property. Typically, metadata is stored in a custom dimension or metric, though in some cases you might want to overwrite the default information already gathered (for example, importing a product catalog with updated categories).

You can import the following data types:

Limits Data-source size 1GB Daily uploads

120 uploads per property per day

Import data type Data source limit per property Storage limit per data type Cost data Up to 5 1 GB across all import sources Item data Up to 5 1 GB across all import sources User data Up to 10 Not applicable Offline events Up to 10 Not applicable Custom event data Up to 5 1 GB across all import sources

You can find the current quota usage in-product through the "Quota information" button.

How to import data

When you import data, you create a data source. A data source is the combination of the CSV file you want to upload and a mapping of existing Analytics fields to the fields in your CSV. For example:

Do not upload a file that includes duplicate keys (for example, 2 fields named user_id)

You may refer to this article if you want to understand data sources

Prerequisites for using SFTP to upload data

If you plan to use the SFTP option in Step 5, make sure your SFTP server supports the ssh-rsa and ssh-dss host-key algorithms. Learn more about verifying which host-key algorithms you use and how to format the SFTP-server URL.

Start the import process
  1. In Admin, under Data collection and modification, click Data import. Note

    : The previous link opens to the last Analytics property you accessed. You can

    change the property

    using the property selector. You must be an

    Editor or above

    at the property level to successfully start the import process.

  2. Create a new data source or select an existing data source. (Check the following sections.)
Create a new data source
  1. Click Create data source.
  2. Enter a name for your data source.
  3. Select the data type:
  4. Click Review terms if prompted. This prompt is displayed if you are importing device or user data.
  5. Do one of the following: Or
  6. Click Next to proceed to the mapping stage.
  7. Select the imported field names you want to map to the Analytics field.
  8. Click Import.
Upload data to an existing data source
  1. In the row for an existing data source, click Import now.
  2. If the data source is configured for CSV import, then select the CSV file you want to import and click Open.

The CSV file has to include the same fields, or subset of fields, as the original. If you want to import different fields for the same data type, you need to delete the existing data source and create a new one.

Data import done in the Source Property will be automatically exported to both the roll-up and subproperties.

Verify SFTP host-key algorithms; format SFTP-server URL Verify algorithms

There are different methods you can use to verify whether your SFTP server uses either the ssh-rsa or ssh-dss host-key algorithm. For example, you can use the OpenSSH remote-login client to check your server logs via the following command:

ssh -vv <your sftp server name>

If your server does support either of those algorithms, then you should notice a line like the following in your server log:

debug2: host key algorithms: rsa-sha2-512, rsa-sha2-256, ssh-rsa

Format SFTP-server URL

If your SFTP-server URL is badly formatted, your import setup will fail with an internal-error message.

An SFTP-server URL generally has 3 parts that you need to consider for uploading data-import files. For example:

sftp://example.com//home/jon/upload.csv has the following parts:

In the example above, the upload file is located in the home directory.

You can format the domain portion of the URL in a variety of ways, using the domain name or the IPv4 or IPv6 address of the server, with or without a port number:

If you do not include the port number, then the default port is 22.

You can correctly format the URL to include or exclude the home directory. The following examples of correctly formatted URLs use different formats to identify the domain. The examples include port numbers, but you may choose not to use the port number.

If your upload file is located in a subdirectory of your home directory, your URL would look something like:

sftp://example.com//home/jon/data/upload.csv

In this case, you can use the following types of formats:

If your upload file is not stored in your home directory (//home/jon) or a subdirectory of your home directory (//home/jon/data), and is instead stored in the directory /foo/bar, then the properly formatted URL for your upload file would look something like:

sftp://example.com//foo/bar/upload.csv (//foo/bar replaces the home directory)

View data-source details, get your SFTP public key, import new data, delete a data source
  1. In Admin, under Data collection and modification, click Data import. Note

    : The previous link opens to the last Analytics property you accessed. You can

    change the property

    using the property selector.

    You must be a Viewer or above at the property level to view data source details.

  2. In the row for the data source, click .

You can view the name, data type, public key, and history of each upload.

To import new data:

Click Import now and choose the relevant CSV file on your computer.

To delete the data source:

  1. Click > Delete data source.
  2. Read the deletion notice, then click Delete data source.

You can delete Collection/Processing-time data, but if you want to remove the data that was previously uploaded from all events processed by Analytics, then you may also need to follow up with a user or user-property deletion. Deleting an already imported file will not remove the processed data that has been associated with events collected since the import was completed. You may use this article to learn more about data-deletion requests.

Reserved names and prefixes

The following event names, event-parameter names, user-property names, and prefixes are reserved for use by Analytics. If you try to upload data that includes any of the reserved names or prefixes, that data will not be uploaded.

For example:

Reserved event names Reserved event-parameter names Reserved user property names Reserved prefixes (applies to event parameters and user properties)

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.3