Each business system you use generates its own data. Your CRM might contain information like customer-loyalty ratings, lifetime values, and product preferences. If you're a web publisher, your content-management system might store dimensions like author and article category. If you run an ecommerce business, you store item attributes like price, style, and size.
And you use Analytics to measure traffic and performance for your websites and apps.
Typically, each body of data exists in its own silo, uninformed by the other data. Data Import lets you join all this data in Analytics on a defined schedule in order to take down these silos, unlock new insights, and democratize your data.
How Data Import works Uploading dataYou upload CSV files that contain external data to an Analytics property. You can export those CSV files from an offline business tool like your CRM or CMS system, or for smaller amounts of data, you can create the files manually in a text editor or spreadsheet.
Data Import joins the offline data you upload with the event data that Analytics collects. The imported data enhances your reports, comparisons, and audiences. The result is a more complete picture of online and offline activity.
Joining dataData is joined 2 different ways depending on the type of imported data:
Reporting/Query time data is not available when you're creating audiences in Analytics or when you're creating segments in Explorations.
When you import data, previously imported data will persist while appending any new imported data. Note that if the imported data has the same set of keys as previously imported data, the data will be overwritten.
Note: Imported processing data such as user data and offline-event data are joined at collection/processing time and will be exported to BigQuery. Query time data such as cost, item, and custom event data are joined at reporting/query time and can be joined within BigQuery.
Types of metadata you can import MetadataImporting metadata adds to the data already collected and processed by a property. Typically, metadata is stored in a custom dimension or metric, though in some cases you might want to overwrite the default information already gathered (for example, importing a product catalog with updated categories).
You can import the following data types:
120 uploads per property per day
Import data type Data source limit per property Storage limit per data type Cost data Up to 5 1 GB across all import sources Item data Up to 5 1 GB across all import sources User data Up to 10 Not applicable Offline events Up to 10 Not applicable Custom event data Up to 5 1 GB across all import sourcesYou can find the current quota usage in-product through the "Quota information" button.
How to import dataWhen you import data, you create a data source. A data source is the combination of the CSV file you want to upload and a mapping of existing Analytics fields to the fields in your CSV. For example:
Do not upload a file that includes duplicate keys (for example, 2 fields named user_id)
You may refer to this article if you want to understand data sources
Prerequisites for using SFTP to upload dataIf you plan to use the SFTP option in Step 5, make sure your SFTP server supports the ssh-rsa
and ssh-dss
host-key algorithms. Learn more about verifying which host-key algorithms you use and how to format the SFTP-server URL.
: The previous link opens to the last Analytics property you accessed. You can
change the propertyusing the property selector. You must be an
Editor or aboveat the property level to successfully start the import process.
The CSV file has to include the same fields, or subset of fields, as the original. If you want to import different fields for the same data type, you need to delete the existing data source and create a new one.
Data import done in the Source Property will be automatically exported to both the roll-up and subproperties.
Verify SFTP host-key algorithms; format SFTP-server URL Verify algorithmsThere are different methods you can use to verify whether your SFTP server uses either the ssh-rsa or ssh-dss host-key algorithm. For example, you can use the OpenSSH remote-login client to check your server logs via the following command:
ssh -vv <your sftp server name>
If your server does support either of those algorithms, then you should notice a line like the following in your server log:
debug2: host key algorithms: rsa-sha2-512, rsa-sha2-256, ssh-rsa
If your SFTP-server URL is badly formatted, your import setup will fail with an internal-error message.
An SFTP-server URL generally has 3 parts that you need to consider for uploading data-import files. For example:
sftp://example.com//home/jon/upload.csv
has the following parts:
example.com
//home/jon
/upload.csv
In the example above, the upload file is located in the home directory.
You can format the domain portion of the URL in a variety of ways, using the domain name or the IPv4 or IPv6 address of the server, with or without a port number:
sftp://example.com
sftp://142.250.189.4:1234
sftp://142.250.189.4
sftp://[2607:f8b0:4007:817::2004]:1234
sftp://[2607:f8b0:4007:817::2004]
If you do not include the port number, then the default port is 22.
You can correctly format the URL to include or exclude the home directory. The following examples of correctly formatted URLs use different formats to identify the domain. The examples include port numbers, but you may choose not to use the port number.
sftp://example.com//home/jon/upload.csv
(domain name)sftp://142.250.189.4:1234//home/jon/upload.csv
(IPv4 with port number)sftp://example.com/upload.csv
(domain name)sftp://[2607:f8b0:4007:817::2004]:1234/upload.csv
(IPv6 with port number)If your upload file is located in a subdirectory of your home directory, your URL would look something like:
sftp://example.com//home/jon/data/upload.csv
In this case, you can use the following types of formats:
sftp://example.com//home/jon/data/upload.csv
sftp://142.250.189.4:1234//home/jon/data/upload.csv
(IPv4 with port number)sftp://example.com/data/upload.csv
sftp://[2607:f8b0:4007:817::2004]:1234/data/upload.csv
(IPv6 with port number)If your upload file is not stored in your home directory (//home/jon
) or a subdirectory of your home directory (//home/jon/data
), and is instead stored in the directory /foo/bar, then the properly formatted URL for your upload file would look something like:
sftp://example.com//foo/bar/upload.csv
(//foo/bar
replaces the home directory)
: The previous link opens to the last Analytics property you accessed. You can
change the propertyusing the property selector.
You must be a Viewer or above at the property level to view data source details.
You can view the name, data type, public key, and history of each upload.
Note: % imported and match rate are relevant to cost, item, and custom event data import, but are not applicable to user data import or offline event data import.
To import new data:
Click Import now and choose the relevant CSV file on your computer.
To delete the data source:
You can delete Collection/Processing-time data, but if you want to remove the data that was previously uploaded from all events processed by Analytics, then you may also need to follow up with a user or user-property deletion. Deleting an already imported file will not remove the processed data that has been associated with events collected since the import was completed. You may use this article to learn more about data-deletion requests.
Reserved names and prefixesThe following event names, event-parameter names, user-property names, and prefixes are reserved for use by Analytics. If you try to upload data that includes any of the reserved names or prefixes, that data will not be uploaded.
For example:
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.3