The Cloud SQL Node.js Connector is a Cloud SQL connector designed for use with the Node.js runtime. Using a Cloud SQL connector provides a native alternative to the Cloud SQL Auth Proxy while providing the following benefits:
The Cloud SQL Node.js Connector is a package to be used alongside a database driver. Currently supported drivers are:
You can install the library using npm install
:
npm install @google-cloud/cloud-sql-connector
This library requires the following to successfully make Cloud SQL Connections:
This library uses the Application Default Credentials (ADC) strategy for resolving credentials. Please see these instructions for how to set your ADC (Google Cloud Application vs Local Development, IAM user vs service account credentials), or consult the Node.js google-auth-library.
The connector package is meant to be used alongside a database driver, in the following examples you can see how to create a new connector and get valid options that can then be used when starting a new connection.
For even more examples, check the examples/
folder.
Here is how to start a new pg
connection pool.
import pg from 'pg'; import {Connector} from '@google-cloud/cloud-sql-connector'; const {Pool} = pg; const connector = new Connector(); const clientOpts = await connector.getOptions({ instanceConnectionName: 'my-project:region:my-instance', ipType: 'PUBLIC', }); const pool = new Pool({ ...clientOpts, user: 'my-user', password: 'my-password', database: 'db-name', max: 5, }); const {rows} = await pool.query('SELECT NOW()'); console.table(rows); // prints returned time value from server await pool.end(); connector.close();
Here is how to start a new mysql2
connection pool.
import mysql from 'mysql2/promise'; import {Connector} from '@google-cloud/cloud-sql-connector'; const connector = new Connector(); const clientOpts = await connector.getOptions({ instanceConnectionName: 'my-project:region:my-instance', ipType: 'PUBLIC', }); const pool = await mysql.createPool({ ...clientOpts, user: 'my-user', password: 'my-password', database: 'db-name', }); const conn = await pool.getConnection(); const [result] = await conn.query(`SELECT NOW();`); console.table(result); // prints returned time value from server await pool.end(); connector.close();
Here is how to start a new tedious
connection.
const {Connection, Request} = require('tedious'); const {Connector} = require('@google-cloud/cloud-sql-connector'); const connector = new Connector(); const clientOpts = await connector.getTediousOptions({ instanceConnectionName: process.env.SQLSERVER_CONNECTION_NAME, ipType: 'PUBLIC', }); const connection = new Connection({ // Please note that the `server` property here is not used and is only defined // due to a bug in the tedious driver (ref: https://github.com/tediousjs/tedious/issues/1541) // With that in mind, do not try to change this value since it will have no // impact in how the connector works, this README will be updated to remove // this property declaration as soon as the tedious driver bug is fixed server: '0.0.0.0', authentication: { type: 'default', options: { userName: 'my-user', password: 'my-password', }, }, options: { ...clientOpts, // Please note that the `port` property here is not used and is only defined // due to a bug in the tedious driver (ref: https://github.com/tediousjs/tedious/issues/1541) // With that in mind, do not try to change this value since it will have no // impact in how the connector works, this README will be updated to remove // this property declaration as soon as the tedious driver bug is fixed port: 9999, database: 'my-database', }, }); connection.connect(err => { if (err) { throw err; } let result; const req = new Request('SELECT GETUTCDATE()', err => { if (err) { throw err; } }); req.on('error', err => { throw err; }); req.on('row', columns => { result = columns; }); req.on('requestCompleted', () => { console.table(result); }); connection.execSql(req); }); connection.close(); connector.close();Using a Local Proxy Tunnel (Unix domain socket)
Another possible way to use the Cloud SQL Node.js Connector is by creating a local proxy server that tunnels to the secured connection established using the Connector.startLocalProxy()
method instead of Connector.getOptions()
.
Note
The startLocalProxy()
method is currently only supported for MySQL and PostgreSQL as it uses a Unix domain socket which SQL Server does not currently support.
This alternative approach enables usage of the Connector library with unsupported drivers such as Prisma. Here is an example on how to use it with its PostgreSQL driver:
import {Connector} from '@google-cloud/cloud-sql-connector'; import {PrismaClient} from '@prisma/client'; const connector = new Connector(); await connector.startLocalProxy({ instanceConnectionName: 'my-project:us-east1:my-instance', listenOptions: { path: '.s.PGSQL.5432' }, }); const hostPath = process.cwd(); const datasourceUrl = `postgresql://my-user:password@localhost/dbName?host=${hostPath}`; const prisma = new PrismaClient({ datasourceUrl }); connector.close(); await prisma.$disconnect();
For examples on each of the supported Cloud SQL databases consult our Prisma samples.
Specifying IP Address TypeThe Cloud SQL Connector for Node.js can be used to connect to Cloud SQL instances using both public and private IP addresses, as well as Private Service Connect (PSC). Specifying which IP address type to connect to can be configured within getOptions
through the ipType
argument.
By default, connections will be configured to 'PUBLIC'
and connect over public IP, to configure connections to use an instance's private IP, use 'PRIVATE'
for ipType
as follows:
Note: If specifying Private IP or Private Service Connect, your application must be attached to the proper VPC network to connect to your Cloud SQL instance. For most applications this will require the use of a VPC Connector.
Example on how to use a Private IPconst clientOpts = await connector.getOptions({ instanceConnectionName: 'my-project:region:my-instance', ipType: 'PRIVATE', });
const clientOpts = await connector.getOptions({ instanceConnectionName: 'my-project:region:my-instance', ipType: 'PSC', });Example on how to use
IpAddressTypes
in TypeScript
import {Connector, IpAddressTypes} from '@google-cloud/cloud-sql-connector'; const clientOpts = await connector.getOptions({ instanceConnectionName: 'my-project:region:my-instance', ipType: IpAddressTypes.PSC, });Automatic IAM Database Authentication
Connections using Automatic IAM database authentication are supported when using Postgres or MySQL drivers.
Make sure to configure your Cloud SQL Instance to allow IAM authentication and add an IAM database user.
A Connector
can be configured to connect to a Cloud SQL instance using automatic IAM database authentication with getOptions
through the authType
argument.
const clientOpts = await connector.getOptions({ instanceConnectionName: 'my-project:region:my-instance', authType: 'IAM', });
When configuring a connection for IAM authentication, the password
argument can be omitted and the user
argument should be formatted as follows:
Postgres: For an IAM user account, this is the user's email address. For a service account, it is the service account's email without the
.gserviceaccount.com
domain suffix.MySQL: For an IAM user account, this is the user's email address, without the
@
or domain name. For example, fortest-user@gmail.com
, set theuser
field totest-user
. For a service account, this is the service account's email address without the@project-id.iam.gserviceaccount.com
suffix.
Examples using the test-sa@test-project.iam.gserviceaccount.com
service account to connect can be found below.
import pg from 'pg'; import {Connector} from '@google-cloud/cloud-sql-connector'; const {Pool} = pg; const connector = new Connector(); const clientOpts = await connector.getOptions({ instanceConnectionName: 'my-project:region:my-instance', authType: 'IAM', }); const pool = new Pool({ ...clientOpts, user: 'test-sa@test-project.iam', database: 'db-name', max: 5, }); const {rows} = await pool.query('SELECT NOW()'); console.table(rows); // prints returned time value from server await pool.end(); connector.close();MySQL Automatic IAM Authentication Example
import mysql from 'mysql2/promise'; import {Connector} from '@google-cloud/cloud-sql-connector'; const connector = new Connector(); const clientOpts = await connector.getOptions({ instanceConnectionName: 'my-project:region:my-instance', authType: 'IAM', }); const pool = await mysql.createPool({ ...clientOpts, user: 'test-sa', database: 'db-name', }); const conn = await pool.getConnection(); const [result] = await conn.query(`SELECT NOW();`); console.table(result); // prints returned time value from server await pool.end(); connector.close();Example on how to use
AuthTypes
in TypeScript
For TypeScript users, the AuthTypes
type can be imported and used directly for automatic IAM database authentication.
import {AuthTypes, Connector} from '@google-cloud/cloud-sql-connector'; const clientOpts = await connector.getOptions({ instanceConnectionName: 'my-project:region:my-instance', authType: AuthTypes.IAM, });Using With
Google Auth Library: Node.js Client
Credentials
One can use google-auth-library
credentials with this library by providing an AuthClient
or GoogleAuth
instance to the Connector
.
npm install google-auth-library
import {GoogleAuth} from 'google-auth-library'; import {Connector} from '@google-cloud/cloud-sql-connector'; const connector = new Connector({ auth: new GoogleAuth({ scopes: ['https://www.googleapis.com/auth/sqlservice.admin'] }), });
This can be useful when configuring credentials that differ from Application Default Credentials. See the documentation on the google-auth-library
for more information.
The custom Google Auth Library auth
property can also be used to set auth-specific properties such as a custom quota project. Following up from the previous example, here's how you can set a custom quota project using a custom auth
credential:
import {GoogleAuth} from 'google-auth-library'; import {Connector} from '@google-cloud/cloud-sql-connector'; const connector = new Connector({ auth: new GoogleAuth({ clientOptions: { quotaProjectId: '<custom quota project>', }, }), });Additional customization via Environment Variables
It is possible to change some of the library default behavior via environment variables. Here is a quick reference to supported values and their effect:
GOOGLE_APPLICATION_CREDENTIALS
: If defined the connector will use this file as a custom credential files to authenticate to Cloud SQL APIs. Should be a path to a JSON file. You can find more on how to get a valid credentials file here.GOOGLE_CLOUD_QUOTA_PROJECT
: Used to set a custom quota project to Cloud SQL APIs when defined.The connector can be configured to use DNS to look up an instance. This would allow you to configure your application to connect to a database instance, and centrally configure which instance in your DNS zone.
Configure your DNS RecordsAdd a DNS TXT record for the Cloud SQL instance to a private DNS server or a private Google Cloud DNS Zone used by your application.
Note: You are strongly discouraged from adding DNS records for your Cloud SQL instances to a public DNS server. This would allow anyone on the internet to discover the Cloud SQL instance name.
For example: suppose you wanted to use the domain name prod-db.mycompany.example.com
to connect to your database instance my-project:region:my-instance
. You would create the following DNS record:
TXT
prod-db.mycompany.example.com
– This is the domain name used by the applicationmy-project:region:my-instance
– This is the instance nameConfigure the connector as described above, replacing the connector ID with the DNS name.
Adapting the MySQL + database/sql example above:
import mysql from 'mysql2/promise'; import {Connector} from '@google-cloud/cloud-sql-connector'; const connector = new Connector(); const clientOpts = await connector.getOptions({ domainName: 'prod-db.mycompany.example.com', ipType: 'PUBLIC', }); const pool = await mysql.createPool({ ...clientOpts, user: 'my-user', password: 'my-password', database: 'db-name', }); const conn = await pool.getConnection(); const [result] = await conn.query(`SELECT NOW();`); console.table(result); // prints returned time value from server await pool.end(); connector.close();Automatic failover using DNS domain names
For example: suppose application is configured to connect using the domain name prod-db.mycompany.example.com
. Initially the private DNS zone has a TXT record with the value my-project:region:my-instance
. The application establishes connections to the my-project:region:my-instance
Cloud SQL instance. Configure the connector using the domainName
option:
Then, to reconfigure the application to use a different database instance, change the value of the prod-db.mycompany.example.com
DNS record from my-project:region:my-instance
to my-project:other-region:my-instance-2
The connector inside the application detects the change to this DNS record. Now, when the application connects to its database using the domain name prod-db.mycompany.example.com
, it will connect to the my-project:other-region:my-instance-2
Cloud SQL instance.
The connector will automatically close all existing connections to my-project:region:my-instance
. This will force the connection pools to establish new connections. Also, it may cause database queries in progress to fail.
The connector will poll for changes to the DNS name every 30 seconds by default. You may configure the frequency of the connections using the Connector's failoverPeriod
option. When this is set to 0, the connector will disable polling and only check if the DNS record changed when it is creating a new connection.
This project uses semantic versioning, and uses the following lifecycle regarding support for a major version:
Active - Active versions get all new features and security fixes (that wouldn’t otherwise introduce a breaking change). New major versions are guaranteed to be "active" for a minimum of 1 year.
Deprecated - Deprecated versions continue to receive security and critical bug fixes, but do not receive new features. Deprecated versions will be supported for 1 year.
Unsupported - Any major version that has been deprecated for >=1 year is considered unsupported.
Supported Node.js VersionsOur client libraries follow the Node.js release schedule. Libraries are compatible with all current active and maintenance versions of Node.js. If you are using an end-of-life version of Node.js, we recommend that you update as soon as possible to an actively supported LTS version.
Google's client libraries support legacy versions of Node.js runtimes on a best-efforts basis with the following warnings:
This project aims for a release on at least a monthly basis. If no new features or fixes have been added, a new PATCH version with the latest dependencies is released.
We welcome outside contributions. Please see our Contributing Guide for details on how best to contribute.
Apache Version 2.0
See LICENSE
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4