A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://github.com/triploc/dynq below:

zgsrc/dynq: AWS DynamoDB query library.

AWS DynamoDB datastore library. It makes data access layers based on DynamoDB easier to develop and maintain. Call it "dink" if you like. Amongst other things, this library features:

To use dynq, you need an AWS account. Once signed into AWS, go to IAM Security Credentials section, click on the Access Keys section and get an access key id and a secret access key. Use these credentials to configure and connect dynq to AWS DynamoDB.

The flexibility of dynq comes from using metadata. A schema model uses table definitions and gets existing table metadata to facilitate more seamless programming.

var dynq = require("dynq");

// Configure using object or JSON file.
dynq.config({ accessKeyId: "xxx", secretAccessKey: "yyy", maxRetries: 5, region: "us-east-1" });

// Load a schema of tables from file or folder
var schema = dynq.connect().schema().require(path.join(__dirname, "model"), { 
    customize: "reuseable model", 
    configuration: "Setting", 
    enableFeature: true 
});
    
// Ensure tables exist and are 'active'
schema.create({ 
    prefix: "TABLE_PREFIX_",
    minReadCapacity: 25,
    minWriteCapacity: 20
}, (err) => { /* ready! */ });

// Easily backup and restore data
schema.backup(__dirname + "/directory", (err) => { });
schema.restore(__dirname + "/directory", (err) => { });

Connections and tables build on the basic putItem, deleteItem, and updateItem actions to provide a more comprehensive set of record-level mutation methods.

// Mutation operations
var table = schema.tables.table;
table.insert({ id: 1, range: 2 }, err => { ... });
table.write({ id: 1, range: 2 }, err => { ... });
table.upsert({ id: 1 range: 2 }, err => { ... });
table.update({ id: 1 range: 3 }, err => { ... });
table.delete(1, err => { ... });

// Query operations
table.exists(1, (err, exists) = > { ... });
table.get(1, (err, item) => { ... });
table.getPart(1, [ "range" ], (err, item) => { ... });

// Bulk operations
table.writeAll([ ... ], err => { ... });
table.getAll([ ... ], err => { ... });
table.deleteAll([ ... ], err => { ... });

Queries

The most advanced features of mutation operations supported by DyanmoDB are exposed though a series of builder classes exposed by a table.

// Query keys-only index and project rest of record
table.query({ id: 1 }).select.all((err, results) => { ... });

// Query and update data
table.query({ id: 1, range: [ "LT", 10 ] })
    .filter({ field: [ "BEGINS_WITH", "abc" ] })
    .update((edit, cb) => {
        edit.change({ ... }).add({ ... }).remove({ ... }).upsert(cb);
    }).all(err => { ... });

// Query and delete data
table.query({ id: 1, range: [ "LT", 10 ] })
    .filter({ field: [ "BEGINS_WITH", "abc" ] })
    .delete().all(err => { ... });

Configure library with standard AWS configuration options.

// Create a simple connection
var cxn = new dynq.Connection({ region: "us-east-1" });

// Create a multi-master connection with an array of AWS regions.
cxn = dynq.connect({ regions: [ "us-east-1", "us-west-1" ], distribute: true });

Create connections with the builder method or constructor syntax.

Additional Options

Schemas put a programming model around DynamoDB tables using metadata and definition objects. The easiest way to define a table schema is within a javascript file loaded with the schema.require method.

Schema Example

// index.js
var schema = dynq.connect().schema().require("user.js", { 
    ... /* options */
}).create({ 
    ... /* options */
}, (err, schema) => {
    ... /* ready */
});

// user.js
module.exports = {
    name: "UsersTable",
    key: { id: "string" },
    read: 5,
    write: 5,
    sort: {
        ByUser: {
            columns: { user: "string" },
            project: "ALL"
        }
    },
    indices: {
        ByTimestamp: {
            columns: { timestamp: "number" },
            read: 5,
            write: 5,
            project: "ALL"
        }
    },
    members: function(table) {
        // These methods will be mixed-in with the table object
        this.foo = function(cb) {
            cb();
        };
    }
};

NOTE: The schema.require method uses the file name as the identifier within the programming model (i.e. schema.tables.user).

The schema.require method may take options, which allows development of reuseable table schema generators, or components. options passed to the schema.require method are available to a schema definition when defined as a function.

// to expose options use a function
module.exports = function(options) {
    return { 
        name: "Users",
        key: { id: "string" },
        read: 5,
        write: 5,
        ...
    };
}

Because a schema definition can be thought of as a flexible component, the schema.create method also takes options that support customizations like table name prefixes (prefix) and read/write capacities (minReadCapacity and minWriteCapacity).

Mixins and Behaviors

A big advantage of schemaless databases like Dynamo is that the data model is readily extensible. Methods added to the table can easily manipulate state, thereby implementing some semantically significant functionality. For example, an articles table might benefit from a publish method.

Table objects in a schema can be extended through mixins. A mixin takes the form of a function or class which will be initialized with the inheriting table. They are mounted on table definitions using the mixin, members, properties, or methods fields. The mixins field may be used to mount multiple mixins as an array, which is useful in the development and composition of components.

function(table) { 
    this.sample = cb => { cb(); };
}

// - or -

class mixin {
    constructor(table) { }
}

A behavior is a mixin that also has access to, and may manipulate, the table definition itself. In the example of the articles table and publish method above, this functionality may be enhanced by a global secondary index on a published field. A behavior not only implements the functional logic, but might also add an index or query the existing schema.

// behavior.js
exports.module = function(option) {
    // perform some logic based on options
    var indexName = options.indexName;
    return function(definition) {
        // inspect or alter defintion
        defintion.indicies[indexName] = { columns: { id: "text" } };
        return function(table) {
            // add mixin methods
            this.operation = (cb) => {
                // table exists within this scope
            };
        };
    };
};

// schema.js
exports.module = function(options) {
    // behaviors allow composition of table functionality
    return {
        name: "table",
        key: { id: "string" },
        behaviors: require("behavior.js")(options)
    };
};

Programming Model

A dyna.schema provides table creation, modification, and deletion functionality and pares this with table definition and description schemas.

Tables can be defined with the schema.define and schema.require methods, which merge into the overall schema.definition object. If you have a tables already created, use the schema.load to populate the schema from DynamoDB table descriptions.

To bring a schema to life, call the schema.create which enumerates through each table in schema.definition. Existing tables are loaded via schema.load and ones that do not are created. All tables can be deleted with schema.drop. The entire schema can be schema.backup'd to and schema.restore'd from a folder of flat files.

Once a schema has been created, schema.tables contain the dynq.table objects to be read and written.

var schema = dynq.connect("us-east-1").schema();

State

Table Creation, Modification, and Deletion Methods

Schema Management Methods

Tables are accessed through the schema.table object and use DynamoDB table description metadata to provide programming abstrations above the low-level DynamoDB API. With knowledge of the key attributes, a table provide conditional record-level methods like insert, update, and exists. Mass operations like writeAll, deleteAll, and getAll use smart batching logic to gracefully handle DynamoDB operation limitations.

The table.query and table.edit are builder interfaces for more complicated actions. The edit interface allows for add and remove operations (in addition to change) on individual items. The query interface invokes index queries and table scans, whose data can either be directly returned, or repurposed as keys for a batch select, update, or delete operation.

table.mixin enables extension of table methods. Only methods with names who do not already exist will be mixed into the table.

var schema = dynq.connect("us-east-1").schema();

schema.load(/PREFIX_.*/i, function(err) {
    var table = schema.tables["PREFIX_Users"];
    table.mixin(function(table) {
        this.foo = cb => { cb() };
    });
    
    table.foo(cb => { });
});

Table-Level Members

File Methods

Record-Level Methods

Query Interface

Edit Interface

Conditions Composer

Calls that take a conditions object may build this object using the conditions composer interface by passing a function instead.

table.where(op => op.field("id").equals(1).and("timestamp").greaterThan(12341234));
// instead of
table.where({ id: 1, timestamp: [ "greater than", 12341234 ] });

Expression Composer

The expression composer interface simplifies DynamoDB expressions by assembling the expression string while simultaneously populating the ExpressionAttributeNames and ExpressionAttributeValues parameters.

table.query().expression(if => if("id", "=", 1).and("timestamp", "between", 12341234, 23452345).and().exists("somefield").or("size(text)", ">", 2));

comparator can be "in", "between", "<>", "<", "<=", ">", ">=". operand can be an attribute name, a path, or the "size" function. value2 is only used with the "between" comparator.

Low-Level Connection Interface

The dynq module supports the logger and debug global configuration operations. The logger defaults to console.log. If debug is set to true, all DynamoDB native operations are logged.

The Connection class provides access to native DynamoDB operations with multi-master support and throughput handling infrastructure. Some conditional and mass operations like insert and insertAll are build on top of the native calls to support higher-level table operations.

Arguments

These methods automatically encode parameters to and decode responses from the DynamoDB typed JSON format.

Native DynamoDB Operation Proxies

The returned connections are compatible with the AWS DynamoDB API.

Arguments

These methods do not automatically decode responses in the DynamoDB typed JSON format.


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4