loading image
Back to glossary

DynamoDB

Fully managed NoSQL database service by AWS offering predictable performance with automatic scaling for modern applications.

Updated on January 14, 2026

Amazon DynamoDB is a fully managed NoSQL key-value and document database service provided by AWS. It delivers single-digit millisecond performance at any scale, with built-in high availability and multi-region replication. DynamoDB eliminates the operational complexity of managing a distributed database while ensuring consistent latency.

Fundamentals

  • Serverless architecture with unlimited automatic horizontal scaling
  • Flexible data model based on tables, items, and attributes without fixed schema
  • Two capacity modes: on-demand (pay-per-request) or provisioned with auto-scaling
  • Eventual consistency by default with strongly consistent read option available

Benefits

  • Predictable performance with sub-10ms latency even at millions of requests per second
  • Zero infrastructure management: no servers to provision, patch, or maintain
  • Native high availability with automatic replication across three availability zones
  • Continuous backup and point-in-time recovery with no performance impact
  • Built-in security with encryption at rest, granular IAM access control, and VPC endpoints

Practical Example

user-service.ts
import { DynamoDBClient } from '@aws-sdk/client-dynamodb';
import { DynamoDBDocumentClient, PutCommand, GetCommand, QueryCommand } from '@aws-sdk/lib-dynamodb';

const client = new DynamoDBClient({ region: 'eu-west-1' });
const docClient = DynamoDBDocumentClient.from(client);

// Create an item
async function createUser(userId: string, email: string) {
  const command = new PutCommand({
    TableName: 'Users',
    Item: {
      PK: `USER#${userId}`,
      SK: `PROFILE`,
      email,
      createdAt: new Date().toISOString(),
      status: 'active'
    }
  });
  await docClient.send(command);
}

// Read an item
async function getUser(userId: string) {
  const command = new GetCommand({
    TableName: 'Users',
    Key: { PK: `USER#${userId}`, SK: 'PROFILE' },
    ConsistentRead: true
  });
  const response = await docClient.send(command);
  return response.Item;
}

// Query with secondary index
async function getUsersByStatus(status: string) {
  const command = new QueryCommand({
    TableName: 'Users',
    IndexName: 'StatusIndex',
    KeyConditionExpression: '#status = :status',
    ExpressionAttributeNames: { '#status': 'status' },
    ExpressionAttributeValues: { ':status': status }
  });
  const response = await docClient.send(command);
  return response.Items;
}

Implementation

  1. Design the data model by identifying access patterns (single-table design recommended)
  2. Define the partition key (PK) to distribute data evenly and optional sort key (SK)
  3. Create the table via AWS Console, CLI, or Infrastructure as Code (CloudFormation, Terraform)
  4. Configure global secondary indexes (GSI) or local secondary indexes (LSI) for alternative queries
  5. Implement access logic with appropriate AWS SDK following best practices
  6. Enable DynamoDB Streams to capture changes and trigger downstream workflows
  7. Configure CloudWatch metrics and alarms to monitor consumption and performance

Pro Tip

Adopt the Single-Table Design pattern to reduce costs and improve performance. Use intelligent composite keys (PK#SK) to group related data and minimize the number of requests. Choose On-Demand mode for unpredictable workloads and Provisioned mode with auto-scaling for stable workloads to optimize costs.

  • AWS SDK (JavaScript, Python, Java, .NET): official clients to interact with DynamoDB
  • NoSQL Workbench: visual modeling and testing tool for DynamoDB
  • DynamoDB Streams + Lambda: event-driven architecture to react to changes
  • DAX (DynamoDB Accelerator): in-memory cache compatible with DynamoDB API for microsecond latency
  • PartiQL: SQL-like query language for DynamoDB
  • AWS Data Pipeline / Glue: ETL tools for data migration and transformation

DynamoDB represents a strategic choice for organizations seeking highly performant databases without operational overhead. Its ability to handle millions of requests per second with predictable latency makes it the ideal solution for web, mobile, gaming, IoT, and ad tech applications. Investment in mastering its specific data model translates into substantial savings in infrastructure and development time.

Themoneyisalreadyonthetable.

In 1 hour, discover exactly how much you're losing and how to recover it.