Programmatic Schema Management & ZCQL DDL Support for Zoho Catalyst Data Store

Programmatic Schema Management & ZCQL DDL Support for Zoho Catalyst Data Store

1. Background

Zoho Catalyst’s Data Store and ZCQL are already powerful for building serverless applications on top of a fully managed relational database.

ZCQL offers a familiar SQL-like syntax for performing data operations (select, insert, update, delete) while the Catalyst Console provides a convenient UI for defining tables and columns.

However, as applications grow in complexity and move through multiple environments (development, staging, production, tenant-specific workspaces, etc.), a console-only approach to schema management becomes a bottleneck.

2. Current Limitation

According to the current documentation and knowledge base:

  • ZCQL supports data manipulation operations only (DML). There is no support for CREATE TABLE, ALTER TABLE, DROP TABLE, or other Data Definition Language (DDL) statements.

  • Catalyst support has confirmed that tables cannot be created via code; tables and their schemas must be created in the Catalyst Console, while data can then be inserted/queried via code.

  • Official tutorials consistently instruct developers to manually create tables and columns in the Data Store UI before wiring up SDKs and ZCQL in their application code. 

This means Catalyst currently does not support a “schema as code” or “migration” approach that many modern backend/serverless platforms provide.

3. Why this is a problem in real-world projects

For simple demos, manual table creation in the console is acceptable. But in real projects, this introduces major pain points:

  1. Multi-environment setups (dev / staging / production)

    • Every environment must have the same schema.

    • Right now, this means someone has to recreate tables and columns by hand, or rely on fragile manual documentation.

    • Any typo or mismatch can cause runtime failures that are hard to detect early.

  2. Team collaboration and onboarding

    • New developers or partners must be told: “Log in to Catalyst, click here, create this table, add these columns…”

    • This is error-prone, not repeatable, and not easily reviewed (no Git history or code review around schema changes).

  3. Continuous Delivery and Automated Deployments

    • CI/CD pipelines can deploy code and functions, but cannot reliably apply schema changes, because there is no first-class, scriptable mechanism to create or modify tables.

    • This blocks stronger DevOps practices such as “one-click environment setup” or automated test environment spin-up.

  4. Tenant-specific or dynamic schemas

    • Some applications need to provision tables at runtime (for example, per tenant, per customer space, or per module).

    • Today this is impossible without manual intervention, which breaks the idea of fully automated onboarding or self-service provisioning.

  5. Auditability and reproducibility

    • Schema changes done via UI are not easy to track or roll back.

    • There is no built-in way to version schema changes, apply them forward, or revert them, similar to how code migrations work in ORM frameworks.

In short: the lack of scriptable schema management is now one of the main constraints preventing larger, more complex systems from fully standardizing on Zoho Catalyst.

4. Proposed Feature Set

I’d like to propose a set of enhancements that would make Catalyst’s Data Store much more powerful for serious application development:


4.1. ZCQL DDL Support (simple, SQL-like schema creation)

Extend ZCQL to support a safe subset of DDL for Data Store tables, for example:

  • CREATE TABLE

    • Define table name

    • Define columns, data types, constraints (NOT NULL, UNIQUE, default values, etc.)

  • ALTER TABLE

    • Add, rename, or drop columns

    • Modify column data types where feasible

  • DROP TABLE

    • Optionally with safety flags or soft-deletion

These DDL commands could be:

  • Executed from:

    • Functions (Node.js, Java, Python, etc.) via SDK executeZCQLQuery()

    • The ZCQL Console (for quick testing)

  • Controlled by permissions/scopes:

    • Only allowed for project owners or for functions with specific roles/scopes.

    • Perhaps disabled by default in production unless explicitly enabled.

This would immediately allow developers to codify their schemas and apply them through deployment scripts.


4.2. Schema Management API & SDK Methods

In addition or as an alternative to DDL in ZCQL, Catalyst could offer dedicated schema management APIs, for example:

  • REST endpoints such as:

    • POST /datastore/tables – create a table with a JSON schema definition

    • PATCH /datastore/tables/{table_name} – alter a table

    • DELETE /datastore/tables/{table_name} – drop a table (with safeguards)

  • SDK wrappers:

    • catalyst.dataStore.createTable(schemaDefinition)

    • catalyst.dataStore.alterTable(tableName, changes)

    • catalyst.dataStore.dropTable(tableName, options)

The schema definition could be a JSON/YAML structure describing:

  • Table name

  • Columns (name, data type, length, flags like mandatory/unique/encrypted)

  • Relationships (if/when Data Store supports foreign key-like references)

  • Indexes (now or in future)

This would unlock:

  • Schema as code: store schema files in Git, review via pull requests.

  • Automated environment setup: a single script/command can spin up a new environment (dev, staging, test) with identical schema.

  • Programmatic multi-tenant provisioning: create per-tenant tables automatically on customer signup.


4.3. Migration & Versioning (Nice to Have)

On top of DDL or APIs, a simple migration mechanism would be extremely valuable:

  • A migration file format like:

    • 2025_01_01_001_create_users_table.zcql

    • 2025_01_10_002_add_status_to_orders.json

  • A small migration runner that:

    • Track which migrations have been applied.

    • Applies new migrations in order.

    • Rolls back where possible (or at least fails-early with clear logs).

This doesn’t need to be as complex as full ORM frameworks, but even a minimal built-in migration runner, or recommended pattern, would considerably improve developer experience.


5. Safety & Governance Considerations

Introducing schema changes via code naturally raises concerns about safety. Here are some ideas to mitigate risk:

  1. Role-based access control

    • Only project owners / admins can run schema-changing code in production.

    • Separate scopes for “DML only” vs “DDL allowed”.

  2. Environment protections

    • Ability to disable DDL in production by default and enable it only when required.

    • Optional approval step in the console for dangerous operations (like dropping tables).

  3. Audit logs

    • Log every schema change (who, when, what DDL/API) with before/after snapshots in the Catalyst Logs section.

  4. Dry-run mode

    • API / SDK support for “dry run” which shows what will change but doesn’t apply it.

These measures preserve the stability of production while still allowing scalable, automated schema management.


6. Impact & Benefits for the Catalyst Ecosystem

Adding programmatic schema management will:

  • Reduce onboarding time for new projects and new developers.

  • Improve reliability of deployments across environments (less human error).

  • Enable more sophisticated architectures (multi-tenant apps, dynamically generated modules, rapid PoC → production flows).

  • Make Catalyst more competitive against other serverless / BaaS platforms that already support migrations and schema-as-code patterns.

  • Encourage partners and agencies to standardize on Catalyst for more of their backend workloads, since infrastructure can be fully automated.

In short, this feature would dramatically improve the developer experience, scalability, and maintainability of Catalyst-based solutions.


7. Closing

Right now, Catalyst’s Data Store and ZCQL are excellent for data operations, but the inability to create and evolve schemas via script is a key missing piece.

I hope you will consider:

  • Adding DDL support in ZCQL and/or

  • Providing schema management APIs & SDK methods, and eventually

  • Offering a lightweight migrations framework.

This would align Catalyst with modern “infrastructure as code” and “schema as code” practices and unlock a lot of advanced use cases for teams building serious production systems on Zoho Catalyst.

    • Announcements

    • Announcing Deprecation of Catalyst File Store, Event Listeners, and Cron

      We would like to announce that the following Catalyst features are now in their deprecation phase and will reach End Of Life (EOL) on 30 April, 2026- Catalyst File Store Catalyst Event Listeners Catalyst Cron New users who sign up for Catalyst from today
    • React Nexus 2025 Recap: Catalyst Slate in Action!

      Hey Catalyst Community! We recently attended the React Nexus 2025 conference, an exciting gathering for frontend enthusiasts and React developers. Our team had an incredible time presenting and conducting a hands-on workshop on Catalyst Slate, our streamlined
    • [Webinar] A hands-on guide to Catalyst Stratus

      Have you used Catalyst Stratus yet? It’s an object storage service that makes it easy to handle large files — whether they're coming from your Catalyst app or other Zoho apps. We’re hosting a live coding session where you’ll build a working prototype
    • Catalyst Video Tutorials!

      Hello everyone! We’ve been brewing something exciting behind the scenes, and we’re thrilled to finally share it with you- Catalyst video tutorials are here! We recognized that videos are the predominant medium for learning and discovery these days, so
    • [Webinar] Catalyst Cloud Browser in Action: PDF & Web Rendering Solutions for Regulated Industries

      Hi everyone, Have you ever struggled with rigid PDF tools or clunky rendering logic in BFSI or healthcare apps? Do your clients struggle to deliver compliant, dynamic, and automated documents — and most are still stuck with brittle, server-heavy PDF generation?

      Catalyst Community