Release Announcement: Godspeed V2 for Nodejs/Bunjs

Release Announcement: Godspeed V2 for Nodejs/Bunjs

World's first meta-framework for API and event driven services

Ayush Ghai's photo
·

10 min read

We're excited to announce the release of Godspeed version 2 (Github). In our knowledge, it is the world's first 4th generation microservices "meta" framework. The first attempt at abstracting, standardising and democratising development of modern API and event driven services, independent of the framework, language, integrations and protocols used. Currently we are shipping the port for Nodejs and Bunjs. Java is under construction and expected to come out later this year.

And a promise: In alignment with our commitment to a fair way of doing things - "FOSS: From Free to Fair" all our dev tools will forever be free to run in self hosted fashion for not-for-profits, individuals, IT services teams, freelancers and businesses who are yet to make profit or raise series A funding.We invite developers to help us build this unique project together to bring a paradigm shift in in the way teams build and maintain great products across diverse stacks.

The Nodejs module's shipping comes as a paradigm shift from the V1, with a new feather light packaging, decoupled architecture, and multitude of new features and enhancements to take your development experience to new heights. Before we dive into the highlights, let’s see the major differences or enhancements between V2 and V1.

Differences from V1

1) Packaging: V2 ships as a node.js module with a command line utility. This makes it super lightweight and easy to adapt into new or even existing Node.js or Bun.js projects. The packaging of V1 was via Remote Containers based approach, which proved to be a bulky process to setup, build and run on a developer machine. Despite the theoretical promise of better practices that it holds, we realized that remote containers come at their own cost. Even though we have had initial first wins across our enterprise customers, we realised that we needed to shift to a feather light and quick to get started approach.

npm install -g @godspeedsystems/godspeed

2) Modularity: The framework now ships with eventsource and datasource as modular plugins, which can be installed via command line or handcrafted by developers within their project src folders. They can also take help of Godspeed's GPT (Available for private preview). But creating plugins is straightforward and free. Anyone can do it!
The plugins and the functions (js, ts, yaml) have dynamic imports in the rest of the code, so that the eventsource and datasource sdks are not coupled with the business logic. They should be able to replace an integration like prisma by just overwriting a single file in their plugin, instead of changing their entire microservice! Check out this video on decoupled architecture

godspeed plugin add

Creating your own plugins

You can create your own plugins as documented here. Feel free to browse the implementations our plugins to understand how they are setup.

3) Native language functions: In V1, developer could write workflows in YAML only. While it has a shorthand syntax for simpler things like datasource calls and combining or transforming data, it is not suited for complex business logic like nested loops. Hence developer's freedom to choose how to write business logic was given high importance in this release. V2 allows developers to write JS and TS functions and plugins to be used within inline scripts in YAML. This gives them full freedom and control over their app. Node.js or Bun.js app, but they can now leverage Godspeed’s opinionated ways of how to develop services in a better way with guardrails. Like YAML workflows, native workflows can be directly set as a event handler or called from other functions aka workflows .

import { GSContext, GSStatus, GSCloudEvent } from "@godspeedsystems/core";

/**
 * Here we call a datasource and can apply any busienss logic
 */
export default async function (ctx: GSContext, args: any) {

    //Ctx object has the basics you need to write any business logic in TS/JS
    const {
        logger: pinoLogger, 
        childLogger: pinoChildLogger, 
        inputs, outputs, functions, datasources, 
        config, mappings, plugins: scriptFunctions, forAuth, exitWithStatus
    } = ctx;
    //Accessing deserialized inputs from the event source
    const {user, body, params, query, headers} = inputs.data;
    //Apply some business logic. Call another function or workflow etc.
    //Note: this is not stopping you from statically importing the same function modules
    const sum = await functions['com.biz.sum'](args.x,args.y);

    // Invoking any datasource method
    // The args can be completely custom as per the plugin implementation
    const mongooseCallArgs = {"name":"mastersilv3r"};

    // There is a `meta` key in args which acn be used to send meta information to the plugin's execute() calls
    // The plugin knows how to handle thse calls
    // In `meta` you can set your custom parameters on top of the arguments for the execute() method of the plugin 
    // Prepare the args: Add entityType, method name and authorization access rules to the datasource call

    const metaArgs = {
        entityType: 'Templates', 
        method: 'findOne', 
        authzPerms: { //If supported by the datasource plugin
            CollectionName: { //Collection Name
                can_access: ['name', 'email'], //Fields allowed to retrieve for this user
                where: { //Which rows are allowed to be returned to this user
                    tenant_id: user.tenantId, 
                    owner_id: user.ownerId
                }
            }
        }
    };
    const res = await ctx.datasources.mongoose.execute(ctx, {
        ...mongooseCallArgs,
        meta: metaArgs
    });
    if (!res.success) {
        return new GSStatus(false, res.code || 500, undefined, {message: "Internal Server Error", info: res.message})
    }
    return new GSStatus(true, 200, undefined, res || {",ess": 5});
}

4) Doubling down on Schema driven development: While we already supported schema driven development in V1, we have now added Apollo Graphql and multiple http services (like Express and Fastify) all getting created and working with incoming and outgoing data deserialization, serialization, validations, authentication, authorization via the Godspeed + Swagger hybrid event schema. This way developers work on one standard schema format, and can expose their APIs via different protocols without writing or changing a single line of code in their business logic or schema validations! Check out this video on Schema Driven Development

Lets check difference (or similarity) between sample HTTP/REST and Graphql event schemas

http event

http.get./sample_api:
    summary: sample
    description: sample http event example
    fn: http_sample
    body: 
      content:
         application/json:
            schema:
               type: object 
               properties:
                  name:
                    type: string 
    responses:
       200: 
         content:
           application/json:
              schema:
                type: object

Graphql event

Apollo.post./sample:
    summary: sample
    description: sample graphql event example
    fn: graphql_sample
    body: 
      content:
         application/json:
            schema:
               type: object 
               properties:
                  name:
                    type: string 
    responses:
       200: 
         content:
           application/json:
              schema:
                type: object

They are the same! See how you can add Apollo Graphql to your service or generate CRUD API in REST.

Now, having seen the major highlights, lets dive deep into individual updates.

New Features

1) Introducing Event source: Event source is introduced to capture and define entry points in the Godspeed Framework, all following a standard schema syntax, irrespective of the protocol, whether async or sync. Do check the http, kafka, graphql etc event sources in Godspeed's plugin directory.

Issue: Documentation

2) Datasource Enhancements: Datasource serves as central origins for data querying and storage from APIs or datasources. Let's see examples from the AWS plugin

Aws plugin scaffolding

├── src
    ├── datasources
       ├── types
       |    └── aws.ts
    |   |
    │   └── aws.yaml
    │
    ├── events
    |
    ├── eventsources
    │      
    └── functions
        |
        └── aws_list.yaml

AWS configuration file

  type: aws
  default_client_config: # uses env var declared in config/custom-environment-variables.yaml
    region: <%config.region%> 
    credentials:
      accessKeyId: <%config.accessKeyId%>
      secretAccessKey: <%config.secretAccessKey%>
  # service type is the name of the npm module for ex. @aws-sqk/client-dynamodb or @aws-sqk/client-s3 etc
  # The `types` key can have service type to sdk's client names mappings when coding
  types:
    dynamodb: DynamoDB
    s3: S3
    lambda: Lambda
    ssm: SSM
    sqs: SQS
  services:
    s3:
      type: s3
      config: #overriden config for s3
        region: <%config.s3Region%> 
        credentials:
          accessKeyId: <%config.s3AccessKeyId%>
          secretAccessKey: <%config.s3SecretAccessKey%>
    s3_1: #another S3 service instance
      type: s3
    dynamodb:
      type: dynamodb
    sqs:
      type: sqs
    ssm:
      type: ssm
    lamdba:
      type: lambda

Sample AWS function call

id: aws_workflow 
tasks: 
  - id: aws_list 
    fn: datasource.aws.s3.listObjects
    args: <% inputs.body %>

Issue: Documentation

3) Authorization:

  • Implementation of authorization in Godspeed V2. Adopt any RBAC and ABAC policies with rule engines and distributed systems, using any databases or policy engines. We have provided a way-agnostic method to have custom authz at event and workflow/task levels, including very fine grained database access limiting access to not just tables but rows and columns! You can customise the implementation to your org's use case.

    Issue: Documentation

  • Event level authz

  •       "http.get./helloworld":
            authn: true
            fn: helloworld
            authz:
              - fn: com.gs.transform
                id: try_auth_2_authz
                args: |
                  <js% 
                    if (inputs.user.role === 'admin') {
                      return {
                        success: true, 
                        message: "Authorization passed",
                      }
                    } else {
                       return {
                        success: false, 
                        message: "Authorization failed"
                      }
                    }
                  %>
    
  • Task level authz

  •             id: task_authz
                tasks:
                  - fn: com.gs.transform
                    id: try_auth_3
                    args:    
                      success: true,
                      data: {
                          "tableABC": {
                            "where": {ownerId: <%user.user_id%>, departmentId: <%user.department_id%>}, 
                            "can_access": ["column1","column2"] ,
                            "no_access": ["column1"]
                          }
                      }
    

    4)DirectJS/TS Workflow Invocation

  • Allow native language Workflows to be directly called from events.

  • Example event

  •                         http.post./helloworld:
                              fn: test
                              body:
                                content:
                                  application/json:
                                    schema:
                                      type: object
    
                              responses:
                                200:
                                  content:
                                    application/json:
                                      schema:
                                        type: number
    
  • test.js

  •                         const {GSStatus} = require('@godspeedsystems/core');
    
                            module.exports = async(ctx)=>{
                                const x = parseInt(ctx.inputs.data.body.x)
                                const y = parseInt(ctx.inputs.data.body.y)
                                const responseData = x+y
                                return new GSStatus(true, 200, undefined, responseData, undefined);
                            };
    

    5) Pluggable Observability in Godspeed v2:

  • Pluggable OTEL standard based monitoring of your services for capturing logs, traces, and metrics via simple configurations.

  • Issue: Documentation

    6) Godspeed GS-Kit:

  • AI-Powered React Front-End Starter Kit for Rapid Development from OpenAPI/Swagger Specifications. This is a small useful utility when it comes to initializing a React project with API/data/store connectivity generated from the Swagger spec of your backend service, along with Tailwind setup. A great place to get started with UI around your API!

  • Issue: Documentation

    7) Automated Swagger Generation:

    Automated Swagger Generation for Smooth API Exploration.

    Issue: Documentation

    8) Godspeed Plugin - graphql-as-eventsource:

  • Run a Graphql service on top of your database model in 5 minutes. Use the standard event schema syntax of Godspeed but instead of http say graphql or Apollo or a keyword of your choice.

  • Sample graphql event(create_category.yaml)

  •   Apollo.post./mongo/category:
        summary: Create a new Category
        description: Create Category from the database
        fn: create
        body:
          content:
            application/json:
              schema:
                type: object
                properties:
                  name:
                    type: string
        responses:
          content:
            application/json:
              schema:
                type: object
    
  • Issue: Documentation

    9) Godspeed Language Tools:

  • Enhanced Development Experience with VSCode Extension and Language Server. Autocompletion and validation of events, workflows and tasks happens through this VS plugin.

  • Issue: Documentation

    10) Godspeed Plugin - AWS as Datasource:

  • Introducing a Godspeed plugin for AWS as a datasource, exposing all the services of AWS via simple configuration files

Issue: Documentation

11) Added Plugin Feature for Inlinescript

You can write funtions and use them within inline scripts of your YAML DSL within YAML workflows, eventsource and datasource configurations. Plugins are utility JS/TS synchronous functions designed to enhance inline scripting capabilities. You can write any piece of code in a plugin and access it inside your inline scripts at any time. Please note, this is not to be confused with Datasource or Eventsource plugins.

plugins/epoch/convertEpoch.ts

    import format from 'date-fns/format';

        export default function convertEpoch(inputTimestamp: string){
            const newDateTime = new Date(inputTimestamp);
            return format(newDateTime, 'yyyy-MM-dd HH:mm:ss');
        }
  - id: httpbinCof_step1
    description: Hit http bin with some dummy data. It will send back same as response
    fn: datasource.api.post./anything
    args:
      data:
        default_date: <% epoch_convertEpoch(inputs.body.datetimestamp) %>

Improvements

1)Simplifying Return Values in TS/JS Functions and Event Handlers:

  • Improved handling of return values in TS/JS functions and event handlers.

  • Issue

Bug fixes

1) Failure to Read Some Environment Variables from .env File:

  • Fixed the issue with reading environment variables from .env files.

  • Issue

2)Fixed Return Bugs in V2:

  • Addressed return bugs for smoother execution.

  • Issue

3) Request Body Schema Reference Issue:

  • Fixed the issue where the request body schema, $ref, couldn't take reference from the definitions.

  • Issue

4) Fix Inline JS Script in EventSources YAML:

  • Resolved the issue with inline JS script not working during loading of EventSources YAML.

  • Issue

5) Response Schema Validation Issue:

  • Fixed the issue where response schema validation was not working.

  • Issue

6) Inline Scripts Not Supported by EventSource YAML:

  • Resolved the issue where inline scripts were not supported by EventSource YAML (http.yaml).

  • Issue

7) Config Module Environment Variable Issue:

  • Fixed the issue where the config module was unable to read environment variables from the .env file.

  • Issue

Do watch these videos on Security and Configure Over Code. Here is a getting started video & getting started Documentation link. Feel free to join our discord for any discussions.

What do you think about our philosophy of Godspeed’s meta design.

Ensure to check the provided links for detailed information on each feature, improvement, and bug fix.

Thank you for being a part of the Godspeed community! Your contributions and feedback drive us to new heights.

Happy coding! 🚀