Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.morf.health/docs/llms.txt

Use this file to discover all available pages before exploring further.

images/google.svg

Getting Started

Automate your healthcare workflows with Google BigQuery integrations and Morf. BigQuery is a cloud-based data warehouse for analyzing and querying large datasets. Save event data from a Morf workflow to a BigQuery table.

Grant Morf access to BigQuery

  1. Create a new dataset. This recommended approach to dataset management ensures Morf only has access to the necessary tables. Follow Google’s instructions on creating a new dataset.
  2. Share your dataset with Morf. Follow Google’s instructions on granting access to a dataset.
    • Role: BigQuery Data Editor (roles/bigquery.dataEditor)
    • Email: primary-workload-sa@morf-prod-5f7d.iam.gserviceaccount.com
This role should be granted at the dataset level so Morf doesn’t have unneccessary access to other data.
Use Morf’s Save to BigQuery action to store event data directly in BigQuery from a workflow.

Parameters

Action
Parameters
{
  "$result_object_key": {
    "estimated_row_count": 1000,
    "partition": "2025-03-17",
    "schema_differences": [
      {
        "description": "Number of visits of this patient",
        "field": "number_of_visits",
        "field_addition": {},
        "field_type": "INTEGER",
        "field_update": {
          "new_description": "Number of visits of this patient",
          "new_field_type": "STRING",
          "new_repeated": true,
          "new_required": false
        },
        "nested_field_changes": null,
        "repeated": false,
        "required": true
      }
    ],
    "table_name": "healthie_patient_updated",
    "table_schema": [
      {
        "description": "Number of visits of this patient",
        "fields": null,
        "name": "number_of_visits",
        "repeated": false,
        "required": true,
        "type": "INTEGER"
      }
    ]
  }
}
{
  "$result_object_key": {
    "estimated_row_count": "Estimated total number of rows including the data just inserted",
    "partition": "Date partition the data was saved to",
    "schema_differences": {
      "description": "Field description",
      "field": "Field name",
      "field_addition": {},
      "field_type": "Field type",
      "field_update": {
        "new_description": "New description",
        "new_field_type": "New field type",
        "new_repeated": "New value for whether the field is a list",
        "new_required": "New value for whether the field is required"
      },
      "nested_field_changes": null,
      "repeated": "If the field is a list",
      "required": "If the field is required"
    },
    "table_name": "Table name used to save the data",
    "table_schema": {
      "description": "Field description",
      "fields": null,
      "name": "Field name",
      "repeated": "If the field is a list",
      "required": "If the field is required",
      "type": "Field type"
    }
  }
}

Result Object Field Details

You can use the result of the action’s data as inputs to downstream workflow actions. Each fetch action requires a result object key to be specified which will nest the action’s result data inside the downstream data context in the Workflow. Here we demonstrate how to refer to this data using the prefix $result_object_key.
SaveToBigQueryResponse
Fetch Action Response Object