Can we run export files from an Ontology Function?

Can we currently run an export sync of a file from an ontology function?

source - Google Cloud Storage source.

function - python function.

I assume, We can run export sync file only on datasets, can not be done from an ontology funtion.

In general, the answer seems no, but depends on the source type.

For GCS -

Follow the docs here to construct a GCS client and export files - https://docs.cloud.google.com/storage/docs/multipart-uploads

You can construct the GCS client from within your function.

To construct the GCS client, accessing the session credentials is not supported yet (but coming soon).

Till then, we can create a proxy big query source - which supports accessing the session credentials.

Sample code

from functions.api import function, String
from functions.sources import get_source



@function(sources=\[SOURCE_NAME\])
def get_alchemy_gcp_access_token() -> String:
    """
    Retrieves a GCP OAuth access token from a GCS source.
    
    This function connects to a BigQueryProxyForGCS source, 
    retrieves refreshable OAuth session credentials, and returns the 
    current access token as a string.
    
    The access token can be used to authenticate with GCP services like:
    - Google Cloud Storage (GCS)
    - BigQuery
    - Other GCP APIs
    
    Returns:
        String: The GCP OAuth access token
        
    Raises:
        Exception: If unable to retrieve credentials from the source
    """
    source = get_source(SOURCE_NAME)
    refreshable_creds = source.get_session_credentials()
    gcp_creds = refreshable_creds.get()
    
    return gcp_creds.access_token