Python packages with native C++ bindings crash in published Functions (work in preview)

Python Function with OR-Tools CP-SAT works in live preview but consistently fails when called from a Workshop function-backed variable

I have a Python Function that uses Google’s OR-Tools CP-SAT solver to optimize assignment of personnel to requirements (classic assignment problem). The function works perfectly in live preview in platform VSCode every time, but consistently fails when called from a Workshop function-backed string variable with this error:

RawClientError(hyper_util::client::legacy::Error(SendRequest, hyper::Error(Io, 
Custom { kind: UnexpectedEof, error: "peer closed connection without sending TLS 
close_notify" })))

It has worked a couple of times from Workshop, but very intermittently. The vast majority of calls fail.

What I’ve verified:

  • Function works consistently in live preview in platform VSCode (tested dozens of times)

  • Function consistently fails when called from a Workshop function-backed variable, works very rarely

  • The error doesn’t hit my try/catch block, suggesting the process dies before Python can handle it

  • Replacing the OR-Tools optimizer with a pure Python greedy algorithm makes the function work reliably every time from Workshop

  • The issue is specifically OR-Tools – all other code (ontology queries, scoring, data loading) works fine from Workshop

  • OR-Tools is installed via PyPI in the Libraries panel

  • Python 3.10 environment

  • Problem size is small (18 requirements, ~60 eligible candidates after filtering)

My theory is that OR-Tools’ C++ native bindings are crashing the function executor process. The fact that it bypasses try/catch suggests a segfault or process-level crash rather than a Python exception.

Questions:

  1. Has anyone successfully used OR-Tools (or other packages with C++ native bindings) in published Python Functions called from Workshop?

  2. Is there a known limitation with native extensions in the Python function executor when called from Workshop?

  3. Is there a way to configure the function executor to isolate or restart between calls?

  4. Any alternative solver recommendations that work reliably in published Python Functions called from Workshop? The solver needs to handle:

    • Classic assignment problem (maximize total score, one-to-one matching)

    • Hard constraints (qualification gates, capacity caps)

    • Multi-objective optimization (quality vs. fairness/distribution balance)

    • Scale to 500+ requirements x 1,000+ candidates

    • Extensible for additional constraint types in the future

Any guidance appreciated.

Update: Found a workaround

We got OR-Tools CP-SAT working reliably in published Python Functions by running the solver in a subprocess using Python’s multiprocessing module.

The approach:

  • Define the solver function at module level (required for pickling)
  • Import ortools inside the subprocess function, not at module level
  • Call it via multiprocessing.Pool(1).apply()
import multiprocessing

def _run_solver(args):
    # Import OR-Tools inside the subprocess
    from ortools.sat.python import cp_model
    
    model = cp_model.CpModel()
    # ... build model, add constraints, set objective ...
    
    solver = cp_model.CpSolver()
    solver.parameters.max_time_in_seconds = 30
    solver.parameters.num_workers = 4
    status = solver.Solve(model)
    
    # Return plain dicts (must be serializable)
    return {"assignments": [...], "status": "optimal"}

def run_solver(scores, eligible):
    solver_input = {"scores": scores, "eligible": eligible}
    with multiprocessing.Pool(1) as pool:
        result = pool.apply(_run_solver, (solver_input,))
    return result

Key details:

  • The subprocess isolates OR-Tools’ C++ runtime from the function executor process
  • If the native code crashes, it only kills the subprocess, not the executor
  • All data passed in/out must be JSON-serializable (no tuple dict keys – we convert to string keys)
  • CP-SAT multithreading (num_workers = 4) works fine inside the subprocess
  • Successfully handling 2,800+ candidates and 166+ requirements with this approach
  • Had to increase memory and CPU limits via the function Configuration tab

This went from consistently crashing to consistently working. The overhead of subprocess creation is negligible compared to the solver time.

Hope this helps anyone else trying to use packages with native C++ bindings in published Python Functions.