Skip to content

Using TF-DF library with Inference Sidecar #98

@avttiwari

Description

@avttiwari

AIM - To load a custom model which applies a NN and tf-df GBDT model one after the other

While trying to save and load a custom model which is a stacked version of an NN model and a GBDT model, while trying to register the model on inference sidecar I get the error ""Status: fail: NOT_FOUND: Op type not registered 'SimpleMLCreateModelResource' in binary running on 3bdaf418bc27""
When a YDF model is saved, some YDF-specific operations are written in the model graph, one of them is the SimpleMLCreateModelResource we've been getting in the error trace. The problem is that the tensorflow runtime in the inference sidecar does not contain the kernels for those operations; it only has a stripped down subset of vanilla TF operations.

One solution is to include the right inference kernel and ops from YDF as dependencies in the inference service's bazel files and rebuild the inference sidecar. The ops are registered statically when the TF binary runs, so there's no need to change the source code of the inference sidecar as long as the dependencies are working. tf-df is supported Natively in tf-serving 2.11+. Would using these versions of tf-serving solve the issue ?

Are there any alternatives to this?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions