ModelManager


public class ModelManager
extends Object

java.lang.Object
   ↳ android.adservices.ondevicepersonalization.ModelManager


Handles model inference and only support TFLite model inference now. See IsolatedService.getModelManager(RequestToken).

Summary

Public methods

void run(InferenceInput input, Executor executor, OutcomeReceiver<InferenceOutputException> receiver)

Run a single model inference.

Inherited methods

Public methods

run

public void run (InferenceInput input, 
                Executor executor, 
                OutcomeReceiver<InferenceOutputException> receiver)

Run a single model inference. Only supports TFLite model inference now.
This method may take several seconds to complete, so it should only be called from a worker thread.

Parameters
input InferenceInput: contains all the information needed for a run of model inference. This value cannot be null.

executor Executor: the Executor on which to invoke the callback. This value cannot be null. Callback and listener events are dispatched through this Executor, providing an easy way to control which thread is used. To dispatch events through the main thread of your application, you can use Context.getMainExecutor(). Otherwise, provide an Executor that dispatches to an appropriate thread.

receiver OutcomeReceiver: this returns a InferenceOutput which contains model inference result or Exception on failure. This value cannot be null.