Oracle Analytics Cloud and Server

Welcome to the Oracle Analytics Community: Please complete your User Profile and upload your Profile Picture

OML Model Monitoring equivalent in OAS?

Received Response
73
Views
5
Comments

Hi there,

In OAS, there is functionality to create machine learning models using the built-in algorithms, but as I understand it there is currently no way to understand model performance / monitoring once the model is productionized? It would be a really great feature to understand if there is any drift (concept / data) and be able to apply that within OAS as a Data Flow / similar.

I'm aware this functionality exists within Oracle Machine Learning Services (Model Monitoring) -> Model Monitoring with Oracle Machine Learning Services, but this is enabled by the Autonomous Database.

Are there any plans for similar to be made available within OAS? Or is there a way to do similar using a custom script?

Thanks

Phil

Answers

  • RajeshPolavarapu-Oracle
    RajeshPolavarapu-Oracle Rank 6 - Analytics Lead
    edited Mar 6, 2024 3:02PM

    Hi @User_CKFJD

    This is an enhancement request for the product. Please post it in below analytics idea lab communities -

    https://community.oracle.com/products/oracleanalytics/categories/idealab-oracle-analytics-cloud-server


    Here is the document on how to create idea lab request for this -

    How To Create An Enhancement Request In Idea Lab For Oracle Analytics (Doc ID 2662737.1)

  • Hi Phil (@User_CKFJD ),

    Not the exact equivalent of OML Model Monitoring, but OAS does have these concepts.

    Assess a Predictive Model's Quality

    and

    Evaluate Machine Learning Models Using Lift and Gain Charts


    @Philippe Lions-Oracle or @Lalitha Venkataraman-Oracle - may have some additional comments.

  • Thanks @SteveF-Oracle .

    Hi @User_CKFJD, OAS and OAC both support two aspects of designing/monitoring ML models (OAS may lag behind on some recent OAC aspects, but catches up once a year) :

    • OA native ML Models, these have no dependency on OML (ADW), and provide metadata/quality metrics on the models created right away. See inspect tab, or tag the lift option when applying classification models.
    • OA integration with OML : you can design models directly in OML and consume them in OA, or even use the AutoML node in OA to trigger OML-AutoML process behind the scenes. OML models offer same metadata/quality/lift information to OA users as with native OA ML models, but in addition, OML models also allow explain-ability outputs when applying them in OA. This provides explanation by record as to why a given record was predicted in a certain way.

    For drift, we don't have a native feature in OA for this yet. Your best option to monitor model drift is to leverage OML capability directly, and possibly visualize this in OA if this is needed by OA users. Let us know if this was not your question.

  • philipgodfrey
    philipgodfrey Rank 6 - Analytics Lead

    Thanks so much for the swift reply @SteveF-Oracle and @Philippe Lions-Oracle - and for the helpful information, I really appreciate it.

    Having the ability to assess the quality of a predictive model within OAS will be certainly helpful.

    What I was looking for particularly is:

    a) if there is a way to identify that a machine learning model is not performing when in production

    and

    b) If possible - to alert an end user of this case - similar to the alert generated by Model Monitoring in OML (ADW).

    But within OAS if I'm not mistaken, you could do something quite similar:

    • Baseline accuracy of your model that it was trained on
    • Regularly check the score of your model over a period of time as newer data is fed in
    • If the appropriate score dropped below a determined threshold, you would look to re-train the model on more recent data and repeat the process
    • Then, regularly inspect the quality of the newer model, as new data is fed in etc

    I'm not sure if there is a way to flag this to an end-user, but even just having the ability to regularly assess the quality of the model within OAS would be perfect.

    Hopefully that makes sense.

    Thanks

    Phil

  • Thanks @User_CKFJD for that feedback, we are interested in following up with you on this, this is obviously very legitimate enhancement request. Can you email me directly philippe.lions@oracle.com, we could use a bit more of your pointers so the team and myself will start think how we could better address it. Thanks in advance.

    Philippe