Early Access
Bring Your Own Model (BYOM)
3 min
this feature is in early access and not yet generally available early access gives select customers the opportunity to try a new features, share feedback, and help shape its final release contact your account team if you're interested in participating in maro's early access program maro supports the option for users to leverage models of their choice to perform analysis of social engineering and gen ai usage some reasons that users may want to use this feature are the device running maro does not meet the device specifications necessary to run slms locally the organization has enterprise access to a sanctioned llm provider the user wants to leverage a locally running llm (for instance, via ollama) instead of maro’s language model options gemini nano \[default] openai you will need to provide your openai api key the openai model to use anthropic you will need to provide your anthropic api key claude model to use for detections maximum tokens to generate in the response ollama you will need to provide the base url of your ollama instance the model to use for inference request timeout in second instructions open the maro side panel click your user portrait in the top right corner select settings choose one of the language model providers in the drop down box fill out the details described above press save