Skip to main content

Striim Platform 5.0 documentation

Configuring and selecting AI engines

Striim AI uses large language models for sensitive data governance, to generate vector embeddings, and to run Striim CoPilot. Striim's default Striim AI engine is based on the open-source Microsoft Presidio.

If you are already using OpenAI or Vertex AI, you can use them as AI engines for Striim AI. See Using an OpenAI or VertexAI engine.

You can set up multiple AI engines in Striim. You can find the list of your AI engines for the Striim AI features as follows:

  • In Vector Embeddings Generator from the Striim AI menu.

  • In Sensitive Data Governance, in the Settings tab.

    settings_engines_sensitive_data_governance.png
Using the Striim AI engine

The following are required for the Strim AI engine:

  • A minimum of 16GB of RAM.

  • Docker Desktop installed and running.

  • 40GB of free disk space available in Docker (in Docker Desktop, see Preferences > Resources > Disk image size).

To install Striim AI:

  1. In a terminal or command prompt, pull the Docker image from Docker Hub.

    docker pull striim/striimai:latest
  2. Run the Docker image.

    docker run -d -p 9000:9000 all striim/striimai:latest

    If the machine has GPUs, include the --gpus option.

    docker run -d -p 9000:9000 --gpus all striim/striimai:latest

    By default the image uses port 9000. If necessary you may map to a different port (in this example, 9001):

    docker run -d -p 9001:9000 all striim/striimai:latest
  3. If you didn't use the default port 9000, or the Docker container is not running on localhost, edit startUp.properties, change StriimAIServiceAddress to the correct IP address and port, and restart Striim. For example, for port 9001 on localhost:

    ########################## #
    Striim AI Settings 
    ########################## 
    StriimAIServiceAddress = localhost:9001

    You can also add multiple servers as follows:

    StriimAIServiceAddress = localhost:9000,localhost:9001
  4. Look in the Docker Desktop Dashboard and make sure the Striim AI container is running (it may take a few minutes to start). Once it is running, you can use Striim's AI features.

If you move Striim AI to a different IP address or port, enter the following command in the Striim console to update the Striim AI connection profile:

CREATE OR REPLACE CONNECTIONPROFILE System$PIIGlobal.striimAICP type striimAICP
  ( striimaiserver: 'localhost:9002'; );
Using an OpenAI or VertexAI engine

To configure a custom OpenAI or Vertex AI engine:

  1. In the Striim AI menu select Sensitive Data Governance.

  2. Select the Settings tab. The default engine is Striim AI engine on my Striim server.

  3. Select Custom AI engine and select New.

  4. Choose OpenAI or Vertex AI as the AI model provider.

  5. Configure the following settings:

    • For OpenAI:

      • Object name

      • Model name

      • API key

        striim-ai-settings-custom-ai-engine-openai.png
    • For Vertex AI:

      • Object name

      • Model name

      • Project ID

      • Service account key

      • Region

      • Publisher

        striim-ai-settings-custom-ai-engine-vertex-ai.png
  6. Click Save.

    You can now select the engine from the Custom AI menu for sensitive data discovery.