Overview
Boiler Room offers two primary model classes:- BlackBoxModel: For interacting with API-based models like OpenAI GPT models, Anthropic Claude models, and Together.ai hosted models.
- WhiteBoxModel: For loading and interacting with locally-hosted models using Hugging Face Transformers.
Supported Models
The Boiler Room supports a wide range of models including:OpenAI Models
- GPT-4o, GPT-4o-mini, GPT-4-turbo
- GPT-4.5-preview-2025-02-27
- GPT-4-0125-preview, GPT-4-0613
- GPT-3.5-turbo
- o1, o1-mini, o3-mini, o3-mini-2025-01-31
- Text embedding models (text-embedding-3-small, text-embedding-3-large, text-embedding-ada-002)
Anthropic Models
- claude-3-7-sonnet-20250219
- claude-3-5-sonnet-20241022, claude-3-5-sonnet-20240620
- claude-3-5-haiku-20241022
- claude-3-sonnet-20240229
Together.ai Hosted Models
- meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo
- meta-llama/Llama-3.3-70B-Instruct-Turbo
- meta-llama/Meta-Llama-3.1-405B-Instruct-Turbo
- mistralai/Mistral-Small-24B-Instruct-2501
- mistralai/Mixtral-8x22B-Instruct-v0.1
- deepseek-ai/DeepSeek-R1, deepseek-ai/DeepSeek-R1-Distill-Llama-70B
- databricks/dbrx-instruct
- Qwen/Qwen2.5-7B-Instruct-Turbo
- google/gemma-2-27b-it
Basic Usage
Parallel Querying
For batch processing or efficiency, you can query models in parallel:Working with Locally-Hosted Models
For gradient-based techniques or direct model access, use the WhiteBoxModel:Error Handling
The module includes robust error handling for API failures:Next Steps
- Learn about the BlackBoxModel for API-based interactions
- Explore the WhiteBoxModel for local model interactions
- See how to use these models with Adversarial Candidate Generators

