LogoRead Frog
Model Providers

Built-in AI Provider Configuration

Learn how to configure and use Read Frog's built-in AI providers, including OpenAI, DeepSeek, Google, and other mainstream platforms.

What is Built-in AI Provider Configuration

Read Frog has built-in support for multiple mainstream large language model providers, allowing you to choose the most suitable model for translation and reading comprehension. We have pre-configured default request addresses, models, etc. for these providers, so you only need to configure the API Key to use them.

Provider Overview

Provider CategoryRepresentative ProvidersFeatures
Major AI CompaniesOpenAIProvides GPT-4o and other models
Google GeminiGoogle's flagship AI model with advanced reasoning capabilities
AnthropicClaude models known for safety and reasoning capabilities
DeepSeekRecommended for use in China
xAIElon Musk's AI company, providing Grok models with real-time data access
Cloud Inference PlatformsGroqSpecialized hardware focused on ultra-fast LLM inference speed
Together AICollaborative platform for running and fine-tuning open-source models
DeepInfraCost-effective cloud inference for popular open-source models
FireworksProduction-ready inference platform for open-source models
CerebrasUltra-fast AI inference, suitable for quick translation and text analysis
ReplicateAccess to diverse open-source models, suitable for specialized translation tasks
Enterprise ServicesAmazon BedrockAWS managed service for enterprise-grade foundation models
CohereEnterprise-grade AI with strong multilingual and RAG capabilities
Aggregation PlatformsOpenRouterUnified interface for multiple LLM providers, pay-as-you-go
European AIMistralEuropean AI company focused on efficient multilingual models
Featured PlatformsPerplexityReal-time knowledge-enhanced AI providing contextual translation and reading assistance
VercelAI models optimized for web content translation and analysis

Detailed Configuration Guide

Step 1: Access Settings Page

  1. Click the Read Frog extension icon in your browser toolbar
  2. Click the "Options" button in the popup window
  3. Or right-click the extension icon and select "Options"

Step 2: Select Provider

API Providers Page

Click the "Add Provider" button, select the "Built-in LLM Providers" section, and choose the provider you want to use.

Step 3: Get API Key

Add the API Key in the corresponding field. Here's how to get API keys for major providers:

ProviderAPI Key LocationSpecial Notes
OpenAIplatform.openai.com/api-keysRequires payment method
Googleaistudio.google.com/app/apikeyHas free quota
DeepSeekplatform.deepseek.com/api_keysVery cost-effective

Step 4: Configure Model Selection

We have two scenarios: translation and reading. We recommend choosing the appropriate model based on your use case.

ScenarioDescriptionRequirements
TranslationOnly used to translate text from one language to anotherRecommend using faster small models
ReadingRead articles and perform analysis and summarizationRequires models that support structured output

Read Frog's reading feature requires AI models to output JSON objects in a specific format. Not all models support this feature

  • Full Support: OpenAI GPT series, Google Gemini series, Anthropic Claude series
  • ⚠️ Partial Support: DeepSeek, most open-source models (may need adjustment)
  • Not Supported: Older models or models that don't support JSON output

Step 5: Verify Configuration

After configuration, test it:

  1. Click the "Test Connection" button
  2. System sends test request
  3. Check the returned result to confirm configuration is correct
  4. If it fails, check API key and network connection

Troubleshooting Guide

Common Configuration Issues

Issue TypeSymptomsSolutionsVerification Method
API Key Error"Invalid API Key"Re-copy key, check for spacesTest connection button
Insufficient Balance"Quota exceeded"Top up account or switch providerCheck provider console
Network IssuesConnection timeoutCheck network, try proxyPing provider domain
Model RestrictionsModel unavailableCheck permissions, choose other modelReview provider documentation

Summary

With this guide, you should be able to:

  1. ✅ Understand the features and applicable scenarios of each provider
  2. ✅ Successfully configure the required AI providers
  3. ✅ Choose appropriate models for translation and reading
  4. ✅ Solve common configuration and usage issues

If you encounter other issues during use, we recommend checking the provider's official documentation or seeking help in the Read Frog Discord community.