Migration Guide
Migrate to Skimly v2.0+ for production-grade streaming, tool calling, and Anthropic-compatible interfaces. Switch from provider SDKs or upgrade from Skimly v1.x in minutes.
v2.0 Features: Streaming, tool calling, TypeScript types, Anthropic-compatible interfaces.
Provider flexibility: Switch between OpenAI and Anthropic with one line change.
Cost savings: Smart Truncation + blob storage reduces token usage by 35-85%.
Skimly v1.x → v2.0+
Already using Skimly v1.x? Upgrade to v2.0+ for streaming, enhanced TypeScript support, and Anthropic-compatible interfaces:
v1.x (Legacy)
// Node.js v1 import { fromEnv } from '@skimly/sdk' const client = fromEnv() await client.chat({ provider: 'openai', model: 'gpt-4o-mini', messages: [{ role: 'user', content: 'Hello!' }] }) # Python v1 from skimly import Skimly client = Skimly.from_env() client.chat({ 'provider': 'anthropic', 'model': 'claude-3-sonnet', 'messages': [{'role': 'user', 'content': 'Hello!'}] })
v2.0+ (Current)
// Node.js v2.0+ import { SkimlyClient } from '@skimly/sdk' const client = new SkimlyClient({ apiKey: process.env.SKIMLY_KEY! }) await client.messages.create({ provider: 'openai', model: 'gpt-4o-mini', max_tokens: 1024, messages: [{ role: 'user', content: 'Hello!' }] }) # Python v2.0+ from skimly import AsyncSkimlyClient async with AsyncSkimlyClient.from_env() as client: await client.messages.create({ 'provider': 'anthropic', 'model': 'claude-3-sonnet', 'max_tokens': 1024, 'messages': [{'role': 'user', 'content': 'Hello!'}] })
🚀 v2.0 New Features
- • Streaming:
client.messages.stream()
with real-time SSE - • Anthropic Interface:
messages.create()
matching official SDK - • Smart Truncation: Up to 99% savings on verbose AI responses
- • Full TypeScript: Comprehensive types and error handling
- • Async Python:
AsyncSkimlyClient
with context managers
Provider SDKs → Skimly
Migration Overview
The migration involves three main changes:
1. API Key
Replace OPENAI_API_KEY
or ANTHROPIC_API_KEY
with SKIMLY_KEY
2. SDK Import
Replace provider SDKs with @skimly/sdk
(Node) or skimly
(Python)
3. API Calls
Replace .chat.completions.create()
or .messages.create()
with client.chat()
Node.js Migration
OpenAI → Skimly
Replace direct OpenAI SDK calls with Skimly's unified interface:
Before (OpenAI SDK)
import OpenAI from "openai" const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY }) const resp = await openai.chat.completions.create({ model: "gpt-4o-mini", messages: [{ role: "user", content: "Summarize this email" }] })
After (Skimly SDK)
import { SkimlyClient } from "@skimly/sdk" const client = new SkimlyClient({ apiKey: process.env.SKIMLY_KEY! }) const resp = await client.messages.create({ provider: "openai", model: "gpt-4o-mini", max_tokens: 1024, messages: [{ role: "user", content: "Summarize this email" }] })
Anthropic → Skimly
Replace direct Anthropic SDK calls with Skimly's unified interface:
Before (Anthropic SDK)
import Anthropic from "@anthropic-ai/sdk" const anthropic = new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY }) const resp = await anthropic.messages.create({ model: "claude-3-sonnet", messages: [{ role: "user", content: "Summarize this email" }] })
After (Skimly SDK)
import { SkimlyClient } from "@skimly/sdk" const client = new SkimlyClient({ apiKey: process.env.SKIMLY_KEY! }) const resp = await client.messages.create({ provider: "anthropic", model: "claude-3-5-sonnet-20241022", max_tokens: 1024, messages: [{ role: "user", content: "Summarize this email" }] })
Python Migration
OpenAI → Skimly
Replace direct OpenAI SDK calls with Skimly's unified interface:
Before (OpenAI SDK)
from openai import OpenAI client = OpenAI(api_key=os.getenv("OPENAI_API_KEY")) resp = client.chat.completions.create( model="gpt-4o-mini", messages=[{"role": "user", "content": "Summarize this email"}] )
After (Skimly SDK)
from skimly import AsyncSkimlyClient import asyncio async def main(): async with AsyncSkimlyClient.from_env() as client: resp = await client.messages.create({ 'provider': 'openai', 'model': 'gpt-4o-mini', 'max_tokens': 1024, 'messages': [{'role': 'user', 'content': 'Summarize this email'}] }) asyncio.run(main())
Anthropic → Skimly
Replace direct Anthropic SDK calls with Skimly's unified interface:
Before (Anthropic SDK)
import anthropic client = anthropic.Anthropic( api_key=os.getenv("ANTHROPIC_API_KEY") ) resp = client.messages.create( model="claude-3-sonnet", messages=[{"role": "user", "content": "Summarize this email"}] )
After (Skimly SDK)
from skimly import AsyncSkimlyClient import asyncio async def main(): async with AsyncSkimlyClient.from_env() as client: resp = await client.messages.create({ 'provider': 'anthropic', 'model': 'claude-3-5-sonnet-20241022', 'max_tokens': 1024, 'messages': [{'role': 'user', 'content': 'Summarize this email'}] }) asyncio.run(main())
What Changed
API Key
Replace OPENAI_API_KEY
or ANTHROPIC_API_KEY
with SKIMLY_KEY
API Calls
Use client.messages.create()
(Anthropic-style interface) for all providers
Provider Flexibility
Switch between OpenAI and Anthropic with one line change: provider: "openai"
→ provider: "anthropic"
Blob Storage
Upload large content once and reference it in chat requests to save 60-90% on tokens
Migration Checklist
Install Skimly SDK v2.0+
npm install @skimly/sdk@^2.0.0
(Node) or pip install skimly>=
(Python)
Update Environment Variables
Replace OPENAI_API_KEY
or ANTHROPIC_API_KEY
with SKIMLY_KEY
Update SDK Imports
Replace provider SDK imports with Skimly SDK imports
Update API Calls
Use client.messages.create()
(Anthropic-style interface) and add max_tokens
Add Provider Field
Add provider: "openai"
or provider: "anthropic"
to all chat calls
Test Migration
Verify responses are identical to your original provider calls
(Optional) Remove Old Dependencies
Uninstall openai
and/or @anthropic-ai/sdk
if no longer needed
Bonus: Blob Storage for Big Context
Once migrated, you can start using Skimly's blob storage to dramatically reduce token usage for large, rarely-changing content like policies, documentation, or conversation threads.
Upload Content Once
// Node.js const blob = await client.createBlob( '...very long text content...', 'text/plain' ) console.log(blob.blob_id) // -> "b_abc123..." # Python blob_id = client.create_blob( "...very long text content...", "text/plain" ) print(blob_id) # -> "b_abc123..."
Reference in Chat
// Node.js const resp = await client.chat({ provider: 'openai', model: 'gpt-4o-mini', messages: [{ role: 'user', content: [ { type: 'text', text: 'Summarize this:' }, { type: 'pointer', blob_id: blob.blob_id } ] }] }) # Python resp = client.chat( provider="anthropic", model="claude-3-sonnet", messages=[{ "role": "user", "content": [ {"type": "text", "text": "Summarize this:"}, {"type": "pointer", "blob_id": blob_id} ] }] )
Follow the checklist above step by step. Migration to v2.0+ typically takes 15-30 minutes and unlocks streaming, enhanced TypeScript support, and up to 99% token savings with Smart Truncation.
Next Steps
Quickstart Guide
Learn how to use Skimly's features like blob storage and provider switching.
View quickstart →API Reference
Complete endpoint documentation with all available options and parameters.
View API docs →