LocalStack Bedrock alternative

Free, open-source alternative to LocalStack Ultimate-tier Bedrock. 111 operations, full control plane, deterministic responses for tests. Not backed by Ollama. AGPL-3.0.

Looking for a free, open-source alternative to LocalStack's Ultimate-tier Bedrock? Use fakecloud. 111 Bedrock operations, full control plane, deterministic responses, free.

curl -fsSL https://raw.githubusercontent.com/faiscadev/fakecloud/main/install.sh | bash
fakecloud

Point any AWS SDK at http://localhost:4566. Same setup as LocalStack.

How LocalStack Bedrock compares

As of April 2026, per docs.localstack.cloud/aws/services/bedrock:

LocalStack Bedrockfakecloud Bedrock
TierUltimate (top paid plan)Free, AGPL-3.0
Operations4: InvokeModel, Converse, ListFoundationModels, CreateModelInvocationJob111 across runtime + full control plane
BackendOllama (real local LLM)Configurable responses per prompt rule
DeterminismNo — real inference, different output each runYes — returns exactly what you configured
Speed1-30s per call (Ollama CPU inference)Milliseconds
DiskGBs (Ollama model weights)~19 MB binary
GuardrailsNot supportedFull CRUD + versioning + ApplyGuardrail content evaluation
Custom modelsNot supportedCreateModelCustomizationJob, imported models, model copy jobs
Async invoke / batchPartial (CreateModelInvocationJob only)Full flow, S3-backed
Prompt managementNot supportedPrompts + prompt routers
Fault injectionNoYes (ThrottlingException, ValidationException, etc. on demand)
Call history introspectionNo/_fakecloud/bedrock/calls endpoint
GPU requiredUsuallyNo
Persistence"Not supported"Yes (built-in state management)

If you came to LocalStack Bedrock for testing, fakecloud covers every test case better. If you came for real local inference: that's Ollama's job, not ours — Ollama runs standalone.

Migration from LocalStack

Env vars: the same ones work. fakecloud accepts AWS_ENDPOINT_URL, dummy credentials, any region.

# before (LocalStack)
export AWS_ENDPOINT_URL=http://localhost:4566
export AWS_ACCESS_KEY_ID=test
export AWS_SECRET_ACCESS_KEY=test
export AWS_REGION=us-east-1
localstack start

# after (fakecloud)
export AWS_ENDPOINT_URL=http://localhost:4566
export AWS_ACCESS_KEY_ID=test
export AWS_SECRET_ACCESS_KEY=test
export AWS_REGION=us-east-1
fakecloud

Model IDs: LocalStack's ollama.<id> convention does not exist here. Use real Bedrock model IDs (anthropic.claude-3-haiku-20240307-v1:0, amazon.nova-lite-v1:0, meta.llama3-8b-instruct-v1:0, etc.). fakecloud accepts them and returns whatever response rule you configured.

Response config: LocalStack's BEDROCK_PREWARM and DEFAULT_BEDROCK_MODEL go away. Replace with explicit per-test fixtures:

import { FakeCloud } from "fakecloud";
const fc = new FakeCloud();

await fc.bedrock.setResponseRule({
  whenPromptContains: "summarize",
  respond: { completion: "deterministic summary" },
});

Tests that depend on real inference output: these were never good tests on LocalStack either (Ollama's Llama ≠ Claude). Rewrite assertions to check that your code handled the response shape correctly, not that the model "got the right answer."

What fakecloud Bedrock actually runs

Full operation surface across 27 modules: