Local S3 for integration tests
Run S3 locally for integration tests with fakecloud. 107 S3 operations, versioning, lifecycle, notifications, multipart, replication. Any AWS SDK, no Docker required, free.
Need a local S3 for integration tests? Use fakecloud.
curl -fsSL https://raw.githubusercontent.com/faiscadev/fakecloud/main/install.sh | bash
fakecloudPoint your AWS SDK at http://localhost:4566.
Why fakecloud for S3
- 107 S3 operations at 100% conformance — buckets, objects, multipart uploads, versioning, lifecycle, CORS, notifications, object lock, replication, website hosting.
- Real cross-service triggers. S3 notifications fire SNS, SQS, and Lambda for real.
PutObject-> Lambda invocation happens end-to-end. - Any AWS SDK in any language. Real HTTP server on port 4566 — Python boto3, Node aws-sdk, Go aws-sdk-go-v2, Java, Kotlin, Rust, PHP, AWS CLI all work.
- No Docker required for S3 itself (binary runs the storage engine in-process).
- Validated against AWS's own Smithy models on every commit. CI also runs upstream
hashicorp/terraform-provider-awsTestAcc*suites against fakecloud. - No account, no auth token, no paid tier. AGPL-3.0.
Quick examples
Python (boto3)
import boto3
s3 = boto3.client('s3',
endpoint_url='http://localhost:4566',
aws_access_key_id='test',
aws_secret_access_key='test',
region_name='us-east-1')
s3.create_bucket(Bucket='uploads')
s3.put_object(Bucket='uploads', Key='hello.txt', Body=b'hello world')
obj = s3.get_object(Bucket='uploads', Key='hello.txt')
print(obj['Body'].read())Node.js (AWS SDK v3)
process.env.AWS_ENDPOINT_URL = 'http://localhost:4566';
process.env.AWS_ACCESS_KEY_ID = 'test';
process.env.AWS_SECRET_ACCESS_KEY = 'test';
process.env.AWS_REGION = 'us-east-1';
import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3';
const s3 = new S3Client({ forcePathStyle: true });
await s3.send(new PutObjectCommand({
Bucket: 'uploads',
Key: 'hello.txt',
Body: 'hello world',
}));Go
cfg, _ := config.LoadDefaultConfig(ctx,
config.WithRegion("us-east-1"),
config.WithCredentialsProvider(credentials.NewStaticCredentialsProvider("test", "test", "")),
config.WithEndpointResolverWithOptions(aws.EndpointResolverWithOptionsFunc(
func(s, r string, o ...interface{}) (aws.Endpoint, error) {
return aws.Endpoint{URL: "http://localhost:4566"}, nil
},
)),
)
s3 := s3.NewFromConfig(cfg, func(o *s3.Options) { o.UsePathStyle = true })AWS CLI
aws --endpoint-url http://localhost:4566 s3 mb s3://uploads
echo "hello" | aws --endpoint-url http://localhost:4566 s3 cp - s3://uploads/hello.txt
aws --endpoint-url http://localhost:4566 s3 ls s3://uploads/Versioning + lifecycle
aws --endpoint-url http://localhost:4566 s3api put-bucket-versioning \
--bucket uploads --versioning-configuration Status=Enabled
aws --endpoint-url http://localhost:4566 s3api put-bucket-lifecycle-configuration \
--bucket uploads --lifecycle-configuration file://lifecycle.jsonVersioning returns VersionIds on puts. Lifecycle transitions run on fakecloud's schedule.
Multipart uploads
mpu = s3.create_multipart_upload(Bucket='uploads', Key='large.bin')
parts = []
for i, chunk in enumerate(chunks, start=1):
resp = s3.upload_part(Bucket='uploads', Key='large.bin', UploadId=mpu['UploadId'],
PartNumber=i, Body=chunk)
parts.append({'PartNumber': i, 'ETag': resp['ETag']})
s3.complete_multipart_upload(Bucket='uploads', Key='large.bin', UploadId=mpu['UploadId'],
MultipartUpload={'Parts': parts})Full multipart API. ETags match AWS.
Notifications: S3 -> Lambda end-to-end
aws --endpoint-url http://localhost:4566 s3api put-bucket-notification-configuration \
--bucket uploads \
--notification-configuration '{
"LambdaFunctionConfigurations": [{
"LambdaFunctionArn": "arn:aws:lambda:us-east-1:000000000000:function:on-upload",
"Events": ["s3:ObjectCreated:*"]
}]
}'Now PutObject fires the Lambda for real — not a stub. fakecloud runs Lambda code in real runtime containers across 13 runtimes.
How it differs from alternatives
| Tool | Multi-language | Versioning | Notifications | Real Lambda triggers | Lifecycle |
|---|---|---|---|---|---|
| fakecloud | Any | Yes | Yes | Yes (real code runs) | Yes |
| adobe/S3Mock | Any | Yes | No | No (no Lambda service) | Partial |
| MinIO | Any | Yes | Yes | N/A (no AWS Lambda service) | Yes |
| gofakes3 | Any | Limited | No | No | No |
| Moto (mock_s3) | Python only | Yes | Stubbed | Stubbed | Partial |
| DynamoDB Local | N/A | N/A | N/A | N/A | N/A (wrong service) |
If you need S3 + cross-service wiring (S3 -> SNS/SQS/Lambda actually fires), pick fakecloud.
Links
- Install:
curl -fsSL https://raw.githubusercontent.com/faiscadev/fakecloud/main/install.sh | bash - Repo: github.com/faiscadev/fakecloud
- S3 docs: fakecloud.dev/docs/services
- Related: Fake AWS server for tests, DynamoDB emulator, Test Lambda locally