An open-source CRM system designed for startups to manage customers, contacts, and companies with ease.
Status: Early development
Open CRM provides a straightforward way to organize and maintain business relationships. It focuses on the core needs of small teams: keeping track of companies, contacts, and customer records without the complexity of enterprise CRM solutions.
- Company Management — Create and manage company records with relevant business details
- Contact Management — Track individual contacts and their associations with companies
- Image Upload — Upload and manage company logos (SVG, PNG, JPEG) and contact photos (JPEG) with thumbnails in list views
- Customer Management — Manage customer relationships and history
- Single Sign-On (SSO) — Integration with Authentik for authentication and user management
- Brevo Sync — Automatic synchronization of customers and companies with Brevo for marketing and communication workflows
- Backend: Spring Boot 3.4 (Java 21)
- Frontend: Next.js 15 (React 19, TypeScript, Tailwind CSS, shadcn/ui)
- Database: PostgreSQL 17
- Authentication: Authentik (SSO via OpenID Connect) — planned
- External Integrations: Brevo API — planned
┌────────────┐ ┌──────────────┐ ┌────────────┐
│ Frontend │────▶│ Spring Boot │────▶│ PostgreSQL │
└────────────┘ │ Backend │ └────────────┘
└──────┬───────┘
│
┌─────────┼─────────┐
▼ ▼
┌────────────┐ ┌────────────┐
│ Authentik │ │ Brevo │
│ (SSO) │ │ (Sync) │
└────────────┘ └────────────┘
- Java 21 — recommended via SDKMAN! (see
backend/.sdkmanrc) - Node.js 22 — recommended via nvm (see
frontend/.nvmrc) - pnpm — package manager for the frontend
- Docker & Docker Compose — for running the full stack
- PostgreSQL 17 — for standalone backend development (or use Docker)
- Clone the repository
- Copy the environment file:
cp .env.example .env
- Adjust values in
.envif needed (defaults work for local development)
Start all services (PostgreSQL, backend, frontend):
docker compose up --build- Frontend: http://localhost:4001
- Backend API: http://localhost:9081/api/health
- Swagger UI: http://localhost:9081/swagger-ui.html
Port bindings are defined in docker-compose.override.yml, which Docker Compose merges automatically. The main
docker-compose.yml contains no ports entries — see Docker Compose & Coolify for details.
Stop services:
docker compose downStop services and remove database data:
docker compose down -vStart the database and mock OAuth2 server via Docker Compose:
docker compose up -d db mock-oauth2Then run backend and frontend locally:
Backend:
cd backend
sdk env install # install Java 21 via SDKMAN!
./mvnw spring-boot:run # starts on port 8080The default values in application.yml already point to localhost:5432 (database) and localhost:8888 (mock OAuth2),
so no extra configuration is needed.
For verbose SQL logging and debug output, activate the local profile:
./mvnw spring-boot:run -Dspring-boot.run.profiles=localFrontend:
cd frontend
nvm install # install Node.js 22 via nvm
pnpm install
pnpm dev # starts on port 3000The default values in frontend/.env.local already point to localhost:8080 (backend) and localhost:8888 (mock
OAuth2), so no extra configuration is needed.
Stop the infrastructure services when done:
docker compose down # keeps data
docker compose down -v # removes database dataBackend:
cd backend && ./mvnw clean verifyFrontend:
cd frontend && pnpm install && pnpm test && pnpm buildThe project uses a split Docker Compose setup to support both local development and Coolify deployment:
| File | Purpose |
|---|---|
docker-compose.yml |
Service definitions without port bindings (used by Coolify) |
docker-compose.override.yml |
Host port bindings for local development (merged automatically) |
This split is necessary because Coolify uses Traefik as a reverse proxy to route traffic via FQDNs. Host port bindings are not needed in Coolify and cause port allocation conflicts during deployment.
For more details, see DOCKER-COMPOSE-COOLIFY.md.
- In Coolify, create a new application and connect the GitHub repository
- Select Docker Compose as the build method
- Set the branch to
main - Configure the FQDNs for the services in Coolify:
- frontend — e.g.,
crm.example.com - backend — e.g.,
crm-backend.example.com
- frontend — e.g.,
- Add the required environment variables in Coolify:
DB_NAME— PostgreSQL database name (e.g.,opencrm)DB_USER— PostgreSQL user (e.g.,opencrm)DB_PASSWORD— PostgreSQL password
- Deploy — Coolify builds the Docker images and starts the services
- Traefik automatically handles TLS certificates and routes traffic to the containers
After deployment:
- Frontend:
https://crm.example.com - Backend / Swagger UI:
https://crm-backend.example.com/swagger-ui.html
Open CRM uses OpenID Connect (OIDC) for authentication. Locally, a mock-oauth2-server runs automatically via Docker Compose. For production, you connect a real Authentik instance.
No configuration needed — docker compose up starts the mock-oauth2-server automatically. The default .env values
point to the local mock server. The mock server shows an interactive login form where you can enter any username and
claims.
- In Authentik, go to Applications → Providers → Create
- Select OAuth2/OpenID Provider
- Configure:
- Name:
Open CRM - Authorization flow: Select your default authorization flow
- Client type: Confidential
- Client ID: Note this value (e.g.,
open-crm) - Client Secret: Note this value
- Redirect URIs:
https://crm.example.com/api/auth/callback/oidc - Logout URI:
https://crm.example.com(required for logout redirect back to the app) - Scopes:
openid,profile,email,offline_access
- Name:
- Save the provider
Important: The
offline_accessscope is required for refresh tokens. Without it, users will be logged out when the access token expires (typically after a few minutes).
- Go to Applications → Applications → Create
- Configure:
- Name:
Open CRM - Slug:
open-crm - Provider: Select the provider created above
- Name:
- Save the application
Set these variables in your deployment environment (e.g., Coolify):
# Frontend OIDC configuration
OIDC_ISSUER_URI=https://auth.example.com/application/o/open-crm/
OIDC_CLIENT_ID=open-crm
OIDC_CLIENT_SECRET=your-client-secret-from-authentik
AUTH_SECRET=generate-a-random-secret-here
AUTH_URL=https://crm.example.com
# Backend JWT validation
OIDC_JWK_SET_URI=https://auth.example.com/application/o/open-crm/jwks/OIDC_ISSUER_URI— The Authentik OpenID issuer URL for your applicationOIDC_CLIENT_ID/OIDC_CLIENT_SECRET— From the Authentik provider configurationAUTH_SECRET— A random string for encrypting session cookies (generate withopenssl rand -base64 32)AUTH_URL— The public URL of your frontendOIDC_JWK_SET_URI— The JWKS endpoint for JWT signature validation (usually{issuer}/jwks/)
After deploying with the new variables:
- Open the frontend URL — you should be redirected to the Authentik login page
- Log in with an Authentik user
- You should be redirected back to the CRM with your name displayed in the sidebar
API keys allow external services to access the CRM REST API without an interactive OIDC login. Keys grant read-only access (GET endpoints only) — write operations (POST, PUT, DELETE) are rejected with 403 Forbidden.
API keys can be managed via the frontend (sidebar: "API Keys") or the REST API:
# Create an API key (requires JWT auth — use the Bearer token from the Admin page)
curl -X POST https://crm-backend.example.com/api/api-keys \
-H "Authorization: Bearer <your-jwt-token>" \
-H "Content-Type: application/json" \
-d '{"name": "CI Pipeline"}'The response includes the raw key (crm_...) — copy it immediately, it is shown only once.
# List API keys
curl https://crm-backend.example.com/api/api-keys \
-H "Authorization: Bearer <your-jwt-token>"
# Delete an API key
curl -X DELETE https://crm-backend.example.com/api/api-keys/<key-id> \
-H "Authorization: Bearer <your-jwt-token>"Pass the key in the X-API-Key header:
# List companies
curl https://crm-backend.example.com/api/companies \
-H "X-API-Key: crm_your_key_here"
# Get a specific contact
curl https://crm-backend.example.com/api/contacts/<contact-id> \
-H "X-API-Key: crm_your_key_here"
# Paginated contacts
curl "https://crm-backend.example.com/api/contacts?page=0&size=10" \
-H "X-API-Key: crm_your_key_here"
# This will be rejected (write access not allowed):
curl -X POST https://crm-backend.example.com/api/companies \
-H "X-API-Key: crm_your_key_here" \
-H "Content-Type: application/json" \
-d '{"name": "Test"}'
# -> 403 Forbidden: "API keys only grant read-only access"Webhooks notify external services when data changes in the CRM. When a company, contact, task, tag, comment, or user is created, updated, or deleted, an HTTP POST is sent to all registered webhook URLs with a JSON payload describing the event.
Webhooks can be managed via the frontend (sidebar: "Webhooks") or the REST API:
# Create a webhook
curl -X POST https://crm-backend.example.com/api/webhooks \
-H "Authorization: Bearer <your-jwt-token>" \
-H "Content-Type: application/json" \
-d '{"url": "https://your-service.example.com/webhook"}'
# List webhooks (shows lastStatus and lastCalledAt)
curl https://crm-backend.example.com/api/webhooks \
-H "Authorization: Bearer <your-jwt-token>"
# Ping a webhook to test connectivity
curl -X POST https://crm-backend.example.com/api/webhooks/<webhook-id>/ping \
-H "Authorization: Bearer <your-jwt-token>"
# Delete a webhook
curl -X DELETE https://crm-backend.example.com/api/webhooks/<webhook-id> \
-H "Authorization: Bearer <your-jwt-token>"Every webhook call sends a JSON POST with this structure:
{
"eventId": "a1b2c3d4-e5f6-...",
"eventType": "COMPANY_CREATED",
"timestamp": "2026-04-05T14:30:00Z",
"entityId": "550e8400-...",
"data": {
"id": "550e8400-...",
"name": "Acme Corp"
}
}For delete events, data is null. The eventId (UUID) can be used for idempotency.
Event types: COMPANY_CREATED, COMPANY_UPDATED, COMPANY_DELETED, CONTACT_CREATED, CONTACT_UPDATED,
CONTACT_DELETED, COMMENT_CREATED, COMMENT_UPDATED, COMMENT_DELETED, TAG_CREATED, TAG_UPDATED, TAG_DELETED,
TASK_CREATED, TASK_UPDATED, TASK_DELETED, USER_CREATED, USER_UPDATED, USER_DELETED, PING
To test webhooks on your local machine, you need a tool that creates a temporary HTTP endpoint and shows incoming requests. Here are some options for Mac/Linux:
Option 1: Python one-liner (no install needed)
# Start a simple HTTP server that logs all incoming requests
python3 -c "
from http.server import HTTPServer, BaseHTTPRequestHandler
import json
class Handler(BaseHTTPRequestHandler):
def do_POST(self):
length = int(self.headers.get('Content-Length', 0))
body = self.rfile.read(length)
print('\n--- Webhook received ---')
print(json.dumps(json.loads(body), indent=2))
self.send_response(200)
self.end_headers()
HTTPServer(('0.0.0.0', 9999), Handler).serve_forever()
"Then register http://host.docker.internal:9999 as the webhook URL (if the CRM runs in Docker) or
http://localhost:9999 (if running standalone).
Option 2: netcat (quick & dirty)
# Listen for a single webhook call and print the raw request
while true; do nc -l 9999; doneOption 3: webhook.site (no local setup)
Visit webhook.site to get a temporary public URL. Register that URL as a webhook and see incoming requests in the browser.
Each webhook tracks the result of its last call:
lastStatus |
Meaning |
|---|---|
null |
Never called |
200 |
OK |
0 |
Connection error |
-1 |
Timeout (10s limit) |
4xx/5xx |
HTTP error |
Use the PING endpoint or check lastStatus in the webhook list to verify connectivity.
See LICENSE for details.