Universal project initialization toolkit with development infrastructure
This repository provides a comprehensive project initialization system that sets up new development projects with standardized configurations, development tools, and infrastructure. Following best practices from frameworks like Create React App and Angular CLI, this toolkit ensures consistent development environments across all your projects.
Best Practice Pattern: Major frameworks use initialization toolkits:
- Create React App - React project scaffolding
- Angular CLI - Angular project initialization
- Rails new - Ruby on Rails setup
- Yeoman generators - Web development scaffolding
Our toolkit extends this pattern for any project type with:
- β Standardized IDE configurations (Cursor, VSCode)
- β Development infrastructure (Memory, tools, backups)
- β Project templates (React, Node.js, Python, ProcOS, etc.)
- β Port management (Unusual ports to avoid conflicts)
- β Team consistency (Same setup across all developers)
- Single Neo4j instance serves all projects
- Project-specific metadata keeps memories organized
- Cross-project knowledge sharing and references
- Unified backup/restore system for all development context
- One-command setup for new projects
- Automatic Cursor configuration for MCP integration
- Standardized memory patterns across all projects
- Cursor rules for consistent memory usage
- Automated backups with timestamp and compression
- Easy restore from any backup point
- Health monitoring and diagnostics
- Production-ready container configuration
cd /path/to/mcp-memory-infrastructure
docker-compose up -d# Enhanced initialization with DADMS-derived configurations
./scripts/init-project-enhanced.sh procos /home/user/procos
./scripts/init-project-enhanced.sh my-ai-project /path/to/my-projectEnhanced features include:
- 3 MCP Servers: neo4j-memory, neo4j-cypher, huggingface-mcp
- 7 Advanced Rules: Development standards, architecture, testing, code quality
- Automatic Memory: Intelligent storage during development
- Adaptive Templates: Project-specific configurations
- Enterprise Standards: DADMS-derived best practices
# Basic initialization (memory only)
./scripts/init-project.sh my-project /path/to/my-project- Restart Cursor to load the enhanced MCP configuration
- Ask the AI to remember something about your project
- Memories will be automatically stored with advanced rules active
- Benefit from automatic code quality and architecture guidance
graph TB
subgraph "Universal Project Initializer"
subgraph "π― Project Templates"
REACT[React/TypeScript Template]
NODE[Node.js/Express Template]
PYTHON[Python/FastAPI Template]
PROCOS[ProcOS Template]
GENERIC[Generic Template]
end
subgraph "βοΈ Development Infrastructure"
NEO[Neo4j Memory Service<br/>Port 9475/9688]
TOOLS[Development Tools<br/>Port 9xxx Range]
BACKUP[Backup System]
end
subgraph "π Standard Configurations"
CURSOR[.cursor/mcp.json<br/>.cursor/rules/]
VSCODE[.vscode/settings.json<br/>.vscode/extensions.json]
COPILOT[copilot-instructions.md]
RULES[.cursorrules]
GIT[.gitignore<br/>.gitattributes]
ENV[.env.example<br/>docker-compose.yml]
end
subgraph "π Initialization Process"
DETECT[Detect Project Type]
TEMPLATE[Apply Template]
CONFIG[Configure Development Tools]
CONNECT[Connect to Infrastructure]
TEST[Validate Setup]
end
DETECT --> TEMPLATE
TEMPLATE --> CONFIG
CONFIG --> CONNECT
CONNECT --> TEST
REACT -.-> TEMPLATE
NODE -.-> TEMPLATE
PYTHON -.-> TEMPLATE
PROCOS -.-> TEMPLATE
GENERIC -.-> TEMPLATE
CONFIG --> CURSOR
CONFIG --> VSCODE
CONFIG --> COPILOT
CONFIG --> RULES
CONFIG --> GIT
CONFIG --> ENV
CONNECT --> NEO
CONNECT --> TOOLS
CONNECT --> BACKUP
end
classDef template fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef infra fill:#e8f5e8,stroke:#388e3c,stroke-width:2px
classDef config fill:#fff3e0,stroke:#f57c00,stroke-width:2px
classDef process fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
class REACT,NODE,PYTHON,PROCOS,GENERIC template
class NEO,TOOLS,BACKUP infra
class CURSOR,VSCODE,COPILOT,RULES,GIT,ENV config
class DETECT,TEMPLATE,CONFIG,CONNECT,TEST process
Development infrastructure uses unusual ports (9xxx range) to avoid conflicts with application ports (3000-8000):
| Service | Production Port | Development Port | Purpose |
|---|---|---|---|
| Neo4j Web UI | 7474 | 9475 | Memory database interface |
| Neo4j Bolt | 7687 | 9688 | MCP memory connection |
| Development Tools | Various | 9100-9999 | Isolated development services |
graph TB
subgraph "Centralized MCP Memory Solution"
subgraph "π³ Infrastructure Repository"
REPO[mcp-memory-infrastructure]
REPO --> DOCKER[Docker Compose<br/>Neo4j Container]
REPO --> SCRIPTS[Management Scripts]
REPO --> INIT[Project Initialization]
end
subgraph "π± Cursor MCP Integration"
CURSOR[Cursor IDE] --> MCP[MCP Server]
MCP --> NEO[(Neo4j Memory<br/>Port 7688)]
end
subgraph "π Project Usage"
DADMS[DADMS Project]
PROCOS[ProcOS Project]
FUTURE[Future Projects]
DADMS --> CONFIG1[.cursor/mcp.json]
PROCOS --> CONFIG2[.cursor/mcp.json]
FUTURE --> CONFIG3[.cursor/mcp.json]
CONFIG1 -.->|points to| NEO
CONFIG2 -.->|points to| NEO
CONFIG3 -.->|points to| NEO
end
subgraph "βοΈ One-Command Setup"
SETUP[./scripts/init-project.sh]
SETUP --> CONFIG1
SETUP --> CONFIG2
SETUP --> CONFIG3
end
end
classDef infra fill:#e3f2fd,stroke:#1976d2,stroke-width:3px
classDef mcp fill:#e8f5e8,stroke:#388e3c,stroke-width:2px
classDef projects fill:#fff3e0,stroke:#f57c00,stroke-width:2px
classDef setup fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
class REPO,DOCKER,SCRIPTS,INIT infra
class CURSOR,MCP,NEO mcp
class DADMS,PROCOS,FUTURE,CONFIG1,CONFIG2,CONFIG3 projects
class SETUP setup
graph LR
subgraph "Neo4j Memory Database"
subgraph "Project Separation"
P1[DADMS Memories<br/>project_id: 'dadms']
P2[ProcOS Memories<br/>project_id: 'procos']
P3[Future Memories<br/>project_id: 'future-x']
end
subgraph "Memory Categories"
ARCH[Architecture Decisions]
BUGS[Bug Solutions]
PATTERNS[Design Patterns]
CONFIGS[Configuration Notes]
end
subgraph "Cross-Project Links"
P1 -.->|references| P2
P2 -.->|shares solution| P3
P1 -.->|common pattern| P3
end
end
subgraph "Memory Metadata"
META[Each Memory Node]
META --> PROJ[project_id]
META --> CAT[category]
META --> TIME[timestamps]
META --> TAGS[tags]
META --> CONTENT[content]
end
classDef project fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef category fill:#e8f5e8,stroke:#388e3c,stroke-width:2px
classDef metadata fill:#fff3e0,stroke:#f57c00,stroke-width:2px
class P1,P2,P3 project
class ARCH,BUGS,PATTERNS,CONFIGS category
class META,PROJ,CAT,TIME,TAGS,CONTENT metadata
The initialization system supports multiple project types with standardized configurations:
| Template | Description | Includes |
|---|---|---|
| React/TypeScript | Modern React application | Vite, TypeScript, ESLint, Prettier |
| Node.js/Express | Backend API server | Express, TypeScript, Jest, Nodemon |
| Python/FastAPI | Python web API | FastAPI, Pydantic, pytest, uvicorn |
| ProcOS | Process-oriented system | Microkernel, BPMN, Camunda integration |
| Generic | Basic development setup | Standard IDE configs, memory integration |
Every initialized project gets:
.cursor/mcp.json- MCP server configuration.cursor/rules/- Project-specific Cursor rules.vscode/settings.json- VSCode configuration.vscode/extensions.json- Recommended extensionscopilot-instructions.md- AI pair programming guidelines.cursorrules- Memory usage and development guidelines.gitignore- Language/framework-specific ignores.env.example- Environment variable templatedocker-compose.yml- Development infrastructure (if applicable)
Templates can be customized by:
- Modifying template files in
templates/directory - Adding custom rules in
.cursor/rules/ - Project-specific overrides during initialization
dev-project-init/
βββ docker-compose.yml # Development memory service
βββ README.md # This file
βββ
βββ scripts/ # Management utilities
β βββ init-project.sh # Universal project initializer
β βββ backup-memory.sh # Create memory backups
β βββ restore-memory.sh # Restore from backups
β βββ check_memory.sh # Health diagnostics
βββ
βββ templates/ # Project templates
β βββ react-typescript/ # React + TypeScript template
β βββ node-express/ # Node.js + Express template
β βββ python-fastapi/ # Python + FastAPI template
β βββ procos/ # ProcOS system template
β βββ generic/ # Basic development template
βββ
βββ backups/ # Backup storage
β βββ dev-memory/ # Development memory backups
βββ
βββ config/ # Base configurations
β βββ cursor/ # Cursor IDE configurations
β βββ vscode/ # VSCode configurations
β βββ git/ # Git configurations
βββ docs/ # Additional documentation
# Start memory service
docker-compose up -d
# Stop memory service
docker-compose down
# View logs
docker-compose logs -f neo4j-memory
# Check service status
docker-compose psThe initialization script handles all configuration automatically:
flowchart TD
START([Run init-project.sh]) --> CHECK{Memory Service Running?}
CHECK -->|No| ERROR[β Error: Start service first]
CHECK -->|Yes| BACKUP[π Backup existing mcp.json]
BACKUP --> EXISTS{mcp.json exists?}
EXISTS -->|Yes| HASNEO{Has neo4j-memory config?}
EXISTS -->|No| CREATE[π Create new mcp.json]
HASNEO -->|Yes| VERIFY{Correct URL?}
HASNEO -->|No| ADD[β Add neo4j-memory config]
VERIFY -->|Yes| SKIP[β
Skip - already configured]
VERIFY -->|No| UPDATE[π Update URL to localhost:7688]
CREATE --> RULES[π Create .cursorrules]
ADD --> RULES
UPDATE --> RULES
SKIP --> RULES
RULES --> TEST[π§ͺ Test connection]
TEST --> SUCCESS[π Setup complete!]
classDef start fill:#e8f5e8,stroke:#388e3c,stroke-width:2px
classDef process fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef decision fill:#fff3e0,stroke:#f57c00,stroke-width:2px
classDef error fill:#ffebee,stroke:#d32f2f,stroke-width:2px
classDef success fill:#e8f5e8,stroke:#4caf50,stroke-width:3px
class START,SUCCESS start
class BACKUP,CREATE,ADD,UPDATE,RULES,TEST process
class CHECK,EXISTS,HASNEO,VERIFY decision
class ERROR error
class SKIP success
Usage Examples:
# Initialize new project
./scripts/init-project.sh <project-name> <project-path>
# Setup ProcOS
./scripts/init-project.sh procos /home/user/procos
# Setup DADMS
./scripts/init-project.sh dadms /home/user/dadms# Create backup
./scripts/backup-memory.sh
# List available backups
ls -la backups/mcp-memory/
# Restore latest backup
./scripts/restore-memory.sh latest
# Restore specific backup
./scripts/restore-memory.sh mcp-memory-backup-20240101_120000.cypher.gz# Check memory service health
./scripts/check_memory.sh
# Access Neo4j web interface
open http://localhost:7475| Service | URL | Purpose |
|---|---|---|
| Neo4j Web UI | http://localhost:9475 | Visual database interface (unusual port) |
| Bolt Connection | neo4j://localhost:9688 | MCP protocol connection (unusual port) |
| Authentication | neo4j / devmemorypass | Database credentials |
Note: Using unusual ports (9xxx range) to avoid conflicts with application development ports (3000-8000).
When you run ./scripts/init-project.sh, it creates:
.cursor/settings.json- MCP server configuration.cursorrules- Memory usage guidelines for the project- Connection test - Verifies the memory service is accessible
Each memory entry includes:
- Project ID:
dadms,procos,my-project - Category:
architecture,bug,decision,pattern - Content: The actual knowledge/solution
- Relationships: Links to related memories
- Timestamps: Creation and update times
- Shared Solutions: Bug fixes discovered in one project help others
- Architecture Patterns: Design decisions can be referenced across projects
- Team Knowledge: Institutional knowledge persists across project boundaries
- Context Preservation: Development history maintains continuity
If you currently have DADMS-specific memory:
- Create backup of existing DADMS memory
- Start centralized service with this infrastructure
- Restore DADMS backup to centralized service
- Update DADMS MCP configuration to point to centralized service
# From DADMS project directory
cd /path/to/dadms
./scripts/backup-memory.sh
# Copy backup to centralized infrastructure
cp backups/mcp-memory/mcp-memory-backup-*.cypher.gz \
/path/to/mcp-memory-infrastructure/backups/mcp-memory/
# Start centralized service
cd /path/to/mcp-memory-infrastructure
docker-compose up -d
# Restore migrated backup
./scripts/restore-memory.sh mcp-memory-backup-*.cypher.gz- Timestamped backups: Each backup includes date/time
- Compressed storage: All backups are gzipped to save space
- Retention policy: Automatically keeps last 7 backups
- Easy restore: Simple commands to restore any backup point
backups/mcp-memory/
βββ mcp-memory-backup-20240101_120000.cypher.gz
βββ mcp-memory-backup-20240101_180000.cypher.gz
βββ mcp-memory-backup-20240102_090000.cypher.gz
- Default credentials:
neo4j/memorypassword(change for production) - Local network: Service bound to localhost only
- No external access: Firewall-friendly configuration
- Change passwords: Update
NEO4J_AUTHin docker-compose.yml - Persistent volumes: Data survives container restarts
- Health checks: Automatic service monitoring
- Resource limits: Uncomment memory settings for production loads
# Check if ports are available
lsof -i :7475 -i :7688
# Check Docker is running
docker ps
# View service logs
docker-compose logs neo4j-memory# Test network connectivity
nc -z localhost 7688
# Check MCP configuration
cat .cursor/settings.json
# Restart Cursor to reload MCP config# Verify container is running
docker ps | grep mcp-neo4j-memory
# Check container logs
docker logs mcp-neo4j-memory
# Test APOC is available
docker exec mcp-neo4j-memory cypher-shell -u neo4j -p memorypassword \
"RETURN apoc.version() AS version"- Shared service: One memory service per team/environment
- Project isolation: Memories tagged by project for organization
- Collaborative knowledge: Team members share discovered solutions
- Consistent setup: Same initialization process for all team members
- Start memory service (once per development session)
- Work on any project (DADMS, ProcOS, etc.)
- Memories automatically stored in centralized service
- Knowledge shared across all projects and team members
- Regular backups preserve all development context
- Start the service:
docker-compose up -d - Initialize your projects:
./scripts/init-project.sh <project> <path> - Test memory creation: Ask Cursor AI to remember something
- Setup backups: Schedule regular
./scripts/backup-memory.shruns - Share with team: Point team members to this repository
Your centralized development memory service is ready! All projects can now share knowledge and development context seamlessly.
Following industry best practices from major frameworks and development tools
Understanding MCP: Model Context Protocol servers are specialized tools that provide AI assistants with enhanced capabilities through standardized protocols. Unlike regular infrastructure services, MCP servers extend AI functionality with context-aware operations.
Current Setup: We use mcp-neo4j-memory for development memory management.
Reference: Awesome MCP Servers - Comprehensive list of production-ready MCP servers.
graph TB
subgraph "File & Project Management"
FILESYSTEM["mcp-filesystem<br/>File operations"]
GIT["mcp-git<br/>Git commands"]
GITHUB["mcp-github<br/>GitHub API"]
PROJECT["mcp-project-manager<br/>Project scaffolding"]
end
subgraph "Database & Data"
SQLITE["mcp-sqlite<br/>Local databases"]
POSTGRES["mcp-postgres<br/>PostgreSQL queries"]
MEMORY["mcp-neo4j-memory<br/>AI memory (current)"]
end
subgraph "Development Tools"
DOCKER["mcp-docker<br/>Container management"]
SHELL["mcp-shell<br/>Terminal commands"]
BROWSER["mcp-browser-automation<br/>Web automation"]
end
subgraph "Research & Learning"
SEARCH["mcp-brave-search<br/>Web search"]
FETCH["mcp-fetch<br/>URL content"]
YOUTUBE["mcp-youtube<br/>Video transcripts"]
end
classDef current fill:#e8f5e8,stroke:#4caf50,stroke-width:3px
classDef mcp fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
class MEMORY current
class FILESYSTEM,GIT,GITHUB,PROJECT,SQLITE,POSTGRES,DOCKER,SHELL,BROWSER,SEARCH,FETCH,YOUTUBE mcp
| Category | MCP Server | Purpose | Use Cases |
|---|---|---|---|
| File Management | mcp-filesystem |
File system operations | Read/write files, directory management |
| Version Control | mcp-git |
Git repository operations | Commits, branches, repository analysis |
| GitHub Integration | mcp-github |
GitHub API interactions | Issues, PRs, repository management |
| Database Operations | mcp-sqlite |
SQLite database queries | Local database development and testing |
| Container Management | mcp-docker |
Docker operations via MCP | Container lifecycle, image management |
| Web Research | mcp-brave-search |
Web search capabilities | Research, documentation lookup |
| Content Fetching | mcp-fetch |
URL content retrieval | API documentation, web scraping |
| Development Memory | mcp-neo4j-memory |
AI memory management | Currently implemented |
| Category | MCP Server | Purpose | Use Cases |
|---|---|---|---|
| Team Communication | mcp-slack |
Slack API integration | Send messages, read channels, team notifications |
| Project Management | mcp-linear |
Linear project management | Issue tracking, project planning, sprint management |
| Documentation | mcp-confluence |
Confluence integration | Read/write documentation, knowledge management |
| Task Management | mcp-asana |
Asana task management | Task creation, project tracking, team coordination |
| Code Reviews | mcp-github |
GitHub code reviews | PR management, code review automation |
| Category | MCP Server | Purpose | Use Cases |
|---|---|---|---|
| AI Integration | mcp-openai |
OpenAI API access | Chat with AI models, embeddings, completions |
| Data Analysis | mcp-pandas |
Data manipulation | DataFrame operations, data analysis, CSV processing |
| Testing | mcp-playwright |
Browser automation | E2E testing, web scraping, UI automation |
| Security | mcp-1password |
Password management | Secure credential access, secret management |
| Monitoring | mcp-prometheus |
Metrics and monitoring | System metrics, performance monitoring |
| Calendar | mcp-google-calendar |
Calendar integration | Schedule management, meeting coordination |
Custom MCP Servers for ProcOS (Process-Oriented Operating System):
| MCP Server | Purpose | ProcOS Integration |
|---|---|---|
| mcp-bpmn-analyzer | BPMN process analysis | Validate process models, detect patterns |
| mcp-camunda-client | Camunda engine integration | Deploy processes, monitor instances |
| mcp-process-simulator | Process simulation | Test workflows before deployment |
| mcp-workflow-optimizer | Performance analysis | Identify bottlenecks, suggest improvements |
Note: These would be custom-built MCP servers specifically for ProcOS development workflow.
React/TypeScript Projects:
mcp-filesystem- Component file managementmcp-git- Version control for React componentsmcp-browser-automation- Component testing automationmcp-fetch- API integration for React apps
Node.js/Express Projects:
mcp-sqlite- Database operations for Node.jsmcp-docker- Container management for Node servicesmcp-shell- Build script automationmcp-github- Repository management for Node projects
mcp-filesystem- File system operations for project managementmcp-git- Version control integrationmcp-sqlite- Local database development and testing
mcp-github- Repository management and code reviewsmcp-slack- Team communication integrationmcp-docker- Container management for development
mcp-browser-automation- Web testing and automationmcp-fetch- API integration and web scrapingmcp-shell- Advanced terminal operations
- ProcOS-specific MCP servers - Custom BPMN and process tools
mcp-openai- AI integration for enhanced development- Custom project-specific MCP servers
{
"$schema": "https://json.schemastore.org/mcp.json",
"description": "Development Memory Configuration",
"mcpServers": {
"neo4j-memory": {
"command": "uvx",
"args": [
"mcp-neo4j-memory",
"--db-url", "neo4j://localhost:9688",
"--username", "neo4j",
"--password", "devmemorypass"
]
}
}
}{
"$schema": "https://json.schemastore.org/mcp.json",
"description": "Enhanced Development MCP Configuration",
"mcpServers": {
"neo4j-memory": {
"command": "uvx",
"args": [
"mcp-neo4j-memory",
"--db-url", "neo4j://localhost:9688",
"--username", "neo4j",
"--password", "devmemorypass"
]
},
"filesystem": {
"command": "uvx",
"args": ["mcp-filesystem", "--read-write"]
},
"git": {
"command": "uvx",
"args": ["mcp-git"]
},
"github": {
"command": "uvx",
"args": ["mcp-github", "--auth-token", "${GITHUB_TOKEN}"]
},
"sqlite": {
"command": "uvx",
"args": ["mcp-sqlite", "--db-path", "./dev.db"]
},
"fetch": {
"command": "uvx",
"args": ["mcp-fetch"]
}
}
}- Awesome MCP Servers - Comprehensive MCP server directory
- glama.ai/mcp/servers - Browse and discover MCP servers
- MCP Documentation - Official Model Context Protocol docs
- r/mcp Reddit - Community discussions and support
- Build custom MCP servers for specialized workflows
- Share project templates that integrate useful MCP servers
- Document MCP server configurations for different development scenarios
- Create tutorials for MCP server setup and usage
This Universal Project Initializer bridges the gap between AI capabilities and development workflows by:
- Standardizing MCP server configurations across projects
- Providing templates that include proven MCP server combinations
- Enabling rapid onboarding with pre-configured AI development tools
- Maintaining consistency across development environments
The future of development is AI-enhanced workflows. This toolkit provides the foundation for integrating Model Context Protocol servers into every project, making AI assistance a natural part of the development process.