Skip to main content

Nova Direct

Nova Direct provides a complete browser-based platform running entirely on your local machine via Docker. Everything you need is accessible through a beautiful web UI - no configuration files, no editor plugins, just pure simplicity.

🚀 The Easiest Way to Deploy Nova: Just run Docker Compose and open your browser. Everything else is automated!

Overview

🌐 Browser-Based UI

Everything works in your browser - no editors, plugins, or complex configurations needed.

⚡ Fully Automated

All features are pre-configured and ready to use. Just open your browser and start.

Local Infrastructure

Keep all AI interactions within your local infrastructure for enhanced security.

Self-Hosted Control

Run Nova on your own machines with complete control over your data and environment.

MCP Marketplace Integration

🚀 Nova Direct includes a built-in MCP Marketplace that allows you to dynamically extend Nova's capabilities with additional tools and integrations.

What is the MCP Marketplace?

The MCP (Model Context Protocol) Marketplace is integrated directly into Nova Direct, providing a user-friendly interface to manage MCP servers. This powerful feature allows you to:

  • Browse Available Servers: Discover MCP servers for various integrations (GitHub, databases, monitoring tools, etc.)
  • Dynamic Management: Add or remove MCP servers on-the-fly without restarting Nova
  • Visual Configuration: Configure server settings through an intuitive UI instead of editing config files
  • Real-time Status: Monitor the health and connection status of all active MCP servers
  • Secure Credentials: Store API keys and credentials securely within your local Nova instance

Key Features

Self-Hosted Marketplace

The entire marketplace runs locally on port 8000, ensuring your credentials never leave your infrastructure.

Dynamic Extension

Add capabilities like GitHub integration, database access, or monitoring tools without code changes.

Visual Management

User-friendly interface to browse, add, configure, and remove MCP servers with just a few clicks.

Instant Updates

Changes take effect immediately - Nova can use new tools as soon as you add them.

How MCP Servers Make Nova Your Platform Engineer

With MCP servers from the marketplace, Nova becomes an intelligent platform engineer focused on observability, monitoring, and incident management:

MCP Server CombinationPlatform Engineering CapabilitiesExample Commands to Nova
DataDog + SlackReal-time monitoring alerts with team notifications"Monitor our API error rates and alert the #platform-oncall channel if errors exceed 5%"
PagerDuty + Slack + JiraIncident management and tracking"Create a P1 incident for the database outage, page the DBA team, and create a Jira ticket for RCA"
Prometheus + GrafanaMetrics visualization and alerting"Create a dashboard showing pod memory usage trends and set up alerts for OOM risks"
Sentry + SlackError tracking and notification"Monitor JavaScript errors in production and notify #frontend-team of any new error patterns"
CloudWatch + SNSAWS resource monitoring"Set up alarms for RDS CPU usage and send SMS alerts to on-call when it exceeds 80%"

Real Platform Engineering Scenarios

Production Incident Response

MCP Servers: DataDog + Kubernetes + PagerDuty + Slack

Nova can: "Check DataDog for errors, identify the failing pods, restart them, update PagerDuty incident, and notify team on Slack with RCA"

Using the Marketplace

Once Nova Direct is running, access the marketplace at http://localhost:10000 to:

  1. Browse available MCP servers
  2. Click "Add" on any server you want to install
  3. Configure the server with required credentials
  4. Save to activate the server
  5. Use the new capabilities immediately in Nova chat

Prerequisites

RequirementDetails
DockerDocker and Docker Compose installed on your system
Memory8GB+ RAM recommended (16GB+ preferred for optimal performance)
Disk Space20GB+ free disk space
Operating SystemLinux, macOS, or Windows with WSL2

Setup with Docker Compose

Use the Setup Wizard

Navigate to the AI Agent section

Open the Astro Platform Dashboard and go to the AI Agent section

Select Nova Direct

Click on the Nova Direct card

Start the setup process

Click on "Set up Nova with Docker" in the Nova Chat view

Configure your instance

Enter your admin email and password, add your Astropulse API key

Download the bundle

Click "Download Nova Platform Bundle (ZIP)"

Extract and run

Extract the ZIP contents to a directory and run docker compose up -d

The downloaded ZIP file contains:

  • docker-compose.yml - Pre-configured Nova services
  • nova.env - Environment file with your credentials
  • README.md - Additional setup instructions

Getting Started

After downloading and extracting the Nova bundle:

Step 1: Start Nova Direct

Launch the Nova Direct services:

Docker Compose
$ docker compose up -d
Creating network nova-platform_default with the default driver Pulling nova-ui (ghcr.io/astropulseinc/nova-ui:latest)... Pulling nova-backend (ghcr.io/astropulseinc/nova-backend:latest)... Pulling nova-proxy (ghcr.io/astropulseinc/nova-proxy:latest)... Creating nova-backend ... done Creating nova-proxy ... done Creating nova-ui ... done

The first time you run this command, Docker will pull the necessary images and may take several minutes to complete depending on your internet connection.

Step 2: Access the Nova Chat Interface

Once the containers are running, you can access the Nova Direct interface at:

http://localhost:10001

Use the credentials you specified in the nova.env file to log in. If you used the setup wizard, these will be the email and password you entered during setup.

🌐 Alternative Access: After deploying with Docker Compose, you can also access the Nova UI through the AstroPulse console at https://astropulse.io/console/nova

Step 3: Access the MCP Marketplace

Nova Direct includes a built-in MCP Marketplace interface at:

http://localhost:10000

The MCP Marketplace allows you to:

  • Browse available MCP servers
  • Dynamically add new MCP servers to extend Nova's capabilities
  • Remove MCP servers you no longer need
  • Configure server-specific settings and credentials
  • See real-time status of connected servers

Step 4: Access the Kubernetes Analyzer

The Kubernetes Analyzer interface is available at:

http://localhost:10002

The K8s Analyzer provides:

  • Real-time cluster health monitoring
  • Pod and deployment analysis
  • Resource usage visualization
  • Automated troubleshooting recommendations
  • Integration with Nova AI for intelligent insights

Configuration Options

Data Persistence

The Docker Compose setup mounts data directories for persistence:

Directory Structure
$
~/nova-platform/ ├── ui/ # Stores UI settings, conversations, and user data ├── backend/ # Stores backend services data └── proxy/ # Stores proxy configuration and caches

This ensures you don't lose your data when restarting or upgrading the containers.

Security Considerations

ConsiderationRecommendation
Admin CredentialsChange the default admin credentials immediately
API KeysUse strong, unique passwords and API keys
Network SecurityConsider limiting network access to the Nova ports
UpdatesKeep your Docker installation and Nova images updated

Resource Requirements

Nova Direct runs efficiently on most systems, but performance depends on your hardware:

ResourceMinimumRecommendedNotes
CPU2 cores4+ coresMore cores improve response times, especially with local models
RAM8GB16GB+Local models require significant RAM; cloud models need less
Disk20GB50GB+Additional space needed if using local models

💡 Resource Tip: Since Nova Direct runs on your local machine, ensure you have adequate resources available. For production use with local models, we recommend at least 16GB RAM and a modern multi-core CPU.

Troubleshooting

IssueSolution
Error starting userland proxyAnother service is using one of the required ports. Change the port mapping in docker-compose.yml
Connection refusedCheck if containers are running with docker-compose ps
Authentication errorsVerify your credentials in the nova.env file
Container stops unexpectedlyCheck logs with docker-compose logs

Updating Nova Direct

To update to the latest version of Nova Direct:

Terminal
$ docker compose pull
Pulling nova-ui ... done Pulling nova-backend ... done Pulling nova-proxy ... done
Terminal
$ docker compose down
[+] Running 4/4 ✔ Container nova-ui Removed 0.1s ✔ Container nova-proxy Removed 0.7s ✔ Container nova-backend Removed 10.2s ✔ Network nova-network Removed 0.1s
Terminal
$ docker compose up -d
[+] Running 4/4 ✔ Network nova-network Created 0s ✔ Container nova-proxy Started 3s ✔ Container nova-backend Started 3s ✔ Container nova-ui Started 4s

Advanced Usage

Custom Model Configurations

Via Nova UI

Navigate to /admin/functions in the Nova interface and click the settings icon next to each model to add your API key.

Via nova.env File

Add the appropriate environment variables for your preferred models (e.g., OPENAI_API_KEY, ANTHROPIC_API_KEY).

Model Requirements for Tool Calls

⚠️ Important: For optimal tool call support, we strongly recommend using Anthropic or OpenAI models over local models. Local models often have limited or inconsistent tool call capabilities.

Recommended Models for Tool Calls:

  • Anthropic Models (Claude 4 Sonnet, Claude 4.1 Opus) - Excellent tool call support
  • OpenAI Models (GTP-5 GPT-4) - Reliable tool call functionality
  • Google Gemini - Good tool call support

Local Model Considerations:

  • Local models may have limited tool call support
  • Performance depends heavily on your hardware resources
  • Ensure you have sufficient RAM (16GB+ recommended) and CPU cores
  • Install local models from Ollama library:
    • deepseek-r1:latest (recommended for local)
    • llama3.2:latest (basic tool support)
    • qwen2.5-coder:latest (code-focused tool support)
    • gpt-oss:latest (opennAi's weight model)
    • Browse more models at https://ollama.com/library

Why Nova Direct is Easier

🎯 Everything in Your Browser - No Configuration Hassles!

Nova Direct revolutionizes infrastructure management by providing everything you need in a single browser interface:

  • Fully Automated Setup: Just run docker compose up -d and access http://localhost:10001
  • Integrated UI: All features, settings, and configurations are available directly in your browser
  • No Manual Configuration: Unlike editor integrations, you don't need to configure endpoints or manage config files
  • One-Click Model Management: Switch between AI models with a simple dropdown in the UI
  • Built-in MCP Marketplace: Add integrations visually without editing any configuration files
  • Real-time Updates: See changes instantly without restarting services

Kubernetes Analyzer - Automated & Ready:

  • The Kubernetes analyzer works out of the box
  • AI models are pre-configured and ready to use
  • Use local models from Ollama library for privacy
  • Switch between local and cloud models seamlessly through the UI
  • No need to access separate configuration endpoints
  • Everything is managed through the intuitive web interface

Perfect for Teams Who Want:

  • ✅ Quick setup without technical configuration
  • ✅ Browser-based access from anywhere
  • ✅ Visual management of all features
  • ✅ No dependency on specific editors or tools
  • ✅ Simplified onboarding for team members