Getting Started

Installation

Install and set up Cortex analytics platform.

Quick Installation

Install Cortex using your preferred package manager:

pip install "telescope-cortex[api]"

Requirements

  • Python: 3.12+ (required)
  • Operating System: Linux, macOS, or Windows
  • Database: PostgreSQL, MySQL, SQLite

Installation Options

Core Package Only

Install just the semantic layer and core functionality:

pip install telescope-cortex

Full Package with API

Install with API server capabilities:

pip install "telescope-cortex[api]"

Verify Installation

After installation, verify that Cortex is working:

python -c "import cortex; print('Cortex installed successfully!')"

Environment Setup

Database

Cortex requires a SQL database for storing metadata. Configure using environment variables:

# Set database configuration
export CORTEX_DB_TYPE="postgresql"          # Database type
export CORTEX_DB_USERNAME="username"        # Database username
export CORTEX_DB_PASSWORD="password"        # Database password
export CORTEX_DB_HOST="localhost"           # Database host
export CORTEX_DB_PORT="5432"                # Database port
export CORTEX_DB_NAME="cortex"              # Database name

Supported Database Types:

  • postgresql - PostgreSQL (recommended for production)
  • mysql - MySQL (recommended for production)
  • sqlite - SQLite (development only)

Development vs Production:

  • Development: SQLite is supported for quick setup and testing
  • Production: PostgreSQL or MySQL are strongly recommended for better performance and reliability

For more information about storage configuration and database options, see the Storage Documentation.

SQLite In-Memory Mode (Development Only)

For quick testing and development, you can run Cortex entirely in memory using SQLite:

# Enable in-memory mode
export CORTEX_DB_MEMORY="true"
export CORTEX_DB_TYPE="sqlite"

# Start the API server
python -m cortex.api

Important Notes:

  • ⚠️ Data is not persisted - all data is lost when the server stops
  • ⚠️ Not recommended for production - use PostgreSQL or MySQL instead
  • Perfect for development - quick setup without database installation
  • Testing - ideal for running tests and experiments

API Server (Optional)

If you installed with API support, you can start the API server:

# Use this command when Cortex is installed via pip or poetry
python -m cortex.api

Both commands start the API server on http://0.0.0.0:9002 by default.

Next Steps

Quick Start

Get up and running with Cortex in minutes by following these essential steps.

Prerequisites

  • Cortex installed (pip install "telescope-cortex[api]" or poetry add "telescope-cortex[api]")
  • PostgreSQL database running
  • Sample data source (PostgreSQL, MySQL, or BigQuery)

Step 1: Installation

# Install Cortex with API support
pip install "telescope-cortex[api]"

# Verify installation
python -c "import cortex; print('Cortex installed successfully!')"

Start Studio

Start the Cortex API server and Studio interface:

# Set database configuration
export CORTEX_DB_TYPE="postgresql"
export CORTEX_DB_USERNAME="username"
export CORTEX_DB_PASSWORD="password"
export CORTEX_DB_HOST="localhost"
export CORTEX_DB_PORT="5432"
export CORTEX_DB_NAME="cortex"

# Start the API server
python -m cortex.api

# API will be available at http://localhost:9002

Studio Configuration:

Make sure your Studio is configured with the correct API base URL in the environment variables:

export API_BASE_URL="http://localhost:9002"

Step 2: Configure Workspaces & Environments

Create New Workspace

Studio Method:Creating a new workspace in Studio interface

  1. Open Studio at http://localhost:3000
  2. Click "Create Workspace"
  3. Enter workspace name and description
  4. Click "Create"

API Method:

curl -X POST "http://localhost:9002/api/v1/workspaces" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "My Analytics Workspace",
    "description": "Main workspace for analytics"
  }'

Example Response:

{
  "id": "550e8400-e29b-41d4-a716-446655440000",
  "name": "My Analytics Workspace", 
  "description": "Main workspace for analytics",
  "created_at": "2024-01-15T10:30:00Z",
  "updated_at": "2024-01-15T10:30:00Z"
}

Create New Environment

Studio Method:Setting up development and production environments

  1. Select your workspace
  2. Click "Create Environment"
  3. Choose environment type (Development/Production)
  4. Enter environment details
  5. Click "Create"

API Method:

curl -X POST "http://localhost:9002/api/v1/environments" \
  -H "Content-Type: application/json" \
  -d '{
    "workspace_id": "550e8400-e29b-41d4-a716-446655440000",
    "name": "Development",
    "description": "Development environment"
  }'

Example Response:

{
  "id": "123e4567-e89b-12d3-a456-426614174000",
  "workspace_id": "550e8400-e29b-41d4-a716-446655440000",
  "name": "Development",
  "description": "Development environment",
  "created_at": "2024-01-15T10:35:00Z",
  "updated_at": "2024-01-15T10:35:00Z"
}

Step 3: Add Data Sources

Add Data Source

Studio Method:Connecting to your database through Studio

  1. Navigate to "Data Sources" in Studio
  2. Click "Add Data Source"
  3. Select database type (PostgreSQL, MySQL, or BigQuery)
  4. Enter connection details
  5. Test connection
  6. Save data source

API Method:

For PostgreSQL:

curl -X POST "http://localhost:9002/api/v1/data-sources" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Sales Database",
    "description": "Primary sales data",
    "type": "postgresql",
    "connection_details": {
    "host": "localhost",
      "port": 5432,
      "database": "sales_db",
      "username": "sales_user",
    "password": "secure_password"
  }
  }'

For BigQuery:

curl -X POST "http://localhost:9002/api/v1/data-sources" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Analytics BigQuery",
    "description": "BigQuery analytics data",
    "type": "bigquery", 
    "connection_details": {
      "project_id": "my-analytics-project",
      "dataset_id": "sales_data",
      "service_account_key": "{ ... service account JSON ... }"
    }
  }'

Example Response:

{
  "id": "987f6543-21ba-98dc-76fe-543210987654",
  "name": "Sales Database",
  "type": "postgresql",
  "environment_id": "123e4567-e89b-12d3-a456-426614174000",
  "status": "connected",
  "created_at": "2024-01-15T10:40:00Z"
}

Step 4: Add Data Models

Create New Data Model

Studio Method:Creating a data model to organize your metrics

  1. Go to "Data Models" section
  2. Click "Create Data Model"
  3. Enter model name and description
  4. Save model

API Method:

curl -X POST "http://localhost:9002/api/v1/data-models" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Sales Analytics",
    "description": "Sales performance metrics and KPIs", 
    "version": "1.0.0"
  }'

Example Response:

{
  "id": "456d78ef-90ab-12cd-34ef-567890123456",
  "name": "Sales Analytics",
  "description": "Sales performance metrics and KPIs",
  "environment_id": "123e4567-e89b-12d3-a456-426614174000", 
  "version": "1.0.0",
  "created_at": "2024-01-15T10:45:00Z"
}

Step 5: Add Metrics

Create New Metric

Studio Method:Using the metric builder in Studio

  1. Navigate to your data model
  2. Click "Add Metric"
  3. Use the metric builder to configure:
    • Basic information (name, description)
    • Measures (quantitative data)
    • Dimensions (categorical data)
    • Filters (optional)
    • Joins (for cross-table references)

API Method:

curl -X POST "http://localhost:9002/api/v1/metrics" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Monthly Revenue",
    "description": "Total revenue by month",
  "table_name": "sales",
    "measures": [
      {
        "name": "total_revenue",
        "query": "SUM(amount)",
        "type": "number",
        "format": "currency"
      }
    ],
    "dimensions": [
      {
        "name": "month",
        "query": "DATE_TRUNC('\''month'\'', sale_date)",
        "type": "date"
      }
    ]
  }'

Configure Measures

Studio Method:Setting up quantitative measures in the metric builder

Use the measures builder to define:

  • Measure name and calculation
  • Data type and formatting
  • Aggregation method

API Example:

{
  "measures": [
    {
      "name": "total_revenue", 
      "query": "SUM(amount)",
      "type": "number",
      "format": "currency",
      "description": "Total sales revenue"
    },
    {
      "name": "avg_order_value",
      "query": "AVG(amount)", 
      "type": "number",
      "format": "currency",
      "description": "Average order value"
    }
  ]
}

Configure Dimensions

Studio Method:Setting up categorical dimensions for data grouping

Use the dimensions builder to define:

  • Dimension name and field
  • Data type and formatting
  • Grouping behavior

API Example:

{
  "dimensions": [
    {
      "name": "month",
      "query": "DATE_TRUNC('month', sale_date)",
      "type": "date",
      "description": "Sales month"
    },
    {
      "name": "product_category",
      "query": "category",
      "type": "string", 
      "description": "Product category"
    }
  ]
}

Add Filters (Optional)

Studio Method:Setting up optional filters for data subsetting

API Example:

{
  "filters": [
    {
      "name": "date_range",
      "query": "sale_date",
      "operator": "between",
      "type": "date",
      "description": "Date range filter"
    },
    {
      "name": "active_customers",
      "query": "customer_status", 
      "operator": "equals",
      "value": "active",
      "type": "string"
    }
  ]
}

Add Joins (For Cross-Table References)

Studio Method:Configuring table joins for cross-table metrics

API Example:

{
  "joins": [
    {
      "name": "customer_join",
      "left_table": "sales",
      "right_table": "customers", 
      "join_type": "inner",
      "conditions": [
        {
          "left_field": "customer_id",
          "right_field": "id",
          "operator": "equals"
        }
      ]
    }
  ]
}

Execute the Metric

Studio Method:Testing metric execution and viewing results

  1. Click "Preview" in the metric builder
  2. Adjust parameters if needed
  3. View results in charts and tables
  4. Save metric when satisfied

API Method:

curl -X POST "http://localhost:9002/api/v1/metrics/567e89ab-12cd-34ef-56gh-789012345678/execute" \
  -H "Content-Type: application/json" \
  -d '{
    "filters": {
      "date_range": {
        "start": "2024-01-01",
        "end": "2024-12-31"
      }
    },
    "limit": 100
  }'

Example Response:

{
  "data": [
    {
      "month": "2024-01-01",
      "total_revenue": 125000.50,
      "avg_order_value": 85.25
    },
    {
      "month": "2024-02-01", 
      "total_revenue": 142000.75,
      "avg_order_value": 92.15
    }
  ],
  "metadata": {
    "total_rows": 12,
    "execution_time_ms": 245,
    "query": "SELECT DATE_TRUNC('month', sale_date) as month, SUM(amount) as total_revenue, AVG(amount) as avg_order_value FROM sales WHERE sale_date BETWEEN '2024-01-01' AND '2024-12-31' GROUP BY month ORDER BY month"
  }
}

Supported Databases

Cortex currently supports the following SQL databases:

  • PostgreSQL - Full support with advanced features
  • MySQL - Complete SQL operations and schema introspection
  • BigQuery - Google Cloud BigQuery integration
  • SQLite - Local development and testing

Note: NoSQL database support is planned for future releases.

Next Steps

Now that you have Cortex set up with your first metric:

  1. Explore Studio: Learn more about the Studio interface for visual configuration
  2. API Reference: Review the API documentation for programmatic access
  3. Semantic Layer: Understand semantic configuration options
  4. Architecture: Learn about Cortex's architecture

Getting Help

If you encounter issues: