Frontend

Data Flow

Understanding how data flows through Studio's reactive architecture and component hierarchy.

Overview

Studio implements a sophisticated data flow architecture that manages state reactively across components while maintaining performance and type safety. Understanding these patterns is crucial for building scalable analytics interfaces.

High-level data flow architecture in Studio

Reactive State Architecture

Studio uses Vue's Composition API to create reactive state that automatically updates the UI when data changes.

State Layers

Different layers of state management in Studio

1. Global State (Composables)

Shared state managed through composables for application-wide data.

// Global workspace state
export function useWorkspaces() {
  const workspaces = ref<Workspace[]>([])
  const selectedWorkspaceId = ref<string | null>(null)
  const loading = ref(false)

  // Computed derived state
  const selectedWorkspace = computed(() => 
    workspaces.value.find(w => w.id === selectedWorkspaceId.value)
  )

  // State mutations
  function selectWorkspace(id: string) {
    selectedWorkspaceId.value = id
    // Trigger side effects
    fetchEnvironments(id)
  }

  return {
    workspaces: readonly(workspaces),
    selectedWorkspaceId,
    selectedWorkspace,
    selectWorkspace
  }
}

2. Component State (Local Refs)

Local component state for UI-specific data and interactions.

// Component-local state
const activeTab = ref('basic')
const editMode = ref(false)
const formData = reactive({
  name: '',
  description: '',
  config: {}
})

// Local computed properties
const isFormValid = computed(() => 
  formData.name.length > 0 && !errors.value.length
)

// Local watchers for side effects
watch(activeTab, (newTab) => {
  // Tab-specific logic
  if (newTab === 'preview') {
    generatePreview()
  }
})

3. Derived State (Computed Properties)

Automatically calculated state based on other reactive data.

// Derived state examples
const filteredMetrics = computed(() => 
  metrics.value.filter(metric => 
    metric.name.toLowerCase().includes(searchQuery.value.toLowerCase()) &&
    (!selectedModel.value || metric.data_model_id === selectedModel.value.id)
  )
)

const chartConfig = computed(() => ({
  type: selectedChartType.value,
  data: processedData.value,
  options: {
    responsive: true,
    plugins: {
      legend: { display: showLegend.value }
    }
  }
}))

Component Communication Patterns

Studio implements several patterns for component communication and data sharing.

Props Down, Events Up

Props down, events up communication pattern

// Parent component
<template>
  <MetricBuilder
    :metric="currentMetric"
    :data-models="availableModels"
    :loading="isLoading"
    @save="handleMetricSave"
    @cancel="handleCancel"
    @validate="handleValidation"
  />
</template>

<script setup>
const currentMetric = ref<SemanticMetric | null>(null)
const isLoading = ref(false)

function handleMetricSave(updatedMetric: SemanticMetric) {
  // Handle save logic
  updateMetric(updatedMetric)
}

function handleValidation(isValid: boolean) {
  // Handle validation state
  formValid.value = isValid
}
</script>

// Child component
<script setup>
interface Props {
  metric: SemanticMetric | null
  dataModels: DataModel[]
  loading?: boolean
}

interface Emits {
  (e: 'save', metric: SemanticMetric): void
  (e: 'cancel'): void
  (e: 'validate', isValid: boolean): void
}

const props = defineProps<Props>()
const emit = defineEmits<Emits>()

function handleSave() {
  if (isValid.value) {
    emit('save', formData.value)
  }
}
</script>

Provide/Inject Pattern

For deeply nested component hierarchies and shared context.

Provide/Inject pattern for context sharing

// Root component provides context
import { provide } from 'vue'

// Dashboard context
const dashboardContext = reactive({
  dashboard: currentDashboard,
  selectedView: activeView,
  editMode: isEditing,
  executionResults: results
})

provide('dashboardContext', dashboardContext)

// Deep child component injects context
import { inject } from 'vue'

const dashboardContext = inject('dashboardContext')

// Access context data
const { dashboard, selectedView, editMode } = dashboardContext

Composable Sharing

Shared business logic through composables.

Sharing logic through composables across components

// Shared metric validation logic
export function useMetricValidation() {
  const errors = ref<ValidationError[]>([])
  const isValid = computed(() => errors.value.length === 0)

  function validateMetric(metric: SemanticMetric) {
    errors.value = []
    
    if (!metric.name?.trim()) {
      errors.value.push({ field: 'name', message: 'Name is required' })
    }
    
    if (!metric.measures?.length) {
      errors.value.push({ field: 'measures', message: 'At least one measure is required' })
    }
    
    return isValid.value
  }

  return {
    errors: readonly(errors),
    isValid,
    validateMetric
  }
}

// Used in multiple components
const { errors, isValid, validateMetric } = useMetricValidation()

API Data Flow

How data flows from APIs through composables to components.

Request Flow

API request flow from components through composables

// 1. Component triggers action
async function loadMetrics() {
  await fetchMetrics(selectedModel.value.id)
}

// 2. Composable handles API call
export function useMetrics() {
  const metrics = ref<SemanticMetric[]>([])
  const loading = ref(false)
  const error = ref<string | null>(null)

  async function fetchMetrics(modelId: string) {
    loading.value = true
    error.value = null
    
    try {
      // 3. API call with error handling
      const response = await $fetch<{metrics: SemanticMetric[]}>(
        apiUrl(`/api/v1/data-models/${modelId}/metrics`)
      )
      
      // 4. Update reactive state
      metrics.value = response.metrics
    } catch (err: any) {
      error.value = err.message
    } finally {
      loading.value = false
    }
  }

  return {
    metrics: readonly(metrics),
    loading: readonly(loading),
    error: readonly(error),
    fetchMetrics
  }
}

// 5. Component reacts to state changes
watchEffect(() => {
  if (metrics.value.length > 0) {
    // Update UI automatically
    updateMetricsList()
  }
})

Response Processing

How API responses are processed and normalized

// Response transformation pipeline
export function useMetrics() {
  async function fetchMetrics(modelId: string) {
    const response = await $fetch(apiUrl(`/api/v1/data-models/${modelId}/metrics`))
    
    // 1. Validate response structure
    const validatedResponse = MetricsResponseSchema.parse(response)
    
    // 2. Transform data for UI
    const transformedMetrics = validatedResponse.metrics.map(metric => ({
      ...metric,
      // Add computed properties
      displayName: metric.title || metric.name,
      statusVariant: getStatusVariant(metric.validation_status),
      lastUpdated: formatDate(metric.updated_at)
    }))
    
    // 3. Update reactive state
    metrics.value = transformedMetrics
    
    // 4. Cache for performance
    cacheMetrics(modelId, transformedMetrics)
  }
}

Form Data Flow

Studio implements sophisticated form data flow patterns for complex configuration interfaces.

Form State Management

Form state management and validation flow

// Form data flow in metric builder
export function useMetricForm() {
  // 1. Form schema definition
  const formSchema = reactive({
    name: '',
    description: '',
    table_name: '',
    measures: [] as Measure[],
    dimensions: [] as Dimension[],
    joins: [] as Join[],
    filters: [] as Filter[]
  })

  // 2. Validation state
  const { errors, validate } = useFormValidation(formSchema)

  // 3. Dirty state tracking
  const isDirty = ref(false)
  const originalData = ref({})

  // 4. Auto-save functionality
  const { save: autoSave } = useAutoSave(formSchema, {
    interval: 30000, // 30 seconds
    key: 'metric-draft'
  })

  // 5. Real-time validation
  watchDebounced(
    formSchema,
    () => {
      validate()
      isDirty.value = hasChanges(formSchema, originalData.value)
    },
    { debounce: 500, deep: true }
  )

  return {
    formSchema,
    errors,
    isDirty,
    validate,
    save: autoSave
  }
}

Builder Component Flow

How builder components share and synchronize form state

// MetricSchemaBuilder.vue - Parent builder
const schema = reactive({
  basic: { name: '', description: '' },
  measures: [],
  dimensions: [],
  joins: []
})

// Child builders receive reactive refs
<BasicInfoBuilder v-model="schema.basic" />
<MeasuresBuilder v-model="schema.measures" />
<DimensionsBuilder v-model="schema.dimensions" />
<JoinsBuilder v-model="schema.joins" />

// Child component (MeasuresBuilder.vue)
interface Props {
  modelValue: Measure[]
}

interface Emits {
  (e: 'update:modelValue', value: Measure[]): void
}

const props = defineProps<Props>()
const emit = defineEmits<Emits>()

// Local working copy
const localMeasures = ref([...props.modelValue])

// Sync changes back to parent
watch(localMeasures, (newMeasures) => {
  emit('update:modelValue', newMeasures)
}, { deep: true })

// Handle external changes
watch(() => props.modelValue, (newValue) => {
  localMeasures.value = [...newValue]
}, { deep: true })

Chart Data Flow

How data flows from metrics to visualizations through the chart system.

Data Processing Pipeline

Data processing pipeline from raw data to chart visualization

// Chart data flow
export function useChartData() {
  const rawData = ref<any[]>([])
  const processedData = ref<ChartData>({})
  const chartConfig = ref<ChartConfig>({})

  // 1. Data transformation
  const transformedData = computed(() => {
    if (!rawData.value.length) return []
    
    return rawData.value.map(row => ({
      ...row,
      // Type conversions
      value: Number(row.value) || 0,
      date: new Date(row.date),
      // Formatting
      displayValue: formatNumber(row.value),
      displayDate: formatDate(row.date)
    }))
  })

  // 2. Chart-specific processing
  const chartData = computed(() => {
    const { type, xField, yFields, groupField } = chartConfig.value
    
    switch (type) {
      case 'bar':
        return processBarChartData(transformedData.value, xField, yFields)
      case 'line':
        return processLineChartData(transformedData.value, xField, yFields)
      case 'pie':
        return processPieChartData(transformedData.value, groupField, yFields[0])
      default:
        return transformedData.value
    }
  })

  // 3. ECharts option generation
  const echartsOptions = computed(() => {
    return generateEChartsOptions({
      type: chartConfig.value.type,
      data: chartData.value,
      theme: isDark.value ? 'dark' : 'light',
      responsive: true,
      ...chartConfig.value.options
    })
  })

  return {
    rawData,
    transformedData,
    chartData,
    echartsOptions,
    updateData: (data: any[]) => { rawData.value = data },
    updateConfig: (config: ChartConfig) => { chartConfig.value = config }
  }
}

Real-time Updates

Real-time data updates and chart synchronization

// Real-time chart updates
export function useRealtimeChart() {
  const { chartData, updateData } = useChartData()
  const wsConnection = ref<WebSocket | null>(null)

  function connectRealtime(metricId: string) {
    wsConnection.value = new WebSocket(
      wsUrl(`/api/v1/metrics/${metricId}/realtime`)
    )

    wsConnection.value.onmessage = (event) => {
      const update = JSON.parse(event.data)
      
      switch (update.type) {
        case 'data_update':
          // Update chart data reactively
          updateData(update.data)
          break
        case 'config_update':
          // Update chart configuration
          updateConfig(update.config)
          break
      }
    }
  }

  return {
    chartData,
    connectRealtime,
    disconnect: () => wsConnection.value?.close()
  }
}

Dashboard Data Flow

Complex data orchestration for dashboard execution and widget management.

Dashboard Execution Flow

Dashboard execution and widget data flow

// Dashboard execution orchestration
export function useDashboardExecution() {
  const executionResults = ref<DashboardExecutionResult | null>(null)
  const widgetResults = ref<Map<string, WidgetExecutionResult>>(new Map())
  const executionStatus = ref<'idle' | 'executing' | 'complete' | 'error'>('idle')

  async function executeDashboard(dashboard: Dashboard, viewId?: string) {
    executionStatus.value = 'executing'
    
    try {
      // 1. Execute all widgets in parallel
      const view = dashboard.views.find(v => v.alias === viewId) || dashboard.views[0]
      const widgets = view.sections.flatMap(s => s.widgets)
      
      const widgetPromises = widgets.map(widget => 
        executeWidget(dashboard.id, view.alias, widget.alias)
      )
      
      // 2. Wait for all widgets to complete
      const results = await Promise.allSettled(widgetPromises)
      
      // 3. Process results
      results.forEach((result, index) => {
        const widget = widgets[index]
        if (result.status === 'fulfilled') {
          widgetResults.value.set(widget.alias, result.value)
        } else {
          // Handle widget execution errors
          console.error(`Widget ${widget.alias} failed:`, result.reason)
        }
      })
      
      executionStatus.value = 'complete'
    } catch (error) {
      executionStatus.value = 'error'
      throw error
    }
  }

  return {
    executionResults: readonly(executionResults),
    widgetResults: readonly(widgetResults),
    executionStatus: readonly(executionStatus),
    executeDashboard
  }
}

Widget Data Synchronization

Widget data synchronization and update patterns

// Widget data synchronization
export function useWidgetData() {
  const widgetData = ref<Map<string, any>>(new Map())
  const loadingWidgets = ref<Set<string>>(new Set())
  const errorWidgets = ref<Map<string, string>>(new Map())

  function updateWidgetData(widgetId: string, data: any) {
    widgetData.value.set(widgetId, data)
    loadingWidgets.value.delete(widgetId)
    errorWidgets.value.delete(widgetId)
  }

  function setWidgetLoading(widgetId: string, loading: boolean) {
    if (loading) {
      loadingWidgets.value.add(widgetId)
    } else {
      loadingWidgets.value.delete(widgetId)
    }
  }

  function setWidgetError(widgetId: string, error: string) {
    errorWidgets.value.set(widgetId, error)
    loadingWidgets.value.delete(widgetId)
  }

  // Reactive getters for components
  const getWidgetData = (widgetId: string) => 
    computed(() => widgetData.value.get(widgetId))
  
  const isWidgetLoading = (widgetId: string) => 
    computed(() => loadingWidgets.value.has(widgetId))
  
  const getWidgetError = (widgetId: string) => 
    computed(() => errorWidgets.value.get(widgetId))

  return {
    updateWidgetData,
    setWidgetLoading,
    setWidgetError,
    getWidgetData,
    isWidgetLoading,
    getWidgetError
  }
}

Performance Optimization

Studio implements several patterns to optimize data flow performance.

Computed Caching

Computed property caching and dependency tracking

// Expensive computations with caching
const expensiveComputation = computed(() => {
  // This only re-runs when dependencies change
  return complexDataProcessing(
    rawData.value,
    filters.value,
    aggregations.value
  )
})

// Shallow reactive for large arrays
const largeDataset = shallowRef([])

// Computed with custom caching
const processedData = computed(() => {
  const cacheKey = getCacheKey(filters.value, aggregations.value)
  
  if (cache.has(cacheKey)) {
    return cache.get(cacheKey)
  }
  
  const result = processLargeDataset(largeDataset.value)
  cache.set(cacheKey, result)
  return result
})

Debounced Updates

Debouncing expensive operations and API calls

// Debounced search
const searchQuery = ref('')
const debouncedSearch = debounce(searchQuery, 300)

watchEffect(() => {
  if (debouncedSearch.value) {
    performSearch(debouncedSearch.value)
  }
})

// Debounced auto-save
const formData = reactive({})
const debouncedSave = debounce(() => {
  autoSave(formData)
}, 2000)

watch(formData, debouncedSave, { deep: true })

Virtual Scrolling

Virtual scrolling for large datasets

// Virtual list for large metric collections
const { list, containerProps, wrapperProps } = useVirtualList(
  metrics,
  {
    itemHeight: 60,
    overscan: 5,
  }
)

Error Boundaries

How Studio handles errors in the data flow.

Error Propagation

Error handling and recovery patterns

// Error boundary composable
export function useErrorBoundary() {
  const errors = ref<Map<string, Error>>(new Map())
  const hasErrors = computed(() => errors.value.size > 0)

  function handleError(context: string, error: Error) {
    console.error(`Error in ${context}:`, error)
    errors.value.set(context, error)
    
    // Report to error tracking
    reportError(error, { context })
  }

  function clearError(context: string) {
    errors.value.delete(context)
  }

  function retry(context: string, retryFn: () => Promise<void>) {
    clearError(context)
    return retryFn().catch(error => handleError(context, error))
  }

  return {
    errors: readonly(errors),
    hasErrors,
    handleError,
    clearError,
    retry
  }
}

Best Practices

Data Flow Design

  1. Unidirectional Flow: Maintain clear data flow direction from parent to child
  2. Single Source of Truth: Keep state in one place and derive other values
  3. Reactive Updates: Use computed properties for derived state
  4. Error Boundaries: Implement proper error handling at each layer
  5. Performance: Optimize expensive operations with caching and debouncing

State Management

  1. Local First: Keep state as local as possible to the components that need it
  2. Composable Abstraction: Extract complex state logic to reusable composables
  3. Type Safety: Maintain strong typing throughout the data flow
  4. Immutability: Prefer immutable updates for predictable state changes
  5. Cleanup: Properly clean up watchers and subscriptions

API Integration

  1. Centralized Logic: Keep API calls in composables rather than components
  2. Error Handling: Implement comprehensive error states and recovery
  3. Loading States: Provide visual feedback during async operations
  4. Caching: Cache responses appropriately to reduce network requests
  5. Optimistic Updates: Use optimistic updates for better perceived performance

Next Steps

Understanding Studio's data flow patterns provides the foundation for building responsive, performant analytics interfaces that handle complex state management gracefully.