Skip to content

Adaptive Scaling #11

@csmangum

Description

@csmangum

To address the need for dynamically or adaptively scaling the input in your ContinuousEvolution project, you can consider several approaches:

  1. Adaptive Normalization:

    • Use techniques like Min-Max Scaling or Standardization, but update the scaling parameters periodically based on the most recent data.
    • Keep a rolling window of data to compute the scaling parameters. This way, the scaling adapts to the latest trends.
  2. Online Normalization:

    • Implement an online algorithm to update the mean and variance (or min and max) of the input data as new data arrives.
    • Algorithms like Welford's method can be used to update the mean and variance incrementally.
  3. Quantile Transformation:

    • Transform the data based on quantiles, which can adapt to the changing distribution of the data.
    • Use an online version of the QuantileTransformer to adjust to new data points.
  4. Z-Score Normalization with Decay:

    • Apply Z-Score normalization but introduce a decay factor to give more weight to recent data.
    • This method ensures that old data has less influence, allowing the scaling to adapt over time.
  5. Adaptive Binning:

    • Divide the data into bins and scale within each bin.
    • Update the bin boundaries periodically based on the distribution of the most recent data.

Here is a simple implementation of adaptive normalization using a rolling window approach in Python:

import numpy as np
from collections import deque

class AdaptiveScaler:
    def __init__(self, window_size):
        self.window_size = window_size
        self.data_window = deque(maxlen=window_size)
        
    def update(self, new_data):
        self.data_window.extend(new_data)
        current_data = np.array(self.data_window)
        self.min_val = current_data.min(axis=0)
        self.max_val = current_data.max(axis=0)
        
    def transform(self, data):
        return (data - self.min_val) / (self.max_val - self.min_val + 1e-6)

# Example usage:
scaler = AdaptiveScaler(window_size=100)
new_data = np.random.randn(10, 5)  # New batch of data
scaler.update(new_data)
scaled_data = scaler.transform(new_data)

This approach updates the scaling parameters (min and max values) based on the most recent window_size data points, ensuring that the scaling adapts to changes in the input data over time.

Feel free to adjust the window_size or experiment with other techniques mentioned above to best suit your project's requirements.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions