Open jjfantini opened 5 months ago
-- Symbols table (updated)
CREATE TABLE symbols (
symbol_id SERIAL PRIMARY KEY,
symbol VARCHAR(10) UNIQUE NOT NULL
);
-- Indicators table (renamed from data_points)
CREATE TABLE indicators (
indicator_id SERIAL PRIMARY KEY,
name VARCHAR(50) UNIQUE NOT NULL
);
-- Logic conditions table
CREATE TABLE logic_conditions (
logic_id SERIAL PRIMARY KEY,
condition VARCHAR(50) UNIQUE NOT NULL
);
-- Actions table
CREATE TABLE actions (
action_id SERIAL PRIMARY KEY,
name VARCHAR(20) UNIQUE NOT NULL
);
-- Alerts table (updated to reference indicators)
CREATE TABLE alerts (
alert_id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL,
symbol_id INTEGER NOT NULL,
indicator_id INTEGER NOT NULL,
logic_id INTEGER NOT NULL,
value DECIMAL(18, 6) NOT NULL,
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (user_id) REFERENCES auth.users(id),
FOREIGN KEY (symbol_id) REFERENCES symbols(symbol_id),
FOREIGN KEY (indicator_id) REFERENCES indicators(indicator_id),
FOREIGN KEY (logic_id) REFERENCES logic_conditions(logic_id)
);
-- Alert actions table (for multiple actions per alert)
CREATE TABLE alert_actions (
alert_id UUID,
action_id INTEGER,
PRIMARY KEY (alert_id, action_id),
FOREIGN KEY (alert_id) REFERENCES alerts(alert_id),
FOREIGN KEY (action_id) REFERENCES actions(action_id)
);
-- Alert history table
CREATE TABLE alert_history (
history_id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
alert_id UUID NOT NULL,
fired_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (alert_id) REFERENCES alerts(alert_id)
);
-- Indexes
CREATE INDEX idx_alerts_user_id ON alerts(user_id);
CREATE INDEX idx_alert_history_alert_id ON alert_history(alert_id);
This graph is a 🚧 WORK IN PROGRESS 🚧
graph TD
subgraph User Interface
A[User creates alert]
end
subgraph Database
B[(public.alerts)]
C[(active_tracking_alerts)]
D[(alerts_history)]
end
subgraph Alert System
E[create_alert_tracker]
F[Add to alert_tracker logic]
G[Handle INSERT, UPDATE, DELETE]
H[Run every 30 mins from 9am - 4pm ET]
I[Create tracker for each indicator]
J[Collects & runs all alert trackers in active_tracking_alerts]
end
subgraph Backend
K[celery/FastAPI apscheduler with background tasks]
end
A -->|saves| B
E -->|scrapes table| B
E -->|save new alert to| C
E -->|add to| F
F --> G
G --> H
H --> I
I --> J
J -->|fires alert & saves to| D
J --> K
K -->|Notifies User| A
I have thought about probably using FastAPI (w/cron jobs) or Celery for distributed work to be an alert_tracker()
Python function that will scrape the the public.alerts
table and collect all new alerts made by users. This function will read the parameters of the inputs and create logic to fetch those data points and compare the data values. This would happen every 30mins from 9am-4pm ET. Once the function creates an alert (+ adds it to a list of alerts to run the fetch and compare logic in the scheduled function). To keep track of the alerts that should be included in the cronjob, they should save the parameters in a table public.active_tracking_alerts
. Perhaps also hashing a value of the parameter combinations to identify unique combinations or something similar to the same effect. The reason is so that the function that is scraping the public.alerts
table to create the logic function (function factory), can see if it already has an alert running with the same parameters. So the scraping function should only track NEW and unique alerts. These alerts can then save their results back to a unique DB table public.alerts_history
with the unique hash of the alert functions' hashed parameter id. Users can then supscribe to each alerts changes with Supabase Realtime listening. So all users with this unique alert id toggled as active will get these alerts.
There should be one more table public.user_alerts
that keeps track off each user and their active alerts. There should be foreign key columns to public.profiles.id
and to public.alerts.id
.
Include a combination of user ID and alert parameters to create a unique hash. This will help in identifying and avoiding duplicate alerts.
Updated SQL Schema:
-- Existing tables with minor modifications
CREATE TABLE public.alerts (
alert_id UUID NOT NULL DEFAULT gen_random_uuid(),
user_id UUID NOT NULL,
symbol_id INTEGER NOT NULL,
indicator_id INTEGER NOT NULL,
logic_id INTEGER NOT NULL,
value VARCHAR NOT NULL,
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
is_active BOOLEAN NOT NULL DEFAULT TRUE,
CONSTRAINT alerts_pkey PRIMARY KEY (alert_id),
CONSTRAINT alerts_indicator_id_fkey FOREIGN KEY (indicator_id) REFERENCES indicators (indicator_id),
CONSTRAINT alerts_logic_id_fkey FOREIGN KEY (logic_id) REFERENCES logic_conditions (logic_id),
CONSTRAINT alerts_symbol_id_fkey FOREIGN KEY (symbol_id) REFERENCES all_symbols (symbol_id) ON UPDATE CASCADE ON DELETE CASCADE,
CONSTRAINT alerts_user_id_fkey FOREIGN KEY (user_id) REFERENCES auth.users (id) ON DELETE CASCADE
);
CREATE INDEX idx_alerts_user_id ON public.alerts USING btree (user_id);
CREATE INDEX idx_alerts_is_active ON public.alerts USING btree (is_active);
CREATE TABLE public.alert_history (
history_id UUID NOT NULL DEFAULT gen_random_uuid(),
alert_id UUID NOT NULL,
fired_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
indicator_value NUMERIC, -- Add this column to store the value when the alert fired
CONSTRAINT alert_history_pkey PRIMARY KEY (history_id),
CONSTRAINT alert_history_alert_id_fkey FOREIGN KEY (alert_id) REFERENCES alerts (alert_id) ON DELETE CASCADE
);
CREATE INDEX idx_alert_history_alert_id ON public.alert_history USING btree (alert_id);
CREATE INDEX idx_alert_history_fired_at ON public.alert_history USING btree (fired_at);
CREATE TABLE public.alert_actions (
alert_id UUID NOT NULL,
action_id INTEGER NOT NULL,
CONSTRAINT alert_actions_pkey PRIMARY KEY (alert_id, action_id),
CONSTRAINT alert_actions_action_id_fkey FOREIGN KEY (action_id) REFERENCES actions (action_id),
CONSTRAINT alert_actions_alert_id_fkey FOREIGN KEY (alert_id) REFERENCES alerts (alert_id) ON UPDATE CASCADE ON DELETE CASCADE
);
-- New tables
CREATE TABLE public.active_tracking_alerts (
tracking_id UUID NOT NULL DEFAULT gen_random_uuid(),
alert_id UUID NOT NULL,
last_checked TIMESTAMP WITH TIME ZONE,
next_check TIMESTAMP WITH TIME ZONE,
CONSTRAINT active_tracking_alerts_pkey PRIMARY KEY (tracking_id),
CONSTRAINT active_tracking_alerts_alert_id_fkey FOREIGN KEY (alert_id) REFERENCES alerts (alert_id) ON DELETE CASCADE
);
CREATE INDEX idx_active_tracking_alerts_next_check ON public.active_tracking_alerts USING btree (next_check);
CREATE TABLE public.user_alerts (
user_id UUID NOT NULL,
alert_id UUID NOT NULL,
is_active BOOLEAN NOT NULL DEFAULT TRUE,
last_notified TIMESTAMP WITH TIME ZONE,
CONSTRAINT user_alerts_pkey PRIMARY KEY (user_id, alert_id),
CONSTRAINT user_alerts_user_id_fkey FOREIGN KEY (user_id) REFERENCES auth.users (id) ON DELETE CASCADE,
CONSTRAINT user_alerts_alert_id_fkey FOREIGN KEY (alert_id) REFERENCES alerts (alert_id) ON DELETE CASCADE
);
CREATE INDEX idx_user_alerts_user_id ON public.user_alerts USING btree (user_id);
CREATE INDEX idx_user_alerts_is_active ON public.user_alerts USING btree (is_active);
CREATE TABLE public.alert_parameters (
parameter_id UUID NOT NULL DEFAULT gen_random_uuid(),
alert_id UUID NOT NULL,
param_name VARCHAR(50) NOT NULL,
param_value VARCHAR(255) NOT NULL,
CONSTRAINT alert_parameters_pkey PRIMARY KEY (parameter_id),
CONSTRAINT alert_parameters_alert_id_fkey FOREIGN KEY (alert_id) REFERENCES alerts (alert_id) ON DELETE CASCADE
);
CREATE INDEX idx_alert_parameters_alert_id ON public.alert_parameters USING btree (alert_id);
Implement a notification system using Supabase Realtime: You can use Supabase Realtime to notify users when their subscribed alerts are triggered. Here's a high-level approach: a. When an alert is triggered, insert a record into the alert_history table. b. Use Supabase Realtime to listen for changes in the alert_history table. c. When a new record is inserted, query the user_alerts table to find users who have subscribed to that alert and have it set as active. d. Send notifications to those users using Supabase Realtime channels.
CREATE OR REPLACE FUNCTION process_alert(p_alert_id UUID, p_indicator_value NUMERIC)
RETURNS VOID AS $$
BEGIN
-- Insert into alert_history
INSERT INTO alert_history (alert_id, indicator_value)
VALUES (p_alert_id, p_indicator_value);
-- Update last_notified for subscribed users
UPDATE user_alerts
SET last_notified = CURRENT_TIMESTAMP
WHERE alert_id = p_alert_id AND is_active = TRUE;
-- Here you would typically trigger a notification to subscribed users
-- This could be done through a separate service or using database triggers
END;
$$ LANGUAGE plpgsql;
const channel = supabase
.channel('alert-notifications')
.on(
'postgres_changes',
{
event: 'INSERT',
schema: 'public',
table: 'alert_history',
},
(payload) => {
// Process the notification
console.log('Alert triggered:', payload.new);
// Here you would typically update the UI or send a push notification
}
)
.subscribe();
Main Goal
Add an
Alerts
dashboard page to theAlerts
sidebar section. This page will allow for the CRUD'ing of alerts based on user selection criteria. Users will create alerts based on symbols and values from columns in their portfolios. The users will be able to select the symbol, logic, and value for the alert logic, and they will have a choice of receiving email, SMS, or Push notifications (which will appear on their dashboard in the bell icon.)Tasks
Create
/Active
/History
tabsCreate
tab add the logic execution (check references below)Active
tab have all currenlty active alertsall_symbols
Active Alerts
, this toggle will setpublic.alerts
active
column to True/False to determine if the user wants notifications form this alert or not.NotificationDropdown
with a little redx
on the right hand side.logic_conditions
table: equal_to, not_equal_to, greater_than, less_than, greater_than_or_equal_to, less_than_or_equal_to, and, or, notindicators
table: price, mandelbrot_channel_buy, mandelbrot_channel_sell, humbl_ratioNotes
Start simple, get the logic working with a simple example. I am still wondering about how and where the data will be monitored to fire the alerts. This can be a discussion between the developers to come to the best solution. My best idea so far is to have a serverless python function in the
/api
route that will run a cron job every 10 minutes and analyze the PortfolioTable data for all active alerts for each user. This can be done on bulk since would have access to all user alerts and the symbols that they are on.Resources
The video below shows RAW execution of what should be achieved. There should be a bit nicer UI with NEXTUI being used, and the custom values should be able to be inputted in the same drop-down rather than a new text box. But you can see how alerts should be CRUD'd.
https://github.com/humblFINANCE/humblFINANCE-frontend/assets/63276164/ef5ae020-5dbc-4769-83ee-4c86397a10fd