Compare commits
12 commits
fcc745f09d
...
5fa87a712e
| Author | SHA1 | Date | |
|---|---|---|---|
| 5fa87a712e | |||
|
|
274473c544 | ||
|
|
cebcf74ba2 | ||
|
|
8867be9d3d | ||
|
|
f7cc48cc6a | ||
|
|
20d3ddb841 | ||
|
|
6ed4d3a1e2 | ||
|
|
2ca3d2f021 | ||
|
|
75d3417a2b | ||
|
|
f9834564ab | ||
|
|
2131faf8c6 | ||
|
|
6a0ceb78dd |
37 changed files with 3890 additions and 301 deletions
23
.forgejo/workflows/ci-static.yml
Normal file
23
.forgejo/workflows/ci-static.yml
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
name: Static Analysis
|
||||
|
||||
on: [push, pull_request]
|
||||
|
||||
jobs:
|
||||
static:
|
||||
runs-on: self-hosted
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Install Python
|
||||
run: pip install ruff mypy psycopg2-binary requests fastapi uvicorn python-multipart
|
||||
|
||||
- name: Lint with ruff
|
||||
run: ruff check .
|
||||
|
||||
- name: Type check with mypy
|
||||
run: >
|
||||
mypy
|
||||
ts_shared_rev.py
|
||||
ingest_movement_rev.py
|
||||
ingest_events_rev.py
|
||||
webhook_receiver_rev.py
|
||||
40
.forgejo/workflows/ci-tests.yml
Normal file
40
.forgejo/workflows/ci-tests.yml
Normal file
|
|
@ -0,0 +1,40 @@
|
|||
name: Tests
|
||||
|
||||
on: [push, pull_request]
|
||||
|
||||
jobs:
|
||||
test:
|
||||
runs-on: self-hosted
|
||||
services:
|
||||
timescaledb:
|
||||
image: timescale/timescaledb-ha:pg16-ts2.15
|
||||
env:
|
||||
POSTGRES_PASSWORD: test
|
||||
POSTGRES_DB: tracksolid_test
|
||||
POSTGRES_USER: postgres
|
||||
ports:
|
||||
- 5433:5432
|
||||
options: >-
|
||||
--health-cmd pg_isready
|
||||
--health-interval 10s
|
||||
--health-timeout 5s
|
||||
--health-retries 5
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
pip install pytest pytest-asyncio httpx psycopg2-binary requests \
|
||||
fastapi uvicorn python-multipart
|
||||
|
||||
- name: Run tests
|
||||
run: pytest tests/ -v --tb=short
|
||||
env:
|
||||
TRACKSOLID_APP_KEY: test_key
|
||||
TRACKSOLID_APP_SECRET: test_secret
|
||||
TRACKSOLID_USER_ID: test_user
|
||||
TRACKSOLID_PWD_MD5: test_md5
|
||||
DATABASE_URL: postgresql://postgres:test@localhost:5433/tracksolid_test
|
||||
TEST_DATABASE_URL: postgresql://postgres:test@localhost:5433/tracksolid_test
|
||||
JIMI_WEBHOOK_TOKEN: ""
|
||||
20
.forgejo/workflows/scheduled-audit.yml
Normal file
20
.forgejo/workflows/scheduled-audit.yml
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
name: DB Audit
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: "0 3 * * *" # 03:00 UTC = 06:00 EAT daily
|
||||
workflow_dispatch: # Also runnable manually from Forgejo UI
|
||||
|
||||
jobs:
|
||||
audit:
|
||||
runs-on: self-hosted
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Install dependencies
|
||||
run: pip install psycopg2-binary
|
||||
|
||||
- name: Run DB audit
|
||||
run: python db_audit/run_audit.py
|
||||
env:
|
||||
DATABASE_URL: ${{ secrets.DATABASE_URL }}
|
||||
|
|
@ -1 +1 @@
|
|||
3.13
|
||||
3.12.0
|
||||
|
|
|
|||
|
|
@ -6,20 +6,67 @@
|
|||
|
||||
## Table of Contents
|
||||
|
||||
0. [How to Use This Document](#0-how-to-use-this-document)
|
||||
1. [Data Foundation Summary](#1-data-foundation-summary)
|
||||
2. [Fleet Utilisation](#2-fleet-utilisation)
|
||||
3. [Driver Behaviour](#3-driver-behaviour)
|
||||
4. [Real-Time Dispatch — Nearest Vehicle to Job](#4-real-time-dispatch--nearest-vehicle-to-job)
|
||||
4. [Real-Time Dispatch & Field-Service SLAs](#4-real-time-dispatch--field-service-slas)
|
||||
5. [Distance per Driver per Day](#5-distance-per-driver-per-day)
|
||||
6. [Business Questions Now Answerable](#6-business-questions-now-answerable)
|
||||
7. [Grafana Dashboard Blueprint](#7-grafana-dashboard-blueprint)
|
||||
8. [What Unlocks the Remaining 30%](#8-what-unlocks-the-remaining-30)
|
||||
9. [Fleet Readiness Scorecard](#9-fleet-readiness-scorecard)
|
||||
10. [Service-Interval Forecaster](#10-service-interval-forecaster)
|
||||
|
||||
---
|
||||
|
||||
## 0. How to Use This Document
|
||||
|
||||
Every query in this document is tagged by intended consumption cadence. Build Grafana panels, alert rules, and scheduled reports against the tag — not the SQL text — so that moving a metric between dashboard and alert is a one-line change.
|
||||
|
||||
| Tag | Meaning | Typical cadence | Owner |
|
||||
|---|---|---|---|
|
||||
| `[DASHBOARD]` | Live or near-live panel | Refresh 30 s – 5 min | Ops / Dispatch |
|
||||
| `[ALERT]` | Trigger a page or ticket | Evaluate 1 – 15 min | On-call / Fleet Manager |
|
||||
| `[MONTHLY]` | Management / exec reporting | Run on 1st of month | Finance / Ops Lead |
|
||||
| `[AD-HOC]` | Investigation, audit, one-off | On demand | Analyst / Ops |
|
||||
|
||||
**Reading a query block**: each section lead-in states the tag(s). If a query has no tag it is reference material (schema, benchmark tables, appendix).
|
||||
|
||||
**Thresholds are starting points, not gospel**. Every red/amber/green band in this document must be re-calibrated against your own 30-day distribution once data matures. See [Appendix B — Threshold Calibration Guide](#appendix-b--threshold-calibration-guide).
|
||||
|
||||
**City-cohort cuts**. Fireside operates in Nairobi, Mombasa, and Kampala. Traffic, fuel prices, and shift norms differ materially between them. Any fleet-level metric should be sliceable by `devices.assigned_city` once that column is populated (see §3.7).
|
||||
|
||||
---
|
||||
|
||||
## 1. Data Foundation Summary
|
||||
|
||||
The ingestion stack currently populates the following data sources, each feeding the analytics layer:
|
||||
### 1.1 Current Deployment State *(as of 18 Apr 2026)*
|
||||
|
||||
> **⚠ New stack not yet live.** The refactored ingestion pipeline (`ingest_movement_rev.py` v2.2) targets the `tracksolid` schema, which is currently empty. All live data sits in the legacy `tracksolid_2` schema populated by the prior codebase. The queries in this document are written for the target schema (`tracksolid`) and will produce results once the new stack is deployed and the device sync has run.
|
||||
|
||||
| Metric | Observed value | Source |
|
||||
|---|---|---|
|
||||
| Devices registered | **63** (AT4-series, `353549*` IMEIs) | `tracksolid_2.devices` |
|
||||
| Driver names populated | **0 / 63** | `tracksolid_2.devices` |
|
||||
| Vehicle numbers populated | **0 / 63** | `tracksolid_2.devices` |
|
||||
| SIM numbers populated | **14 / 63** | `tracksolid_2.devices` |
|
||||
| Live positions (stale) | **19** | `tracksolid_2.live_positions` |
|
||||
| Position history rows | **208** | `tracksolid_2.position_history` |
|
||||
| Trips recorded | **5** (12.8 km total) | `tracksolid_2.trips` |
|
||||
| Parking / alarms / OBD | **0** each | `tracksolid_2.*` |
|
||||
| Last pipeline run | **6 Apr 2026 13:20 EAT** | `tracksolid_2.ingestion_log` |
|
||||
| Pipeline failure rate | **41%** (277/668 runs, all 401 auth errors) | `tracksolid_2.ingestion_log` |
|
||||
|
||||
**Why the pipeline stopped (6 Apr):** 276 consecutive `401 Unauthorized` errors against `eu-open.tracksolidpro.com`. The API token expired and was not refreshed — the prior codebase lacked the auto-refresh logic that `ts_shared_rev.py` now includes. Deploying the new stack resolves this permanently.
|
||||
|
||||
**CSV fleet (144 devices, X3/JC400P series):** The `20260414_FS__Logistics - final_fixed.csv` file contains a separate, newer batch of devices (`865135*`, `862798*` IMEIs) with full driver names and plates. **These 144 devices are not yet registered in the DB at all** — they will be synced by `sync_driver_audit.py` after the new stack is deployed, then enriched by `import_drivers_csv.py`.
|
||||
|
||||
---
|
||||
|
||||
### 1.2 Target Data Architecture
|
||||
|
||||
Once deployed, the ingestion stack populates the following data sources:
|
||||
|
||||
| Table | Content | Frequency |
|
||||
|---|---|---|
|
||||
|
|
@ -32,7 +79,7 @@ The ingestion stack currently populates the following data sources, each feeding
|
|||
| `tracksolid.devices` | Vehicle and driver registry | Daily at 02:00 |
|
||||
| `dwh_gold.fact_daily_fleet_metrics` | Daily KPI aggregates per vehicle | Nightly ETL |
|
||||
|
||||
**Position history density** increased significantly with the addition of `poll_track_list` (POLL-01):
|
||||
**Position history density** improvement with `poll_track_list` (POLL-01):
|
||||
|
||||
| Before | After |
|
||||
|---|---|
|
||||
|
|
@ -119,6 +166,8 @@ WHERE day >= DATE_TRUNC('month', CURRENT_DATE);
|
|||
|
||||
### 2.3 Vehicles That Did Not Move Today
|
||||
|
||||
`[DASHBOARD]` `[ALERT]` — alert if a vehicle has not moved for ≥ 2 consecutive working days.
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
d.imei,
|
||||
|
|
@ -139,6 +188,73 @@ ORDER BY d.imei;
|
|||
|
||||
---
|
||||
|
||||
### 2.4 Cost-per-Ticket and Cost-per-Km
|
||||
|
||||
`[MONTHLY]` — the single most actionable finance metric: *what does one completed field-service job actually cost in fuel?* Pairs the trip table with the ticketing system (replace `ops.tickets` with the actual source — Zoho Desk, Freshdesk, or the Fireside job-management export).
|
||||
|
||||
Requires `devices.fuel_100km` (see §8 Step 2). Diesel price is parameterised so this query works across Nairobi / Mombasa / Kampala without editing.
|
||||
|
||||
```sql
|
||||
WITH fuel_rates AS (
|
||||
SELECT
|
||||
'NBO'::TEXT AS city, 180.0::NUMERIC AS price_per_litre -- Nairobi diesel KES
|
||||
UNION ALL SELECT 'MBA', 175.0
|
||||
UNION ALL SELECT 'KLA', 5200.0 -- Kampala UGX → convert in BI layer
|
||||
),
|
||||
daily_cost AS (
|
||||
SELECT
|
||||
t.imei,
|
||||
DATE(t.start_time AT TIME ZONE 'Africa/Nairobi') AS working_day,
|
||||
SUM(t.distance_km) AS km,
|
||||
SUM(t.distance_km) * (d.fuel_100km / 100.0) AS litres,
|
||||
SUM(t.distance_km) * (d.fuel_100km / 100.0) * f.price_per_litre AS fuel_cost
|
||||
FROM tracksolid.trips t
|
||||
JOIN tracksolid.devices d ON d.imei = t.imei
|
||||
LEFT JOIN fuel_rates f ON f.city = d.assigned_city
|
||||
WHERE t.start_time >= DATE_TRUNC('month', CURRENT_DATE)
|
||||
AND t.end_time IS NOT NULL
|
||||
GROUP BY t.imei, working_day, d.fuel_100km, f.price_per_litre
|
||||
),
|
||||
tickets AS (
|
||||
SELECT
|
||||
assigned_imei AS imei,
|
||||
DATE(closed_at AT TIME ZONE 'Africa/Nairobi') AS working_day,
|
||||
COUNT(*) FILTER (WHERE status = 'resolved') AS tickets_closed
|
||||
FROM ops.tickets
|
||||
WHERE closed_at >= DATE_TRUNC('month', CURRENT_DATE)
|
||||
GROUP BY assigned_imei, working_day
|
||||
)
|
||||
SELECT
|
||||
dc.imei,
|
||||
d.driver_name,
|
||||
d.vehicle_number,
|
||||
SUM(dc.km) AS km_month,
|
||||
ROUND(SUM(dc.fuel_cost), 0) AS fuel_cost_kes_month,
|
||||
COALESCE(SUM(tk.tickets_closed), 0) AS tickets_closed,
|
||||
ROUND(SUM(dc.fuel_cost) / NULLIF(SUM(tk.tickets_closed), 0), 0) AS cost_per_ticket_kes,
|
||||
ROUND(SUM(dc.fuel_cost) / NULLIF(SUM(dc.km), 0), 2) AS cost_per_km_kes
|
||||
FROM daily_cost dc
|
||||
JOIN tracksolid.devices d ON d.imei = dc.imei
|
||||
LEFT JOIN tickets tk
|
||||
ON tk.imei = dc.imei
|
||||
AND tk.working_day = dc.working_day
|
||||
GROUP BY dc.imei, d.driver_name, d.vehicle_number
|
||||
ORDER BY cost_per_ticket_kes DESC NULLS LAST;
|
||||
```
|
||||
|
||||
**Interpretation bands** — driver-level cost-per-ticket (van fleet, Nairobi baseline):
|
||||
|
||||
| KES / ticket | Signal | Typical cause |
|
||||
|---|---|---|
|
||||
| < 400 | Efficient | Dense route, minimal backtracking |
|
||||
| 400 – 900 | Normal | Mixed urban route |
|
||||
| 900 – 1500 | Review | Scattered geography or low ticket throughput |
|
||||
| > 1500 | Investigate | Idle time, off-route driving, or single-ticket days |
|
||||
|
||||
> **Dependency:** requires ticket data joined on IMEI or driver ID. If only driver-level data is available, swap `assigned_imei` for a driver→imei lookup.
|
||||
|
||||
---
|
||||
|
||||
## 3. Driver Behaviour
|
||||
|
||||
### 3.1 Speeding
|
||||
|
|
@ -421,7 +537,169 @@ ORDER BY t.imei, week_start;
|
|||
|
||||
---
|
||||
|
||||
## 4. Real-Time Dispatch — Nearest Vehicle to Job
|
||||
### 3.6 Alarm-While-Parked — Tamper and Theft Signal
|
||||
|
||||
`[ALERT]` — an alarm event on a vehicle that has been stationary for > 10 minutes is qualitatively different from an alarm mid-drive. Stationary alarms are the strongest signal for tamper, battery disconnect, unauthorised ignition, or geofence breach by a *parked* vehicle being loaded. Fires highest-priority page.
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
a.imei,
|
||||
d.driver_name,
|
||||
d.vehicle_number,
|
||||
a.alarm_name,
|
||||
a.alarm_time AT TIME ZONE 'Africa/Nairobi' AS event_time,
|
||||
ROUND(
|
||||
EXTRACT(EPOCH FROM (a.alarm_time - p.end_time)) / 60.0, 1
|
||||
) AS minutes_parked_before_alarm,
|
||||
p.address AS park_location,
|
||||
a.lat, a.lng
|
||||
FROM tracksolid.alarms a
|
||||
JOIN tracksolid.devices d ON d.imei = a.imei
|
||||
JOIN LATERAL (
|
||||
SELECT end_time, address
|
||||
FROM tracksolid.parking_events p
|
||||
WHERE p.imei = a.imei
|
||||
AND p.start_time <= a.alarm_time
|
||||
AND (p.end_time IS NULL OR p.end_time >= a.alarm_time)
|
||||
ORDER BY p.start_time DESC
|
||||
LIMIT 1
|
||||
) p ON TRUE
|
||||
WHERE a.alarm_time > NOW() - INTERVAL '24 hours'
|
||||
AND a.alarm_type IN ('vibration', 'power_cut', 'geofence_enter', 'geofence_exit', 'unauthorized_ignition')
|
||||
ORDER BY a.alarm_time DESC;
|
||||
```
|
||||
|
||||
> **Page rule:** any row where `alarm_type IN ('power_cut', 'unauthorized_ignition')` AND vehicle has been parked > 10 min pages the on-call operations lead immediately. Other stationary alarms ticket to the fleet manager for next-day review.
|
||||
|
||||
---
|
||||
|
||||
### 3.7 Geographic Drift — Vehicles Operating Outside Assigned City
|
||||
|
||||
`[MONTHLY]` `[ALERT]` — detects vehicles running outside their assigned operating territory. Protects against unauthorised inter-city trips, fuel tourism, and route fraud.
|
||||
|
||||
**Prerequisite** — add an `assigned_city` column to the devices table:
|
||||
|
||||
```sql
|
||||
ALTER TABLE tracksolid.devices ADD COLUMN IF NOT EXISTS assigned_city TEXT;
|
||||
-- Example back-fill:
|
||||
UPDATE tracksolid.devices SET assigned_city = 'NBO' WHERE imei IN (...);
|
||||
UPDATE tracksolid.devices SET assigned_city = 'MBA' WHERE imei IN (...);
|
||||
UPDATE tracksolid.devices SET assigned_city = 'KLA' WHERE imei IN (...);
|
||||
```
|
||||
|
||||
City bounding boxes (approximate; widen as needed for suburban coverage):
|
||||
|
||||
| City | Code | min lat | max lat | min lng | max lng |
|
||||
|---|---|---|---|---|---|
|
||||
| Nairobi metro | NBO | -1.45 | -1.15 | 36.65 | 37.05 |
|
||||
| Mombasa metro | MBA | -4.15 | -3.90 | 39.55 | 39.80 |
|
||||
| Kampala metro | KLA | 0.20 | 0.45 | 32.50 | 32.75 |
|
||||
|
||||
```sql
|
||||
WITH city_box AS (
|
||||
SELECT * FROM (VALUES
|
||||
('NBO', -1.45, -1.15, 36.65, 37.05),
|
||||
('MBA', -4.15, -3.90, 39.55, 39.80),
|
||||
('KLA', 0.20, 0.45, 32.50, 32.75)
|
||||
) AS c(code, min_lat, max_lat, min_lng, max_lng)
|
||||
),
|
||||
out_of_zone AS (
|
||||
SELECT
|
||||
ph.imei,
|
||||
d.assigned_city,
|
||||
DATE(ph.gps_time AT TIME ZONE 'Africa/Nairobi') AS day,
|
||||
COUNT(*) AS fixes_outside_zone
|
||||
FROM tracksolid.position_history ph
|
||||
JOIN tracksolid.devices d ON d.imei = ph.imei
|
||||
JOIN city_box c ON c.code = d.assigned_city
|
||||
WHERE ph.gps_time > NOW() - INTERVAL '30 days'
|
||||
AND (
|
||||
ph.lat < c.min_lat OR ph.lat > c.max_lat
|
||||
OR ph.lng < c.min_lng OR ph.lng > c.max_lng
|
||||
)
|
||||
GROUP BY ph.imei, d.assigned_city, day
|
||||
)
|
||||
SELECT
|
||||
o.imei,
|
||||
d.driver_name,
|
||||
d.vehicle_number,
|
||||
o.assigned_city,
|
||||
o.day,
|
||||
o.fixes_outside_zone
|
||||
FROM out_of_zone o
|
||||
JOIN tracksolid.devices d ON d.imei = o.imei
|
||||
WHERE o.fixes_outside_zone > 20 -- ~10 minutes of continuous out-of-zone driving
|
||||
ORDER BY o.day DESC, o.fixes_outside_zone DESC;
|
||||
```
|
||||
|
||||
> **Alert threshold:** > 50 fixes outside zone in a single day = escalate. Expected legitimate cases: cross-city service trips, driver taking vehicle home across a city boundary (policy decision).
|
||||
|
||||
---
|
||||
|
||||
### 3.8 Odometer Divergence — Tracker vs Physical Reading
|
||||
|
||||
`[MONTHLY]` — compares cumulative distance recorded by the tracker against the vehicle's physical odometer (captured at service or fuel card events). Divergence > 10% suggests sensor drift, GPS gaps, or unauthorised driving with the tracker disabled.
|
||||
|
||||
```sql
|
||||
WITH tracker_km AS (
|
||||
SELECT
|
||||
imei,
|
||||
SUM(distance_km) AS trips_km_30d
|
||||
FROM tracksolid.trips
|
||||
WHERE start_time > NOW() - INTERVAL '30 days'
|
||||
AND end_time IS NOT NULL
|
||||
GROUP BY imei
|
||||
),
|
||||
physical_readings AS (
|
||||
-- Replace with actual odometer log source (service records, fuel card, manual entry)
|
||||
SELECT
|
||||
imei,
|
||||
reading_km,
|
||||
reading_date,
|
||||
LAG(reading_km) OVER (PARTITION BY imei ORDER BY reading_date) AS prev_reading_km,
|
||||
LAG(reading_date) OVER (PARTITION BY imei ORDER BY reading_date) AS prev_reading_date
|
||||
FROM ops.odometer_readings
|
||||
WHERE reading_date > NOW() - INTERVAL '60 days'
|
||||
),
|
||||
physical_delta AS (
|
||||
SELECT
|
||||
imei,
|
||||
reading_km - prev_reading_km AS physical_km,
|
||||
EXTRACT(DAY FROM (reading_date - prev_reading_date)) AS period_days
|
||||
FROM physical_readings
|
||||
WHERE prev_reading_km IS NOT NULL
|
||||
AND period_days BETWEEN 20 AND 40
|
||||
)
|
||||
SELECT
|
||||
p.imei,
|
||||
d.driver_name,
|
||||
d.vehicle_number,
|
||||
ROUND(p.physical_km, 0) AS odometer_km_period,
|
||||
ROUND(tk.trips_km_30d, 0) AS tracker_km_30d,
|
||||
ROUND(
|
||||
(p.physical_km - tk.trips_km_30d) / NULLIF(p.physical_km, 0) * 100,
|
||||
1
|
||||
) AS divergence_pct
|
||||
FROM physical_delta p
|
||||
JOIN tracker_km tk ON tk.imei = p.imei
|
||||
JOIN tracksolid.devices d ON d.imei = p.imei
|
||||
WHERE ABS(
|
||||
(p.physical_km - tk.trips_km_30d) / NULLIF(p.physical_km, 0)
|
||||
) > 0.10
|
||||
ORDER BY ABS(p.physical_km - tk.trips_km_30d) DESC;
|
||||
```
|
||||
|
||||
**Interpretation:**
|
||||
|
||||
| Divergence | Likely cause | Action |
|
||||
|---|---|---|
|
||||
| Tracker < physical (> 10%) | GPS outage, tracker powered off, engine driven with no fix | Audit device uptime; inspect for tamper |
|
||||
| Tracker > physical (> 10%) | Duplicate trip records, distance-correction bug | Run migration check; review `trips.distance_km` distribution |
|
||||
| Divergence growing month-over-month | Sensor drift, antenna degradation | Replace device or antenna |
|
||||
|
||||
---
|
||||
|
||||
## 4. Real-Time Dispatch & Field-Service SLAs
|
||||
|
||||
### 4.1 Find the 5 Closest Available Vehicles
|
||||
|
||||
|
|
@ -512,6 +790,148 @@ ORDER BY lp.imei;
|
|||
|
||||
---
|
||||
|
||||
### 4.4 Dispatch Log Schema
|
||||
|
||||
A persistent record of every dispatch decision, needed for every SLA and cost metric that follows. Create once:
|
||||
|
||||
```sql
|
||||
CREATE TABLE IF NOT EXISTS tracksolid.dispatch_log (
|
||||
dispatch_id BIGSERIAL PRIMARY KEY,
|
||||
ticket_id TEXT NOT NULL,
|
||||
imei TEXT NOT NULL REFERENCES tracksolid.devices(imei),
|
||||
driver_name TEXT,
|
||||
job_lat DOUBLE PRECISION NOT NULL,
|
||||
job_lng DOUBLE PRECISION NOT NULL,
|
||||
job_geom GEOMETRY(POINT, 4326),
|
||||
assigned_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
first_movement_at TIMESTAMPTZ, -- populated when vehicle leaves depot
|
||||
on_site_at TIMESTAMPTZ, -- vehicle enters 150 m radius of job
|
||||
resolved_at TIMESTAMPTZ, -- ticket closed in ops system
|
||||
cancelled_at TIMESTAMPTZ,
|
||||
distance_km NUMERIC(8, 2),
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_dispatch_log_ticket ON tracksolid.dispatch_log(ticket_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_dispatch_log_imei_assigned
|
||||
ON tracksolid.dispatch_log(imei, assigned_at DESC);
|
||||
CREATE INDEX IF NOT EXISTS idx_dispatch_log_assigned_at
|
||||
ON tracksolid.dispatch_log(assigned_at DESC);
|
||||
```
|
||||
|
||||
**Population plan:** n8n or the ops integration layer writes one row per dispatch at assignment. A nightly job back-fills `first_movement_at` / `on_site_at` by joining `trips` and `live_positions` against `job_geom`.
|
||||
|
||||
---
|
||||
|
||||
### 4.5 Field-Service SLA Metrics
|
||||
|
||||
`[DASHBOARD]` `[ALERT]` `[MONTHLY]` — the operational heartbeat of a field-services business. Four timings per ticket, each a discrete SLA with its own band.
|
||||
|
||||
```
|
||||
ticket_created ─► assigned ─► first_movement ─► on_site ─► resolved
|
||||
(dispatch (depot depart (vehicle (job done)
|
||||
latency) latency) arrived)
|
||||
```
|
||||
|
||||
**(a) Dispatch latency** — from ticket creation to vehicle assignment:
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
t.ticket_id,
|
||||
EXTRACT(EPOCH FROM (dl.assigned_at - t.created_at)) / 60 AS dispatch_latency_min
|
||||
FROM ops.tickets t
|
||||
JOIN tracksolid.dispatch_log dl ON dl.ticket_id = t.ticket_id
|
||||
WHERE t.created_at > NOW() - INTERVAL '7 days';
|
||||
```
|
||||
|
||||
**(b) Dispatch-to-depart** — from assignment to vehicle actually leaving the depot:
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
dl.ticket_id,
|
||||
dl.imei,
|
||||
d.driver_name,
|
||||
EXTRACT(EPOCH FROM (dl.first_movement_at - dl.assigned_at)) / 60 AS depart_delay_min
|
||||
FROM tracksolid.dispatch_log dl
|
||||
JOIN tracksolid.devices d ON d.imei = dl.imei
|
||||
WHERE dl.assigned_at > NOW() - INTERVAL '7 days'
|
||||
AND dl.first_movement_at IS NOT NULL
|
||||
ORDER BY depart_delay_min DESC;
|
||||
```
|
||||
|
||||
**(c) Time-to-site** — from assignment to arrival at the job location (vehicle within 150 m):
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
dl.ticket_id,
|
||||
dl.imei,
|
||||
ROUND(dl.distance_km, 1) AS distance_km,
|
||||
EXTRACT(EPOCH FROM (dl.on_site_at - dl.assigned_at)) / 60 AS time_to_site_min,
|
||||
ROUND(
|
||||
dl.distance_km /
|
||||
NULLIF(EXTRACT(EPOCH FROM (dl.on_site_at - dl.assigned_at)) / 3600, 0),
|
||||
1
|
||||
) AS avg_transit_kmh
|
||||
FROM tracksolid.dispatch_log dl
|
||||
WHERE dl.assigned_at > NOW() - INTERVAL '7 days'
|
||||
AND dl.on_site_at IS NOT NULL;
|
||||
```
|
||||
|
||||
**(d) On-site to resolution** — wrench time at the job:
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
dl.ticket_id,
|
||||
dl.imei,
|
||||
EXTRACT(EPOCH FROM (dl.resolved_at - dl.on_site_at)) / 60 AS wrench_time_min
|
||||
FROM tracksolid.dispatch_log dl
|
||||
WHERE dl.on_site_at IS NOT NULL
|
||||
AND dl.resolved_at IS NOT NULL
|
||||
AND dl.assigned_at > NOW() - INTERVAL '30 days';
|
||||
```
|
||||
|
||||
**Monthly SLA attainment per driver:**
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
dl.imei,
|
||||
d.driver_name,
|
||||
COUNT(*) AS tickets,
|
||||
ROUND(AVG(
|
||||
EXTRACT(EPOCH FROM (dl.first_movement_at - dl.assigned_at))
|
||||
) / 60, 1) AS avg_depart_min,
|
||||
ROUND(AVG(
|
||||
EXTRACT(EPOCH FROM (dl.on_site_at - dl.assigned_at))
|
||||
) / 60, 1) AS avg_time_to_site_min,
|
||||
ROUND(AVG(
|
||||
EXTRACT(EPOCH FROM (dl.resolved_at - dl.on_site_at))
|
||||
) / 60, 1) AS avg_wrench_min,
|
||||
ROUND(
|
||||
100.0 * COUNT(*) FILTER (
|
||||
WHERE EXTRACT(EPOCH FROM (dl.on_site_at - dl.assigned_at)) / 60 <= 90
|
||||
) / NULLIF(COUNT(*), 0),
|
||||
1
|
||||
) AS pct_on_site_within_90min
|
||||
FROM tracksolid.dispatch_log dl
|
||||
JOIN tracksolid.devices d ON d.imei = dl.imei
|
||||
WHERE dl.assigned_at >= DATE_TRUNC('month', CURRENT_DATE)
|
||||
AND dl.on_site_at IS NOT NULL
|
||||
GROUP BY dl.imei, d.driver_name
|
||||
ORDER BY pct_on_site_within_90min DESC;
|
||||
```
|
||||
|
||||
**Target bands** (baseline — recalibrate after 90 days of data):
|
||||
|
||||
| SLA | Green | Amber | Red |
|
||||
|---|---|---|---|
|
||||
| Dispatch latency (ops → driver) | < 10 min | 10 – 25 min | > 25 min |
|
||||
| Depart delay (assigned → moving) | < 15 min | 15 – 35 min | > 35 min |
|
||||
| Time-to-site (assigned → on-site) | < 60 min | 60 – 120 min | > 120 min |
|
||||
| Wrench time (on-site → resolved) | < 90 min | 90 – 180 min | > 180 min |
|
||||
| % on-site within 90 min (monthly) | ≥ 85% | 70 – 85% | < 70% |
|
||||
|
||||
---
|
||||
|
||||
## 5. Distance per Driver per Day
|
||||
|
||||
### 5.1 Today's Summary
|
||||
|
|
@ -586,28 +1006,32 @@ ORDER BY k.total_km DESC;
|
|||
|
||||
## 6. Business Questions Now Answerable
|
||||
|
||||
| Business Question | Primary Data Source | Confidence |
|
||||
Status key: **✅ Ready** = answerable once new stack deployed | **⚙ Needs data** = additional setup required | **🔴 Blocked** = pending action before any data
|
||||
|
||||
| Business Question | Primary Data Source | Status |
|
||||
|---|---|---|
|
||||
| Which vehicles are moving right now? | `live_positions` | High |
|
||||
| Who started work latest today? | `fact_daily_fleet_metrics.day_start_time` | High |
|
||||
| Who drove the most km this week? | `trips` + `devices` | High |
|
||||
| Which vehicle spent the most time idling? | `trips.idle_time_s` | High |
|
||||
| How much fuel was wasted on idle today? | `trips.idle_time_s` × est. rate | Medium (needs `fuel_100km` set) |
|
||||
| Which driver triggered the most alarms this month? | `alarms` + `devices` | High |
|
||||
| What is total fleet distance this month? | `trips` | High |
|
||||
| Which vehicles did not move at all today? | `trips` LEFT JOIN `devices` | High |
|
||||
| Who is nearest to a new job right now? | `live_positions` + PostGIS | High |
|
||||
| Did any vehicle leave depot after hours? | `trips` time filter | High |
|
||||
| What is the speeding rate per driver per week? | `position_history` speed filter | High |
|
||||
| Which driver has the harshest driving style? | `position_history` delta query | High (needs 1–2 weeks of `track_list` data to accumulate) |
|
||||
| Are vehicles on approved routes? | `position_history` + `geofences` | Low (pending geofence population) |
|
||||
| Is cold chain in temperature range? | `temperature_readings` | Low (pending webhook registration) |
|
||||
| How much fuel is consumed per route? | `fuel_readings` + `trips` | Low (pending fuel sensor webhook) |
|
||||
| What is the real odometer per vehicle? | `live_positions.current_mileage` | Medium (depends on tracker calibration) |
|
||||
| How many km to next service interval? | `live_positions.current_mileage` - last service | Open (requires service log) |
|
||||
| Did any vehicle enter a restricted zone? | `alarms` (geofence type) + `geofences` | Low (pending geofence setup) |
|
||||
| Which drivers are consistently late on Mondays? | `fact_daily_fleet_metrics` day-of-week filter | High |
|
||||
| What percentage of the fleet was utilised today? | `trips` + `devices` count | High |
|
||||
| Which vehicles are moving right now? | `live_positions` | ✅ Ready (deploy stack) |
|
||||
| Who started work latest today? | `fact_daily_fleet_metrics.day_start_time` | ✅ Ready (deploy stack) |
|
||||
| Who drove the most km this week? | `trips` + `devices` | ✅ Ready (deploy + CSV import) |
|
||||
| Which vehicle spent the most time idling? | `trips.idle_time_s` | ✅ Ready (deploy stack) |
|
||||
| How much fuel was wasted on idle today? | `trips.idle_time_s` × rate | ⚙ Needs `fuel_100km` set per vehicle |
|
||||
| Which driver triggered the most alarms this month? | `alarms` + `devices` | ✅ Ready (deploy stack) |
|
||||
| What is total fleet distance this month? | `trips` | ✅ Ready (deploy stack) |
|
||||
| Which vehicles did not move at all today? | `trips` LEFT JOIN `devices` | ✅ Ready (deploy stack) |
|
||||
| Who is nearest to a new job right now? | `live_positions` + PostGIS | ✅ Ready (deploy + CSV import for names) |
|
||||
| Did any vehicle leave depot after hours? | `trips` time filter | ✅ Ready (deploy stack) |
|
||||
| What is the speeding rate per driver per week? | `position_history` speed filter | ✅ Ready (needs 1 week data) |
|
||||
| Which driver has the harshest driving style? | `position_history` delta query | ✅ Ready (needs 2 weeks `track_list`) |
|
||||
| What does one field ticket cost in fuel? | `trips` + `ops.tickets` + `fuel_100km` | ⚙ Needs `fuel_100km` + ticket feed wired |
|
||||
| Which vehicles are running outside assigned city? | `position_history` + `assigned_city` | ⚙ Needs `assigned_city` set (CSV import) |
|
||||
| How many km to next service interval? | `devices.current_mileage` + `ops.service_log` | ⚙ Needs first service-log entry per vehicle |
|
||||
| Are vehicles on approved routes? | `position_history` + `geofences` | ⚙ Pending geofence population (Step 4) |
|
||||
| Is cold chain in temperature range? | `temperature_readings` | 🔴 Pending webhook registration (Step 1) |
|
||||
| How much fuel is consumed per route? | `fuel_readings` + `trips` | 🔴 Pending fuel sensor webhook (Step 1) |
|
||||
| Did any vehicle enter a restricted zone? | `alarms` + `geofences` | 🔴 Pending geofence setup (Step 4) |
|
||||
| What percentage of the fleet was utilised today? | `trips` + `devices` count | ✅ Ready (deploy stack) |
|
||||
| Alarm while parked — tamper / theft signal | `alarms` + `parking_events` | ✅ Ready (deploy stack) |
|
||||
| Odometer divergence — tracker vs physical | `trips` + `ops.odometer_readings` | ⚙ Needs first odometer reading entry |
|
||||
|
||||
---
|
||||
|
||||
|
|
@ -659,7 +1083,48 @@ Ranked by aggression index (harsh events per 100 km), speeding events, and late
|
|||
|
||||
## 8. What Unlocks the Remaining 30%
|
||||
|
||||
The data foundation is in place. The following five steps activate the remaining analytics capabilities:
|
||||
The data foundation is in place. The following steps activate the remaining analytics capabilities, in priority order.
|
||||
|
||||
### Step 0 — Deploy New Ingestion Stack *(Current Blocker — do first)*
|
||||
|
||||
All analytics in this document are blocked until the new stack is live. The legacy pipeline stopped on **6 Apr 2026** due to 401 token expiry errors. The refactored code fixes this permanently.
|
||||
|
||||
```bash
|
||||
# On the Coolify server / inside the repo directory:
|
||||
|
||||
# 1. Pull latest code (includes all revisions through cebcf74)
|
||||
git pull
|
||||
|
||||
# 2. Apply schema migrations (01 through 06 in order)
|
||||
TS_DB=$(docker ps --filter "name=timescale_db" --format "{{.Names}}" | head -1)
|
||||
for f in 01_tracksolid_base.sql 02_tracksolid_full_schema_rev.sql \
|
||||
03_webhook_schema_migration.sql 04_bug_fix_migration.sql \
|
||||
05_enhancement_migration.sql 06_business_analytics_migration.sql; do
|
||||
echo "Applying $f..."
|
||||
docker exec -i "$TS_DB" psql -U postgres -d tracksolid_db < "$f"
|
||||
done
|
||||
|
||||
# 3. Rebuild and start new ingestion containers
|
||||
docker compose up -d --build ingest_movement ingest_events webhook_receiver
|
||||
|
||||
# 4. Run initial device sync (populates tracksolid.devices from API)
|
||||
docker exec -it ingest_movement python sync_driver_audit.py
|
||||
|
||||
# 5. Import driver/vehicle details from CSV
|
||||
docker exec -it ingest_movement python import_drivers_csv.py # dry-run
|
||||
docker exec -it ingest_movement python import_drivers_csv.py --apply # commit
|
||||
|
||||
# 6. Schedule nightly ETL
|
||||
# Add to cron or n8n: SELECT dwh_gold.refresh_daily_metrics(CURRENT_DATE - 1);
|
||||
```
|
||||
|
||||
**Expected state after Step 0:**
|
||||
- `tracksolid.devices`: 144+ rows with driver names, plates, departments, assigned_city
|
||||
- `tracksolid.live_positions`: positions refreshing every 60 seconds
|
||||
- `tracksolid.trips` / `position_history`: accumulating from first pipeline run
|
||||
- All analytics in this document begin producing results within 15 minutes of container start
|
||||
|
||||
---
|
||||
|
||||
### Step 1 — Register Webhooks in Tracksolid Pro Account *(Blocker)*
|
||||
Without registration, the following tables remain empty regardless of code:
|
||||
|
|
@ -693,16 +1158,24 @@ UPDATE tracksolid.devices SET fuel_100km = 9.0 WHERE vehicle_category = 'car';
|
|||
|
||||
### Step 3 — Populate Vehicle Names and Driver Names
|
||||
|
||||
Currently all 63 devices show blank fields. Reports display IMEI numbers instead of human-readable identities.
|
||||
**Automated:** `import_drivers_csv.py` (committed to the repo) reads `20260414_FS__Logistics - final_fixed.csv` (144 devices) and sets `driver_name`, `vehicle_number`, `vehicle_models`, `cost_centre`, `assigned_city`, `sim`, `iccid`, `imsi` in a single pass. Run after Step 0 device sync.
|
||||
|
||||
```bash
|
||||
docker exec -it ingest_movement python import_drivers_csv.py --apply
|
||||
```
|
||||
|
||||
CSV coverage after import: 140 vehicles with plates, 144 with driver names, 138 with SIM, `assigned_city` inferred (NBO=136, KLA=4). The 4 "Identification" spare units are skipped automatically.
|
||||
|
||||
**Manual top-up** for any device not in the CSV:
|
||||
|
||||
```sql
|
||||
-- Update individually or import from CSV via COPY
|
||||
UPDATE tracksolid.devices
|
||||
SET vehicle_name = 'KBZ 123A',
|
||||
vehicle_number = 'KBZ 123A',
|
||||
driver_name = 'John Kamau',
|
||||
driver_phone = '+254700000001',
|
||||
vehicle_category = 'van'
|
||||
vehicle_category = 'van',
|
||||
assigned_city = 'NBO'
|
||||
WHERE imei = '352093080000001';
|
||||
```
|
||||
|
||||
|
|
@ -731,29 +1204,233 @@ VALUES (
|
|||
|
||||
### Step 5 — Run Migrations and Deploy Updated Containers
|
||||
|
||||
```bash
|
||||
# Resolve container name dynamically (survives Coolify redeployments)
|
||||
TS_DB=$(docker ps --filter "name=timescale_db" --format "{{.Names}}" | head -1)
|
||||
|
||||
# 1. Run distance correction migration (fixes historical data)
|
||||
docker exec -i "$TS_DB" psql -U postgres -d tracksolid_db \
|
||||
< /migrations/04_bug_fix_migration.sql
|
||||
|
||||
# 2. Run schema enhancement migration (new tables + columns)
|
||||
docker exec -i "$TS_DB" psql -U postgres -d tracksolid_db \
|
||||
< /migrations/05_enhancement_migration.sql
|
||||
|
||||
# 3. Rebuild and restart ingestion containers with updated code
|
||||
docker compose up -d --build ingest_movement ingest_events webhook_receiver
|
||||
|
||||
# 4. Schedule nightly ETL
|
||||
# Add to cron or n8n:
|
||||
# SELECT dwh_gold.refresh_daily_metrics(CURRENT_DATE - 1);
|
||||
```
|
||||
See **Step 0** above for the full deployment sequence. All six migrations (01–06) must be applied in order before starting the new containers. Step 0 includes the complete command block.
|
||||
|
||||
---
|
||||
|
||||
## Appendix — Key Metric Thresholds Reference
|
||||
## 9. Fleet Readiness Scorecard
|
||||
|
||||
`[DASHBOARD]` `[MONTHLY]` — a single composite number per vehicle, useful as a morning briefing and a monthly fleet health report. Runs against only the tables you already have — no new DDL required — so this is the fastest concrete win in this document.
|
||||
|
||||
Five sub-scores (0 – 100), averaged with weights:
|
||||
|
||||
| Sub-score | Weight | Signal |
|
||||
|---|---|---|
|
||||
| **Freshness** | 25% | GPS fix age vs. a 5-minute target |
|
||||
| **Coverage** | 20% | Active days in the last 7 |
|
||||
| **Silence** | 15% | Tracker went dark > 30 min during working hours |
|
||||
| **Alarm pressure** | 20% | Alarms per 100 km over 30 days |
|
||||
| **Driver behaviour** | 20% | Aggression + speeding index |
|
||||
|
||||
```sql
|
||||
WITH freshness AS (
|
||||
SELECT
|
||||
imei,
|
||||
EXTRACT(EPOCH FROM (NOW() - gps_time)) / 60 AS minutes_since_fix
|
||||
FROM tracksolid.live_positions
|
||||
),
|
||||
coverage AS (
|
||||
SELECT
|
||||
imei,
|
||||
COUNT(DISTINCT DATE(start_time AT TIME ZONE 'Africa/Nairobi')) AS days_active_7d
|
||||
FROM tracksolid.trips
|
||||
WHERE start_time > NOW() - INTERVAL '7 days'
|
||||
GROUP BY imei
|
||||
),
|
||||
silence AS (
|
||||
-- Gaps > 30 min during 07:00 – 19:00 EAT in the last 7 days
|
||||
SELECT
|
||||
imei,
|
||||
COUNT(*) AS silence_events_7d
|
||||
FROM (
|
||||
SELECT
|
||||
imei,
|
||||
gps_time,
|
||||
LAG(gps_time) OVER (PARTITION BY imei ORDER BY gps_time) AS prev_time
|
||||
FROM tracksolid.position_history
|
||||
WHERE gps_time > NOW() - INTERVAL '7 days'
|
||||
AND EXTRACT(HOUR FROM gps_time AT TIME ZONE 'Africa/Nairobi') BETWEEN 7 AND 19
|
||||
) gaps
|
||||
WHERE EXTRACT(EPOCH FROM (gps_time - prev_time)) > 1800
|
||||
GROUP BY imei
|
||||
),
|
||||
alarm_pressure AS (
|
||||
SELECT
|
||||
a.imei,
|
||||
COUNT(*) AS alarms_30d,
|
||||
SUM(t.distance_km) AS km_30d
|
||||
FROM tracksolid.alarms a
|
||||
LEFT JOIN tracksolid.trips t
|
||||
ON t.imei = a.imei
|
||||
AND t.start_time > NOW() - INTERVAL '30 days'
|
||||
WHERE a.alarm_time > NOW() - INTERVAL '30 days'
|
||||
GROUP BY a.imei
|
||||
),
|
||||
behaviour AS (
|
||||
SELECT
|
||||
ph.imei,
|
||||
COUNT(*) FILTER (WHERE ph.speed > 100) AS over_100,
|
||||
COUNT(*) FILTER (
|
||||
WHERE ABS(ph.speed - LAG(ph.speed) OVER (
|
||||
PARTITION BY ph.imei ORDER BY ph.gps_time
|
||||
)) > 30
|
||||
) AS harsh_events
|
||||
FROM tracksolid.position_history ph
|
||||
WHERE ph.gps_time > NOW() - INTERVAL '30 days'
|
||||
AND ph.source = 'track_list'
|
||||
GROUP BY ph.imei
|
||||
)
|
||||
SELECT
|
||||
d.imei,
|
||||
d.driver_name,
|
||||
d.vehicle_number,
|
||||
ROUND(
|
||||
GREATEST(0, 100 - COALESCE(f.minutes_since_fix, 999) / 5.0 * 20)
|
||||
) AS freshness_score,
|
||||
ROUND(
|
||||
LEAST(100, COALESCE(c.days_active_7d, 0) / 5.0 * 100)
|
||||
) AS coverage_score,
|
||||
ROUND(
|
||||
GREATEST(0, 100 - COALESCE(s.silence_events_7d, 0) * 10)
|
||||
) AS silence_score,
|
||||
ROUND(
|
||||
GREATEST(0, 100 - COALESCE(
|
||||
ap.alarms_30d::NUMERIC / NULLIF(ap.km_30d, 0) * 100 * 20, 0
|
||||
))
|
||||
) AS alarm_score,
|
||||
ROUND(
|
||||
GREATEST(0, 100 - COALESCE(b.over_100, 0) * 2 - COALESCE(b.harsh_events, 0) * 3)
|
||||
) AS behaviour_score,
|
||||
ROUND(
|
||||
GREATEST(0, 100 - COALESCE(f.minutes_since_fix, 999) / 5.0 * 20) * 0.25
|
||||
+ LEAST(100, COALESCE(c.days_active_7d, 0) / 5.0 * 100) * 0.20
|
||||
+ GREATEST(0, 100 - COALESCE(s.silence_events_7d, 0) * 10) * 0.15
|
||||
+ GREATEST(0, 100 - COALESCE(
|
||||
ap.alarms_30d::NUMERIC / NULLIF(ap.km_30d, 0) * 100 * 20, 0
|
||||
)) * 0.20
|
||||
+ GREATEST(0, 100 - COALESCE(b.over_100, 0) * 2 - COALESCE(b.harsh_events, 0) * 3) * 0.20
|
||||
) AS readiness_score
|
||||
FROM tracksolid.devices d
|
||||
LEFT JOIN freshness f ON f.imei = d.imei
|
||||
LEFT JOIN coverage c ON c.imei = d.imei
|
||||
LEFT JOIN silence s ON s.imei = d.imei
|
||||
LEFT JOIN alarm_pressure ap ON ap.imei = d.imei
|
||||
LEFT JOIN behaviour b ON b.imei = d.imei
|
||||
WHERE d.enabled_flag = 1
|
||||
ORDER BY readiness_score ASC NULLS FIRST;
|
||||
```
|
||||
|
||||
**Interpretation:**
|
||||
|
||||
| Score | Band | Action |
|
||||
|---|---|---|
|
||||
| 85 – 100 | Green — ready | Dispatch freely |
|
||||
| 60 – 84 | Amber — monitor | Review the lowest sub-score; fix trackers or coach driver |
|
||||
| < 60 | Red — unreliable | Do not dispatch for priority jobs; service or replace |
|
||||
| NULL | Silent | Vehicle never reported — investigate install / commission |
|
||||
|
||||
The scorecard is also the cleanest Panel 2 replacement for the Grafana Fleet Status Summary.
|
||||
|
||||
---
|
||||
|
||||
## 10. Service-Interval Forecaster
|
||||
|
||||
`[MONTHLY]` `[ALERT]` — predicts when each vehicle will hit its next service interval (default 10,000 km), based on its trailing 30-day km rate. Lets ops pre-book workshop slots and avoid fleet-wide conflicts.
|
||||
|
||||
Requires a service-log table (create once):
|
||||
|
||||
```sql
|
||||
CREATE TABLE IF NOT EXISTS ops.service_log (
|
||||
service_id BIGSERIAL PRIMARY KEY,
|
||||
imei TEXT NOT NULL REFERENCES tracksolid.devices(imei),
|
||||
service_date DATE NOT NULL,
|
||||
odometer_km INTEGER NOT NULL,
|
||||
service_type TEXT, -- 'scheduled', 'repair', 'tyre', etc.
|
||||
cost_kes INTEGER,
|
||||
notes TEXT,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
);
|
||||
CREATE INDEX IF NOT EXISTS idx_service_log_imei_date
|
||||
ON ops.service_log(imei, service_date DESC);
|
||||
```
|
||||
|
||||
**Forecaster query** — km until next service, projected service date:
|
||||
|
||||
```sql
|
||||
WITH last_service AS (
|
||||
SELECT DISTINCT ON (imei)
|
||||
imei,
|
||||
service_date,
|
||||
odometer_km
|
||||
FROM ops.service_log
|
||||
WHERE service_type = 'scheduled'
|
||||
ORDER BY imei, service_date DESC
|
||||
),
|
||||
current_odometer AS (
|
||||
SELECT imei, current_mileage_km
|
||||
FROM tracksolid.devices
|
||||
),
|
||||
trailing_rate AS (
|
||||
SELECT
|
||||
imei,
|
||||
SUM(distance_km) / 30.0 AS km_per_day_30d
|
||||
FROM tracksolid.trips
|
||||
WHERE start_time > NOW() - INTERVAL '30 days'
|
||||
AND end_time IS NOT NULL
|
||||
GROUP BY imei
|
||||
)
|
||||
SELECT
|
||||
d.imei,
|
||||
d.driver_name,
|
||||
d.vehicle_number,
|
||||
ls.service_date AS last_service_date,
|
||||
ls.odometer_km AS last_service_odo,
|
||||
co.current_mileage_km AS current_odo,
|
||||
(co.current_mileage_km - COALESCE(ls.odometer_km, 0)) AS km_since_service,
|
||||
GREATEST(
|
||||
0,
|
||||
10000 - (co.current_mileage_km - COALESCE(ls.odometer_km, 0))
|
||||
) AS km_to_next_service,
|
||||
ROUND(tr.km_per_day_30d, 1) AS km_per_day_30d,
|
||||
CASE
|
||||
WHEN tr.km_per_day_30d > 0 THEN
|
||||
CURRENT_DATE + (
|
||||
GREATEST(0, 10000 - (co.current_mileage_km - COALESCE(ls.odometer_km, 0)))
|
||||
/ tr.km_per_day_30d
|
||||
)::INT
|
||||
ELSE NULL
|
||||
END AS projected_service_date
|
||||
FROM tracksolid.devices d
|
||||
LEFT JOIN last_service ls ON ls.imei = d.imei
|
||||
LEFT JOIN current_odometer co ON co.imei = d.imei
|
||||
LEFT JOIN trailing_rate tr ON tr.imei = d.imei
|
||||
WHERE d.enabled_flag = 1
|
||||
ORDER BY projected_service_date NULLS LAST;
|
||||
```
|
||||
|
||||
**Weekly booking view** — how many vehicles need service in each of the next 8 weeks:
|
||||
|
||||
```sql
|
||||
WITH forecast AS (
|
||||
-- (same CTE body as above; wrap as subquery or view `ops.vw_service_forecast`)
|
||||
SELECT imei, projected_service_date
|
||||
FROM ops.vw_service_forecast
|
||||
WHERE projected_service_date IS NOT NULL
|
||||
)
|
||||
SELECT
|
||||
DATE_TRUNC('week', projected_service_date)::DATE AS week_start,
|
||||
COUNT(*) AS vehicles_due
|
||||
FROM forecast
|
||||
WHERE projected_service_date BETWEEN CURRENT_DATE AND CURRENT_DATE + INTERVAL '8 weeks'
|
||||
GROUP BY week_start
|
||||
ORDER BY week_start;
|
||||
```
|
||||
|
||||
> **Alert:** any vehicle with `km_to_next_service < (7 × km_per_day_30d)` fires an amber ticket to the fleet manager. Any vehicle already overdue (`km_to_next_service = 0`) fires red.
|
||||
|
||||
---
|
||||
|
||||
## Appendix A — Key Metric Thresholds Reference
|
||||
|
||||
| Metric | Green | Amber | Red |
|
||||
|---|---|---|---|
|
||||
|
|
@ -765,8 +1442,57 @@ docker compose up -d --build ingest_movement ingest_events webhook_receiver
|
|||
| Days vehicle not used (per month) | 0–2 | 3–5 | > 5 |
|
||||
| GPS fix age (live_positions) | < 2 min | 2–10 min | > 10 min |
|
||||
| Alarm rate per vehicle per week | 0–2 | 3–7 | > 7 |
|
||||
| Readiness score (§9) | ≥ 85 | 60–84 | < 60 |
|
||||
| Cost per ticket (van, NBO baseline) | < 400 KES | 400–900 KES | > 900 KES |
|
||||
| On-site within 90 min (§4.5) | ≥ 85% | 70–85% | < 70% |
|
||||
|
||||
---
|
||||
|
||||
*Document generated: 2026-04-10 · Stack: TimescaleDB 2.15 + PostGIS + Tracksolid Pro Open Platform API*
|
||||
## Appendix B — Threshold Calibration Guide
|
||||
|
||||
Every threshold in Appendix A is a **starting point**. They are drawn from general field-services norms and three Fireside incident reviews — not from Fireside's own distribution. After ~30 days of clean data, recalibrate each one against your own observed p50 / p90 / p99.
|
||||
|
||||
**The principle:** green should catch ≥ 50% of vehicle-days, amber ≥ 30%, red ≤ 20%. If red is firing on more than 25% of the fleet every day, the alert is noise and will be ignored.
|
||||
|
||||
**Calibration recipe** — run monthly for each threshold-backed metric:
|
||||
|
||||
```sql
|
||||
-- Example: utilisation % — recompute green/amber/red cut-points from the live distribution
|
||||
WITH daily AS (
|
||||
SELECT
|
||||
t.imei,
|
||||
DATE(t.start_time AT TIME ZONE 'Africa/Nairobi') AS day,
|
||||
SUM(t.driving_time_s) / (10.0 * 3600) * 100 AS utilisation_pct
|
||||
FROM tracksolid.trips t
|
||||
WHERE t.start_time > NOW() - INTERVAL '30 days'
|
||||
AND t.end_time IS NOT NULL
|
||||
GROUP BY t.imei, day
|
||||
)
|
||||
SELECT
|
||||
PERCENTILE_CONT(0.25) WITHIN GROUP (ORDER BY utilisation_pct) AS p25_red_cut,
|
||||
PERCENTILE_CONT(0.50) WITHIN GROUP (ORDER BY utilisation_pct) AS p50_amber_cut,
|
||||
PERCENTILE_CONT(0.75) WITHIN GROUP (ORDER BY utilisation_pct) AS p75_green_cut,
|
||||
PERCENTILE_CONT(0.90) WITHIN GROUP (ORDER BY utilisation_pct) AS p90_stretch
|
||||
FROM daily;
|
||||
```
|
||||
|
||||
Replace the Appendix A band edges with the returned percentiles. Repeat for idle %, speeding rate, harsh driving index, alarms per week. Document the recalibration date and the previous values in a changelog so band drift is visible.
|
||||
|
||||
**City-cohort cuts.** Nairobi traffic, Mombasa port runs, and Kampala cross-border routes produce genuinely different distributions. Group the recalibration by `devices.assigned_city` so you end up with three threshold sets, not one fleet-average compromise:
|
||||
|
||||
```sql
|
||||
-- Apply the same percentile function grouped by city
|
||||
SELECT
|
||||
d.assigned_city,
|
||||
PERCENTILE_CONT(0.50) WITHIN GROUP (ORDER BY utilisation_pct) AS p50,
|
||||
PERCENTILE_CONT(0.75) WITHIN GROUP (ORDER BY utilisation_pct) AS p75
|
||||
FROM daily
|
||||
JOIN tracksolid.devices d ON d.imei = daily.imei
|
||||
GROUP BY d.assigned_city;
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
*Document updated: 2026-04-18 · Stack: TimescaleDB 2.15 + PostGIS + Tracksolid Pro Open Platform API*
|
||||
*Ingestion pipeline: `ingest_movement_rev.py` v2.2 · `ingest_events_rev.py` · `webhook_receiver_rev.py`*
|
||||
*DB state verified: 18 Apr 2026 — live data in `tracksolid_2` (63 devices, pipeline stopped 6 Apr). New stack targets `tracksolid` schema — pending deployment.*
|
||||
|
|
|
|||
225
06_business_analytics_migration.sql
Normal file
225
06_business_analytics_migration.sql
Normal file
|
|
@ -0,0 +1,225 @@
|
|||
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||
-- Migration 06 — Business Analytics Schema Support
|
||||
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||
-- Adds the schema objects referenced by 01_BusinessAnalytics.md:
|
||||
-- • tracksolid.devices.assigned_city (§3.7 Geographic Drift)
|
||||
-- • tracksolid.dispatch_log (§4.4, §4.5 Field-Service SLAs)
|
||||
-- • ops schema (external ops integration namespace)
|
||||
-- • ops.service_log (§10 Service-Interval Forecaster)
|
||||
-- • ops.odometer_readings (§3.8 Odometer Divergence)
|
||||
-- • ops.tickets (§2.4 Cost-per-Ticket — skeleton)
|
||||
-- • ops.vw_service_forecast (§10 weekly booking view)
|
||||
--
|
||||
-- Run after migration 05. Safe to re-run (uses IF NOT EXISTS / DO NOTHING /
|
||||
-- CREATE OR REPLACE).
|
||||
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||
|
||||
BEGIN;
|
||||
|
||||
-- ── 1. City cohort column (§3.7) ─────────────────────────────────────────────
|
||||
|
||||
ALTER TABLE tracksolid.devices
|
||||
ADD COLUMN IF NOT EXISTS assigned_city TEXT;
|
||||
|
||||
COMMENT ON COLUMN tracksolid.devices.assigned_city
|
||||
IS 'Operating territory code: NBO (Nairobi) | MBA (Mombasa) | KLA (Kampala). '
|
||||
'Used for city-cohort analytics and geographic drift detection.';
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_devices_assigned_city
|
||||
ON tracksolid.devices (assigned_city)
|
||||
WHERE assigned_city IS NOT NULL;
|
||||
|
||||
-- ── 2. Dispatch log (§4.4, §4.5) ──────────────────────────────────────────────
|
||||
-- One row per ticket dispatch. Populated by n8n / ops integration at
|
||||
-- assignment; back-filled by nightly job using trips + live_positions.
|
||||
|
||||
CREATE TABLE IF NOT EXISTS tracksolid.dispatch_log (
|
||||
dispatch_id BIGSERIAL PRIMARY KEY,
|
||||
ticket_id TEXT NOT NULL,
|
||||
imei TEXT NOT NULL REFERENCES tracksolid.devices(imei),
|
||||
driver_name TEXT,
|
||||
job_lat DOUBLE PRECISION NOT NULL,
|
||||
job_lng DOUBLE PRECISION NOT NULL,
|
||||
job_geom geometry(Point, 4326),
|
||||
assigned_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
first_movement_at TIMESTAMPTZ,
|
||||
on_site_at TIMESTAMPTZ,
|
||||
resolved_at TIMESTAMPTZ,
|
||||
cancelled_at TIMESTAMPTZ,
|
||||
distance_km NUMERIC(8, 2),
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_dispatch_log_ticket
|
||||
ON tracksolid.dispatch_log (ticket_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_dispatch_log_imei_assigned
|
||||
ON tracksolid.dispatch_log (imei, assigned_at DESC);
|
||||
CREATE INDEX IF NOT EXISTS idx_dispatch_log_assigned_at
|
||||
ON tracksolid.dispatch_log (assigned_at DESC);
|
||||
CREATE INDEX IF NOT EXISTS idx_dispatch_log_job_geom
|
||||
ON tracksolid.dispatch_log USING GIST (job_geom);
|
||||
|
||||
COMMENT ON TABLE tracksolid.dispatch_log
|
||||
IS 'Persistent record of every dispatch decision. Powers SLA metrics: '
|
||||
'dispatch latency, depart delay, time-to-site, wrench time.';
|
||||
COMMENT ON COLUMN tracksolid.dispatch_log.first_movement_at
|
||||
IS 'First trip start after assigned_at. Back-filled nightly from trips.';
|
||||
COMMENT ON COLUMN tracksolid.dispatch_log.on_site_at
|
||||
IS 'Time vehicle entered 150 m radius of job_geom. Back-filled nightly.';
|
||||
COMMENT ON COLUMN tracksolid.dispatch_log.resolved_at
|
||||
IS 'Ticket close time from the ops system (ops.tickets.closed_at).';
|
||||
|
||||
-- ── 3. ops schema namespace ───────────────────────────────────────────────────
|
||||
-- Separates Fireside operations domain (tickets, services, odometers) from
|
||||
-- the tracksolid telematics namespace so ownership / grants can diverge.
|
||||
|
||||
CREATE SCHEMA IF NOT EXISTS ops;
|
||||
|
||||
COMMENT ON SCHEMA ops
|
||||
IS 'Fireside operations domain: tickets, service logs, odometer readings. '
|
||||
'Distinct from tracksolid.* which holds telematics data.';
|
||||
|
||||
-- ── 4. Service log (§10) ──────────────────────────────────────────────────────
|
||||
|
||||
CREATE TABLE IF NOT EXISTS ops.service_log (
|
||||
service_id BIGSERIAL PRIMARY KEY,
|
||||
imei TEXT NOT NULL REFERENCES tracksolid.devices(imei),
|
||||
service_date DATE NOT NULL,
|
||||
odometer_km INTEGER NOT NULL,
|
||||
service_type TEXT,
|
||||
cost_kes INTEGER,
|
||||
notes TEXT,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_service_log_imei_date
|
||||
ON ops.service_log (imei, service_date DESC);
|
||||
|
||||
COMMENT ON TABLE ops.service_log
|
||||
IS 'Workshop service history. Powers §10 Service-Interval Forecaster.';
|
||||
COMMENT ON COLUMN ops.service_log.service_type
|
||||
IS 'scheduled | repair | tyre | bodywork | inspection | other';
|
||||
COMMENT ON COLUMN ops.service_log.odometer_km
|
||||
IS 'Physical odometer reading at service time (integer km).';
|
||||
|
||||
-- ── 5. Odometer readings (§3.8) ───────────────────────────────────────────────
|
||||
-- Periodic physical odometer captures from service events, fuel card receipts,
|
||||
-- or manual driver entry. Divergence vs tracker-computed distance flags
|
||||
-- sensor drift or tamper.
|
||||
|
||||
CREATE TABLE IF NOT EXISTS ops.odometer_readings (
|
||||
reading_id BIGSERIAL PRIMARY KEY,
|
||||
imei TEXT NOT NULL REFERENCES tracksolid.devices(imei),
|
||||
reading_date DATE NOT NULL,
|
||||
reading_km INTEGER NOT NULL,
|
||||
source TEXT,
|
||||
recorded_by TEXT,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
UNIQUE (imei, reading_date)
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_odometer_readings_imei_date
|
||||
ON ops.odometer_readings (imei, reading_date DESC);
|
||||
|
||||
COMMENT ON TABLE ops.odometer_readings
|
||||
IS 'Physical odometer captures from service, fuel card, or manual entry. '
|
||||
'Powers §3.8 Odometer Divergence audit.';
|
||||
COMMENT ON COLUMN ops.odometer_readings.source
|
||||
IS 'service | fuel_card | driver_manual | workshop_form';
|
||||
|
||||
-- ── 6. Tickets skeleton (§2.4) ───────────────────────────────────────────────
|
||||
-- MINIMAL skeleton so the Cost-per-Ticket query is runnable. In production,
|
||||
-- this table is expected to be populated by the Fireside ticketing system
|
||||
-- (Zoho/Freshdesk/job-management export) via n8n or a direct feed. Schema
|
||||
-- is intentionally narrow — extend with columns specific to your source.
|
||||
|
||||
CREATE TABLE IF NOT EXISTS ops.tickets (
|
||||
ticket_id TEXT PRIMARY KEY,
|
||||
assigned_imei TEXT REFERENCES tracksolid.devices(imei),
|
||||
driver_name TEXT,
|
||||
customer TEXT,
|
||||
job_type TEXT,
|
||||
priority TEXT,
|
||||
status TEXT NOT NULL DEFAULT 'open',
|
||||
created_at TIMESTAMPTZ NOT NULL,
|
||||
assigned_at TIMESTAMPTZ,
|
||||
closed_at TIMESTAMPTZ,
|
||||
job_lat DOUBLE PRECISION,
|
||||
job_lng DOUBLE PRECISION,
|
||||
job_geom geometry(Point, 4326),
|
||||
ingested_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_tickets_status_created
|
||||
ON ops.tickets (status, created_at DESC);
|
||||
CREATE INDEX IF NOT EXISTS idx_tickets_assigned_imei
|
||||
ON ops.tickets (assigned_imei)
|
||||
WHERE assigned_imei IS NOT NULL;
|
||||
CREATE INDEX IF NOT EXISTS idx_tickets_closed_at
|
||||
ON ops.tickets (closed_at DESC NULLS LAST);
|
||||
|
||||
COMMENT ON TABLE ops.tickets
|
||||
IS 'Skeleton for ticket data sourced from the Fireside ops system. '
|
||||
'Replace or extend to match the actual feed (Zoho Desk, Freshdesk, etc).';
|
||||
COMMENT ON COLUMN ops.tickets.status
|
||||
IS 'open | assigned | in_progress | resolved | cancelled';
|
||||
|
||||
-- ── 7. Service forecast view (§10) ────────────────────────────────────────────
|
||||
-- Wraps the §10 forecaster CTE so the weekly booking query in
|
||||
-- 01_BusinessAnalytics.md references a stable object.
|
||||
|
||||
CREATE OR REPLACE VIEW ops.vw_service_forecast AS
|
||||
WITH last_service AS (
|
||||
SELECT DISTINCT ON (imei)
|
||||
imei,
|
||||
service_date,
|
||||
odometer_km
|
||||
FROM ops.service_log
|
||||
WHERE service_type = 'scheduled'
|
||||
ORDER BY imei, service_date DESC
|
||||
),
|
||||
current_odometer AS (
|
||||
SELECT imei, current_mileage_km
|
||||
FROM tracksolid.devices
|
||||
),
|
||||
trailing_rate AS (
|
||||
SELECT
|
||||
imei,
|
||||
SUM(distance_km) / 30.0 AS km_per_day_30d
|
||||
FROM tracksolid.trips
|
||||
WHERE start_time > NOW() - INTERVAL '30 days'
|
||||
AND end_time IS NOT NULL
|
||||
GROUP BY imei
|
||||
)
|
||||
SELECT
|
||||
d.imei,
|
||||
d.driver_name,
|
||||
d.vehicle_number,
|
||||
ls.service_date AS last_service_date,
|
||||
ls.odometer_km AS last_service_odo,
|
||||
co.current_mileage_km AS current_odo,
|
||||
(co.current_mileage_km - COALESCE(ls.odometer_km, 0)) AS km_since_service,
|
||||
GREATEST(
|
||||
0,
|
||||
10000 - (co.current_mileage_km - COALESCE(ls.odometer_km, 0))
|
||||
) AS km_to_next_service,
|
||||
ROUND(tr.km_per_day_30d, 1) AS km_per_day_30d,
|
||||
CASE
|
||||
WHEN tr.km_per_day_30d > 0 THEN
|
||||
CURRENT_DATE + (
|
||||
GREATEST(0, 10000 - (co.current_mileage_km - COALESCE(ls.odometer_km, 0)))
|
||||
/ tr.km_per_day_30d
|
||||
)::INT
|
||||
ELSE NULL
|
||||
END AS projected_service_date
|
||||
FROM tracksolid.devices d
|
||||
LEFT JOIN last_service ls ON ls.imei = d.imei
|
||||
LEFT JOIN current_odometer co ON co.imei = d.imei
|
||||
LEFT JOIN trailing_rate tr ON tr.imei = d.imei
|
||||
WHERE d.enabled_flag = 1;
|
||||
|
||||
COMMENT ON VIEW ops.vw_service_forecast
|
||||
IS 'Projected next-service date per vehicle based on 30-day km rate. '
|
||||
'Service interval default 10,000 km — override at query time if needed.';
|
||||
|
||||
COMMIT;
|
||||
145
20260414_FS__Logistics - final_fixed.csv
Normal file
145
20260414_FS__Logistics - final_fixed.csv
Normal file
|
|
@ -0,0 +1,145 @@
|
|||
Account,Customer Name,Device Name,IMEI,Model,Activated Date,Sales Time,SIM,MAC,Subscription Expiration,User Expiration Date,Battery replacement date,Group,ICCID,IMSI,Driver Name,Telephone,License Plate No.,ID Number,Department,VIN,Engine Number,Vehicle Brand,Vehicle Model,Fuel/100km,Installation Time
|
||||
fireside,Fireside Group HQ,UMA 382EK_UG,865135061569479,X3,2026-02-26,2025-09-08,+256792997079,,2036-02-27,2036-02-27,,Default Group,8925610001837573419F,641101970467667,UG,,UMA 382EK,,MTN,,,,,,
|
||||
fireside,Fireside Group HQ,UMA 418EK_UG,865135061569131,X3,2026-02-26,2025-09-08,+256792997053,,2036-02-27,2036-02-27,,Default Group,8925610001837573385F,641101970467664,UG,,UMA 418EK,,MTN,,,,,,
|
||||
fireside,Fireside Group HQ,John Mbugua/OSP-KDW 573B_CAM,862798052707896,JC400P,2026-01-30,2025-06-11,,,2036-01-31,2036-01-31,,Default Group,89254021414206816725,639021410681672,John Mbugua,,KDW 573B,,OSP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,JOEL NTUMBA/ISP-UMA 826AB_UG,865135061563423,X3,2026-01-28,2025-09-08,0119051036,,2036-01-29,2036-01-29,,Default Group,89254021414206652690,639021410665269,Joel Ntumba,,UMA 826AB,,MTN,,,,Motorbike,,
|
||||
fireside,Fireside Group HQ,RODIN KIBERU/ISP-UMA 011EK_UG,865135061564280,X3,2026-01-28,2025-09-08,0118081642,,2036-01-29,2036-01-29,,Default Group,89254021414206817244,639021410681724,Rodin Kiberu,,UMA 011EK,,MTN,,,,Motorbike,,
|
||||
fireside,Fireside Group HQ,Wambua/ROLLOUT-KDV 683Z_CAM,862798052708068,JC400P,2026-01-24,2025-06-11,0758048043,,2036-01-25,2036-01-25,,Default Group,89254021414206816964,639021410681696,Dominic Wambua,,KDV 683Z,,ROLLOUT,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Levine/OSP-KDV 439_CAM,862798052708167,JC400P,2025-12-13,2025-06-11,0758046738,,2035-12-14,2035-12-14,,Default Group,89254021414206816741,639021410681674,Levine Wasike,,KDV 439W,,FDS,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Benjamin/PLAN-KDV 438W_Track,865135061563639,X3,2025-12-13,2025-09-08,0758047065,,2035-12-14,2035-12-14,,Default Group,89254021414206816683,639021410681668,Benjamin Ananda,,KDV 438W,,PLANNING,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Albert/FDS-KDV 437W_Track,865135061569123,X3,2025-12-13,2025-09-08,0758047101,,2035-12-14,2035-12-14,,Default Group,89254021414206816881,639021410681688,Albert Mutwiri,,KDV 437W,,FDS,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Silvanus/FDS-KDV 064S_Track,865135061564470,X3,2025-11-21,2025-09-08,0113669866,,2035-11-22,2035-11-22,,Default Group,89254021414206378718,639021410637871,Silvanus Kipkorir,,KDV 064S,,AIRTEL,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Robbert/FDS-KDV 072L_Track,865135061581904,X3,2025-11-21,2025-09-08,0114149576,,2035-11-22,2035-11-22,,Default Group,89254021264261503993,639021266150399,Robert Kipruto,,KDV 072L,,FDS,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Benard Kimutai/KDN 759G_CAM,862798052713779,JC400P,2025-08-23,2025-06-11,0752143258,,2035-08-24,2035-08-24,,Default Group,89254035061001753860,639035060175386,Benard Kimutai,,KDN 759G,,OSP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Geoffrey/Rider-KMGS 239H,865135061043426,X3,2025-08-22,2025-06-11,0768696658,,2035-08-23,2035-08-23,,Default Group,89254021394274518926,639021397451892,Geoffrey Karanja,,KMGS 239H,,OSP-PATROL,,,,Motorbike,,
|
||||
fireside,Fireside Group HQ,Samuel Kihara/Rider_KMEL 225X,865135061053714,X3,2025-08-02,2025-06-11,0768696832,,2035-08-03,2035-08-03,,Default Group,89254021394274518934,639021397451893,Samuel Kihara,,KMEL 225X,,OSP-PATROL,,,,Motorbike,,
|
||||
fireside,Fireside Group HQ,Brian Njenga/Rider-KMFF 113Z,865135061036164,X3,2025-07-31,2025-06-11,0768696705,,2035-08-01,2035-08-01,,Default Group,89254021394274518850,639021397451885,Brian Njenga,,KMFF 113Z,,OSP-PATROL,,,,Motorbike,,
|
||||
fireside,Fireside Group HQ,KMGK 596V,865135061049001,X3,2025-07-31,2025-06-11,0768697064,,2035-08-01,2035-08-01,,Default Group,89254021394274518884,639021397451888,Parked,,KMGK 596V,,DELIVERIES,,,,Motorbike,,
|
||||
fireside,Fireside Group HQ,Rofas/General-KDT 728R_CAM,862798052715220,JC400P,2025-07-16,2025-06-11,0704573658,,2035-07-17,2035-07-17,,Default Group,89254021334258495873,639021335849587,Rofas Njagi,,KDT 728R,,REGIONAL,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Emmanuel/Gen-KDS 453Y_Track,865135061037980,X3,2025-07-15,2025-06-11,0790176734,,,2035-07-15,,Default Group,89254021394215205856,639021391520585,Emmanuel Luseno,,KDS 453Y,,GENERAL,,,,Pick-Up,,
|
||||
fireside,Fireside Group HQ,Kimeria/Crane-KDS 525D_Track,865135061035778,X3,2025-07-11,2025-06-11,0790176738,,2035-07-12,2035-07-12,,Default Group,89254021394215205922,639021391520592,John Kimeria,,KDS 525D,,GENERAL,,,,Crane,,
|
||||
fireside,Fireside Group HQ,Rashid/ISP-KDM 840V_Track,865135061053748,X3,2025-07-10,2025-06-11,0768445963,,2035-07-11,2035-07-11,,Default Group,89254021334212352574,639021331235257,Rashid Hassan,,KDM 840V,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Wambugu/FDS-KDR 592N_Track,865135061042261,X3,2025-07-10,2025-06-11,0797680464,,2035-07-11,2035-07-11,,Default Group,89254021334258159693,639021335815969,Kelvin Wambugu,,KDR 592N,,FDS,,,,Probox,,
|
||||
fireside,Fireside Group HQ,James Onyango-KDU 613B__CAM,862798052713811,JC400P,2025-07-09,2025-06-11,0790176542,,2035-07-10,2035-07-10,,Default Group,89254021394215205880,639021391520588,James Onyango,,KDU 613B,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Mazda-KDU 613A_Track,865135061047435,X3,2025-07-09,2025-06-11,0790175971,,2035-07-10,2035-07-10,,Default Group,89254021394215205971,639021391520597,Management_Mazda,,KDU 613A,,MGT,,,,Mazda,,
|
||||
fireside,Fireside Group HQ,Charles Nyambane/ISP-KCB 711C_CAM,862798050522743,JC400P,2023-12-22,2024-11-08,0768657106,,2033-12-23,2033-12-23,,Default Group,,,Charles Nyambane,,KCB 711C,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Sadique/GEN-KDC 490Q_CAM,862798050525225,JC400P,2023-12-22,2024-11-08,0768652386,,2043-12-22,2043-12-22,,Default Group,,,Sadique Wakayula,,KDC 490Q,,GENERAL,,,,Crane,,
|
||||
fireside,Fireside Group HQ,Samuel Nganga/ISP-KDE 264M_CAM,862798050525068,JC400P,2023-12-22,2024-11-08,0768658564,,2033-12-23,2033-12-23,,Default Group,,,Samuel Ng'ang'a,,KDE 264M,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Kennedy Ondieki/ISP-KCU 237Z_CAM,862798050525837,JC400P,2023-12-21,,0113669852,,2033-12-22,2033-12-22,,Default Group,,,Kennedy Ondieki,,KCU 237Z,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Geoffrey Too/OSP-KDM 308S_CAM,862798050523618,JC400P,2023-08-15,2023-08-22,0701211625,,2033-08-16,2033-08-16,,Default Group,,,Geoffrey Too,,KDM 308S,,OSP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Job Ngare/ISP Coast-KDM309S_CAM,862798050523816,JC400P,2023-08-15,2023-08-22,0707936781,,2033-08-16,2033-08-16,,Default Group,,,Job Ngare,,KDM 309S,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Daudi Jaoko/OSP-KDK 815R_Track,359857082912239,GT06E,2023-06-21,2023-07-27,0706392117,,2033-06-22,2033-06-22,,Default Group,89254021234296021287,639021239602128,Dickson Jaoko,,KDK 815R,,OSP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Peter Mbugua/ISP-KDK 728K_Track,359857082897091,GT06E,2022-12-14,2022-12-16,0790262984,,2042-12-15,2042-12-15,,Default Group,89254021234222500396,639021232250039,Peter Mbugua,,KDK 728K,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Peter Mbugua/KDK 728K_CAM,862798050524608,JC400P,2022-12-03,2022-12-15,0706742413,,2042-12-04,2042-12-04,,Default Group,,,Peter Mbugua,,KDK 728K,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,JC400P-24368,862798050524368,JC400P,2022-10-29,2022-12-17,,,2042-10-30,2042-10-30,,Default Group,,,Identification,,,,,,,,,,
|
||||
fireside,Fireside Group HQ,Mutuku/FDS-KDC 739F_CAM,862798050524558,JC400P,2022-01-22,2022-01-25,0100858817,,2042-01-23,2042-01-23,,Default Group,,,Mutuku Joseph,,KDC 739F,,FDS,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Cornelius/FDS-KCU 938R_CAM,862798050524897,JC400P,2022-01-22,2022-01-25,0114924404,,2042-01-23,2042-01-23,,Default Group,,,Cornelius Kimutai,,KCU 938R,,FDS,,,,Van,,
|
||||
fireside,Fireside Group HQ,Cassius/OSP-KDB 323M_CAM,862798050522107,JC400P,2022-01-22,2022-01-25,0114149576,,2042-01-23,2042-01-23,,Default Group,,,Cassius Wakiyo,,KDB 323M,,OSP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Richardson/ISP/Coast-KDC 207R _CAM,862798050524657,JC400P,2022-01-22,2022-01-25,0758689195,,2042-01-23,2042-01-23,,Default Group,,,Felix Andole,,KDC 207R,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,George/OSP KDD 684Y-CAM,862798050523386,JC400P,2022-01-22,2022-01-27,0785586834,,2042-01-23,2042-01-23,,Default Group,,,George Ochieng',,KDD 684Y,,OSP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Hamis Pande/ISP-KDD 689Y_CAM,862798050524384,JC400P,2022-01-22,2022-01-27,0701211744,,2042-01-23,2042-01-23,,Default Group,,,Hamisi Pande,,KDD 689Y,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Simon Kamau/ISP-KCE 090R_CAM,862798050525589,JC400P,2022-01-19,2022-01-17,0796276387,,2042-01-20,2042-01-20,,Default Group,,,Simon Kamau,,KCE 090R,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Makori John/PLAN-KDB 585E_CAM,862798050525423,JC400P,2022-01-15,2022-01-17,0701211724,,2042-01-16,2042-01-16,,Default Group,,,Makori John,,KDB 585E,,PLANNING,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Oseko/OSP-KCG 668W_CAM,862798050525951,JC400P,2022-01-15,2022-01-17,0741943212,,2042-01-16,2042-01-16,,Default Group,,,Wright Oseko,,KCG 668W,,OSP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Garage/OSP-KCH 167M_CAM,862798050522859,JC400P,2022-01-15,2022-01-17,0706740252,,2042-01-16,2042-01-16,,Default Group,,,Garage,,KCH 167M,,OSP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Garage/ROLL-KCE 699F_CAM,862798050524707,JC400P,2022-01-15,2022-01-17,0110525751,,2042-01-16,2042-01-16,,Default Group,,,Garage,,KCE 699F,,ROLLOUT,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Dan Watila/ISP-KDE 638J_CAM,862798050522883,JC400P,2022-01-15,2022-01-17,0112615393,,2042-01-16,2042-01-16,,Default Group,,,Dan Watila,,KDE 638J,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ, Samuel Kamau/ROLL-KCA 542Q_CAM,862798050525605,JC400P,2022-01-15,2022-01-17,0110526783,,2042-01-16,2042-01-16,,Default Group,,,John Ondego,,KCA 542Q,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Brian Ngetich/ISP-KDA 717B_CAM,862798050288360,JC400P,2021-11-05,2021-11-08,0717867861,,2041-11-06,2041-11-06,,Default Group,,,Brian Ngetich,,KDA 717B,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Patric Bet/OSP-KDA 609E_CAM,862798050288261,JC400P,2021-10-23,2021-10-25,0790176509,,2041-10-24,2041-10-24,,Default Group,,,Patric Bett,0112693340,KDA 609E,,OSP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Gabriel/ROLL-KCE 690F_Track,359857082042052,GT06E,2020-04-03,2020-04-16,0110094466,,2040-04-04,2040-04-04,,Default Group,89254021164215938024,639021161593802,Gabriel Musumba,,KCE 690F,,OSP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,Allan Owana/ISP-KDK780K_Track,359857081885410,GT06E,2019-06-19,2019-07-01,0703616117,,2039-06-20,2039-06-20,,Default Group,89254021234222499854,639021232249985,Allan Owana,,KDK 780K,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ, Garage/OSP-KCH 167M,359857081891798,GT06E,2019-06-16,2019-07-01,0746760102,,2039-06-17,2039-06-17,,Default Group,89254021084186499493,639021088649949,Garage,,KCH 167M,,OSP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,John Ondego/ISP-KCA 542Q_Track,359857081891632,GT06E,2019-06-15,2019-07-01,0746760038,,2039-06-16,2039-06-16,,Default Group,89254021084186499485,639021088649948,John Ondego,,KCA 542Q,,ISP,,,,Probox,,
|
||||
fireside,Fireside Group HQ,JC400P-08035,862798052708035,JC400P,Inactive,2025-06-11,,,120Month,——,,Default Group,,,Identification,,,,,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Wambua/ROLLOUT-KDV 683Z_Track,865135061563597,X3,2026-01-30,2026-02-24,0758052405,,2036-01-31,2036-01-31,,Default Group,89254021414206816733,639021410681673,Dominic Wambua,,KDV 683Z,,ROLLOUT,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,John Mbugua/OSP-KDW 573B_Track,865135061562722,X3,2026-01-30,2026-02-24,0758052508,,2036-01-31,2036-01-31,,Default Group,89254021414206816832,639021410681683,John Mbugua,,KDW 573B,,OSP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Godffrey Nandwa/ISP-KCN 496A_CAM,862798052708282,JC400P,2026-01-25,2026-02-20,0758047934,,2036-01-26,2036-01-26,,Default Group,89254021414206816865,639021410681686,Godffrey Nandwa,,KCN 496A,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Benjamin/PLAN-KDV 438W_CAM,862798052707888,JC400P,2025-12-15,2026-02-20,0758047312,,2035-12-16,2035-12-16,,Default Group,89254021414206816980,639021410681698,Benjamin Ananda,,KDV 438W,,PLANNING,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Albert/FDS-KDV 437W_CAM,862798052708076,JC400P,2025-12-13,2026-02-20,0758047094,,2035-12-14,2035-12-14,,Default Group,89254021414206816782,639021410681678,Albert Mutwiri,,KDV 437W,,FDS,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Levine/OSP-KDV 439W_Track,865135061562847,X3,2025-12-13,2026-02-24,0758047032,,2035-12-14,2035-12-14,,Default Group,89254021414206816840,639021410681684,Levine Wasike,,KDV 439W,,OSP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,JC400P-14066,862798052714066,JC400P,2025-11-21,2025-06-11,,,2035-11-22,2035-11-22,,Default Group,89254021414206378684,639021410637868,Identification,,,,,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Kennedy Ondieki/ISP-KCU 237Z_CAM,862798052713837,JC400P,2025-10-08,2026-02-20,0113669852,,2035-10-09,2035-10-09,,Default Group,89254021414206327855,639021410632785,Kennedy Ondieki,,KCU 237Z,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,JC400P-13696,862798052713696,JC400P,2025-09-02,2025-06-11,,,2035-09-03,2035-09-03,,Default Group,89254021394215205906,639021391520590,Identification,,,,,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Gitau/Regional-KDT 916R_CAM,862798052713985,JC400P,2025-08-02,2026-02-20,0768696668,,2035-08-03,2035-08-03,,Default Group,89254021394274518892,639021397451889,Timothy Gitau,,KDT 916R,,REGIONAL,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Richardson Komu-KDT 923R_Track,865135061035653,X3,2025-08-02,2026-02-24,0768697292,,2035-08-03,2035-08-03,,Default Group,89254021394274518942,639021397451894,Richardson Komu,,KDT 923R,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Muriithi/Huawei-KDR 594N_Track,865135061048466,X3,2025-07-24,2026-02-24,0797680395,,2035-07-25,2035-07-25,,Default Group,89254021334258159628,639021335815962,Samuel Muriithy,,KDR 594N,,ROLLOUT,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Rofas/General-KDT 728R_Track,865135061054555,X3,2025-07-16,2026-02-24,0790176726,,2035-07-17,2035-07-17,,Default Group,89254021394215205823,639021391520582,Rofas Njagi,,KDT 728R,,REGIONAL,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Mazda-KDU 613A_CAM,862798052713761,JC400P,2025-07-09,2026-02-20,0790176786,,2035-07-10,2035-07-10,,Default Group,89254021394215205955,639021391520595,Management_Mazda,,KDU 613A,,MGT,,,,Mazda,,
|
||||
Fireside@HQ,Fireside Telematics ,James Onyango-KDU 613B_Track,865135061054548,X3,2025-07-09,2026-02-24,0790175997,,2035-07-10,2035-07-10,,Default Group,89254021394215205948,639021391520594,James Onyango,,KDU 613B,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Rashid/ISP-KDM 840V_CAM,862798050526231,JC400P,2023-12-22,2026-02-20,0790175526,,2043-12-23,2043-12-23,,Default Group,,,Rashid Hassan,,KDM 840V,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Mike Wanaswa/FDS-KDT 724R_CAM,862798050523139,JC400P,2023-12-22,2026-02-20,0790175045,,2043-12-23,2043-12-23,,Default Group,,,Mike Wanaswa,,KDT 724R,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Wambugu/FDS-KDR 592N_CAM,862798050523063,JC400P,2023-12-22,2026-02-20,0701211876,,2043-12-22,2043-12-22,,Default Group,,,Kelvin Wambugu,,KDR 594N,,FDS,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Major Simiyu/FDS-KDS949Y_CAM,862798050523626,JC400P,2023-12-22,2026-02-20,0701211892,,2033-12-23,2033-12-23,,Default Group,,,Major Simiyu,,KDS 949Y,,FDS,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics , VICTOR/OSP-KDS919Y_CAM ,862798050523337,JC400P,2023-12-22,2026-02-20,0700242527,,2043-12-22,2043-12-22,,Default Group,,,Victor Kimutai,,KDS 919Y,,OSP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Emmanuel/Gen-KDS 453Y_CAM,862798050523295,JC400P,2023-12-22,2026-02-20,0700242474,,2033-12-23,2033-12-23,,Default Group,,,Emmanuel Luseno,,KDS 453 Y,,GENERAL,,,,Pick-Up,,
|
||||
Fireside@HQ,Fireside Telematics ,Muriithi/Huawei-KDR 594N_CAM,862798050523014,JC400P,2023-12-21,2026-02-20,0790175423,,2033-12-22,2033-12-22,,Default Group,,,Samuel Muriithy,,KDR 594N,,ROLLOUT,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Kimeria-General-KDS 525D_CAM,862798050521521,JC400P,2023-11-26,2026-02-20,0752958416,,2033-11-27,2033-11-27,,Default Group,,,John Kimeria,,KDS 525D,,GENERAL,,,,Crane,,
|
||||
Fireside@HQ,Fireside Telematics ,Leonard/ISP-KDM 306S _CAM,862798050524533,JC400P,2023-08-21,2026-02-20,0703487162,,2033-08-22,2033-08-22,,Default Group,,,Leonard Nzai,,KDM 306S,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Job Ngare/ISP Coast-KDM309S_Track,359857082898016,GT06E,2023-08-15,2026-02-24,0706895756,,2033-08-16,2033-08-16,,Default Group,89254021324273007563,639021327300756,Job Ngare,,KDM 309S,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Dickson Jaoko/OSP-KDK 815R_CAM,862798050525266,JC400P,2023-06-21,2026-02-20,0706665867,,2033-06-22,2033-06-22,,Default Group,,,Dickson Jaoko,,KDK 815R,,OSP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Alan Owana/ISP-KDK 780K_CAM,862798050523527,JC400P,2022-12-03,2026-02-20,0792375024,,2042-12-04,2042-12-04,,Default Group,,,Allan Owana,,KDK 780K,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Amani Sulubu/ISP-KCY 090X_CAM,862798050524426,JC400P,2022-01-16,2026-02-20,0113823350,,2042-01-17,2042-01-17,,Default Group,,,Amani Sulubu,,KCY 090X,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Gideon/ISP-KCQ 215F_CAM,862798050522065,JC400P,2022-01-16,2026-02-20,0113343715,,2042-01-17,2042-01-17,,Default Group,,,Gideon Kiprono,,KCQ 215F,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Gabriel/OSP-KCE 690F_CAM,862798050525670,JC400P,2022-01-15,2026-02-20,0701211996,,2042-01-16,2042-01-16,,Default Group,,,Gabriel Musumba,,KCE 690F,,OSP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Santoes/OSP-KCZ 181P_CAM D-Max,862798050288345,JC400P,2021-11-06,2026-02-20,0768446105,,2041-11-07,2041-11-07,,Default Group,,,Santoes Omondi,,KCZ 181P,,OSP,,,,Pick-Up,,
|
||||
Fireside@HQ,Fireside Telematics ,Elias Baya/FDS-KCZ 476E_CAM,862798050288303,JC400P,2021-11-06,2026-02-20,0115870439,,2041-11-07,2041-11-07,,Default Group,,,Elias Baya,,KCZ 476E,,FDS,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Nicholas Erastus /ISP-KCQ 581M_CAM,862798050288212,JC400P,2021-11-02,2026-02-20,0746979531,,2041-11-03,2041-11-03,,Default Group,,,Nicholas Erastus,,KCQ 581M,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Samuel Ng'ang'a/ISP-KDE 264M_Track,359857082898008,GT06E,2021-10-28,2026-02-24,0711731539,,2041-10-29,2041-10-29,,Default Group,89254021264260342245,639021266034224,Samuel Ng'ang'a,,KDE 264M,,ISP ,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Dan Watila/ISP-KDE 638J,359857082898487,GT06E,2021-10-21,2026-02-24,0116242996,,2041-10-22,2041-10-22,,Default Group,89254021334258404214,639021335840421,Dan Watila,,KDE 638J,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Geoffrey Too/ISP-KDM 308S,359857082900358,GT06E,2021-10-21,2026-02-24,0796527601,,2041-10-22,2041-10-22,,Default Group,89254021264260126572,639021266012657,Geoffrey Too,,KDM 308S,,OSP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Hamisi/ISP-KDD 689Y,359857082896911,GT06E,2021-09-17,2026-02-24,0112714612,,2041-09-18,2041-09-18,,Default Group,89254021214211314660,639021211131466,Hamisi Pande,,KDD 689Y,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,George/OSP-KDD 684Y_Track,359857082900697,GT06E,2021-09-17,2026-02-24,0114879518,,2041-09-18,2041-09-18,,Default Group,89254021214211314678,639021211131467,George Ochieng',,KDD 684Y,,OSP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Cassius/OSP-KDB 323M_Track,359857082897257,GT06E,2021-08-29,2026-02-24,0746428882,,2041-08-29,2041-08-29,,Default Group,89254021234222500818,639021232250081,Cassius Wakiyo,,KDB 323M,,OSP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,John Makori/PLAN-KDB 585E,359857082897737,GT06E,2021-08-29,2026-02-24,0114596734,,2041-08-29,2041-08-29,,Default Group,89254021214211145262,639021211114526,John Makori,,KDB 585E,,PLANNING,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Kelvin Gichea/ISP-KDA 717B,359857082911983,GT06E,2021-08-29,2026-02-24,0795188807,,2041-08-29,2041-08-29,,Default Group,89254021214211145288,639021211114528,Brian Ngetich,0795188807,KDA 717B,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Sadique/GEN-KDC 490Q_Track,359857082902461,GT06E,2021-05-22,2026-02-24,0757556468,,2041-05-22,2041-05-22,,Default Group,89254021154296722488,639021159672248,Sadique Wakayula,,KDC 490Q,,GENERAL,,,,Crane,,
|
||||
Fireside@HQ,Fireside Telematics ,Andrew Makanda/ISP/Coast-KDC 207R ,359857082902503,GT06E,2021-05-15,2026-02-24,0794820817,,2041-05-15,2041-05-15,,Default Group,89254021224270993254,639021227099325,Felix Andole,,KDC 207R,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Mutuku Joseph/FDS-KDC 739F ,359857082897794,GT06E,2021-04-10,2026-02-24,0115019037,,2041-04-10,2041-04-10,,Default Group,89254021224222632356,639021222263235,Mutuku Joseph,0115019037,KDC 739F,,FDS,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics , Patric Bet/OSP-KDA 609E_Track,359857082910589,GT06E,2020-10-26,2026-02-24,0797622637,,2040-10-27,2040-10-27,,Default Group,89254021154296722496,639021159672249,Patric Bett,,KDA 609E,,OSP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Charles Nyambane/ISP-KCB 711C_Track,359857082918012,GT06E,2020-09-21,2026-02-24,0793704231,,2040-09-22,2040-09-22,,Default Group,89254021154287138363,639021158713836,Charles Nyambane,,KCB 711C,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Oseko Wright/OSP-KCG 668W_Track,359857081887069,GT06E,2019-06-30,2026-02-24,0746763106,,2039-07-01,2039-07-01,,Default Group,89254021084186499915,639021088649991,Wright Oseko,,KCG 668W,,OSP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,KCE 699F,359857081891590,GT06E,2019-06-16,2026-02-24,0746760215,,2039-06-17,2039-06-17,,Default Group,89254021084186499519,639021088649951,Garage,,KCE 699F,,ROLLOUT,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Simon Kamau/ISP-KCE 090R,359857081891566,GT06E,2019-06-16,2026-02-24,0746760404,,2039-06-17,2039-06-17,,Default Group,89254021084186499527,639021088649952,Simon Kamau,,KCE 090R,,ISP,,,,Probox,,
|
||||
Fireside@HQ,Fireside Telematics ,Cornelius/FDS-KCU 938R VAN,359857081892101,GT06E,2019-06-12,2026-02-24,0746759919,,2039-06-13,2039-06-13,,Default Group,89254021084186499451,639021088649945,Cornelius Kimutai,,KCU 938R,,FDS,,,,Van,,2019-06-12
|
||||
Fireside@HQ,Fireside Telematics ,Nicholas Erastus/ISP-KCQ581M,359857081892309,GT06E,2019-06-09,2026-02-24,0700023776,,2039-06-10,2039-06-10,,Default Group,89254021084178504672,639021087850467,Nicholas Erastus,,KCQ 581M,,ISP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Barack_Personal-KDW 781E,865135061563415,X3,2026-01-13,2025-09-08,0758052541,,2036-01-14,2036-01-14,,Default Group,89254021414206816931,639021410681693,Barack Orwa,,KDW 781E,,MGT,,,,Vazel,,
|
||||
Fireside_MSA,Fireside Group MSA,Major Simiyu-KDS 949Y_Track,865135061035133,X3,2025-08-02,2025-06-11,0768696642,,2035-08-03,2035-08-03,,Default Group,89254021394274518918,639021397451891,Major Simiyu,,KDS 949Y,,FDS,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Harisson/KDT 724R_Track,865135061043079,X3,2025-08-02,2025-06-11,0768696664,,2035-08-03,2035-08-03,,Default Group,89254021394274518959,639021397451895,Mike Wanaswa,,KDT 724R,,ISP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Gitau/Regional-KDT 916R_Track,865135061048953,X3,2025-08-02,2025-06-11,0768697056,,2035-08-03,2035-08-03,,Default Group,89254021394274518967,639021397451896,Timothy Gitau,,KDT 916R,,REGIONAL,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Victor/OSP-KDS 919Y_Track,865135061048276,X3,2025-08-02,2025-06-11,0768696755,,2035-08-03,2035-08-03,,Default Group,89254021394274518900,639021397451890,Victor Kimutai,,KDS 919Y,,OSP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Ian Dancan-KDT 923R_CAM,862798050526256,JC400P,2023-12-22,,0794873610,,2043-12-22,2043-12-22,,Default Group,,,Ian Dancun,,KDT 923R,,QEHS,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Wilfred/Gen-KCU 729C_CAM,862798050526165,JC400P,2023-11-26,2024-11-08,0790564929,,2033-11-27,2033-11-27,,Default Group,,,Wilfred Kinyanjui,,KCU 729C,,GENERAL,,,,Crane,,
|
||||
Fireside_MSA,Fireside Group MSA,Denis Kazungu/KDM 794R_Track,359857082916826,GT06E,2023-08-21,2023-08-22,0705700971,,2033-08-22,2033-08-22,,Default Group,89254021324273006854,639021327300685,Denis Kazungu,,KDM 794R,,FDS,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Mutuku Anthony-KDK 732K_Track,359857082898073,GT06E,2022-12-20,2022-12-20,0793026954,,2042-12-21,2042-12-21,,Default Group,89254021234222387539,639021232238753,Mutuku Antony,,KDK 732K,,FDS,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Anthon/KDK 732K_CAM,862798050524681,JC400P,2022-12-06,2022-12-16,0796275746,,2042-12-07,2042-12-07,,Default Group,,,Mutuku Antony,,KDK 732K,,FDS,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Makanda-KCZ 155P_CAM,862798050524566,JC400P,2022-01-22,2025-02-24,0758781444,,2042-01-23,2042-01-23,,Default Group,,,Makanda Andrew,,KCZ 155P,,OSP,,,,Pick-Up,,
|
||||
Fireside_MSA,Fireside Group MSA,Dennis Kazungu/-KDM 794R_CAM,862798050521612,JC400P,2022-01-22,2024-11-19,0704113731,,2042-01-23,2042-01-23,,Default Group,,,Denis Kazungu,,KDM 794R,,FDS,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Mbuvi Kioko/OSP-KCZ 199P_CAM,862798050522719,JC400P,2022-01-16,2022-12-16,0768218655,,2042-01-17,2042-01-17,,Default Group,,,Mbuvi Kioko,,KCZ 199P,,OSP,,,,Pick-Up,,
|
||||
Fireside_MSA,Fireside Group MSA,Felix Muema-KCZ 223P_CAM D-Max,862798050524087,JC400P,2022-01-16,2024-12-30,0113973875,,2042-01-17,2042-01-17,,Default Group,,,Felix Muema,,KCZ 223P,,OSP,,,,Pick-Up,,
|
||||
Fireside_MSA,Fireside Group MSA,Lawrence Kijogi/ROLL-KCY 080X_CAM,862798050522891,JC400P,2022-01-16,2022-12-16,0113287191,,2042-01-17,2042-01-17,,Default Group,,,Lawrence Kijogi,,KCY 080X,,ROLLOUT,,,,Pick-Up,,
|
||||
Fireside_MSA,Fireside Group MSA,Ndegwa Duncan/PM-KCG 669W_CAM,862798050524392,JC400P,2022-01-16,2022-12-16, 0113799173,,2042-01-17,2042-01-17,,Default Group,,,Ndegwa Dancun,,KCG 669W,,OSP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Simon Munda-KCZ 154S_CAM,862798050521752,JC400P,2022-01-16,2022-12-16,0113805921,,2042-01-17,2042-01-17,,Default Group,,,Simon Munda,,KCZ 154S,,ISP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Moses Wambua-KCZ 751V_CAM,862798050524012,JC400P,2022-01-16,2022-12-16,0113313797,,2042-01-17,2042-01-17,,Default Group,,,Moses Wambua,,KCZ 751V,,ISP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Amani Kazungu-KCY 084X_CAM,862798050523204,JC400P,2022-01-16,2022-12-16,0707892547,,2042-01-17,2042-01-17,,Default Group,,,Amani Kazungu,,KCY 084X,,ISP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Joseph Kabandi-KCY 076X_CAM,862798050523949,JC400P,2022-01-16,2022-12-16, 0113288492,,2042-01-17,2042-01-17,,Default Group,,,Joseph Kabandi,,KCY 076X,,ISP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Kennedy Chege-KCQ 618K_CAM,862798050525613,JC400P,2022-01-16,2022-12-19,0729994247,,2042-01-17,2042-01-17,,Default Group,,,Kennedy Chege,,KCQ 618K,,OSP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Noel/FDS/VOI-KCY 838X_CAM,862798050525753,JC400P,2022-01-15,2023-08-23,,,2042-01-16,2042-01-16,,Default Group,,,Noel Merengeni,,KCY 838X,,FDS,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Noel/VOI-KCY 838X_Track,359857082925330,GT06E,2020-10-26,2023-08-22,0794873610,,2040-10-27,2040-10-27,,Default Group,89254021154296723429,639021159672342,Noel Merengeni,,KCY 838X,,FDS,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Simon Munda-KCZ 154S_Track,359857082900341,GT06E,2020-09-23,2022-12-16,0757236135,,2040-09-24,2040-09-24,,Default Group,89254021154296723312,639021159672331,Simon Munda,,KCZ 154S,,ISP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA, Michael Odongo-KCZ 751V,359857082912486,GT06E,2020-09-23,2022-12-16,0792756503,,2040-09-24,2040-09-24,,Default Group,89254021154296723437,639021159672343,Moses Wambua,,KCZ 751V,,ISP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Daniel Omondi/Rider_KMFF 099Z,353549090553685,AT4,2020-09-23,2022-12-16,0759336150,,2040-09-24,2040-09-24,,Default Group,89254021334258404099,639021335840409,Daniel Omondi,0112794067,KMFF 099Z,,OSP-PATROL,,,,Motorbike,,
|
||||
Fireside_MSA,Fireside Group MSA,Daniel Kipkirui/Rider-KMFF 162Z,353549090567685,AT4,2020-09-23,2022-12-16,0742532058,,2040-09-24,2040-09-24,,Default Group,89254021264260388966,639021266038896,Daniel Kipkirui,0112795498,KMFF 162Z,,OSP-PATROL,,,,Motorbike,,
|
||||
Fireside_MSA,Fireside Group MSA,Makanda/OSP-KCZ155P D-Max,359857082910886,GT06E,2020-08-23,2025-02-24,0745067338,,2040-08-24,2040-08-24,,Default Group,89254021154287138397,639021158713839,Makanda Andrew,,KCZ 155P,,OSP,,,,Pick-Up,,
|
||||
Fireside_MSA,Fireside Group MSA,Santos/OSP-KCZ 181P D-Max,359857082908500,GT06E,2020-08-23,2022-12-16,0701211974,,2040-08-24,2040-08-24,,Default Group,89254021374215155087,639021371515508,Santoes Omondi,,KCZ 181P,,OSP,,,,Pick-Up,,
|
||||
Fireside_MSA,Fireside Group MSA,Mbuvi Kioko-KCZ 199P D-Max,359857082918038,GT06E,2020-08-22,2022-12-16,0797318126,,2040-08-23,2040-08-23,,Default Group,89254021154287138389,639021158713838,Mbuvi Kioko,,KCC 199P,,OSP,,,,Pick-Up,,
|
||||
Fireside_MSA,Fireside Group MSA,Felix Muema-KCZ 223P D-Max,359857082907973,GT06E,2020-08-22,2024-12-30,0757843826,,2040-08-23,2040-08-23,,Default Group,89254021154287138371,639021158713837,Felix Muema,,KCZ 223P,,OSP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Elias KCZ 476E,359857082042854,GT06E,2020-08-09,2022-12-16,0110941187,,2040-08-10,2040-08-10,,Default Group,89254021164224352993,639021162435299,Elias Baya,,KCZ 476E,,ISP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Lawrence Kijogi/ROLL-KCY 080X,359857082044280,GT06E,2020-07-13,2022-12-16,0708155933,,2040-07-13,2040-07-13,,Default Group,89254029851005131222,639029850513122,Lawrence Kijogi,,KCY 080X,,ROLLOUT,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Amani Kazungu/ISP-KCY 084X,359857082037185,GT06E,2020-07-13,2022-12-16,0757338522,,2040-07-14,2040-07-14,,Default Group,89254021154287000597,639021158700059,Amani Kazungu,,KCY 084X,,ISP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Joseph kabandi-KCY 076X,359857082046145,GT06E,2020-07-13,2022-12-16,0110850007,,2040-07-14,2040-07-14,,Default Group,89254021164223447158,639021162344715,Joseph Kabandi,,KCY 076X,,ISP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Rashid Musa-KCY 090X,359857082040981,GT06E,2020-07-13,2022-12-16,0793375853,,2040-07-14,2040-07-14,,Default Group,89254021064168004164,639021066800416,Amani Sulubu,,KCY 090X,,ISP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Wilfred/Gen-KCU 729C_Track,359857082038977,GT06E,2020-04-05,2022-12-16,0110094469,,2040-04-06,2040-04-06,,Default Group,89254021164215938057,639021161593805,Wilfred Kinyanjui,,KCU 729C,,GENERAL,,,,Crane,,
|
||||
Fireside_MSA,Fireside Group MSA,Amani Kazungu/ISP-KCQ 215F_Track,359857081886467,GT06E,2019-06-30,2022-12-16,0746763076,,2039-07-01,2039-07-01,,Default Group,89254021084186499865,639021088649986,Gideon Kiprono,,KCQ 215F,,ISP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA, Kennedy Chege/OSP-KCQ 618K,359857081886905,GT06E,2019-06-30,2022-12-16,0746763132,,2039-07-01,2039-07-01,,Default Group,89254021084186499923,639021088649992,Kennedy Chege,,KCQ 618K,,OSP,,,,Probox,,
|
||||
Fireside_MSA,Fireside Group MSA,Ndegwa Duncan/PM-KCG 669W_Track,359857081887192,GT06E,2019-06-15,2022-12-16,0746760191,,2039-06-16,2039-06-16,,Default Group,89254021084186499501,639021088649950,Ndegwa Dancun,,KCG 669W,,OSP,,,,Probox,,
|
||||
|
315
260412_baseline_report.md
Normal file
315
260412_baseline_report.md
Normal file
|
|
@ -0,0 +1,315 @@
|
|||
# Fireside Communications — Fleet Baseline Report
|
||||
**Date:** 2026-04-12 · **Time of queries:** ~00:15 EAT
|
||||
**Database:** tracksolid_db on TimescaleDB
|
||||
**Container:** timescale_db-bo3nov2ija7g8wn9b1g2paxs-192322642108
|
||||
**Report scope:** All 63 registered devices · All tables · Post-migration 04 + 05
|
||||
|
||||
---
|
||||
|
||||
## 1. Migration Status
|
||||
|
||||
All four schema migrations applied and tracked:
|
||||
|
||||
| Migration File | Applied (EAT) | Status |
|
||||
|---|---|---|
|
||||
| `02_tracksolid_full_schema_rev.sql` | 2026-04-11 22:25:37 | ✓ Applied |
|
||||
| `03_webhook_schema_migration.sql` | 2026-04-11 22:25:37 | ✓ Applied |
|
||||
| `04_bug_fix_migration.sql` | 2026-04-11 22:25:37 | ✓ Applied — `distance_km` renamed & corrected |
|
||||
| `05_enhancement_migration.sql` | 2026-04-11 22:25:37 | ✓ Applied — new tables + columns |
|
||||
|
||||
Schema is fully current. No pending migrations.
|
||||
|
||||
---
|
||||
|
||||
## 2. Table Row Counts (as of 00:15 EAT)
|
||||
|
||||
| Table | Rows | Δ vs 260410 | Notes |
|
||||
|---|---|---|---|
|
||||
| `tracksolid.devices` | **63** | — | Full fleet registry |
|
||||
| `tracksolid.live_positions` | **19** | — | 19 devices with a known position (30% of fleet) |
|
||||
| `tracksolid.position_history` | **101** | −36 | New container; accumulating since 22:25 EAT Apr 11 |
|
||||
| `tracksolid.position_history` (`track_list`) | **70** | +57 | High-res trail density growing strongly |
|
||||
| `tracksolid.alarms` | **3** | +1 | ACC_ON/ACC_OFF events from evening movement |
|
||||
| `tracksolid.trips` | **3** | +3 | **First real trips recorded — FIX-M16 distance fix confirmed** |
|
||||
| `tracksolid.parking_events` | **0** | — | Fix deployed; will populate with completed park cycles |
|
||||
| `tracksolid.obd_readings` | **0** | — | Awaiting webhook registration |
|
||||
| `tracksolid.device_events` | **0** | — | Awaiting `/pushevent` registration |
|
||||
| `tracksolid.fuel_readings` | **0** | — | Awaiting `/pushoil` registration |
|
||||
| `tracksolid.temperature_readings` | **0** | — | Awaiting `/pushtem` registration |
|
||||
| `tracksolid.lbs_readings` | **0** | — | Awaiting `/pushlbs` registration |
|
||||
| `tracksolid.geofences` | **0** | — | Not yet configured |
|
||||
| `tracksolid.heartbeats` | **0** | — | Awaiting heartbeat webhook |
|
||||
| `tracksolid.fault_codes` | **0** | — | Awaiting fault code data |
|
||||
| `tracksolid.ingestion_log` | **43** | — | New container; fresh audit trail |
|
||||
| `dwh_gold.fact_daily_fleet_metrics` | **0** | — | ETL not yet run |
|
||||
| `dwh_gold.dim_vehicles` | **0** | — | Awaiting population |
|
||||
|
||||
---
|
||||
|
||||
## 3. Fleet Composition
|
||||
|
||||
**63 devices across 4 device models — unchanged:**
|
||||
|
||||
| Model | Count | Typical Use |
|
||||
|---|---|---|
|
||||
| AT4 | 23 | Asset / cargo hardwired tracker |
|
||||
| JC400P | 23 | Camera-capable tracker (larger vehicles) |
|
||||
| X3 | 10 | Compact vehicle tracker |
|
||||
| GT06E | 7 | OBD-port tracker |
|
||||
| **Total** | **63** | |
|
||||
|
||||
---
|
||||
|
||||
## 4. Full Device Registry
|
||||
|
||||
> All 63 devices. `driver_name` is blank for every device — confirmed root cause: no drivers assigned in Tracksolid Pro account (not a DB sync issue). `vehicle_number` also unpopulated.
|
||||
|
||||
| Device Name | Model | SIM | Odometer (km) | Expires | Status |
|
||||
|---|---|---|---|---|---|
|
||||
| AT4-51820 | AT4 | — | — | — | No position |
|
||||
| AT4-53099 | AT4 | — | — | — | No position |
|
||||
| AT4-54246 | AT4 | — | — | — | No position |
|
||||
| AT4-55029 | AT4 | — | — | — | No position |
|
||||
| AT4-55235 | AT4 | — | — | — | No position |
|
||||
| AT4-57389 | AT4 | — | — | — | No position |
|
||||
| AT4-61860 | AT4 | — | — | — | No position |
|
||||
| AT4-64815 | AT4 | — | 0 | 2036-02-05 | Inactive (1,573h) |
|
||||
| AT4-64823 | AT4 | — | — | — | No position |
|
||||
| AT4-64880 | AT4 | — | — | — | No position |
|
||||
| AT4-64989 | AT4 | — | — | — | No position |
|
||||
| AT4-65010 | AT4 | — | — | — | No position |
|
||||
| AT4-65135 | AT4 | — | — | — | No position |
|
||||
| AT4-65341 | AT4 | — | — | — | No position |
|
||||
| AT4-65598 | AT4 | — | — | — | No position |
|
||||
| AT4-65648 | AT4 | — | — | — | No position |
|
||||
| AT4-66158 | AT4 | — | — | — | No position |
|
||||
| AT4-67271 | AT4 | — | — | — | No position |
|
||||
| AT4-67693 | AT4 | — | — | — | No position |
|
||||
| KCE 690F | AT4 | — | 0 | 2039-07-01 | Inactive (57,329h) |
|
||||
| KCS 903Y JK SUB | AT4 | 0700024569 | 4 | 2039-06-09 | Inactive (15,229h) |
|
||||
| KCU 865Q Vanguard Sub | AT4 | 0757270804 | 10 | 2039-12-20 | Inactive (15,445h) |
|
||||
| KMEH 692C KAWASAKI | AT4 | 0110094467 | 3 | 2040-04-03 | Inactive (24,709h) |
|
||||
| Belta KCU-647D | GT06E | 0110094465 | 235 | 2040-04-03 | Inactive (7,584h) — SERVICE FLAG |
|
||||
| GT06E-85428 | GT06E | — | — | — | No position |
|
||||
| GT06E-86319 | GT06E | — | — | — | No position |
|
||||
| JK Subaru KCS 903Y | GT06E | 0746759925 | 73 | 2039-06-12 | Very stale (670h) |
|
||||
| KCU 145Q Solo Xtrail | GT06E | 0757270810 | 53 | 2039-12-20 | Inactive (7,546h) |
|
||||
| KCU 865Q Vanguard | GT06E | 0757270763 | 62 | 2039-12-20 | Stale (79h) |
|
||||
| KDK 829A GP | GT06E | 0707923872 | 239 | 2042-10-29 | **Recent (2h) — DEPOT** — SERVICE FLAG |
|
||||
| JC400P-07904 | JC400P | — | — | — | No position |
|
||||
| JC400P-85041 | JC400P | — | — | — | No position |
|
||||
| JC400P-85058 | JC400P | — | — | — | No position |
|
||||
| JC400P-85751 | JC400P | — | 0 | 2036-03-11 | Inactive (746h) |
|
||||
| JC400P-86270 | JC400P | — | — | — | No position |
|
||||
| JC400P-86403 | JC400P | — | — | — | No position |
|
||||
| JC400P-87625 | JC400P | — | — | — | No position |
|
||||
| JC400P-87831 | JC400P | — | — | — | No position |
|
||||
| JC400P-89431 | JC400P | — | — | — | No position |
|
||||
| JC400P-89530 | JC400P | — | — | — | No position |
|
||||
| JC400P-89563 | JC400P | — | — | — | No position |
|
||||
| JC400P-89662 | JC400P | — | — | — | No position |
|
||||
| JC400P-89977 | JC400P | — | — | — | No position |
|
||||
| JC400P-90108 | JC400P | — | — | — | No position |
|
||||
| JC400P-90199 | JC400P | — | — | — | No position |
|
||||
| JC400P-90678 | JC400P | — | — | — | No position |
|
||||
| JC400P-91619 | JC400P | — | — | — | No position |
|
||||
| JC400P-92278 | JC400P | — | — | — | No position |
|
||||
| JC400P-92716 | JC400P | — | — | — | No position |
|
||||
| JC400P-92732 | JC400P | — | — | — | No position |
|
||||
| JC400P-94233 | JC400P | — | — | — | No position |
|
||||
| KDU 878T_CAM | JC400P | 0708351897 | 2 | 2035-08-18 | Inactive (3,081h) |
|
||||
| KDW 632M HL Cam | JC400P | 300002396032 IoT | 0 | 2036-03-11 | Inactive (756h) |
|
||||
| FRED KMGW 538W HULETI | X3 | 0119867174 | 2 | 2036-02-08 | **Active (0.1h) — MOVED** |
|
||||
| KDU 878T_Track | X3 | 0708352823 | 5 | 2035-08-18 | Stale (79h) |
|
||||
| KDW 632M HL Tracker | X3 | 300002396033 IoT | 0 | 2036-02-09 | Inactive (744h) |
|
||||
| KMGR 409U HENRY JAZZ | X3 | 0768697302 | 7 | 2035-07-31 | Recent (7h) |
|
||||
| X3-59405 | X3 | — | — | — | No position |
|
||||
| X3-63282 | X3 | — | 4 | 2036-02-14 | **Active (0.2h) — UGANDA ANOMALY PERSISTS** |
|
||||
| X3-64223 | X3 | — | — | — | No position |
|
||||
| X3-68968 | X3 | — | 0 | 2036-03-11 | Inactive (744h) |
|
||||
| X3-69172 | X3 | — | — | — | No position |
|
||||
| X3-78553 | X3 | — | — | — | No position |
|
||||
|
||||
---
|
||||
|
||||
## 5. Live Position Coverage
|
||||
|
||||
**19 of 63 devices (30%)** have a position in `live_positions` — same count as 260410.
|
||||
**44 devices (70%)** have no position at all — offline, SIM not installed, or never activated.
|
||||
|
||||
### Freshness Bands
|
||||
|
||||
| Band | Count | Devices |
|
||||
|---|---|---|
|
||||
| < 2 hours (active) | 2 | FRED KMGW 538W HULETI, X3-63282 |
|
||||
| 2–24 hours (recent) | 2 | KDK 829A GP (2h), KMGR 409U HENRY JAZZ (7h) |
|
||||
| 1–7 days (stale) | 2 | KCU 865Q Vanguard (79h), KDU 878T_Track (79h) |
|
||||
| 1–12 months (very stale) | 3 | JK Subaru KCS 903Y (670h), KCU 145Q (7,546h), Belta KCU-647D (7,584h) |
|
||||
| > 1 year (inactive) | 10 | KDU 878T_CAM, KCS 903Y JK SUB, KCU 865Q Vanguard Sub, KMEH 692C KAWASAKI, KCE 690F, etc. |
|
||||
|
||||
### Full Live Position Detail
|
||||
|
||||
| Device | Model | Lat | Lng | Speed (km/h) | ACC | GPS Signal | Satellites | Last Fix (EAT) |
|
||||
|---|---|---|---|---|---|---|---|---|
|
||||
| FRED KMGW 538W HULETI | X3 | -1.24444 | 36.72321 | 0 | Off | 4 | 13 | 2026-04-12 00:02:54 |
|
||||
| X3-63282 | X3 | 0.19566 | 32.54004 | 0 | Off | 4 | 11 | 2026-04-11 23:58:38 |
|
||||
| KDK 829A GP | GT06E | -1.23850 | 36.72677 | 0 | Off | 4 | 9 | 2026-04-11 22:09:59 |
|
||||
| KMGR 409U HENRY JAZZ | X3 | -1.23743 | 36.72663 | 3 | Off | 3 | 3 | 2026-04-11 16:47:20 |
|
||||
| KCU 865Q Vanguard | GT06E | -1.23748 | 36.72641 | 5 | Off | 0 | 5 | 2026-04-08 17:17:45 |
|
||||
| KDU 878T_Track | X3 | -1.23528 | 36.72871 | 0 | Off | 4 | 10 | 2026-04-08 17:16:55 |
|
||||
| JK Subaru KCS 903Y | GT06E | -1.23560 | 36.72868 | 0 | Off | 1 | 6 | 2026-03-15 01:52:33 |
|
||||
| X3-68968 | X3 | -1.23799 | 36.72615 | 0 | Off | 4 | 15 | 2026-03-11 23:59:28 |
|
||||
| KDW 632M HL Tracker | X3 | -1.24087 | 36.72839 | 0 | Off | 4 | 6 | 2026-03-11 23:53:44 |
|
||||
| JC400P-85751 | JC400P | -1.23796 | 36.72611 | 0 | Off | 4 | 15 | 2026-03-11 22:15:44 |
|
||||
| KDW 632M HL Cam | JC400P | -1.24115 | 36.72847 | 0 | Off | 4 | 0 | 2026-03-11 11:52:01 |
|
||||
| AT4-64815 | AT4 | -1.24136 | 36.72872 | 0 | Off | 4 | 4 | 2026-02-05 11:19:55 |
|
||||
| KDU 878T_CAM | JC400P | -1.06900 | 37.01436 | 12 | Off | 4 | 15 | 2025-12-04 15:27:42 |
|
||||
| KCU 145Q Solo Xtrail | GT06E | -1.29728 | 36.88850 | 0 | Off | 4 | 7 | 2025-06-01 14:04:47 |
|
||||
| Belta KCU-647D | GT06E | -1.15151 | 36.63857 | 0 | Off | 4 | 11 | 2025-05-30 23:53:22 |
|
||||
| KCS 903Y JK SUB | AT4 | -1.23529 | 36.72875 | 0 | Off | 4 | 3 | 2024-07-16 10:41:42 |
|
||||
| KCU 865Q Vanguard Sub | AT4 | -1.23522 | 36.73104 | 0 | Off | 4 | 5 | 2024-07-07 10:43:21 |
|
||||
| KMEH 692C KAWASAKI | AT4 | -1.23849 | 36.72460 | 0 | Off | 4 | 11 | 2023-06-17 10:41:18 |
|
||||
| KCE 690F | AT4 | -1.24008 | 36.74522 | 31 | Off | 4 | 6 | 2019-09-27 07:20:08 |
|
||||
|
||||
---
|
||||
|
||||
## 6. Geographic Clustering
|
||||
|
||||
| Cluster | Area | Coords | Active Devices | Δ vs 260410 |
|
||||
|---|---|---|---|---|
|
||||
| **Primary depot** | Nairobi West / Kikuyu Rd corridor | -1.235 to -1.244, 36.722 to 36.731 | 14 devices | KDK 829A GP moved here from secondary cluster |
|
||||
| **Secondary** | Nairobi East / Thika Rd | -1.297, 36.888 | 1 device | KDK 829A GP departed — now only KCU 145Q Solo (stale) |
|
||||
| **Outlier** | Thika / Ruiru | -1.069, 37.014 | 1 device (KDU 878T_CAM) | Unchanged |
|
||||
| **CRITICAL** | **Uganda — Kampala region** | **0.196, 32.540** | **1 device (X3-63282)** | **Persists — no change** |
|
||||
|
||||
> **KDK 829A GP position change confirmed:** was at -1.328, 36.900 (Nairobi East) in the 260410 report; now at -1.238, 36.727 (primary depot). Vehicle drove from the secondary cluster to the main yard between the two report windows.
|
||||
|
||||
---
|
||||
|
||||
## 7. Position History
|
||||
|
||||
**Total fixes: 101** across two ingestion sources (new container; accumulating since 22:25 EAT Apr 11):
|
||||
|
||||
| Source | Fixes | Method | Frequency |
|
||||
|---|---|---|---|
|
||||
| `poll` | 31 | Fleet-wide 60s sweep | Every 60 seconds |
|
||||
| `track_list` | 70 | Per-device high-res trail (POLL-01) | Every 30 minutes |
|
||||
| **Total** | **101** | | |
|
||||
|
||||
### Per-Device Fixes — Last 24 Hours
|
||||
|
||||
| Device | Model | Source | Fixes | First Fix (EAT) | Last Fix (EAT) | Avg Speed | Max Speed |
|
||||
|---|---|---|---|---|---|---|---|
|
||||
| FRED KMGW 538W HULETI | X3 | track_list | 69 | 2026-04-11 21:52:44 | 2026-04-11 23:50:12 | 18.1 km/h | 53 km/h |
|
||||
| FRED KMGW 538W HULETI | X3 | poll | 7 | 2026-04-11 22:25:12 | 2026-04-12 00:02:54 | 5.0 km/h | 35 km/h |
|
||||
| X3-63282 | X3 | poll | 4 | 2026-04-11 22:13:38 | 2026-04-11 23:58:38 | 0.0 km/h | 0 km/h |
|
||||
| X3-63282 | X3 | track_list | 1 | 2026-04-11 21:58:38 | 2026-04-11 21:58:38 | 0.0 km/h | 0 km/h |
|
||||
| KDK 829A GP | GT06E | poll | 1 | 2026-04-11 22:09:59 | 2026-04-11 22:09:59 | 0.0 km/h | 0 km/h |
|
||||
| KMGR 409U HENRY JAZZ | X3 | poll | 1 | 2026-04-11 16:47:20 | 2026-04-11 16:47:20 | 3.0 km/h | 3 km/h |
|
||||
|
||||
> **FRED KMGW 538W HULETI** generated 69 high-resolution track_list waypoints with avg 18.1 km/h and peak 53 km/h — confirming real road movement during the evening. This is the first active driving data since pipeline deployment.
|
||||
|
||||
---
|
||||
|
||||
## 8. Alarms
|
||||
|
||||
**Total alarms: 3** — all on FRED KMGW 538W HULETI, corresponding to evening trips.
|
||||
|
||||
| # | Device | Alarm Type | Alarm Name | Time (EAT) | Lat | Lng | Speed |
|
||||
|---|---|---|---|---|---|---|---|
|
||||
| 1 | FRED KMGW 538W HULETI | ACC_ON | ACC ON | 2026-04-11 22:07:27 | -1.23950 | 36.73979 | 0 |
|
||||
| 2 | FRED KMGW 538W HULETI | ACC_OFF | ACC OFF | 2026-04-11 22:28:23 | -1.24441 | 36.72324 | 0 |
|
||||
| 3 | FRED KMGW 538W HULETI | ACC_OFF | ACC OFF | 2026-04-11 23:35:13 | -1.24428 | 36.72300 | 0 |
|
||||
|
||||
**Key findings:**
|
||||
- ACC_ON at 22:07 → vehicle started; ACC_OFF at 22:28 → parked briefly; ACC_OFF at 23:35 → final park. Consistent with the 3 trips recorded.
|
||||
- ACC_ON event at -1.23950, 36.73979 — slightly east of primary depot, consistent with trip 3 start coordinates.
|
||||
- No vibration alerts this window (2 vibration alerts overnight on 260410). Quieter night.
|
||||
- No speeding, geofence, or power alarms.
|
||||
|
||||
---
|
||||
|
||||
## 9. Trips
|
||||
|
||||
**Trips recorded: 3** — all FRED KMGW 538W HULETI on the evening of 2026-04-11.
|
||||
|
||||
> **FIX-M16 confirmed working:** distances are physically consistent with duration and speed.
|
||||
|
||||
| # | Device | Start (EAT) | End (EAT) | Distance (km) | Drive Time (s) | Implied Speed | Avg Speed (API) |
|
||||
|---|---|---|---|---|---|---|---|
|
||||
| 1 | FRED KMGW 538W HULETI | 21:47:05 | 21:49:44 | 1.430 km | 159s | 32.4 km/h | 32.41 km/h ✓ |
|
||||
| 2 | FRED KMGW 538W HULETI | 23:13:05 | 23:20:22 | 2.600 km | 437s | 21.4 km/h | 21.38 km/h ✓ |
|
||||
| 3 | FRED KMGW 538W HULETI | 23:27:36 | 23:35:13 | 2.910 km | 457s | 22.9 km/h | 22.93 km/h ✓ |
|
||||
| **Total** | | | | **6.940 km** | **1,053s (17.6 min)** | | |
|
||||
|
||||
**Notes:**
|
||||
- Pre-fix, these trips were stored as 1,432 km / 2,596 km / 2,910 km — corrected in-place by DB update and code fix deployed.
|
||||
- `max_speed_kmh` not yet populated for these trips — API field `maxSpeed` not returned by `jimi.device.track.mileage` for this device/window.
|
||||
- Short urban trips (1.4–2.9 km) at 21–32 km/h — consistent with Nairobi city driving near the Kikuyu Rd depot.
|
||||
|
||||
---
|
||||
|
||||
## 10. Parking Events
|
||||
|
||||
**Parking events: 0**
|
||||
|
||||
POLL-02 fix deployed (`acc_type=0`, corrected `durSecond` mapping). API responding cleanly (14 calls, 0 rows). Events will populate once a complete park-stop-drive cycle is observed by the poller window.
|
||||
|
||||
---
|
||||
|
||||
## 11. Ingestion Pipeline Health
|
||||
|
||||
Container uptime: ~1h 50min at time of queries (restarted 22:25 EAT Apr 11 for FIX-M16 deployment).
|
||||
|
||||
| Endpoint | Calls | Rows Upserted | Rows Inserted | Avg Duration | Failures | First Call (EAT) | Last Call (EAT) |
|
||||
|---|---|---|---|---|---|---|---|
|
||||
| `jimi.user.device.location.list` | 21 | 399 | 399 | 575ms | 0 | 2026-04-11 22:25:45 | 2026-04-12 00:10:58 |
|
||||
| `jimi.open.platform.report.parking` | 14 | 0 | 0 | 10,863ms | 0 | 2026-04-11 22:25:55 | 2026-04-12 00:04:34 |
|
||||
| `jimi.device.track.list` | 4 | 0 | 77 | 237,175ms | 0 | 2026-04-11 22:26:10 | 2026-04-12 00:04:41 |
|
||||
| `jimi.device.alarm.list` | 3 | 0 | 4 | 344ms | 0 | 2026-04-11 22:30:39 | 2026-04-12 00:04:34 |
|
||||
| `jimi.user.device.list+detail` | 2 | 126 | 0 | 5,768ms | 0 | 2026-04-11 22:25:39 | 2026-04-12 00:04:12 |
|
||||
| **Total** | **44** | | | | **0** | | |
|
||||
|
||||
**Observations:**
|
||||
- **Zero failures across all 44 API calls** — pipeline stable after restart.
|
||||
- Location polling: 21 calls, 575ms avg — consistent with 260410 (493ms). Slightly slower, within normal variance.
|
||||
- Track list: 4 calls, 77 waypoints at 237s avg — slower per call than 260410 (137s). Likely due to FRED KMGW 538W HULETI generating dense waypoints during active driving.
|
||||
- Alarm poll: 344ms avg — fast and clean.
|
||||
- Device sync: 2 runs since restart; all 63 device records updated with full field sync (FIX-M17 now active).
|
||||
|
||||
---
|
||||
|
||||
## 12. Changes Since 260410 Baseline
|
||||
|
||||
| Area | 260410 | 260412 | Assessment |
|
||||
|---|---|---|---|
|
||||
| Trips recorded | 0 | **3** | First real trip data — pipeline validated end-to-end |
|
||||
| Trip distance accuracy | Broken (km stored as m) | **Fixed (FIX-M16)** | Implied speed matches API avgSpeed exactly |
|
||||
| `sync_devices` ON CONFLICT | 5 fields only | **26 fields** (FIX-M17) | Driver/phone/SIM will now update on each daily sync |
|
||||
| `track_list` fixes (24h) | 13 | **70** | FRED KMGW 538W HULETI drove → dense trail captured |
|
||||
| FRED KMGW 538W HULETI | Parked at depot | **Active — 6.94 km driven** | First confirmed driving data |
|
||||
| KDK 829A GP | Secondary cluster (Nairobi East) | **Primary depot** | Returned to main yard overnight |
|
||||
| `sync_driver_audit.py` | Not present | **Added** | One-shot tool for API↔DB driver/IMEI gap reporting |
|
||||
| Driver names in DB | 0 | **0** | Root cause confirmed: not assigned in Tracksolid Pro UI |
|
||||
| Uganda anomaly (X3-63282) | Active at 0.196, 32.540 | **Persists — no change** | Requires investigation |
|
||||
| Service flags | Belta KCU-647D (234,546 km), KDK 829A GP (239,264 km) | **Same — odometers approaching 240k** | Maintenance overdue |
|
||||
|
||||
---
|
||||
|
||||
## 13. Open Items
|
||||
|
||||
| Priority | Item | Owner |
|
||||
|---|---|---|
|
||||
| HIGH | Assign driver names + vehicle numbers in Tracksolid Pro UI | Operations |
|
||||
| HIGH | Investigate X3-63282 in Uganda (Kampala region) — legitimate deployment or stolen? | Management |
|
||||
| HIGH | Service KDK 829A GP (239,264 km) and Belta KCU-647D (235,000 km) | Fleet maintenance |
|
||||
| MEDIUM | Register webhooks in Tracksolid Pro: `/pushobd`, `/pushoil`, `/pushtem`, `/pushlbs`, `/pushevent` | DevOps |
|
||||
| MEDIUM | Set `fuel_100km` per vehicle type to activate fuel cost analytics | Operations |
|
||||
| MEDIUM | Investigate 44 devices with no GPS fix — deployed? SIM installed? | Fleet ops |
|
||||
| LOW | Define geofences — depot boundary, approved route corridors | Operations |
|
||||
| LOW | Run nightly ETL: `SELECT dwh_gold.refresh_daily_metrics(CURRENT_DATE - 1)` | DevOps |
|
||||
|
||||
---
|
||||
|
||||
*Report generated: 2026-04-12 ~00:15 EAT · Stack: TimescaleDB 2.15 + PostGIS + Tracksolid Pro Open Platform API*
|
||||
*Pipeline: `ingest_movement_rev.py` v2.2 (FIX-M16, FIX-M17) · `ingest_events_rev.py` · `webhook_receiver_rev.py`*
|
||||
161
CLAUDE.md
Normal file
161
CLAUDE.md
Normal file
|
|
@ -0,0 +1,161 @@
|
|||
# CLAUDE.md — Fireside Communications · Tracksolid Fleet Intelligence
|
||||
|
||||
## 1. What This Project Is
|
||||
|
||||
Fleet telematics ingestion and analytics stack for a **telco first-line support client** operating in Nairobi, Mombasa, and Kampala. The client dispatches field technicians to install, repair, and maintain home and business broadband, handle LOS signal faults, service migrations, and maintain outside plant infrastructure. The fleet is ~80 vehicles across three cities, all tracked via Tracksolid Pro (Jimi IoT API).
|
||||
|
||||
This repository ingests the Tracksolid Pro API into a TimescaleDB/PostGIS database and visualises fleet and operational KPIs in Grafana. The pipeline is deployed on Coolify at `stage.rahamafresh.com`.
|
||||
|
||||
**Repository:** `https://repo.rahamafresh.com/kianiadee/tracksolid_timescale_grafana_prod.git`
|
||||
|
||||
---
|
||||
|
||||
## 2. Tech Stack
|
||||
|
||||
| Layer | Technology |
|
||||
|---|---|
|
||||
| Ingestion | Python 3.12 — `ingest_movement_rev.py`, `ingest_events_rev.py`, `webhook_receiver_rev.py` |
|
||||
| Shared utils | `ts_shared_rev.py` — token cache, DB pool, API signing, clean helpers |
|
||||
| Database | PostgreSQL 16 + TimescaleDB 2.15 + PostGIS 3 (`tracksolid_db`) |
|
||||
| Orchestration | Docker Compose on Coolify |
|
||||
| Visualisation | Grafana (provisioned via custom image) |
|
||||
| Workflow automation | n8n |
|
||||
| API source | Tracksolid Pro / Jimi IoT Open Platform (`eu-open.tracksolidpro.com/route/rest`) |
|
||||
| Version control | Forgejo at `repo.rahamafresh.com` |
|
||||
|
||||
---
|
||||
|
||||
## 3. Instance & Connection Parameters
|
||||
|
||||
See `docs/CONNECTIONS.md` for the full shape. Summary:
|
||||
|
||||
- **SSH:** `ssh -i ~/.ssh/id_ed25519 kianiadee@stage.rahamafresh.com`
|
||||
- **DB name:** `tracksolid_db` · **DB user:** `postgres` (internal) · `tracksolid_owner` (app) · `grafana_ro` (read-only)
|
||||
- **DB schema:** `tracksolid` (operational) · `infrastructure` · `dwh_gold` (aggregates)
|
||||
- **Container naming:** Coolify appends a random suffix. Always resolve with:
|
||||
```bash
|
||||
docker ps --filter name=<service_name> --format "{{.Names}}" | head -1
|
||||
```
|
||||
e.g. `docker ps --filter name=timescale_db --format "{{.Names}}" | head -1`
|
||||
- **Env vars:** loaded from `.env` via `env_file` in `docker-compose.yaml`. See `docs/CONNECTIONS.md` for variable names. Never hardcode secrets.
|
||||
|
||||
---
|
||||
|
||||
## 4. Codebase Map
|
||||
|
||||
```
|
||||
ts_shared_rev.py # Shared: config, signing, DB pool, token cache, clean helpers
|
||||
ingest_movement_rev.py # GPS positions, trips, parking, track-list (high-res trail), device sync
|
||||
ingest_events_rev.py # Alarm events polling (fallback for webhook push)
|
||||
webhook_receiver_rev.py # FastAPI push receiver: /pushobd /pushevent /pushtripreport etc.
|
||||
sync_driver_audit.py # One-shot: API↔DB driver/IMEI gap report + full upsert
|
||||
run_migrations.py # Applies SQL migrations in order at container startup
|
||||
docker-compose.yaml # Services: timescale_db, ingest_movement, ingest_events,
|
||||
# webhook_receiver, grafana
|
||||
grafana/ # Grafana provisioning (baked into image)
|
||||
n8n-workflows/ # n8n workflow exports
|
||||
docs/ # Reference docs (connections, API, KPIs, project context)
|
||||
02_tracksolid_full_schema_rev.sql # Full schema bootstrap
|
||||
03..05_*.sql # Incremental migrations
|
||||
01_BusinessAnalytics.md # SQL analytics library (reference before writing queries)
|
||||
tracksolidApiDocumentation.md # API endpoint reference
|
||||
260412_baseline_report.md # Latest fleet state snapshot
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 5. Database Schema — Key Tables
|
||||
|
||||
```sql
|
||||
tracksolid.devices -- Master device registry (63 devices, imei PK)
|
||||
tracksolid.live_positions -- Current position per device (1 row per IMEI, upserted)
|
||||
tracksolid.position_history -- All GPS fixes (hypertable, partitioned by gps_time)
|
||||
-- source: 'poll' (60s sweep) | 'track_list' (30m high-res)
|
||||
tracksolid.trips -- Trip summaries: distance_km, driving_time_s, avg/max speed
|
||||
tracksolid.parking_events -- Stop events with duration and address
|
||||
tracksolid.alarms -- Alarm events (alarm_type, alarm_name, alarm_time)
|
||||
tracksolid.obd_readings -- OBD diagnostics (push only, awaiting webhook registration)
|
||||
tracksolid.device_events -- Power on/off tamper events
|
||||
tracksolid.ingestion_log -- API call audit trail per endpoint
|
||||
dwh_gold.fact_daily_fleet_metrics -- Nightly ETL aggregates per vehicle per day
|
||||
```
|
||||
|
||||
Full DDL: `02_tracksolid_full_schema_rev.sql` + migrations `03`–`05`.
|
||||
|
||||
---
|
||||
|
||||
## 6. API Critical Facts
|
||||
|
||||
**Always read `tracksolidApiDocumentation.md` before adding a new endpoint call.**
|
||||
|
||||
| Fact | Detail |
|
||||
|---|---|
|
||||
| Auth | OAuth2 — token cached in `tracksolid.api_token_cache`, refreshed via `jimi.oauth.token.refresh` |
|
||||
| Signing | MD5: `secret + sorted(k+v pairs) + secret` — see `build_sign()` in `ts_shared_rev.py` |
|
||||
| Batch limit | Max 50 IMEIs per call for most endpoints |
|
||||
| `distance` field | **Returns METRES, not km** despite docs. Always divide by 1000. (FIX-M16) |
|
||||
| `driverName`/`driverPhone` | From `jimi.user.device.list` — will be NULL if not set in Tracksolid Pro UI |
|
||||
| `alarm_type` field | API polling returns `alertTypeId`/`alarmTypeName` — NOT `alarmType`/`alarmName` (FIX-E06) |
|
||||
| `durSecond` | Parking endpoint returns `durSecond`, not `seconds` (FIX-M13) |
|
||||
| `jimi.device.track.mileage` | `startMileage`/`endMileage` are cumulative odometer in **metres** |
|
||||
| Rate limit | Code 1006 — back off and retry with re-sign (handled in `api_post()`) |
|
||||
| OBD data | Push only via `/pushobd` webhook — no polling endpoint exists |
|
||||
|
||||
---
|
||||
|
||||
## 7. Fix History (do not regress)
|
||||
|
||||
| Fix ID | File | What it fixed |
|
||||
|---|---|---|
|
||||
| FIX-M11 | `ingest_movement_rev.py` | Removed erroneous ×1000 on distance (was storing km as mm) |
|
||||
| FIX-M13 | `ingest_movement_rev.py` | Parking: added `acc_type=0`, `account`; mapped `durSecond` |
|
||||
| FIX-M14 | `ingest_movement_rev.py` | `poll_track_list()` — high-res GPS trail every 30m |
|
||||
| FIX-M15 | `ingest_movement_rev.py` | `get_device_locations()` — on-demand precision refresh |
|
||||
| FIX-M16 | `ingest_movement_rev.py` | `distance` from API is metres → divide by 1000 before storing |
|
||||
| FIX-M17 | `ingest_movement_rev.py` | `sync_devices()` ON CONFLICT now updates all 26 fields (was 5) |
|
||||
| FIX-E06 | `ingest_events_rev.py` | Alarm field mapping: `alertTypeId`/`alarmTypeName`/`alertTime` |
|
||||
| BUG-02 | Migration 04 | Historical `distance_m` rows ÷1,000,000 → renamed to `distance_km` |
|
||||
|
||||
---
|
||||
|
||||
## 8. Working Rules
|
||||
|
||||
1. **No prod push without explicit user confirmation.** Always state what you are about to push and wait.
|
||||
2. **Never rewrite a migration that is already applied.** Check `tracksolid.schema_migrations` first. Add a new numbered migration file for any schema change.
|
||||
3. **Read before writing.** Before suggesting any code change, read the relevant source file. Before writing a query, check `01_BusinessAnalytics.md` for an existing pattern.
|
||||
4. **Reuse shared utilities.** All DB access via `get_conn()`, all API calls via `api_post()`, all cleaning via `clean()` / `clean_num()` / `clean_int()` / `clean_ts()` in `ts_shared_rev.py`. Do not reinvent these.
|
||||
5. **Resolve container names dynamically.** Never hardcode the Coolify suffix. Use `docker ps --filter name=<service>`.
|
||||
6. **SSH only when asked.** Default workflow is local code → commit → push. SSH into the instance only when explicitly asked to test or run something live.
|
||||
7. **Secrets from env only.** Connection strings, API keys, and passwords live in `.env`. Reference variable names from `docs/CONNECTIONS.md`, never values.
|
||||
8. **Two developers, one incoming.** Write code and docs that a second developer (mixed technical/operations background) can follow without prior context.
|
||||
|
||||
---
|
||||
|
||||
## 9. Fleet State (as of 2026-04-12 baseline)
|
||||
|
||||
| Metric | Value |
|
||||
|---|---|
|
||||
| Total registered devices | 63 (growing to 80) |
|
||||
| Devices with GPS fix < 2h | 2 |
|
||||
| Devices never reported | 44 |
|
||||
| Driver names populated | 0 — must be set in Tracksolid Pro UI first |
|
||||
| Cities active | Nairobi (primary), Mombasa (deploying), Kampala (1 device confirmed) |
|
||||
| Uganda anomaly | X3-63282 at 0.196, 32.540 — under investigation |
|
||||
| Service flags | KDK 829A GP (239,264 km), Belta KCU-647D (235,000 km) |
|
||||
|
||||
Latest full snapshot: `260412_baseline_report.md`
|
||||
|
||||
---
|
||||
|
||||
## 10. Open Items (update as resolved)
|
||||
|
||||
| Priority | Item |
|
||||
|---|---|
|
||||
| HIGH | Assign driver names + vehicle numbers in Tracksolid Pro UI |
|
||||
| HIGH | Register webhooks: `/pushobd` `/pushoil` `/pushtem` `/pushlbs` `/pushevent` |
|
||||
| HIGH | Investigate X3-63282 in Kampala — legitimate or unauthorised? |
|
||||
| MEDIUM | Set `fuel_100km` per vehicle type to activate fuel cost calculations |
|
||||
| MEDIUM | Investigate 44 silent devices — SIM installed? Activated? |
|
||||
| MEDIUM | Co-develop client KPI framework (see `docs/KPI_FRAMEWORK.md`) |
|
||||
| LOW | Populate geofences — depot boundaries, city zones |
|
||||
| LOW | Run nightly ETL: `SELECT dwh_gold.refresh_daily_metrics(CURRENT_DATE - 1)` |
|
||||
0
db_audit/__init__.py
Normal file
0
db_audit/__init__.py
Normal file
19
db_audit/checks/data_gaps.sql
Normal file
19
db_audit/checks/data_gaps.sql
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
-- Data gaps: enabled devices with no position_history or trips in last 7 days
|
||||
SELECT
|
||||
d.imei,
|
||||
d.device_name,
|
||||
d.enabled_flag,
|
||||
MAX(ph.gps_time) AS last_position,
|
||||
MAX(t.start_time) AS last_trip
|
||||
FROM tracksolid.devices d
|
||||
LEFT JOIN tracksolid.position_history ph
|
||||
ON ph.imei = d.imei
|
||||
AND ph.gps_time > NOW() - INTERVAL '7 days'
|
||||
LEFT JOIN tracksolid.trips t
|
||||
ON t.imei = d.imei
|
||||
AND t.start_time > NOW() - INTERVAL '7 days'
|
||||
WHERE d.enabled_flag = 1
|
||||
GROUP BY d.imei, d.device_name, d.enabled_flag
|
||||
HAVING MAX(ph.gps_time) IS NULL
|
||||
AND MAX(t.start_time) IS NULL
|
||||
ORDER BY d.imei;
|
||||
14
db_audit/checks/distance_outliers.sql
Normal file
14
db_audit/checks/distance_outliers.sql
Normal file
|
|
@ -0,0 +1,14 @@
|
|||
-- Distance outliers: trips with impossible or suspicious distance in last 7 days
|
||||
SELECT
|
||||
imei,
|
||||
start_time,
|
||||
end_time,
|
||||
distance_km,
|
||||
source
|
||||
FROM tracksolid.trips
|
||||
WHERE start_time > NOW() - INTERVAL '7 days'
|
||||
AND (
|
||||
distance_km < 0
|
||||
OR distance_km > 500
|
||||
)
|
||||
ORDER BY distance_km DESC;
|
||||
11
db_audit/checks/duplicate_positions.sql
Normal file
11
db_audit/checks/duplicate_positions.sql
Normal file
|
|
@ -0,0 +1,11 @@
|
|||
-- Duplicate (imei, gps_time) pairs in position_history
|
||||
-- Should always return 0 rows if ON CONFLICT DO NOTHING is working correctly
|
||||
SELECT
|
||||
imei,
|
||||
gps_time,
|
||||
COUNT(*) AS duplicate_count
|
||||
FROM tracksolid.position_history
|
||||
WHERE gps_time > NOW() - INTERVAL '7 days'
|
||||
GROUP BY imei, gps_time
|
||||
HAVING COUNT(*) > 1
|
||||
ORDER BY duplicate_count DESC;
|
||||
34
db_audit/checks/enum_drift.sql
Normal file
34
db_audit/checks/enum_drift.sql
Normal file
|
|
@ -0,0 +1,34 @@
|
|||
-- Enum drift: unexpected values in source and other constrained columns
|
||||
-- position_history.source should be: poll, push, track_list
|
||||
SELECT
|
||||
'position_history.source' AS check_column,
|
||||
source AS unexpected_value,
|
||||
COUNT(*) AS occurrences
|
||||
FROM tracksolid.position_history
|
||||
WHERE source NOT IN ('poll', 'push', 'track_list')
|
||||
AND source IS NOT NULL
|
||||
GROUP BY source
|
||||
|
||||
UNION ALL
|
||||
|
||||
-- trips.source should be: poll, push
|
||||
SELECT
|
||||
'trips.source',
|
||||
source,
|
||||
COUNT(*)
|
||||
FROM tracksolid.trips
|
||||
WHERE source NOT IN ('poll', 'push')
|
||||
AND source IS NOT NULL
|
||||
GROUP BY source
|
||||
|
||||
UNION ALL
|
||||
|
||||
-- alarms.source should be: poll, push
|
||||
SELECT
|
||||
'alarms.source',
|
||||
source,
|
||||
COUNT(*)
|
||||
FROM tracksolid.alarms
|
||||
WHERE source NOT IN ('poll', 'push')
|
||||
AND source IS NOT NULL
|
||||
GROUP BY source;
|
||||
30
db_audit/checks/null_integrity.sql
Normal file
30
db_audit/checks/null_integrity.sql
Normal file
|
|
@ -0,0 +1,30 @@
|
|||
-- NULL integrity check across telemetry tables
|
||||
SELECT
|
||||
'position_history.imei_null' AS check_field,
|
||||
COUNT(*) AS null_count
|
||||
FROM tracksolid.position_history
|
||||
WHERE imei IS NULL
|
||||
UNION ALL
|
||||
SELECT
|
||||
'position_history.gps_time_null',
|
||||
COUNT(*)
|
||||
FROM tracksolid.position_history
|
||||
WHERE gps_time IS NULL
|
||||
UNION ALL
|
||||
SELECT
|
||||
'alarms.imei_null',
|
||||
COUNT(*)
|
||||
FROM tracksolid.alarms
|
||||
WHERE imei IS NULL
|
||||
UNION ALL
|
||||
SELECT
|
||||
'alarms.alarm_type_null',
|
||||
COUNT(*)
|
||||
FROM tracksolid.alarms
|
||||
WHERE alarm_type IS NULL
|
||||
UNION ALL
|
||||
SELECT
|
||||
'obd_readings.imei_null',
|
||||
COUNT(*)
|
||||
FROM tracksolid.obd_readings
|
||||
WHERE imei IS NULL;
|
||||
14
db_audit/checks/stale_devices.sql
Normal file
14
db_audit/checks/stale_devices.sql
Normal file
|
|
@ -0,0 +1,14 @@
|
|||
-- Stale devices: enabled devices with no GPS fix in last 2 hours
|
||||
SELECT
|
||||
d.imei,
|
||||
d.device_name,
|
||||
lp.gps_time AS last_gps_time,
|
||||
EXTRACT(EPOCH FROM (NOW() - lp.gps_time)) / 3600 AS hours_since_fix
|
||||
FROM tracksolid.devices d
|
||||
LEFT JOIN tracksolid.live_positions lp ON lp.imei = d.imei
|
||||
WHERE d.enabled_flag = 1
|
||||
AND (
|
||||
lp.gps_time IS NULL
|
||||
OR lp.gps_time < NOW() - INTERVAL '2 hours'
|
||||
)
|
||||
ORDER BY hours_since_fix DESC NULLS FIRST;
|
||||
161
db_audit/run_audit.py
Normal file
161
db_audit/run_audit.py
Normal file
|
|
@ -0,0 +1,161 @@
|
|||
"""
|
||||
db_audit/run_audit.py — Fireside Communications Fleet Telemetry DB Audit
|
||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||
Runs six health checks against the production TimescaleDB.
|
||||
Writes results to tracksolid.health_checks for Grafana monitoring.
|
||||
Exits with code 1 if any critical finding is detected.
|
||||
|
||||
Usage:
|
||||
DATABASE_URL=postgresql://... python db_audit/run_audit.py
|
||||
|
||||
Checks:
|
||||
stale_devices - Enabled devices with no GPS fix in >2h
|
||||
null_integrity - NULL imei/gps_time in telemetry tables
|
||||
distance_outliers - Trip distances <0 or >500 km in last 7 days
|
||||
duplicate_positions - Duplicate (imei, gps_time) in position_history
|
||||
data_gaps - Enabled devices with zero data in last 7 days
|
||||
enum_drift - Unexpected values in source/severity columns
|
||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import logging
|
||||
from pathlib import Path
|
||||
|
||||
import psycopg2
|
||||
import psycopg2.extras
|
||||
|
||||
# ── Config ────────────────────────────────────────────────────────────────────
|
||||
|
||||
DATABASE_URL = os.environ.get("DATABASE_URL")
|
||||
if not DATABASE_URL:
|
||||
print("ERROR: DATABASE_URL environment variable is required.", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
CHECKS_DIR = Path(__file__).parent / "checks"
|
||||
SCHEMA_FILE = Path(__file__).parent / "schema" / "health_checks_table.sql"
|
||||
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format="%(asctime)s [%(levelname)s] %(message)s",
|
||||
datefmt="%Y-%m-%d %H:%M:%S",
|
||||
)
|
||||
log = logging.getLogger("db_audit")
|
||||
|
||||
# ── Status Logic ──────────────────────────────────────────────────────────────
|
||||
|
||||
# Checks that produce CRITICAL status if they return any rows
|
||||
CRITICAL_CHECKS = {"null_integrity", "duplicate_positions"}
|
||||
|
||||
# Checks that produce WARNING status if they return any rows
|
||||
WARNING_CHECKS = {"stale_devices", "distance_outliers", "data_gaps", "enum_drift"}
|
||||
|
||||
|
||||
def _determine_status(check_name: str, rows: list[dict]) -> str:
|
||||
if not rows:
|
||||
return "ok"
|
||||
# null_integrity returns counts — critical if any count > 0
|
||||
if check_name == "null_integrity":
|
||||
has_nulls = any(row.get("null_count", 0) > 0 for row in rows)
|
||||
return "critical" if has_nulls else "ok"
|
||||
if check_name in CRITICAL_CHECKS:
|
||||
return "critical"
|
||||
if check_name in WARNING_CHECKS:
|
||||
return "warning"
|
||||
return "ok"
|
||||
|
||||
|
||||
# ── Core Runner ───────────────────────────────────────────────────────────────
|
||||
|
||||
def run_checks() -> bool:
|
||||
"""Run all checks. Returns True if any critical finding found."""
|
||||
conn = psycopg2.connect(DATABASE_URL, options="-c client_encoding=UTF8")
|
||||
conn.autocommit = False
|
||||
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
# Ensure health_checks table exists
|
||||
cur.execute(SCHEMA_FILE.read_text())
|
||||
conn.commit()
|
||||
log.info("health_checks table verified.")
|
||||
|
||||
has_critical = False
|
||||
results = []
|
||||
|
||||
for sql_file in sorted(CHECKS_DIR.glob("*.sql")):
|
||||
check_name = sql_file.stem
|
||||
sql = sql_file.read_text()
|
||||
|
||||
log.info("Running check: %s ...", check_name)
|
||||
|
||||
with conn.cursor(cursor_factory=psycopg2.extras.RealDictCursor) as cur:
|
||||
cur.execute(sql)
|
||||
rows = [dict(r) for r in cur.fetchall()]
|
||||
|
||||
status = _determine_status(check_name, rows)
|
||||
row_count = len(rows)
|
||||
|
||||
# Serialize rows (convert non-JSON-serializable types)
|
||||
detail = _safe_json(rows[:50]) # Cap at 50 rows to keep detail manageable
|
||||
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
INSERT INTO tracksolid.health_checks
|
||||
(check_name, status, detail, row_count)
|
||||
VALUES (%s, %s, %s, %s)
|
||||
""", (check_name, status, json.dumps(detail), row_count))
|
||||
conn.commit()
|
||||
|
||||
icon = "✅" if status == "ok" else ("⚠️ " if status == "warning" else "🔴")
|
||||
log.info(" %s %s: %s (%d rows)", icon, check_name, status.upper(), row_count)
|
||||
results.append((check_name, status, row_count))
|
||||
|
||||
if status == "critical":
|
||||
has_critical = True
|
||||
|
||||
# Summary
|
||||
print("\n" + "="*60)
|
||||
print("DB AUDIT SUMMARY")
|
||||
print("="*60)
|
||||
for name, status, count in results:
|
||||
indicator = "OK" if status == "ok" else ("WARN" if status == "warning" else "CRIT")
|
||||
print(f" [{indicator:4s}] {name:<30} ({count} rows)")
|
||||
print("="*60)
|
||||
|
||||
if has_critical:
|
||||
print("RESULT: CRITICAL findings detected. Exit code 1.")
|
||||
else:
|
||||
print("RESULT: No critical findings. Exit code 0.")
|
||||
print()
|
||||
|
||||
return has_critical
|
||||
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
|
||||
def _safe_json(rows: list[dict]) -> list[dict]:
|
||||
"""Convert any non-JSON-serializable values (Decimal, datetime) to strings."""
|
||||
import decimal
|
||||
from datetime import datetime, date
|
||||
|
||||
def convert(v):
|
||||
if isinstance(v, (datetime, date)):
|
||||
return v.isoformat()
|
||||
if isinstance(v, decimal.Decimal):
|
||||
return float(v)
|
||||
return v
|
||||
|
||||
return [{k: convert(v) for k, v in row.items()} for row in rows]
|
||||
|
||||
|
||||
# ── Entry Point ───────────────────────────────────────────────────────────────
|
||||
|
||||
if __name__ == "__main__":
|
||||
log.info("Starting DB audit...")
|
||||
has_critical = run_checks()
|
||||
sys.exit(1 if has_critical else 0)
|
||||
13
db_audit/schema/health_checks_table.sql
Normal file
13
db_audit/schema/health_checks_table.sql
Normal file
|
|
@ -0,0 +1,13 @@
|
|||
-- Idempotent: safe to run on every audit start
|
||||
CREATE TABLE IF NOT EXISTS tracksolid.health_checks (
|
||||
id BIGSERIAL PRIMARY KEY,
|
||||
checked_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
check_name TEXT NOT NULL,
|
||||
status TEXT NOT NULL CHECK (status IN ('ok', 'warning', 'critical')),
|
||||
detail JSONB,
|
||||
row_count INT
|
||||
);
|
||||
|
||||
-- Index for Grafana time-range queries
|
||||
CREATE INDEX IF NOT EXISTS health_checks_checked_at_idx
|
||||
ON tracksolid.health_checks (checked_at DESC);
|
||||
111
docs/CONNECTIONS.md
Normal file
111
docs/CONNECTIONS.md
Normal file
|
|
@ -0,0 +1,111 @@
|
|||
# Connection Parameters Reference
|
||||
|
||||
**No secrets are stored here. All values come from `.env` at runtime.**
|
||||
|
||||
---
|
||||
|
||||
## SSH
|
||||
|
||||
```
|
||||
Host: stage.rahamafresh.com
|
||||
User: kianiadee
|
||||
Key: ~/.ssh/id_ed25519
|
||||
```
|
||||
|
||||
```bash
|
||||
ssh -i ~/.ssh/id_ed25519 kianiadee@stage.rahamafresh.com
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Database
|
||||
|
||||
| Parameter | Value |
|
||||
|---|---|
|
||||
| Database | `tracksolid_db` |
|
||||
| Host (internal) | `timescale_db` (Docker service name) |
|
||||
| Port | `5432` |
|
||||
| App user | `tracksolid_owner` |
|
||||
| Read-only user | `grafana_ro` |
|
||||
| Superuser | `postgres` |
|
||||
|
||||
### `.env` variable names
|
||||
|
||||
```
|
||||
POSTGRES_DB=tracksolid_db
|
||||
POSTGRES_USER=...
|
||||
POSTGRES_PASSWORD=...
|
||||
DATABASE_URL=postgresql://tracksolid_owner:<password>@timescale_db:5432/tracksolid_db
|
||||
GRAFANA_DB_RO_PASSWORD=...
|
||||
```
|
||||
|
||||
### Run a query from host
|
||||
|
||||
```bash
|
||||
DB=$(docker ps --filter name=timescale_db --format "{{.Names}}" | head -1)
|
||||
docker exec $DB psql -U postgres -d tracksolid_db -c "SELECT COUNT(*) FROM tracksolid.devices;"
|
||||
```
|
||||
|
||||
### Run a query file
|
||||
|
||||
```bash
|
||||
docker exec -i $DB psql -U postgres -d tracksolid_db < migration.sql
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Tracksolid Pro API
|
||||
|
||||
| Parameter | Env var |
|
||||
|---|---|
|
||||
| App key | `TRACKSOLID_APP_KEY` |
|
||||
| App secret | `TRACKSOLID_APP_SECRET` |
|
||||
| User ID | `TRACKSOLID_USER_ID` |
|
||||
| Target account | `TRACKSOLID_TARGET_ACCOUNT` (defaults to USER_ID) |
|
||||
| Password MD5 | `TRACKSOLID_PWD_MD5` |
|
||||
| Base URL | `TRACKSOLID_API_URL` (default: `https://eu-open.tracksolidpro.com/route/rest`) |
|
||||
|
||||
---
|
||||
|
||||
## Container Name Resolution
|
||||
|
||||
Coolify appends a random suffix to all container names. Never hardcode. Always resolve:
|
||||
|
||||
```bash
|
||||
# Pattern
|
||||
docker ps --filter name=<service_name> --format "{{.Names}}" | head -1
|
||||
|
||||
# Examples
|
||||
docker ps --filter name=timescale_db --format "{{.Names}}" | head -1
|
||||
docker ps --filter name=ingest_movement --format "{{.Names}}" | head -1
|
||||
docker ps --filter name=webhook_receiver --format "{{.Names}}" | head -1
|
||||
docker ps --filter name=grafana --format "{{.Names}}" | head -1
|
||||
```
|
||||
|
||||
Current suffix (may change on redeploy): `bo3nov2ija7g8wn9b1g2paxs-19xxxxxxxxxx`
|
||||
|
||||
---
|
||||
|
||||
## Forgejo
|
||||
|
||||
```
|
||||
Host: https://repo.rahamafresh.com
|
||||
Repo: kianiadee/tracksolid_timescale_grafana_prod
|
||||
Remote: https://repo.rahamafresh.com/kianiadee/tracksolid_timescale_grafana_prod.git
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Grafana
|
||||
|
||||
- Deployed as Docker service `grafana`
|
||||
- Provisioning baked into image (datasources + dashboards via `grafana/Dockerfile`)
|
||||
- Admin password: `GF_SECURITY_ADMIN_PASSWORD` from `.env`
|
||||
- Default dashboard: NOC Fleet Dashboard
|
||||
|
||||
---
|
||||
|
||||
## n8n
|
||||
|
||||
- Deployed as separate Coolify service (`n8n-usoksgg8o40044g0cw08s8wc`)
|
||||
- Workflows exported to `n8n-workflows/`
|
||||
101
docs/KPI_FRAMEWORK.md
Normal file
101
docs/KPI_FRAMEWORK.md
Normal file
|
|
@ -0,0 +1,101 @@
|
|||
# KPI Framework — Telco Field Service Fleet
|
||||
## Fireside Communications · Co-developed with client
|
||||
|
||||
> **Status:** Draft — pending client review and validation.
|
||||
> Update this file after each client feedback session. Move KPIs from Proposed → Active → Retired as the programme matures.
|
||||
|
||||
---
|
||||
|
||||
## How to Use This Document
|
||||
|
||||
1. **Proposed** — KPI defined, not yet validated with client
|
||||
2. **Active** — Client confirmed this matters; query written; Grafana panel exists or is in progress
|
||||
3. **Baseline set** — Enough historical data exists to set a meaningful target
|
||||
4. **Retired** — No longer tracked (document reason)
|
||||
|
||||
Each active KPI should link to:
|
||||
- The SQL query (or reference to `01_BusinessAnalytics.md`)
|
||||
- The Grafana panel name/dashboard
|
||||
- The refresh frequency
|
||||
- The person who reviews it
|
||||
|
||||
---
|
||||
|
||||
## KPI Status Register
|
||||
|
||||
### Fleet Utilisation
|
||||
|
||||
| KPI | Status | SQL ref | Grafana panel | Reviewed by | Cadence |
|
||||
|---|---|---|---|---|---|
|
||||
| Utilisation rate (%) | Proposed | `01_BusinessAnalytics.md §2.1` | — | — | Daily |
|
||||
| Idle time % of shift | Proposed | `01_BusinessAnalytics.md §2.2` | — | — | Daily |
|
||||
| Vehicles not moved today | Proposed | `01_BusinessAnalytics.md §2.3` | — | — | Daily |
|
||||
| Fleet km today | Proposed | `01_BusinessAnalytics.md §5.1` | — | — | Daily |
|
||||
| Fleet km this week | Proposed | `01_BusinessAnalytics.md §5.2` | — | — | Weekly |
|
||||
|
||||
### Technician Productivity *(requires job system integration)*
|
||||
|
||||
| KPI | Status | SQL ref | Grafana panel | Reviewed by | Cadence |
|
||||
|---|---|---|---|---|---|
|
||||
| Jobs completed per tech per day | Proposed | TBD | — | — | Daily |
|
||||
| First-time fix rate | Proposed | TBD | — | — | Weekly |
|
||||
| Mean time to arrive (MTTA) | Proposed | TBD | — | — | Weekly |
|
||||
| Mean time to repair (MTTR) | Proposed | TBD | — | — | Weekly |
|
||||
| SLA compliance rate | Proposed | TBD | — | — | Weekly |
|
||||
|
||||
### Driver Behaviour
|
||||
|
||||
| KPI | Status | SQL ref | Grafana panel | Reviewed by | Cadence |
|
||||
|---|---|---|---|---|---|
|
||||
| Speeding events per 100 km | Proposed | `01_BusinessAnalytics.md §3.1` | — | — | Weekly |
|
||||
| Harsh driving index | Proposed | `01_BusinessAnalytics.md §3.2` | — | — | Weekly |
|
||||
| Late starts (count per driver) | Proposed | `01_BusinessAnalytics.md §3.3` | — | — | Monthly |
|
||||
| Early knock-off | Proposed | `01_BusinessAnalytics.md §3.3` | — | — | Monthly |
|
||||
| After-hours movement | Proposed | `01_BusinessAnalytics.md §3.4` | — | — | Daily |
|
||||
|
||||
### Asset Health & Cost
|
||||
|
||||
| KPI | Status | SQL ref | Grafana panel | Reviewed by | Cadence |
|
||||
|---|---|---|---|---|---|
|
||||
| Estimated idle fuel cost (KES) | Proposed | `01_BusinessAnalytics.md §2.2` | — | — | Monthly |
|
||||
| Vehicles at service threshold | Proposed | TBD | — | — | Weekly |
|
||||
| Alarm rate per vehicle/week | Proposed | `01_BusinessAnalytics.md §6` | — | — | Weekly |
|
||||
| GPS offline rate | Proposed | — | — | — | Daily |
|
||||
|
||||
---
|
||||
|
||||
## Severity & Threshold Reference
|
||||
|
||||
Adjust with client after first month of live data:
|
||||
|
||||
| Metric | Green | Amber | Red |
|
||||
|---|---|---|---|
|
||||
| Fleet utilisation rate | > 60% | 40–60% | < 40% |
|
||||
| Idle time % of shift | < 15% | 15–30% | > 30% |
|
||||
| Speeding per 100 km | < 0.5 | 0.5–2.0 | > 2.0 |
|
||||
| Harsh driving index | < 0.5 | 0.5–2.0 | > 2.0 |
|
||||
| Late starts / month | 0–1 | 2–4 | ≥ 5 |
|
||||
| Alarm rate / vehicle / week | 0–2 | 3–7 | > 7 |
|
||||
| GPS offline rate | < 5% | 5–15% | > 15% |
|
||||
| MTTA (minutes) | < 30 | 30–60 | > 60 |
|
||||
| First-time fix rate | > 85% | 70–85% | < 70% |
|
||||
| SLA compliance | > 95% | 85–95% | < 85% |
|
||||
|
||||
---
|
||||
|
||||
## Client Feedback Log
|
||||
|
||||
| Date | Session | Feedback | Action |
|
||||
|---|---|---|---|
|
||||
| — | Initial framework | Draft created | Awaiting first client review |
|
||||
|
||||
---
|
||||
|
||||
## Next Review Checklist
|
||||
|
||||
- [ ] Confirm shift hours (start, end, lunch, working days)
|
||||
- [ ] Confirm SLA tiers (home vs business customer)
|
||||
- [ ] Confirm which KPIs the ops manager wants on a daily digest
|
||||
- [ ] Confirm reporting format (Grafana link, PDF, WhatsApp summary)
|
||||
- [ ] Identify job management system / ticketing tool for MTTA/MTTR
|
||||
- [ ] Confirm vehicle categories (motorcycle, van, 4WD) for per-type benchmarks
|
||||
131
docs/PROJECT_CONTEXT.md
Normal file
131
docs/PROJECT_CONTEXT.md
Normal file
|
|
@ -0,0 +1,131 @@
|
|||
# Project Context — Fireside Communications Fleet Intelligence
|
||||
|
||||
## The Client
|
||||
|
||||
A first-line technical support operation contracted by a large Kenyan/East African telco. The client manages field technicians who handle the full spectrum of last-mile broadband support:
|
||||
|
||||
| Service Type | Description |
|
||||
|---|---|
|
||||
| New installations | Fibre/broadband installs at home and business premises |
|
||||
| Fault resolution | LOS (Loss of Signal) troubleshooting, slow service investigations |
|
||||
| Outside plant maintenance | Physical cable, cabinet, and pole infrastructure maintenance |
|
||||
| Migrations | Customer plan or technology upgrades requiring a site visit |
|
||||
| Business customer support | Prioritised SLA-driven support for commercial accounts |
|
||||
|
||||
## Operational Geography
|
||||
|
||||
| City | Status | Notes |
|
||||
|---|---|---|
|
||||
| Nairobi | Primary — fully operational | Main depot at Kikuyu Rd corridor (~-1.237, 36.727) |
|
||||
| Mombasa | Deploying | Fleet being onboarded |
|
||||
| Kampala, Uganda | 1 device confirmed | X3-63282 at 0.196, 32.540 — status under investigation |
|
||||
|
||||
All three cities managed from a single Tracksolid Pro account and a single database instance. A `city` field or grouping by device group should be used for per-city analytics rather than separate schemas.
|
||||
|
||||
## The Fleet
|
||||
|
||||
- ~80 vehicles total (63 currently registered in Tracksolid Pro)
|
||||
- Mix of motorcycles (courier/light inspection) and vans/4WDs (equipment and crew)
|
||||
- Device models in use: AT4 (hardwired), JC400P (camera-capable), X3 (compact), GT06E (OBD)
|
||||
- Vehicle identity (plate numbers, driver assignments) not yet populated in Tracksolid Pro — primary data quality gap
|
||||
|
||||
## Data Quality Gaps (as of April 2026)
|
||||
|
||||
| Gap | Impact | Resolution path |
|
||||
|---|---|---|
|
||||
| No driver names assigned | Reports show IMEIs instead of people | Assign in Tracksolid Pro UI → DB syncs nightly |
|
||||
| No vehicle numbers populated | Cannot link vehicle to job/plate | Manual UPDATE or CSV import |
|
||||
| 44 of 63 devices never reported GPS | Cannot track these vehicles | Verify SIM installation + activation |
|
||||
| `fuel_100km` null for all devices | Fuel cost calculations inactive | Set by vehicle type via UPDATE |
|
||||
| No geofences defined | Cannot alert on depot departures or route deviations | Define depot polygons + city zones |
|
||||
| Webhooks not registered | OBD, fuel, temperature tables empty | Register in Tracksolid Pro account settings |
|
||||
|
||||
---
|
||||
|
||||
## KPI Framework
|
||||
|
||||
> This section is developed iteratively with the client. KPIs are grouped by operational domain. As client feedback arrives, move items from "Proposed" to "Active" and add the Grafana panel reference.
|
||||
|
||||
### Domain 1 — Fleet Utilisation
|
||||
|
||||
Measures whether vehicles are productively deployed during working hours.
|
||||
|
||||
| KPI | Definition | Target | Status |
|
||||
|---|---|---|---|
|
||||
| Utilisation rate | Drive time / shift hours × 100 | > 60% | Proposed |
|
||||
| Idle time % | Engine-on-stationary / total shift | < 15% | Proposed |
|
||||
| Vehicles not moved today | COUNT where no trip recorded | 0 | Proposed |
|
||||
| Fleet km per day | SUM(distance_km) across all trips | Baseline TBD | Proposed |
|
||||
|
||||
### Domain 2 — Field Technician Productivity
|
||||
|
||||
Measures output per technician per day. **Requires job management system integration or manual job log.**
|
||||
|
||||
| KPI | Definition | Target | Status |
|
||||
|---|---|---|---|
|
||||
| Jobs completed per technician per day | Count of closed jobs | Baseline TBD | Proposed |
|
||||
| First-time fix rate | Jobs resolved on first visit % | > 80% | Proposed |
|
||||
| Mean time to arrive (MTTA) | Job assignment → vehicle on-site | < 45 min | Proposed |
|
||||
| Mean time to repair (MTTR) | Job creation → job closed | < 2 hours | Proposed |
|
||||
| SLA compliance rate | % jobs closed within SLA window | > 95% | Proposed |
|
||||
|
||||
> Note: MTTA and MTTR require job timestamps from the telco's ticketing system. Integration point TBD.
|
||||
|
||||
### Domain 3 — Driver Behaviour & Safety
|
||||
|
||||
Measures driving quality. Feeds into insurance, safety, and coaching programmes.
|
||||
|
||||
| KPI | Definition | Target | Status |
|
||||
|---|---|---|---|
|
||||
| Speeding events per 100 km | GPS fixes > 80 km/h / total km × 100 | < 0.5 | Proposed |
|
||||
| Harsh driving index | Speed delta > 30 km/h in < 60s per 100 km | < 0.5 | Proposed |
|
||||
| After-hours movement | Trips starting before 06:00 or after 20:00 EAT | 0 | Proposed |
|
||||
| Late starts | First ignition after 07:45 EAT | < 2/month | Proposed |
|
||||
| Early knock-off | Last trip ended before 17:00 EAT | < 2/month | Proposed |
|
||||
|
||||
### Domain 4 — Route & Dispatch Efficiency
|
||||
|
||||
Measures how well vehicles are matched to jobs geographically.
|
||||
|
||||
| KPI | Definition | Target | Status |
|
||||
|---|---|---|---|
|
||||
| Avg distance per job | Total km / jobs completed | Baseline TBD | Proposed |
|
||||
| Nearest available vehicle ETA | PostGIS dispatch query | < 30 min | Proposed |
|
||||
| Return-to-depot rate | % trips ending at primary depot | Baseline TBD | Proposed |
|
||||
|
||||
### Domain 5 — Asset Health & Cost
|
||||
|
||||
Measures maintenance burden and fuel efficiency.
|
||||
|
||||
| KPI | Definition | Target | Status |
|
||||
|---|---|---|---|
|
||||
| Estimated idle fuel cost (KES) | Idle hours × 0.8 L/h × KES 180/L | Minimise | Proposed |
|
||||
| Vehicles approaching service interval | Odometer > threshold | 0 overdue | Proposed |
|
||||
| Alarm rate per vehicle per week | COUNT(alarms) / 7 | < 2 | Proposed |
|
||||
| GPS offline rate | Devices with fix age > 10 min / total | < 10% | Proposed |
|
||||
|
||||
### Shift Schedule Assumptions
|
||||
|
||||
Adjust these as confirmed with client:
|
||||
|
||||
| Parameter | Assumed Value |
|
||||
|---|---|
|
||||
| Shift start | 07:30 EAT |
|
||||
| Late threshold | After 07:45 EAT |
|
||||
| Shift end | 17:00 EAT |
|
||||
| After-hours | Before 06:00 or after 20:00 EAT |
|
||||
| Working days | Monday–Saturday (confirm with client) |
|
||||
| Shift length for utilisation | 10 hours |
|
||||
|
||||
---
|
||||
|
||||
## Integration Roadmap
|
||||
|
||||
| Integration | What it unlocks | Priority |
|
||||
|---|---|---|
|
||||
| Telco ticketing system (job timestamps) | MTTA, MTTR, first-time fix rate, jobs/day | HIGH |
|
||||
| Tracksolid webhook registration | OBD, fuel, temperature, tamper events | HIGH |
|
||||
| Driver assignment in Tracksolid Pro | All driver-attributed KPIs | HIGH |
|
||||
| Geofence definition | Depot departure alerts, city zone coverage | MEDIUM |
|
||||
| Fuel sensor webhook (`/pushoil`) | Actual fuel consumption vs estimated | MEDIUM |
|
||||
| Temperature sensor (`/pushtem`) | Cold-chain compliance (if applicable) | LOW |
|
||||
|
|
@ -0,0 +1,148 @@
|
|||
# Bug Reduction Quality Program — Design Spec
|
||||
|
||||
**Date:** 2026-04-12
|
||||
**Project:** Fireside Communications Fleet Telemetry Ingestion Platform
|
||||
**Repo:** `55_ts_coolify_gemini_prod`
|
||||
**Status:** Approved — Implementation in Progress
|
||||
|
||||
## Problem
|
||||
|
||||
The platform has been running in production since late 2025 ingesting GPS and telemetry data from ~63 fleet vehicles. All bugs discovered to date (FIX-M11, FIX-M13, FIX-M16, FIX-E06, BUG-01 through BUG-05) were caught manually in production — via data inspection, Grafana anomalies, or customer reports. There are:
|
||||
|
||||
- Zero automated tests
|
||||
- No linting or type-checking configuration
|
||||
- No CI/CD pipeline
|
||||
- No programmatic DB health monitoring
|
||||
|
||||
Any code change risks silent regressions. Any API field mapping change risks data going silently to NULL. Any schema change risks data corruption that may not be noticed for days.
|
||||
|
||||
## Goal
|
||||
|
||||
A layered quality program that:
|
||||
1. **Finds existing bugs and data issues** without modifying source code
|
||||
2. **Prevents future regressions** by locking in known-correct behaviour
|
||||
3. **Monitors production DB health** on a daily schedule
|
||||
|
||||
## Constraints
|
||||
|
||||
- Existing source files MUST NOT be modified in Phase 1
|
||||
- All additions are new files only (config, tests, CI workflows, audit scripts)
|
||||
- Must run in CI (Forgejo Actions, self-hosted runner) and production (scheduled DB audit)
|
||||
|
||||
---
|
||||
|
||||
## Architecture: Three Parallel Workstreams
|
||||
|
||||
### Workstream 1 — Static Analysis
|
||||
|
||||
**Tools:** `ruff` (linting) + `mypy` (type checking)
|
||||
**Trigger:** Every push / pull request via Forgejo Actions
|
||||
**Risk:** Zero — read-only analysis of existing source
|
||||
|
||||
Surfaces:
|
||||
- Undefined names, unused imports (ruff/F rules)
|
||||
- Likely bugs: mutable defaults, string formatting issues (ruff/B rules)
|
||||
- Type errors: untyped returns, Optional not handled (mypy)
|
||||
- Modern Python upgrade opportunities (ruff/UP rules)
|
||||
|
||||
First run will be noisy — output becomes the bug backlog.
|
||||
|
||||
### Workstream 2 — Test Suite
|
||||
|
||||
**Framework:** pytest + pytest-asyncio
|
||||
**Trigger:** Every push / pull request via Forgejo Actions
|
||||
**Isolation:** Integration tests use a Docker TimescaleDB service container
|
||||
|
||||
**Unit tests** (pure Python, no DB):
|
||||
- `test_clean_helpers.py` — `clean()`, `clean_num()`, `clean_ts()`, `is_valid_fix()` — these gate all data into the DB
|
||||
- `test_api_signing.py` — `build_sign()` MD5 signature correctness
|
||||
- `test_field_mapping.py` — locks in the three most bug-prone field mappings:
|
||||
- FIX-E06: poll alarms use `alertTypeId`/`alarmTypeName`/`alertTime` (not `alarmType`)
|
||||
- FIX-M16: trip distance arrives in metres, stored as km (÷ 1000)
|
||||
- BUG-03: BCD timestamps `YYMMDDHHmmss` parsed correctly
|
||||
|
||||
**Integration tests** (real TimescaleDB):
|
||||
- `test_movement_pipeline.py` — `poll_live_positions()` full round-trip, UPSERT idempotency
|
||||
- `test_events_pipeline.py` — `poll_alarms()` field mapping, NULL alarm_type rejection
|
||||
- `test_webhook_endpoints.py` — FastAPI endpoints with mock Jimi payloads, SAVEPOINT isolation
|
||||
|
||||
### Workstream 3 — DB Audit
|
||||
|
||||
**Runner:** `db_audit/run_audit.py` (Python)
|
||||
**Trigger:** Daily at 06:00 EAT (03:00 UTC) via scheduled Forgejo workflow + `workflow_dispatch` for manual runs
|
||||
**Output:** Rows written to `tracksolid.health_checks` table; queryable from Grafana
|
||||
|
||||
Six health checks:
|
||||
|
||||
| Check | File | Critical | Warning |
|
||||
|---|---|---|---|
|
||||
| Stale devices | `stale_devices.sql` | — | Any enabled device with no GPS fix >2h |
|
||||
| NULL integrity | `null_integrity.sql` | Any NULL imei or gps_time in telemetry tables | — |
|
||||
| Distance outliers | `distance_outliers.sql` | — | Any trip >500km or <0km in last 7 days |
|
||||
| Duplicate positions | `duplicate_positions.sql` | Any (imei, gps_time) duplicate in position_history | — |
|
||||
| Data gaps | `data_gaps.sql` | — | Any enabled device with no data in 7 days |
|
||||
| Enum drift | `enum_drift.sql` | — | Unexpected value in source/severity columns |
|
||||
|
||||
Exit code: `1` on any `critical`, `0` on `ok`/`warning`.
|
||||
|
||||
---
|
||||
|
||||
## File Layout
|
||||
|
||||
```
|
||||
55_ts_coolify_gemini_prod/
|
||||
├── pyproject.toml ← ADD: ruff + mypy + pytest config + dev deps
|
||||
├── .forgejo/
|
||||
│ └── workflows/
|
||||
│ ├── ci-static.yml
|
||||
│ ├── ci-tests.yml
|
||||
│ └── scheduled-audit.yml
|
||||
├── tests/
|
||||
│ ├── conftest.py
|
||||
│ ├── fixtures/
|
||||
│ │ ├── api_responses.py
|
||||
│ │ └── schema.sql
|
||||
│ ├── unit/
|
||||
│ │ ├── test_clean_helpers.py
|
||||
│ │ ├── test_api_signing.py
|
||||
│ │ └── test_field_mapping.py
|
||||
│ └── integration/
|
||||
│ ├── test_movement_pipeline.py
|
||||
│ ├── test_events_pipeline.py
|
||||
│ └── test_webhook_endpoints.py
|
||||
└── db_audit/
|
||||
├── run_audit.py
|
||||
├── checks/
|
||||
│ ├── stale_devices.sql
|
||||
│ ├── null_integrity.sql
|
||||
│ ├── distance_outliers.sql
|
||||
│ ├── duplicate_positions.sql
|
||||
│ ├── data_gaps.sql
|
||||
│ └── enum_drift.sql
|
||||
└── schema/
|
||||
└── health_checks_table.sql
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Forgejo Runner Setup
|
||||
|
||||
Before CI can run, a self-hosted runner must be registered on the Coolify server:
|
||||
|
||||
1. Forgejo → Settings → Actions → Runners → Register Runner → copy token
|
||||
2. On Coolify server: `docker run -d --name forgejo-runner gitea/act_runner:latest register --instance https://repo.rahamafresh.com --token <TOKEN> --name coolify-runner --labels self-hosted`
|
||||
3. Verify runner appears as active in Forgejo
|
||||
|
||||
Required Forgejo secrets:
|
||||
- `DATABASE_URL` — production DB connection string (for scheduled audit)
|
||||
- `TEST_DATABASE_URL` — set automatically by CI service container
|
||||
|
||||
---
|
||||
|
||||
## Verification
|
||||
|
||||
| Workstream | Pass Criteria |
|
||||
|---|---|
|
||||
| Static Analysis | Push triggers CI-static; ruff + mypy produce output report; job exits non-zero on violations |
|
||||
| Test Suite | Push triggers CI-tests; all unit tests pass; integration tests pass against service container DB |
|
||||
| DB Audit | Manual run populates `health_checks` table; findings match known issues (44 silent devices, etc.); scheduled run fires at 06:00 EAT |
|
||||
232
import_drivers_csv.py
Normal file
232
import_drivers_csv.py
Normal file
|
|
@ -0,0 +1,232 @@
|
|||
"""
|
||||
import_drivers_csv.py — Fireside Communications · Driver & Vehicle CSV Import
|
||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||
One-shot script: reads 20260414_FS__Logistics - final_fixed.csv, compares
|
||||
each row against the current tracksolid.devices values, and updates the DB.
|
||||
|
||||
Usage:
|
||||
# Dry-run — shows diff, writes nothing
|
||||
python import_drivers_csv.py
|
||||
|
||||
# Filter to a single IMEI (dry-run)
|
||||
python import_drivers_csv.py --imei 862798052707896
|
||||
|
||||
# Apply all changes to DB
|
||||
python import_drivers_csv.py --apply
|
||||
|
||||
# Only fill fields that are currently NULL in the DB (never overwrite)
|
||||
python import_drivers_csv.py --only-null --apply
|
||||
|
||||
Pre-requisite:
|
||||
Migration 06 must be applied first (adds assigned_city / cost_centre columns).
|
||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import csv
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
from datetime import date
|
||||
from pathlib import Path
|
||||
|
||||
from ts_shared_rev import clean, clean_num, clean_ts, get_conn, get_logger
|
||||
|
||||
log = get_logger("csv_import")
|
||||
|
||||
CSV_PATH = Path(__file__).parent / "20260414_FS__Logistics - final_fixed.csv"
|
||||
|
||||
# Columns fetched from DB for comparison
|
||||
DB_COLS = [
|
||||
"imei", "driver_name", "driver_phone", "vehicle_number", "vehicle_name",
|
||||
"vehicle_models", "cost_centre", "sim", "iccid", "imsi", "mc_type",
|
||||
"activation_time", "expiration", "device_name", "assigned_city",
|
||||
]
|
||||
|
||||
# Driver Name values that are placeholders — skip writing driver_name for these
|
||||
_DRIVER_SKIP = {"identification", "ug"}
|
||||
|
||||
|
||||
def _infer_city(plate: str) -> str | None:
|
||||
"""Derive assigned_city from license plate prefix."""
|
||||
p = (plate or "").strip().upper()
|
||||
if p.startswith("UMA") or p.startswith("UAG"):
|
||||
return "KLA"
|
||||
if p.startswith("K"):
|
||||
return "NBO"
|
||||
return None
|
||||
|
||||
|
||||
def _clean_date(v: str) -> str | None:
|
||||
"""Accept YYYY-MM-DD and return as ISO string suitable for TIMESTAMPTZ cast."""
|
||||
s = (v or "").strip()
|
||||
if not s:
|
||||
return None
|
||||
try:
|
||||
date.fromisoformat(s)
|
||||
return s
|
||||
except ValueError:
|
||||
return None
|
||||
|
||||
|
||||
def load_csv() -> dict[str, dict]:
|
||||
"""Load CSV into a dict keyed by IMEI."""
|
||||
rows: dict[str, dict] = {}
|
||||
with open(CSV_PATH, encoding="utf-8-sig", newline="") as f:
|
||||
for row in csv.DictReader(f):
|
||||
imei = (row.get("IMEI") or "").strip()
|
||||
if not imei:
|
||||
continue
|
||||
rows[imei] = row
|
||||
log.info("CSV loaded: %d rows from %s", len(rows), CSV_PATH.name)
|
||||
return rows
|
||||
|
||||
|
||||
def load_db_devices() -> dict[str, dict]:
|
||||
"""Fetch current device rows from DB, keyed by IMEI."""
|
||||
devices: dict[str, dict] = {}
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(f"SELECT {', '.join(DB_COLS)} FROM tracksolid.devices")
|
||||
col_names = [d[0] for d in cur.description]
|
||||
for row in cur.fetchall():
|
||||
rec = dict(zip(col_names, row))
|
||||
devices[rec["imei"]] = rec
|
||||
log.info("DB loaded: %d devices", len(devices))
|
||||
return devices
|
||||
|
||||
|
||||
def build_update(csv_row: dict, db_row: dict | None, only_null: bool) -> dict[str, object]:
|
||||
"""
|
||||
Return a dict of column→new_value for fields that need updating.
|
||||
When only_null=True, skip any DB column that already has a value.
|
||||
The driver_name column is skipped for placeholder-labelled devices.
|
||||
"""
|
||||
driver_raw = clean(csv_row.get("Driver Name")) or ""
|
||||
plate = clean(csv_row.get("License Plate No.")) or ""
|
||||
is_placeholder = driver_raw.lower() in _DRIVER_SKIP
|
||||
skip_row = driver_raw.lower() == "identification"
|
||||
|
||||
if skip_row:
|
||||
return {}
|
||||
|
||||
proposed: dict[str, object] = {
|
||||
"vehicle_number": clean(plate),
|
||||
"vehicle_name": clean(plate),
|
||||
"vehicle_models": clean(csv_row.get("Vehicle Model")),
|
||||
"cost_centre": clean(csv_row.get("Department")),
|
||||
"sim": clean(csv_row.get("SIM")),
|
||||
"iccid": clean(csv_row.get("ICCID")),
|
||||
"imsi": clean(csv_row.get("IMSI")),
|
||||
"mc_type": clean(csv_row.get("Model")),
|
||||
"activation_time": _clean_date(csv_row.get("Activated Date", "")),
|
||||
"expiration": _clean_date(csv_row.get("Subscription Expiration", "")),
|
||||
"driver_phone": clean(csv_row.get("Telephone")),
|
||||
"assigned_city": _infer_city(plate),
|
||||
}
|
||||
if not is_placeholder:
|
||||
proposed["driver_name"] = driver_raw or None
|
||||
|
||||
# Drop None values — no point sending a NULL to overwrite another NULL
|
||||
proposed = {k: v for k, v in proposed.items() if v is not None}
|
||||
|
||||
if not only_null or db_row is None:
|
||||
return proposed
|
||||
|
||||
# only_null: drop any column that already has a non-null value in the DB
|
||||
return {
|
||||
k: v for k, v in proposed.items()
|
||||
if db_row.get(k) is None
|
||||
}
|
||||
|
||||
|
||||
def print_diff(imei: str, updates: dict[str, object], db_row: dict | None) -> None:
|
||||
"""Pretty-print what will change for one device."""
|
||||
if not updates:
|
||||
return
|
||||
db = db_row or {}
|
||||
print(f"\n IMEI {imei}:")
|
||||
for col, new_val in sorted(updates.items()):
|
||||
old_val = db.get(col)
|
||||
if old_val != new_val:
|
||||
print(f" {col:<20} {str(old_val):<30} → {new_val}")
|
||||
|
||||
|
||||
def run(apply: bool, only_null: bool, filter_imei: str | None) -> None:
|
||||
csv_rows = load_csv()
|
||||
db_rows = load_db_devices()
|
||||
|
||||
if filter_imei:
|
||||
csv_rows = {k: v for k, v in csv_rows.items() if k == filter_imei}
|
||||
if not csv_rows:
|
||||
print(f"IMEI {filter_imei} not found in CSV.")
|
||||
return
|
||||
|
||||
updated = skipped = no_change = not_in_db = 0
|
||||
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
for imei, csv_row in csv_rows.items():
|
||||
db_row = db_rows.get(imei)
|
||||
|
||||
updates = build_update(csv_row, db_row, only_null)
|
||||
|
||||
if not updates:
|
||||
# Either an "Identification" placeholder or nothing to change
|
||||
driver_raw = (csv_row.get("Driver Name") or "").strip().lower()
|
||||
if driver_raw == "identification":
|
||||
skipped += 1
|
||||
else:
|
||||
no_change += 1
|
||||
continue
|
||||
|
||||
if db_row is None:
|
||||
not_in_db += 1
|
||||
log.warning("IMEI %s in CSV but NOT in DB — skipping.", imei)
|
||||
continue
|
||||
|
||||
print_diff(imei, updates, db_row)
|
||||
|
||||
if apply:
|
||||
set_clauses = []
|
||||
params = []
|
||||
for col, val in updates.items():
|
||||
if col in ("activation_time", "expiration"):
|
||||
set_clauses.append(f"{col} = COALESCE(%s::TIMESTAMPTZ, {col})")
|
||||
else:
|
||||
set_clauses.append(
|
||||
f"{col} = COALESCE(NULLIF(%s, ''), {col})"
|
||||
)
|
||||
params.append(str(val) if val is not None else None)
|
||||
|
||||
set_clauses.append("updated_at = NOW()")
|
||||
params.append(imei)
|
||||
|
||||
cur.execute(
|
||||
f"UPDATE tracksolid.devices SET {', '.join(set_clauses)} WHERE imei = %s",
|
||||
params,
|
||||
)
|
||||
updated += 1
|
||||
else:
|
||||
updated += 1 # count as "would update" in dry-run
|
||||
|
||||
mode = "APPLIED" if apply else "DRY-RUN"
|
||||
print(f"\n{'='*60}")
|
||||
print(f" {mode} COMPLETE")
|
||||
print(f"{'='*60}")
|
||||
print(f" Would update / updated : {updated}")
|
||||
print(f" No change needed : {no_change}")
|
||||
print(f" Skipped (Identification): {skipped}")
|
||||
print(f" IMEI not in DB : {not_in_db}")
|
||||
if not apply:
|
||||
print("\n Run with --apply to commit changes.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
parser = argparse.ArgumentParser(description="Import driver/vehicle details from CSV into tracksolid.devices")
|
||||
parser.add_argument("--apply", action="store_true", help="Write changes to DB (default: dry-run)")
|
||||
parser.add_argument("--only-null", action="store_true", help="Only update fields currently NULL in the DB")
|
||||
parser.add_argument("--imei", default=None, help="Limit to a single IMEI")
|
||||
args = parser.parse_args()
|
||||
|
||||
run(apply=args.apply, only_null=args.only_null, filter_imei=args.imei)
|
||||
|
|
@ -52,6 +52,8 @@ def poll_alarms():
|
|||
start_ts = end_ts - timedelta(minutes=30) # Look back 30m to ensure coverage
|
||||
inserted = 0
|
||||
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
for i in range(0, len(imeis), 50):
|
||||
batch = imeis[i:i+50]
|
||||
resp = api_post("jimi.device.alarm.list", {
|
||||
|
|
@ -64,9 +66,9 @@ def poll_alarms():
|
|||
alarms = resp.get("result") or []
|
||||
if not alarms: continue
|
||||
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
for a in alarms:
|
||||
try:
|
||||
cur.execute("SAVEPOINT sp")
|
||||
lat, lng = clean_num(a.get("lat")), clean_num(a.get("lng"))
|
||||
# [FIX-E06] Poll response uses alertTypeId/alarmTypeName/alertTime,
|
||||
# not alarmType/alarmName/alarmTime (those are webhook push field names).
|
||||
|
|
@ -90,10 +92,14 @@ def poll_alarms():
|
|||
lng, lat, lng, lat, lat, lng,
|
||||
clean_num(a.get("speed")), clean(a.get("accStatus"))
|
||||
))
|
||||
inserted += 1
|
||||
cur.execute("RELEASE SAVEPOINT sp")
|
||||
inserted += cur.rowcount
|
||||
except Exception:
|
||||
cur.execute("ROLLBACK TO SAVEPOINT sp")
|
||||
log.warning("Failed to process alarm for %s", a.get("imei"), exc_info=True)
|
||||
|
||||
log_ingestion(cur, "jimi.device.alarm.list", len(batch), 0, inserted, int((time.time()-t0)*1000), True)
|
||||
conn.commit()
|
||||
log_ingestion(cur, "jimi.device.alarm.list", len(imeis), 0, inserted,
|
||||
int((time.time()-t0)*1000), True)
|
||||
|
||||
log.info("Alarms: %d new events inserted.", inserted)
|
||||
|
||||
|
|
|
|||
|
|
@ -34,8 +34,11 @@ REVISIONS (QA-Verified):
|
|||
"""
|
||||
|
||||
import time
|
||||
from concurrent.futures import ThreadPoolExecutor
|
||||
|
||||
import schedule
|
||||
from datetime import datetime, timezone, timedelta
|
||||
from psycopg2.extras import execute_values
|
||||
|
||||
from ts_shared_rev import (
|
||||
TARGET_ACCOUNT,
|
||||
|
|
@ -70,14 +73,24 @@ def sync_devices():
|
|||
devices = resp.get("result") or []
|
||||
upserted = 0
|
||||
|
||||
# Fetch per-device detail in parallel — previously an N+1 blocker where
|
||||
# 80 devices × ~300 ms/call ≈ 24 s serial. 8 workers brings it to ~3 s.
|
||||
# Gated at 8 to stay under API rate-limit (1006) headroom.
|
||||
def _fetch_detail(imei: str) -> dict:
|
||||
detail_resp = api_post("jimi.track.device.detail", {"imei": imei}, token)
|
||||
return detail_resp.get("result") or {} if detail_resp.get("code") == 0 else {}
|
||||
|
||||
imeis = [d.get("imei") for d in devices if d.get("imei")]
|
||||
with ThreadPoolExecutor(max_workers=8) as pool:
|
||||
details = dict(zip(imeis, pool.map(_fetch_detail, imeis)))
|
||||
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
for d in devices:
|
||||
imei = d.get("imei")
|
||||
if not imei: continue
|
||||
|
||||
detail_resp = api_post("jimi.track.device.detail", {"imei": imei}, token)
|
||||
dtl = detail_resp.get("result") or {} if detail_resp.get("code") == 0 else {}
|
||||
dtl = details.get(imei, {})
|
||||
|
||||
cur.execute("""
|
||||
INSERT INTO tracksolid.devices (
|
||||
|
|
@ -94,11 +107,32 @@ def sync_devices():
|
|||
)
|
||||
ON CONFLICT (imei) DO UPDATE SET
|
||||
device_name = EXCLUDED.device_name,
|
||||
mc_type = EXCLUDED.mc_type,
|
||||
mc_type_use_scope = EXCLUDED.mc_type_use_scope,
|
||||
vehicle_name = EXCLUDED.vehicle_name,
|
||||
vehicle_number = EXCLUDED.vehicle_number,
|
||||
vehicle_models = EXCLUDED.vehicle_models,
|
||||
vehicle_icon = EXCLUDED.vehicle_icon,
|
||||
vin = EXCLUDED.vin,
|
||||
engine_number = EXCLUDED.engine_number,
|
||||
vehicle_brand = EXCLUDED.vehicle_brand,
|
||||
fuel_100km = EXCLUDED.fuel_100km,
|
||||
driver_name = EXCLUDED.driver_name,
|
||||
driver_phone = EXCLUDED.driver_phone,
|
||||
sim = EXCLUDED.sim,
|
||||
iccid = EXCLUDED.iccid,
|
||||
imsi = EXCLUDED.imsi,
|
||||
account = EXCLUDED.account,
|
||||
customer_name = EXCLUDED.customer_name,
|
||||
device_group_id = EXCLUDED.device_group_id,
|
||||
device_group = EXCLUDED.device_group,
|
||||
activation_time = EXCLUDED.activation_time,
|
||||
expiration = EXCLUDED.expiration,
|
||||
enabled_flag = EXCLUDED.enabled_flag,
|
||||
status = EXCLUDED.status,
|
||||
current_mileage_km = EXCLUDED.current_mileage_km,
|
||||
last_synced_at = NOW(), updated_at = NOW()
|
||||
last_synced_at = NOW(),
|
||||
updated_at = NOW()
|
||||
""", (
|
||||
imei, clean(d.get("deviceName")), clean(d.get("mcType")), clean(d.get("mcTypeUseScope")),
|
||||
clean(d.get("vehicleName")), clean(d.get("vehicleNumber")), clean(d.get("vehicleModels")), clean(d.get("vehicleIcon")),
|
||||
|
|
@ -129,8 +163,19 @@ def poll_live_positions():
|
|||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
for p in positions:
|
||||
try:
|
||||
cur.execute("SAVEPOINT sp")
|
||||
imei, lat, lng = p.get("imei"), clean_num(p.get("lat")), clean_num(p.get("lng"))
|
||||
if not imei or not is_valid_fix(lat, lng): continue
|
||||
if not imei or not is_valid_fix(lat, lng):
|
||||
cur.execute("RELEASE SAVEPOINT sp")
|
||||
continue
|
||||
|
||||
gps_time = clean_ts(p.get("gpsTime"))
|
||||
speed = clean_num(p.get("speed"))
|
||||
direction = clean_num(p.get("direction"))
|
||||
acc_status = clean(p.get("accStatus"))
|
||||
gps_num = clean_int(p.get("gpsNum"))
|
||||
current_mileage = clean_num(p.get("currentMileage"))
|
||||
|
||||
cur.execute("""
|
||||
INSERT INTO tracksolid.live_positions (
|
||||
|
|
@ -149,29 +194,33 @@ def poll_live_positions():
|
|||
updated_at=NOW()
|
||||
""", (
|
||||
imei, lng, lat, lat, lng, clean(p.get("posType")), clean_int(p.get("confidence")),
|
||||
clean_ts(p.get("gpsTime")), clean_ts(p.get("hbTime")), clean_num(p.get("speed")),
|
||||
clean_num(p.get("direction")), clean(p.get("accStatus")), clean_int(p.get("gpsSignal")),
|
||||
clean_int(p.get("gpsNum")), clean_num(p.get("electQuantity")), clean_num(p.get("powerValue")),
|
||||
gps_time, clean_ts(p.get("hbTime")), speed,
|
||||
direction, acc_status, clean_int(p.get("gpsSignal")),
|
||||
gps_num, clean_num(p.get("electQuantity")), clean_num(p.get("powerValue")),
|
||||
clean_num(p.get("batteryPowerVal")), clean(p.get("trackerOil")), clean_num(p.get("temperature")),
|
||||
clean_num(p.get("currentMileage")), clean(p.get("status")), clean(p.get("locDesc"))
|
||||
current_mileage, clean(p.get("status")), clean(p.get("locDesc"))
|
||||
))
|
||||
upserted += 1
|
||||
upserted += cur.rowcount
|
||||
|
||||
# History (Hypertable Source)
|
||||
if clean_ts(p.get("gpsTime")):
|
||||
if gps_time:
|
||||
cur.execute("""
|
||||
INSERT INTO tracksolid.position_history (imei, gps_time, geom, lat, lng, speed, direction, acc_status, satellite, current_mileage)
|
||||
VALUES (%s, %s, ST_SetSRID(ST_MakePoint(%s, %s), 4326), %s, %s, %s, %s, %s, %s, %s)
|
||||
ON CONFLICT (imei, gps_time) DO NOTHING
|
||||
""", (imei, clean_ts(p.get("gpsTime")), lng, lat, lat, lng, clean_num(p.get("speed")), clean_num(p.get("direction")), clean(p.get("accStatus")), clean_int(p.get("gpsNum")), clean_num(p.get("currentMileage"))))
|
||||
inserted += 1
|
||||
""", (imei, gps_time, lng, lat, lat, lng, speed, direction, acc_status, gps_num, current_mileage))
|
||||
inserted += cur.rowcount
|
||||
cur.execute("RELEASE SAVEPOINT sp")
|
||||
except Exception:
|
||||
cur.execute("ROLLBACK TO SAVEPOINT sp")
|
||||
log.warning("Failed to process live position for %s", p.get("imei"), exc_info=True)
|
||||
|
||||
log_ingestion(cur, "jimi.user.device.location.list", len(positions), upserted, inserted, int((time.time()-t0)*1000), True)
|
||||
conn.commit()
|
||||
|
||||
# ── 3. Trip Reports (Every 15m) ───────────────────────────────────────────────
|
||||
|
||||
def poll_trips():
|
||||
t0 = time.time()
|
||||
token, imeis = get_token(), get_active_imeis()
|
||||
if not token or not imeis: return
|
||||
|
||||
|
|
@ -179,6 +228,8 @@ def poll_trips():
|
|||
start_ts = end_ts - timedelta(hours=1)
|
||||
inserted = 0
|
||||
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
for i in range(0, len(imeis), 50):
|
||||
batch = imeis[i:i+50]
|
||||
resp = api_post("jimi.device.track.mileage", {
|
||||
|
|
@ -188,12 +239,15 @@ def poll_trips():
|
|||
}, token)
|
||||
|
||||
trips = resp.get("result") or []
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
for t in trips:
|
||||
# [FIX-M11] API returns distance in km. Store directly as distance_km.
|
||||
# Previous code multiplied by 1000 (→ mm), which was wrong.
|
||||
dist_km = clean_num(t.get("distance"))
|
||||
try:
|
||||
cur.execute("SAVEPOINT sp")
|
||||
# [FIX-M16] API returns distance in METRES despite documentation saying km.
|
||||
# Confirmed via: avgSpeed(km/h) × runTimeSecond / 3600 == distance/1000.
|
||||
# startMileage/endMileage are cumulative odometer in metres (same unit).
|
||||
# Divide by 1000 to store as distance_km.
|
||||
raw_dist = clean_num(t.get("distance"))
|
||||
dist_km = round(raw_dist / 1000.0, 4) if raw_dist is not None else None
|
||||
cur.execute("""
|
||||
INSERT INTO tracksolid.trips (
|
||||
imei, start_time, end_time, distance_km,
|
||||
|
|
@ -209,8 +263,14 @@ def poll_trips():
|
|||
dist_km, clean_num(t.get("avgSpeed")),
|
||||
clean_num(t.get("maxSpeed")), clean_int(t.get("runTimeSecond"))
|
||||
))
|
||||
inserted += 1
|
||||
conn.commit()
|
||||
cur.execute("RELEASE SAVEPOINT sp")
|
||||
inserted += cur.rowcount
|
||||
except Exception:
|
||||
cur.execute("ROLLBACK TO SAVEPOINT sp")
|
||||
log.warning("Failed to process trip for %s", t.get("imei"), exc_info=True)
|
||||
|
||||
log_ingestion(cur, "jimi.device.track.mileage", len(imeis), 0, inserted,
|
||||
int((time.time() - t0) * 1000), True)
|
||||
log.info("Trips: %d records processed.", inserted)
|
||||
|
||||
# ── 4. Parking Events (Every 15m) ─────────────────────────────────────────────
|
||||
|
|
@ -224,6 +284,8 @@ def poll_parking():
|
|||
start_ts = end_ts - timedelta(hours=1)
|
||||
inserted = 0
|
||||
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
for i in range(0, len(imeis), 50):
|
||||
batch = imeis[i:i+50]
|
||||
# [FIX-M13] Added account + acc_type=0 (all stop types). Without these
|
||||
|
|
@ -237,12 +299,13 @@ def poll_parking():
|
|||
}, token)
|
||||
|
||||
events = resp.get("result") or []
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
for p in events:
|
||||
try:
|
||||
cur.execute("SAVEPOINT sp")
|
||||
imei = p.get("imei")
|
||||
start_time = clean_ts(p.get("startTime"))
|
||||
if not imei or not start_time:
|
||||
cur.execute("RELEASE SAVEPOINT sp")
|
||||
continue
|
||||
lat, lng = clean_num(p.get("lat")), clean_num(p.get("lng"))
|
||||
cur.execute("""
|
||||
|
|
@ -262,8 +325,13 @@ def poll_parking():
|
|||
lng, lat, lng, lat,
|
||||
clean(p.get("address"))
|
||||
))
|
||||
inserted += 1
|
||||
log_ingestion(cur, "jimi.open.platform.report.parking", len(batch), 0, inserted,
|
||||
cur.execute("RELEASE SAVEPOINT sp")
|
||||
inserted += cur.rowcount
|
||||
except Exception:
|
||||
cur.execute("ROLLBACK TO SAVEPOINT sp")
|
||||
log.warning("Failed to process parking for %s", p.get("imei"), exc_info=True)
|
||||
|
||||
log_ingestion(cur, "jimi.open.platform.report.parking", len(imeis), 0, inserted,
|
||||
int((time.time() - t0) * 1000), True)
|
||||
log.info("Parking: %d events processed.", inserted)
|
||||
|
||||
|
|
@ -292,42 +360,37 @@ def poll_track_list():
|
|||
|
||||
end_ts = datetime.now(timezone.utc)
|
||||
start_ts = end_ts - timedelta(minutes=35) # 5-min overlap avoids boundary gaps
|
||||
total_inserted = 0
|
||||
devices_with_data = 0
|
||||
begin_str = start_ts.strftime("%Y-%m-%d %H:%M:%S")
|
||||
end_str = end_ts.strftime("%Y-%m-%d %H:%M:%S")
|
||||
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
for imei in imeis:
|
||||
# Phase 1: fetch waypoints from API without holding a DB connection.
|
||||
# jimi.device.track.list is per-IMEI; parallelise at 4 workers to speed
|
||||
# up the 30 min sweep without tripping the 1006 rate limit.
|
||||
def _fetch(imei: str):
|
||||
resp = api_post("jimi.device.track.list", {
|
||||
"imei": imei,
|
||||
"begin_time": start_ts.strftime("%Y-%m-%d %H:%M:%S"),
|
||||
"end_time": end_ts.strftime("%Y-%m-%d %H:%M:%S"),
|
||||
"begin_time": begin_str,
|
||||
"end_time": end_str,
|
||||
"map_type": "GOOGLE",
|
||||
}, token)
|
||||
return imei, resp.get("result") or []
|
||||
|
||||
waypoints = resp.get("result") or []
|
||||
if not waypoints:
|
||||
continue
|
||||
with ThreadPoolExecutor(max_workers=4) as pool:
|
||||
fetched = list(pool.map(_fetch, imeis))
|
||||
|
||||
inserted = 0
|
||||
# Phase 2: write rows in one DB transaction.
|
||||
total_inserted = 0
|
||||
devices_with_data = 0
|
||||
rows = []
|
||||
for imei, waypoints in fetched:
|
||||
device_rows = 0
|
||||
for wp in waypoints:
|
||||
lat = clean_num(wp.get("lat"))
|
||||
lng = clean_num(wp.get("lng"))
|
||||
gps_time = clean_ts(wp.get("gpsTime"))
|
||||
if not is_valid_fix(lat, lng) or not gps_time:
|
||||
continue
|
||||
|
||||
cur.execute("""
|
||||
INSERT INTO tracksolid.position_history (
|
||||
imei, gps_time, geom, lat, lng,
|
||||
speed, direction, acc_status, source
|
||||
) VALUES (
|
||||
%s, %s,
|
||||
ST_SetSRID(ST_MakePoint(%s, %s), 4326),
|
||||
%s, %s, %s, %s, %s, 'track_list'
|
||||
)
|
||||
ON CONFLICT (imei, gps_time) DO NOTHING
|
||||
""", (
|
||||
rows.append((
|
||||
imei, gps_time,
|
||||
lng, lat, # ST_MakePoint(lng, lat)
|
||||
lat, lng, # lat, lng columns
|
||||
|
|
@ -335,15 +398,35 @@ def poll_track_list():
|
|||
clean_num(wp.get("direction")),
|
||||
clean(wp.get("accStatus")),
|
||||
))
|
||||
inserted += 1
|
||||
|
||||
if inserted:
|
||||
total_inserted += inserted
|
||||
device_rows += 1
|
||||
if device_rows:
|
||||
devices_with_data += 1
|
||||
|
||||
if rows:
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
execute_values(
|
||||
cur,
|
||||
"""
|
||||
INSERT INTO tracksolid.position_history (
|
||||
imei, gps_time, geom, lat, lng,
|
||||
speed, direction, acc_status, source
|
||||
) VALUES %s
|
||||
ON CONFLICT (imei, gps_time) DO NOTHING
|
||||
""",
|
||||
rows,
|
||||
template="(%s, %s, ST_SetSRID(ST_MakePoint(%s, %s), 4326),"
|
||||
" %s, %s, %s, %s, %s, 'track_list')",
|
||||
page_size=500,
|
||||
)
|
||||
total_inserted = cur.rowcount
|
||||
log_ingestion(cur, "jimi.device.track.list", len(imeis),
|
||||
0, total_inserted, int((time.time() - t0) * 1000), True)
|
||||
conn.commit()
|
||||
else:
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
log_ingestion(cur, "jimi.device.track.list", len(imeis),
|
||||
0, 0, int((time.time() - t0) * 1000), True)
|
||||
|
||||
log.info("Track list: %d waypoints inserted across %d/%d devices.",
|
||||
total_inserted, devices_with_data, len(imeis))
|
||||
|
|
|
|||
|
|
@ -28,3 +28,23 @@ managed = true
|
|||
|
||||
[tool.uv.sources]
|
||||
# Optional: If you ever have custom local modules or git-based private libs
|
||||
|
||||
[project.optional-dependencies]
|
||||
dev = [
|
||||
"ruff>=0.4",
|
||||
"mypy>=1.10",
|
||||
"pytest>=8",
|
||||
"pytest-asyncio>=0.23",
|
||||
"httpx>=0.27",
|
||||
]
|
||||
|
||||
[tool.ruff]
|
||||
target-version = "py312"
|
||||
line-length = 100
|
||||
select = ["E", "W", "F", "B", "UP", "SIM"]
|
||||
|
||||
[tool.mypy]
|
||||
python_version = "3.12"
|
||||
warn_return_any = true
|
||||
warn_unused_ignores = true
|
||||
ignore_missing_imports = true
|
||||
214
sync_driver_audit.py
Normal file
214
sync_driver_audit.py
Normal file
|
|
@ -0,0 +1,214 @@
|
|||
"""
|
||||
sync_driver_audit.py — Fireside Communications · Driver & IMEI Audit Sync
|
||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||
One-shot script: fetches ALL devices from Tracksolid API, compares driver
|
||||
and IMEI details against the DB, reports gaps, and populates missing data.
|
||||
|
||||
Run inside the container:
|
||||
docker exec -it <ingest_movement_container> python sync_driver_audit.py
|
||||
|
||||
Or via Coolify terminal with env vars loaded.
|
||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||
"""
|
||||
|
||||
import time
|
||||
from concurrent.futures import ThreadPoolExecutor
|
||||
|
||||
from ts_shared_rev import (
|
||||
TARGET_ACCOUNT,
|
||||
api_post,
|
||||
get_conn,
|
||||
get_token,
|
||||
clean,
|
||||
clean_num,
|
||||
clean_int,
|
||||
clean_ts,
|
||||
get_logger,
|
||||
)
|
||||
|
||||
log = get_logger("driver_audit")
|
||||
|
||||
|
||||
def run_audit():
|
||||
log.info("=== Driver & IMEI Audit Sync ===")
|
||||
t0 = time.time()
|
||||
token = get_token()
|
||||
if not token:
|
||||
log.error("Could not obtain API token. Check credentials.")
|
||||
return
|
||||
|
||||
# 1. Fetch all devices from API
|
||||
resp = api_post("jimi.user.device.list", {"target": TARGET_ACCOUNT}, token)
|
||||
if resp.get("code") != 0:
|
||||
log.error("API error: %s", resp)
|
||||
return
|
||||
|
||||
api_devices = resp.get("result") or []
|
||||
log.info("API returned %d devices.", len(api_devices))
|
||||
|
||||
# 2. Fetch current DB state
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
SELECT imei, device_name, driver_name, driver_phone, sim, status
|
||||
FROM tracksolid.devices
|
||||
ORDER BY imei
|
||||
""")
|
||||
db_rows = {row[0]: {
|
||||
"device_name": row[1],
|
||||
"driver_name": row[2],
|
||||
"driver_phone": row[3],
|
||||
"sim": row[4],
|
||||
"status": row[5],
|
||||
} for row in cur.fetchall()}
|
||||
|
||||
log.info("DB has %d devices registered.", len(db_rows))
|
||||
|
||||
# 3. Compare and report gaps
|
||||
api_imeis = set()
|
||||
missing_from_db = []
|
||||
driver_gaps = []
|
||||
driver_phone_gaps = []
|
||||
|
||||
for d in api_devices:
|
||||
imei = d.get("imei")
|
||||
if not imei:
|
||||
continue
|
||||
api_imeis.add(imei)
|
||||
|
||||
if imei not in db_rows:
|
||||
missing_from_db.append(imei)
|
||||
else:
|
||||
db = db_rows[imei]
|
||||
if not db["driver_name"] and clean(d.get("driverName")):
|
||||
driver_gaps.append((imei, clean(d.get("driverName"))))
|
||||
if not db["driver_phone"] and clean(d.get("driverPhone")):
|
||||
driver_phone_gaps.append((imei, clean(d.get("driverPhone"))))
|
||||
|
||||
orphaned_in_db = set(db_rows.keys()) - api_imeis
|
||||
|
||||
# 4. Print gap report
|
||||
print("\n" + "="*60)
|
||||
print("AUDIT REPORT")
|
||||
print("="*60)
|
||||
print(f" API devices : {len(api_imeis)}")
|
||||
print(f" DB devices : {len(db_rows)}")
|
||||
print(f" New (API only): {len(missing_from_db)}")
|
||||
print(f" Orphaned (DB) : {len(orphaned_in_db)}")
|
||||
print(f" Missing driver_name (API has, DB null): {len(driver_gaps)}")
|
||||
print(f" Missing driver_phone (API has, DB null): {len(driver_phone_gaps)}")
|
||||
|
||||
if missing_from_db:
|
||||
print(f"\nIMEIs NOT in DB ({len(missing_from_db)}):")
|
||||
for imei in missing_from_db:
|
||||
print(f" {imei}")
|
||||
|
||||
if driver_gaps:
|
||||
print(f"\nDevices missing driver_name in DB ({len(driver_gaps)}):")
|
||||
for imei, name in driver_gaps:
|
||||
print(f" {imei} → '{name}'")
|
||||
|
||||
if driver_phone_gaps:
|
||||
print(f"\nDevices missing driver_phone in DB ({len(driver_phone_gaps)}):")
|
||||
for imei, phone in driver_phone_gaps:
|
||||
print(f" {imei} → '{phone}'")
|
||||
|
||||
if orphaned_in_db:
|
||||
print(f"\nIMEIs in DB but NOT in API (orphaned/deactivated) ({len(orphaned_in_db)}):")
|
||||
for imei in sorted(orphaned_in_db):
|
||||
print(f" {imei}")
|
||||
|
||||
print("="*60)
|
||||
|
||||
# 5. Upsert ALL devices with full field sync (including driver info)
|
||||
log.info("Starting full upsert of %d devices...", len(api_devices))
|
||||
upserted = 0
|
||||
|
||||
# Parallelize the per-device detail lookups (see ingest_movement.sync_devices).
|
||||
def _fetch_detail(imei: str) -> dict:
|
||||
detail_resp = api_post("jimi.track.device.detail", {"imei": imei}, token)
|
||||
return detail_resp.get("result") or {} if detail_resp.get("code") == 0 else {}
|
||||
|
||||
imeis_to_fetch = [d.get("imei") for d in api_devices if d.get("imei")]
|
||||
with ThreadPoolExecutor(max_workers=8) as pool:
|
||||
details = dict(zip(imeis_to_fetch, pool.map(_fetch_detail, imeis_to_fetch)))
|
||||
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
for d in api_devices:
|
||||
imei = d.get("imei")
|
||||
if not imei:
|
||||
continue
|
||||
|
||||
dtl = details.get(imei, {})
|
||||
|
||||
cur.execute("""
|
||||
INSERT INTO tracksolid.devices (
|
||||
imei, device_name, mc_type, mc_type_use_scope,
|
||||
vehicle_name, vehicle_number, vehicle_models, vehicle_icon,
|
||||
vin, engine_number, vehicle_brand, fuel_100km,
|
||||
driver_name, driver_phone, sim, iccid, imsi,
|
||||
account, customer_name, device_group_id, device_group,
|
||||
activation_time, expiration, enabled_flag, status,
|
||||
current_mileage_km, last_synced_at
|
||||
) VALUES (
|
||||
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s,
|
||||
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, NOW()
|
||||
)
|
||||
ON CONFLICT (imei) DO UPDATE SET
|
||||
device_name = EXCLUDED.device_name,
|
||||
mc_type = EXCLUDED.mc_type,
|
||||
mc_type_use_scope = EXCLUDED.mc_type_use_scope,
|
||||
vehicle_name = EXCLUDED.vehicle_name,
|
||||
vehicle_number = EXCLUDED.vehicle_number,
|
||||
vehicle_models = EXCLUDED.vehicle_models,
|
||||
vehicle_icon = EXCLUDED.vehicle_icon,
|
||||
vin = EXCLUDED.vin,
|
||||
engine_number = EXCLUDED.engine_number,
|
||||
vehicle_brand = EXCLUDED.vehicle_brand,
|
||||
fuel_100km = EXCLUDED.fuel_100km,
|
||||
driver_name = EXCLUDED.driver_name,
|
||||
driver_phone = EXCLUDED.driver_phone,
|
||||
sim = EXCLUDED.sim,
|
||||
iccid = EXCLUDED.iccid,
|
||||
imsi = EXCLUDED.imsi,
|
||||
account = EXCLUDED.account,
|
||||
customer_name = EXCLUDED.customer_name,
|
||||
device_group_id = EXCLUDED.device_group_id,
|
||||
device_group = EXCLUDED.device_group,
|
||||
activation_time = EXCLUDED.activation_time,
|
||||
expiration = EXCLUDED.expiration,
|
||||
enabled_flag = EXCLUDED.enabled_flag,
|
||||
status = EXCLUDED.status,
|
||||
current_mileage_km = EXCLUDED.current_mileage_km,
|
||||
last_synced_at = NOW(),
|
||||
updated_at = NOW()
|
||||
""", (
|
||||
imei,
|
||||
clean(d.get("deviceName")), clean(d.get("mcType")),
|
||||
clean(d.get("mcTypeUseScope")), clean(d.get("vehicleName")),
|
||||
clean(d.get("vehicleNumber")), clean(d.get("vehicleModels")),
|
||||
clean(d.get("vehicleIcon")),
|
||||
clean(dtl.get("vin")), clean(dtl.get("engineNumber")),
|
||||
clean(dtl.get("vehicleBrand")), clean_num(dtl.get("fuel_100km")),
|
||||
clean(d.get("driverName")), clean(d.get("driverPhone")),
|
||||
clean(d.get("sim")), clean(dtl.get("iccid")),
|
||||
clean(dtl.get("imsi")),
|
||||
clean(dtl.get("account")), clean(dtl.get("customerName")),
|
||||
clean(d.get("deviceGroupId")), clean(d.get("deviceGroup")),
|
||||
clean_ts(d.get("activationTime")), clean_ts(d.get("expiration")),
|
||||
clean_int(d.get("enabledFlag", 1)),
|
||||
clean(dtl.get("status", "active")),
|
||||
clean_num(dtl.get("currentMileage")),
|
||||
))
|
||||
upserted += 1
|
||||
|
||||
conn.commit()
|
||||
|
||||
elapsed = int((time.time() - t0) * 1000)
|
||||
log.info("Done. Upserted %d devices in %dms.", upserted, elapsed)
|
||||
print(f"\nSync complete: {upserted} devices upserted in {elapsed}ms.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
run_audit()
|
||||
0
tests/__init__.py
Normal file
0
tests/__init__.py
Normal file
0
tests/fixtures/__init__.py
vendored
Normal file
0
tests/fixtures/__init__.py
vendored
Normal file
109
tests/fixtures/api_responses.py
vendored
Normal file
109
tests/fixtures/api_responses.py
vendored
Normal file
|
|
@ -0,0 +1,109 @@
|
|||
"""Mock Tracksolid Pro API responses for testing."""
|
||||
|
||||
# jimi.user.device.location.list response
|
||||
LIVE_POSITIONS_RESPONSE = {
|
||||
"code": 0,
|
||||
"result": [
|
||||
{
|
||||
"imei": "123456789012345",
|
||||
"lat": -1.2921,
|
||||
"lng": 36.8219,
|
||||
"speed": 45.5,
|
||||
"direction": 180,
|
||||
"gpsTime": "2024-04-12 08:00:00",
|
||||
"hbTime": "2024-04-12 08:00:05",
|
||||
"accStatus": "1",
|
||||
"gpsSignal": 4,
|
||||
"gpsNum": 8,
|
||||
"currentMileage": 1234.5,
|
||||
"posType": "GPS",
|
||||
"confidence": 95,
|
||||
"status": "1",
|
||||
"locDesc": "Nairobi CBD",
|
||||
},
|
||||
{
|
||||
# Zero Island — should be filtered by is_valid_fix
|
||||
"imei": "999999999999999",
|
||||
"lat": 0.0,
|
||||
"lng": 0.0,
|
||||
"speed": 0,
|
||||
"gpsTime": "2024-04-12 08:00:00",
|
||||
},
|
||||
]
|
||||
}
|
||||
|
||||
# jimi.device.track.mileage response (distance in METRES — FIX-M16)
|
||||
TRIPS_RESPONSE = {
|
||||
"code": 0,
|
||||
"result": [
|
||||
{
|
||||
"imei": "123456789012345",
|
||||
"startTime": "2024-04-12 07:00:00",
|
||||
"endTime": "2024-04-12 08:00:00",
|
||||
"distance": 15000, # 15000 METRES = 15.0 km
|
||||
"avgSpeed": 15.0,
|
||||
"maxSpeed": 60.0,
|
||||
"runTimeSecond": 3600,
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
# jimi.device.alarm.list response (FIX-E06: uses alertTypeId, not alarmType)
|
||||
ALARMS_RESPONSE = {
|
||||
"code": 0,
|
||||
"result": [
|
||||
{
|
||||
"imei": "123456789012345",
|
||||
"alertTypeId": "4", # poll field name
|
||||
"alarmTypeName": "Speeding", # poll field name
|
||||
"alertTime": "2024-04-12 07:30:00", # poll field name
|
||||
"lat": -1.2921,
|
||||
"lng": 36.8219,
|
||||
"speed": 95.0,
|
||||
"accStatus": "1",
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
# Webhook /pushalarm payload (uses alarmType, not alertTypeId)
|
||||
WEBHOOK_ALARM_PAYLOAD = {
|
||||
"deviceImei": "123456789012345",
|
||||
"alarmType": "4",
|
||||
"alarmName": "Speeding",
|
||||
"gateTime": "2024-04-12 07:30:00",
|
||||
"lat": -1.2921,
|
||||
"lng": 36.8219,
|
||||
"speed": 95.0,
|
||||
}
|
||||
|
||||
# Webhook /pushtripreport payload (BCD timestamp — BUG-03)
|
||||
WEBHOOK_TRIP_BCD_PAYLOAD = {
|
||||
"deviceImei": "123456789012345",
|
||||
"beginTime": "220415103000", # BCD YYMMDDHHmmss = 2022-04-15 10:30:00
|
||||
"endTime": "220415113000", # BCD YYMMDDHHmmss = 2022-04-15 11:30:00
|
||||
"miles": 12.5,
|
||||
"beginLat": -1.2921,
|
||||
"beginLng": 36.8219,
|
||||
"endLat": -1.3000,
|
||||
"endLng": 36.8300,
|
||||
}
|
||||
|
||||
WEBHOOK_TRIP_ISO_PAYLOAD = {
|
||||
"deviceImei": "123456789012345",
|
||||
"beginTime": "2024-04-12 07:00:00",
|
||||
"endTime": "2024-04-12 08:00:00",
|
||||
"miles": 15.5,
|
||||
}
|
||||
|
||||
# Webhook /pushobd payload
|
||||
WEBHOOK_OBD_PAYLOAD = {
|
||||
"deviceImei": "123456789012345",
|
||||
"obdJson": '{"event_time": 1712908800, "AccState": 1, "statusFlags": 0, "lat": -1.2921, "lng": 36.8219}',
|
||||
}
|
||||
|
||||
# Alarm with NULL alarm_type (BUG-02 guard)
|
||||
WEBHOOK_ALARM_NULL_TYPE = {
|
||||
"deviceImei": "123456789012345",
|
||||
"alarmType": None,
|
||||
"gateTime": "2024-04-12 07:30:00",
|
||||
}
|
||||
0
tests/integration/__init__.py
Normal file
0
tests/integration/__init__.py
Normal file
133
tests/integration/test_webhook_endpoints.py
Normal file
133
tests/integration/test_webhook_endpoints.py
Normal file
|
|
@ -0,0 +1,133 @@
|
|||
"""Integration tests for FastAPI webhook endpoints."""
|
||||
import sys
|
||||
import os
|
||||
import json
|
||||
import pytest
|
||||
from unittest.mock import MagicMock, patch, call
|
||||
from contextlib import contextmanager
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.dirname(__file__))))
|
||||
|
||||
os.environ.setdefault("TRACKSOLID_APP_KEY", "test_key")
|
||||
os.environ.setdefault("TRACKSOLID_APP_SECRET", "test_secret")
|
||||
os.environ.setdefault("TRACKSOLID_USER_ID", "test_user")
|
||||
os.environ.setdefault("TRACKSOLID_PWD_MD5", "test_md5")
|
||||
os.environ.setdefault("DATABASE_URL", "postgresql://test:test@localhost:5432/test")
|
||||
os.environ.setdefault("JIMI_WEBHOOK_TOKEN", "")
|
||||
|
||||
from fastapi.testclient import TestClient
|
||||
import webhook_receiver_rev
|
||||
from tests.fixtures.api_responses import (
|
||||
WEBHOOK_ALARM_PAYLOAD,
|
||||
WEBHOOK_ALARM_NULL_TYPE,
|
||||
WEBHOOK_TRIP_BCD_PAYLOAD,
|
||||
WEBHOOK_TRIP_ISO_PAYLOAD,
|
||||
WEBHOOK_OBD_PAYLOAD,
|
||||
)
|
||||
|
||||
|
||||
def make_mock_conn():
|
||||
"""Create a mock DB connection with cursor support."""
|
||||
mock_cur = MagicMock()
|
||||
mock_conn = MagicMock()
|
||||
mock_conn.cursor.return_value.__enter__ = lambda s: mock_cur
|
||||
mock_conn.cursor.return_value.__exit__ = MagicMock(return_value=False)
|
||||
return mock_conn, mock_cur
|
||||
|
||||
|
||||
@contextmanager
|
||||
def mock_get_conn_ctx(mock_conn):
|
||||
yield mock_conn
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client():
|
||||
return TestClient(webhook_receiver_rev.app, raise_server_exceptions=True)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_db():
|
||||
mock_conn, mock_cur = make_mock_conn()
|
||||
with patch("webhook_receiver_rev.get_conn") as mock_get_conn:
|
||||
mock_get_conn.return_value = mock_get_conn_ctx(mock_conn)
|
||||
yield mock_conn, mock_cur
|
||||
|
||||
|
||||
class TestHealth:
|
||||
def test_health_returns_ok(self, client):
|
||||
response = client.get("/health")
|
||||
assert response.status_code == 200
|
||||
assert response.json() == {"status": "ok"}
|
||||
|
||||
|
||||
class TestPushAlarm:
|
||||
def test_valid_alarm_accepted(self, client, mock_db):
|
||||
mock_conn, mock_cur = mock_db
|
||||
data_list = json.dumps([WEBHOOK_ALARM_PAYLOAD])
|
||||
response = client.post("/pushalarm", data={"token": "", "data_list": data_list})
|
||||
assert response.status_code == 200
|
||||
assert response.json()["code"] == 0
|
||||
|
||||
def test_null_alarm_type_skipped(self, client, mock_db):
|
||||
"""BUG-02 guard: NULL alarm_type must be rejected, not inserted."""
|
||||
mock_conn, mock_cur = mock_db
|
||||
data_list = json.dumps([WEBHOOK_ALARM_NULL_TYPE])
|
||||
response = client.post("/pushalarm", data={"token": "", "data_list": data_list})
|
||||
assert response.status_code == 200
|
||||
# Verify no data INSERT was executed. log_ingestion always writes one
|
||||
# row to tracksolid.ingestion_log — exclude it from the assertion.
|
||||
data_inserts = [
|
||||
c for c in mock_cur.execute.call_args_list
|
||||
if "INSERT" in str(c) and "ingestion_log" not in str(c)
|
||||
]
|
||||
assert len(data_inserts) == 0, "NULL alarm_type must not be inserted"
|
||||
|
||||
def test_empty_data_list_ok(self, client):
|
||||
response = client.post("/pushalarm", data={"token": "", "data_list": ""})
|
||||
assert response.status_code == 200
|
||||
|
||||
def test_batch_with_bad_item_processes_rest(self, client, mock_db):
|
||||
"""BUG-04: One bad item must not abort the entire batch."""
|
||||
mock_conn, mock_cur = mock_db
|
||||
# One valid, one missing alarm_type (will be skipped, not crash)
|
||||
items = [WEBHOOK_ALARM_PAYLOAD, WEBHOOK_ALARM_NULL_TYPE]
|
||||
data_list = json.dumps(items)
|
||||
response = client.post("/pushalarm", data={"token": "", "data_list": data_list})
|
||||
assert response.status_code == 200
|
||||
assert response.json()["code"] == 0
|
||||
|
||||
|
||||
class TestPushTripReport:
|
||||
def test_bcd_timestamp_parsed(self, client, mock_db):
|
||||
"""BUG-03: BCD timestamp 220415103000 must be parsed correctly."""
|
||||
mock_conn, mock_cur = mock_db
|
||||
data_list = json.dumps([WEBHOOK_TRIP_BCD_PAYLOAD])
|
||||
response = client.post("/pushtripreport", data={"token": "", "data_list": data_list})
|
||||
assert response.status_code == 200
|
||||
assert response.json()["code"] == 0
|
||||
# Verify an INSERT was attempted
|
||||
insert_calls = [c for c in mock_cur.execute.call_args_list
|
||||
if "INSERT" in str(c)]
|
||||
assert len(insert_calls) > 0, "Trip with BCD timestamp must trigger INSERT"
|
||||
|
||||
def test_iso_timestamp_accepted(self, client, mock_db):
|
||||
mock_conn, mock_cur = mock_db
|
||||
data_list = json.dumps([WEBHOOK_TRIP_ISO_PAYLOAD])
|
||||
response = client.post("/pushtripreport", data={"token": "", "data_list": data_list})
|
||||
assert response.status_code == 200
|
||||
|
||||
def test_missing_imei_skipped(self, client, mock_db):
|
||||
mock_conn, mock_cur = mock_db
|
||||
bad_trip = {"beginTime": "2024-04-12 07:00:00", "miles": 10.0}
|
||||
data_list = json.dumps([bad_trip])
|
||||
response = client.post("/pushtripreport", data={"token": "", "data_list": data_list})
|
||||
assert response.status_code == 200
|
||||
|
||||
|
||||
class TestPushObd:
|
||||
def test_valid_obd_accepted(self, client, mock_db):
|
||||
mock_conn, mock_cur = mock_db
|
||||
data_list = json.dumps([WEBHOOK_OBD_PAYLOAD])
|
||||
response = client.post("/pushobd", data={"token": "", "data_list": data_list})
|
||||
assert response.status_code == 200
|
||||
assert response.json()["code"] == 0
|
||||
0
tests/unit/__init__.py
Normal file
0
tests/unit/__init__.py
Normal file
60
tests/unit/test_api_signing.py
Normal file
60
tests/unit/test_api_signing.py
Normal file
|
|
@ -0,0 +1,60 @@
|
|||
"""Unit tests for Tracksolid API MD5 signature generation."""
|
||||
import sys
|
||||
import os
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.dirname(__file__))))
|
||||
|
||||
os.environ.setdefault("TRACKSOLID_APP_KEY", "test_key")
|
||||
os.environ.setdefault("TRACKSOLID_APP_SECRET", "test_secret")
|
||||
os.environ.setdefault("TRACKSOLID_USER_ID", "test_user")
|
||||
os.environ.setdefault("TRACKSOLID_PWD_MD5", "test_md5")
|
||||
os.environ.setdefault("DATABASE_URL", "postgresql://test:test@localhost:5432/test")
|
||||
|
||||
from ts_shared_rev import build_sign
|
||||
|
||||
|
||||
class TestBuildSign:
|
||||
def test_basic_signature(self):
|
||||
"""Known input + secret produces expected MD5."""
|
||||
params = {"method": "jimi.test", "app_key": "mykey", "v": "1.0"}
|
||||
secret = "mysecret"
|
||||
result = build_sign(params, secret)
|
||||
# Verify it's a 32-char uppercase hex string
|
||||
assert len(result) == 32
|
||||
assert result == result.upper()
|
||||
assert all(c in "0123456789ABCDEF" for c in result)
|
||||
|
||||
def test_sign_key_excluded(self):
|
||||
"""The 'sign' key itself must be excluded from signing."""
|
||||
params_with = {"method": "test", "sign": "old_sign", "v": "1.0"}
|
||||
params_without = {"method": "test", "v": "1.0"}
|
||||
secret = "secret"
|
||||
assert build_sign(params_with, secret) == build_sign(params_without, secret)
|
||||
|
||||
def test_none_values_excluded(self):
|
||||
"""Keys with None values are excluded from signing."""
|
||||
params_with_none = {"method": "test", "optional": None, "v": "1.0"}
|
||||
params_without_none = {"method": "test", "v": "1.0"}
|
||||
secret = "secret"
|
||||
assert build_sign(params_with_none, secret) == build_sign(params_without_none, secret)
|
||||
|
||||
def test_alphabetical_key_ordering(self):
|
||||
"""Keys are sorted alphabetically for consistent signing."""
|
||||
params_abc = {"a": "1", "b": "2", "c": "3"}
|
||||
params_cba = {"c": "3", "b": "2", "a": "1"}
|
||||
secret = "secret"
|
||||
assert build_sign(params_abc, secret) == build_sign(params_cba, secret)
|
||||
|
||||
def test_different_secrets_produce_different_signs(self):
|
||||
params = {"method": "test"}
|
||||
assert build_sign(params, "secret1") != build_sign(params, "secret2")
|
||||
|
||||
def test_known_hash(self):
|
||||
"""Verify against a manually computed hash."""
|
||||
import hashlib
|
||||
params = {"app_key": "ABC", "method": "test", "v": "1.0"}
|
||||
secret = "XYZ"
|
||||
sorted_keys = sorted(params.keys())
|
||||
raw = secret + "".join(f"{k}{params[k]}" for k in sorted_keys) + secret
|
||||
expected = hashlib.md5(raw.encode("utf-8")).hexdigest().upper()
|
||||
assert build_sign(params, secret) == expected
|
||||
125
tests/unit/test_clean_helpers.py
Normal file
125
tests/unit/test_clean_helpers.py
Normal file
|
|
@ -0,0 +1,125 @@
|
|||
"""Unit tests for ts_shared_rev data cleaning helpers."""
|
||||
import sys
|
||||
import os
|
||||
import pytest
|
||||
|
||||
# Add parent directory to path so we can import ts_shared_rev
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.dirname(__file__))))
|
||||
|
||||
# Set required env vars before import
|
||||
os.environ.setdefault("TRACKSOLID_APP_KEY", "test_key")
|
||||
os.environ.setdefault("TRACKSOLID_APP_SECRET", "test_secret")
|
||||
os.environ.setdefault("TRACKSOLID_USER_ID", "test_user")
|
||||
os.environ.setdefault("TRACKSOLID_PWD_MD5", "test_md5")
|
||||
os.environ.setdefault("DATABASE_URL", "postgresql://test:test@localhost:5432/test")
|
||||
|
||||
from ts_shared_rev import clean, clean_num, clean_int, clean_ts, is_valid_fix
|
||||
|
||||
|
||||
class TestClean:
|
||||
def test_none_returns_none(self):
|
||||
assert clean(None) is None
|
||||
|
||||
def test_empty_string_returns_none(self):
|
||||
assert clean("") is None
|
||||
|
||||
def test_whitespace_only_returns_none(self):
|
||||
assert clean(" ") is None
|
||||
|
||||
def test_normal_string_preserved(self):
|
||||
assert clean("hello") == "hello"
|
||||
|
||||
def test_strips_whitespace(self):
|
||||
assert clean(" hello ") == "hello"
|
||||
|
||||
def test_non_string_converted(self):
|
||||
assert clean(123) == "123"
|
||||
|
||||
def test_zero_preserved(self):
|
||||
assert clean(0) == "0"
|
||||
|
||||
|
||||
class TestCleanNum:
|
||||
def test_valid_float_string(self):
|
||||
assert clean_num("3.14") == pytest.approx(3.14)
|
||||
|
||||
def test_valid_integer_string(self):
|
||||
assert clean_num("42") == pytest.approx(42.0)
|
||||
|
||||
def test_non_numeric_returns_none(self):
|
||||
assert clean_num("abc") is None
|
||||
|
||||
def test_none_returns_none(self):
|
||||
assert clean_num(None) is None
|
||||
|
||||
def test_empty_string_returns_none(self):
|
||||
assert clean_num("") is None
|
||||
|
||||
def test_numeric_value_passthrough(self):
|
||||
assert clean_num(45.5) == pytest.approx(45.5)
|
||||
|
||||
def test_negative_value(self):
|
||||
assert clean_num("-1.5") == pytest.approx(-1.5)
|
||||
|
||||
|
||||
class TestCleanInt:
|
||||
def test_integer_string(self):
|
||||
assert clean_int("42") == 42
|
||||
|
||||
def test_float_string_truncates(self):
|
||||
assert clean_int("3.9") == 3
|
||||
|
||||
def test_non_numeric_returns_none(self):
|
||||
assert clean_int("abc") is None
|
||||
|
||||
def test_none_returns_none(self):
|
||||
assert clean_int(None) is None
|
||||
|
||||
|
||||
class TestCleanTs:
|
||||
def test_valid_iso_timestamp(self):
|
||||
result = clean_ts("2024-04-12 08:00:00")
|
||||
assert result == "2024-04-12 08:00:00"
|
||||
|
||||
def test_valid_iso_with_timezone(self):
|
||||
result = clean_ts("2024-04-12T08:00:00Z")
|
||||
assert result is not None
|
||||
|
||||
def test_garbage_returns_none(self):
|
||||
assert clean_ts("not-a-date") is None
|
||||
|
||||
def test_none_returns_none(self):
|
||||
assert clean_ts(None) is None
|
||||
|
||||
def test_empty_string_returns_none(self):
|
||||
assert clean_ts("") is None
|
||||
|
||||
def test_bcd_format_returns_none(self):
|
||||
# BCD format YYMMDDHHmmss is NOT handled by clean_ts (only by _parse_trip_ts)
|
||||
assert clean_ts("220415103000") is None
|
||||
|
||||
|
||||
class TestIsValidFix:
|
||||
def test_zero_island_filtered(self):
|
||||
assert is_valid_fix(0.0, 0.0) is False
|
||||
|
||||
def test_valid_nairobi_coords(self):
|
||||
assert is_valid_fix(-1.2921, 36.8219) is True
|
||||
|
||||
def test_none_lat_returns_false(self):
|
||||
assert is_valid_fix(None, 36.8219) is False
|
||||
|
||||
def test_none_lng_returns_false(self):
|
||||
assert is_valid_fix(-1.2921, None) is False
|
||||
|
||||
def test_out_of_range_lat(self):
|
||||
assert is_valid_fix(91.0, 36.8219) is False
|
||||
|
||||
def test_out_of_range_lng(self):
|
||||
assert is_valid_fix(-1.2921, 181.0) is False
|
||||
|
||||
def test_valid_extreme_coords(self):
|
||||
assert is_valid_fix(90.0, 180.0) is True
|
||||
|
||||
def test_string_coords_accepted(self):
|
||||
assert is_valid_fix("-1.2921", "36.8219") is True
|
||||
150
tests/unit/test_field_mapping.py
Normal file
150
tests/unit/test_field_mapping.py
Normal file
|
|
@ -0,0 +1,150 @@
|
|||
"""Unit tests locking in known field mapping fixes (FIX-E06, FIX-M16, BUG-03)."""
|
||||
import sys
|
||||
import os
|
||||
import pytest
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.dirname(__file__))))
|
||||
|
||||
os.environ.setdefault("TRACKSOLID_APP_KEY", "test_key")
|
||||
os.environ.setdefault("TRACKSOLID_APP_SECRET", "test_secret")
|
||||
os.environ.setdefault("TRACKSOLID_USER_ID", "test_user")
|
||||
os.environ.setdefault("TRACKSOLID_PWD_MD5", "test_md5")
|
||||
os.environ.setdefault("DATABASE_URL", "postgresql://test:test@localhost:5432/test")
|
||||
|
||||
from ts_shared_rev import clean, clean_ts, clean_num
|
||||
from webhook_receiver_rev import _parse_trip_ts, unix_to_ts
|
||||
|
||||
|
||||
class TestFIXE06AlarmFieldMapping:
|
||||
"""FIX-E06: Poll alarm endpoint uses alertTypeId/alarmTypeName/alertTime."""
|
||||
|
||||
def test_poll_uses_alert_type_id(self):
|
||||
"""Alarm poll response must use alertTypeId, not alarmType."""
|
||||
api_alarm = {
|
||||
"imei": "123456789012345",
|
||||
"alertTypeId": "4", # CORRECT poll field
|
||||
"alarmType": "WRONG_FIELD", # webhook field - should NOT be used for polls
|
||||
"alarmTypeName": "Speeding",
|
||||
"alertTime": "2024-04-12 07:30:00",
|
||||
}
|
||||
# FIX-E06: extract using alertTypeId (poll field name)
|
||||
alarm_type = clean(api_alarm.get("alertTypeId"))
|
||||
assert alarm_type == "4", "Must use alertTypeId not alarmType for poll responses"
|
||||
|
||||
def test_poll_uses_alarm_type_name(self):
|
||||
"""Alarm name must come from alarmTypeName, not alarmName."""
|
||||
api_alarm = {
|
||||
"alertTypeId": "4",
|
||||
"alarmTypeName": "Speeding", # CORRECT poll field
|
||||
"alarmName": "WRONG_FIELD", # webhook field
|
||||
"alertTime": "2024-04-12 07:30:00",
|
||||
}
|
||||
alarm_name = clean(api_alarm.get("alarmTypeName"))
|
||||
assert alarm_name == "Speeding"
|
||||
|
||||
def test_poll_uses_alert_time(self):
|
||||
"""Alarm time must come from alertTime, not alarmTime."""
|
||||
api_alarm = {
|
||||
"alertTypeId": "4",
|
||||
"alarmTypeName": "Speeding",
|
||||
"alertTime": "2024-04-12 07:30:00", # CORRECT poll field
|
||||
"alarmTime": "WRONG_FIELD", # webhook field
|
||||
}
|
||||
alarm_time = clean_ts(api_alarm.get("alertTime"))
|
||||
assert alarm_time == "2024-04-12 07:30:00"
|
||||
|
||||
def test_wrong_field_names_return_none(self):
|
||||
"""Using incorrect webhook field names on poll data returns None (the bug)."""
|
||||
api_alarm = {"alertTypeId": "4", "alarmTypeName": "Speeding", "alertTime": "2024-04-12 07:30:00"}
|
||||
# These are webhook fields — should NOT be present in poll responses
|
||||
assert clean(api_alarm.get("alarmType")) is None
|
||||
assert clean(api_alarm.get("alarmName")) is None
|
||||
assert clean_ts(api_alarm.get("alarmTime")) is None
|
||||
|
||||
|
||||
class TestFIXM16DistanceUnits:
|
||||
"""FIX-M16: Trip distance arrives in METRES from API, must be stored as km."""
|
||||
|
||||
def test_metres_divided_by_1000(self):
|
||||
"""15000 metres from API → 15.0 km stored."""
|
||||
raw_dist_metres = 15000
|
||||
dist_km = round(raw_dist_metres / 1000.0, 4)
|
||||
assert dist_km == pytest.approx(15.0)
|
||||
|
||||
def test_small_distance(self):
|
||||
"""500 metres → 0.5 km."""
|
||||
assert round(500 / 1000.0, 4) == pytest.approx(0.5)
|
||||
|
||||
def test_none_distance(self):
|
||||
"""None distance stays None (no division by zero)."""
|
||||
raw_dist = clean_num(None)
|
||||
dist_km = round(raw_dist / 1000.0, 4) if raw_dist is not None else None
|
||||
assert dist_km is None
|
||||
|
||||
def test_zero_distance(self):
|
||||
"""0 metres → 0.0 km."""
|
||||
raw_dist = clean_num(0)
|
||||
dist_km = round(raw_dist / 1000.0, 4) if raw_dist is not None else None
|
||||
assert dist_km == pytest.approx(0.0)
|
||||
|
||||
def test_non_divided_would_be_wrong(self):
|
||||
"""Verify that NOT dividing produces obviously wrong km values."""
|
||||
raw_dist_metres = 15000
|
||||
# Without fix: storing raw value as km
|
||||
wrong_km = raw_dist_metres
|
||||
# With fix: correct km
|
||||
correct_km = raw_dist_metres / 1000.0
|
||||
assert wrong_km == 15000 # Would mean 15,000 km trip — clearly wrong
|
||||
assert correct_km == 15.0
|
||||
|
||||
|
||||
class TestBUG03TripTimestamps:
|
||||
"""BUG-03: Trip timestamps may be BCD format YYMMDDHHmmss or ISO string."""
|
||||
|
||||
def test_bcd_12_char_format(self):
|
||||
"""220415103000 → 2022-04-15 10:30:00."""
|
||||
result = _parse_trip_ts("220415103000")
|
||||
assert result == "2022-04-15 10:30:00"
|
||||
|
||||
def test_bcd_14_char_format(self):
|
||||
"""20220415103000 → 2022-04-15 10:30:00."""
|
||||
result = _parse_trip_ts("20220415103000")
|
||||
assert result == "2022-04-15 10:30:00"
|
||||
|
||||
def test_iso_string_passthrough(self):
|
||||
"""ISO string passes through unchanged."""
|
||||
result = _parse_trip_ts("2024-04-12 08:00:00")
|
||||
assert result == "2024-04-12 08:00:00"
|
||||
|
||||
def test_none_returns_none(self):
|
||||
assert _parse_trip_ts(None) is None
|
||||
|
||||
def test_garbage_returns_none(self):
|
||||
assert _parse_trip_ts("not-a-timestamp") is None
|
||||
|
||||
def test_bcd_year_20xx(self):
|
||||
"""24 prefix → 2024-xx-xx."""
|
||||
result = _parse_trip_ts("240412080000")
|
||||
assert result is not None
|
||||
assert result.startswith("2024-04-12")
|
||||
|
||||
|
||||
class TestUnixToTs:
|
||||
"""BUG-01: OBD event_time may be Unix epoch (seconds or milliseconds)."""
|
||||
|
||||
def test_unix_seconds(self):
|
||||
result = unix_to_ts(1712908800)
|
||||
assert result is not None
|
||||
assert "2024" in result
|
||||
|
||||
def test_unix_milliseconds(self):
|
||||
result = unix_to_ts(1712908800000) # ms — should be divided by 1000
|
||||
assert result is not None
|
||||
assert "2024" in result
|
||||
|
||||
def test_unix_seconds_matches_milliseconds(self):
|
||||
"""Seconds and milliseconds of same moment produce same result."""
|
||||
assert unix_to_ts(1712908800) == unix_to_ts(1712908800000)
|
||||
|
||||
def test_none_returns_none(self):
|
||||
assert unix_to_ts(None) is None
|
||||
|
|
@ -29,6 +29,7 @@ REVISIONS (QA-Verified):
|
|||
|
||||
from __future__ import annotations
|
||||
|
||||
import hmac
|
||||
import json
|
||||
import os
|
||||
import time
|
||||
|
|
@ -36,8 +37,13 @@ from contextlib import asynccontextmanager
|
|||
from datetime import datetime, timezone
|
||||
from typing import Optional
|
||||
|
||||
# Cap on items per webhook POST. Prevents a malformed/malicious push from
|
||||
# monopolising a worker or blowing the DB pool. Jimi normally sends ≤ 200.
|
||||
MAX_ITEMS_PER_POST = int(os.getenv("WEBHOOK_MAX_ITEMS", "5000"))
|
||||
|
||||
from fastapi import FastAPI, Form, HTTPException
|
||||
from fastapi.responses import JSONResponse
|
||||
from psycopg2.extras import execute_values
|
||||
|
||||
from ts_shared_rev import (
|
||||
close_pool,
|
||||
|
|
@ -75,7 +81,7 @@ SUCCESS = {"code": 0, "msg": "success"}
|
|||
|
||||
def _validate_token(token: str) -> None:
|
||||
"""Raise 403 if token is invalid. Skips validation if JIMI_WEBHOOK_TOKEN is empty."""
|
||||
if WEBHOOK_TOKEN and token != WEBHOOK_TOKEN:
|
||||
if WEBHOOK_TOKEN and not hmac.compare_digest(token, WEBHOOK_TOKEN):
|
||||
raise HTTPException(status_code=403, detail="Invalid token")
|
||||
|
||||
|
||||
|
|
@ -83,9 +89,12 @@ def _parse_data_list(raw: str) -> list[dict]:
|
|||
"""Parse the JSON string from Jimi's data_list form field."""
|
||||
try:
|
||||
parsed = json.loads(raw)
|
||||
if isinstance(parsed, list):
|
||||
return parsed
|
||||
return [parsed]
|
||||
items = parsed if isinstance(parsed, list) else [parsed]
|
||||
if len(items) > MAX_ITEMS_PER_POST:
|
||||
log.warning("data_list truncated: %d items exceeded cap of %d",
|
||||
len(items), MAX_ITEMS_PER_POST)
|
||||
items = items[:MAX_ITEMS_PER_POST]
|
||||
return items
|
||||
except (json.JSONDecodeError, TypeError):
|
||||
log.warning("Failed to parse data_list: %.200s", raw)
|
||||
return []
|
||||
|
|
@ -341,34 +350,19 @@ def push_gps(token: str = Form(""), data_list: str = Form("")):
|
|||
return JSONResponse(content=SUCCESS)
|
||||
|
||||
t0 = time.time()
|
||||
inserted = 0
|
||||
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
# Validation phase — pre-clean and filter without touching the DB.
|
||||
# Per-row INSERT with SAVEPOINT was ~1 ms/row overhead at this volume;
|
||||
# one batched execute_values is 10-50× faster for the same rows.
|
||||
rows = []
|
||||
for item in items:
|
||||
try:
|
||||
cur.execute("SAVEPOINT sp")
|
||||
imei = clean(item.get("deviceImei"))
|
||||
gps_time = clean_ts(item.get("gpsTime"))
|
||||
lat = clean_num(item.get("lat"))
|
||||
lng = clean_num(item.get("lng"))
|
||||
|
||||
if not imei or not gps_time or not is_valid_fix(lat, lng):
|
||||
cur.execute("RELEASE SAVEPOINT sp")
|
||||
continue
|
||||
|
||||
cur.execute("""
|
||||
INSERT INTO tracksolid.position_history (
|
||||
imei, gps_time, geom, lat, lng, speed, direction,
|
||||
acc_status, satellite, current_mileage,
|
||||
altitude, post_type, source
|
||||
) VALUES (
|
||||
%s, %s, ST_SetSRID(ST_MakePoint(%s, %s), 4326),
|
||||
%s, %s, %s, %s, %s, %s, %s, %s, %s, 'push'
|
||||
) ON CONFLICT (imei, gps_time) DO NOTHING
|
||||
""", (
|
||||
imei, gps_time, lng, lat,
|
||||
lat, lng,
|
||||
rows.append((
|
||||
imei, gps_time, lng, lat, lat, lng,
|
||||
clean_num(item.get("gpsSpeed")),
|
||||
clean_num(item.get("direction")),
|
||||
str(item.get("acc")) if item.get("acc") is not None else None,
|
||||
|
|
@ -377,16 +371,37 @@ def push_gps(token: str = Form(""), data_list: str = Form("")):
|
|||
clean_num(item.get("altitude")),
|
||||
clean_int(item.get("postType")),
|
||||
))
|
||||
cur.execute("RELEASE SAVEPOINT sp")
|
||||
inserted += 1
|
||||
except Exception:
|
||||
cur.execute("ROLLBACK TO SAVEPOINT sp")
|
||||
log.warning("Failed to process GPS for %s", item.get("deviceImei"), exc_info=True)
|
||||
|
||||
inserted = 0
|
||||
if rows:
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
execute_values(
|
||||
cur,
|
||||
"""
|
||||
INSERT INTO tracksolid.position_history (
|
||||
imei, gps_time, geom, lat, lng, speed, direction,
|
||||
acc_status, satellite, current_mileage,
|
||||
altitude, post_type, source
|
||||
) VALUES %s
|
||||
ON CONFLICT (imei, gps_time) DO NOTHING
|
||||
""",
|
||||
rows,
|
||||
template="(%s, %s, ST_SetSRID(ST_MakePoint(%s, %s), 4326),"
|
||||
" %s, %s, %s, %s, %s, %s, %s, %s, %s, 'push')",
|
||||
page_size=len(rows),
|
||||
)
|
||||
inserted = cur.rowcount
|
||||
log_ingestion(cur, "webhook/pushgps", len(items), 0, inserted,
|
||||
int((time.time() - t0) * 1000), True)
|
||||
else:
|
||||
# No valid rows, still record the call for observability.
|
||||
with get_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
log_ingestion(cur, "webhook/pushgps", len(items), 0, 0,
|
||||
int((time.time() - t0) * 1000), True)
|
||||
|
||||
log.info("pushgps: %d/%d items processed.", inserted, len(items))
|
||||
log.info("pushgps: %d/%d items inserted.", inserted, len(items))
|
||||
return JSONResponse(content=SUCCESS)
|
||||
|
||||
# ── 5. Device Heartbeats (Priority 2) ────────────────────────────────────────
|
||||
|
|
|
|||
Loading…
Reference in a new issue