Schedule Management System

TOCAWebOffline2022Fullstack Developer

Languages

JavaScript

Frontend

LottieReactSCSS

Infrastructure

AWSAWS LambdaAWS S3CloudWatchGitHub ActionsGoogle AnalyticsPingdom

State & Data

AxiosProp TypesReduxYup

Build & Tooling

BabelWebpackYarn

Testing

Jest

Quality & Linting

ESLintPrettier

Summary

The Schedule Management System is an operations tool for TOCA Social venue managers. It lets them design, configure, and deploy scheduling templates that control availability and pricing of gaming boxes across venues. The visual timeline creates reusable patterns that apply operating hours, booking durations, and dynamic pricing tiers based on configurable recurrence rules. It was built for TOCA Social's launch. Research into off-the-shelf options like SevenRooms fell short, so we built a bespoke system as part of the suite that also includes the Reception Desk, Playmaker Social, and Fixtures. Venue managers are the only users: they control what slots are available to book in their venue, factoring in maintenance, inaccessible areas, and upgrades.

Role/Contribution

I worked with venue operations and finance teams to understand pricing models and scheduling requirements. I defined the template and schedule domain model, coordinated API contracts with backend teams, and managed sprint planning. On development, I designed the Redux store structure with redux-undo for undo/redo, built the API integration layer for schedules, box slots, pricing, and recurrence services, and developed the CSS Grid-based timeline visualisation. I'm proud of how this fit the aesthetic for TOCA at the time. Despite being heavily complex, it was viewed favourably and rarely had issues in production, thanks to extensive testing before deployment.

Architecture

The core interface displays all boxes as rows and time slots as columns. Drag-and-drop allows multi-select copying with collision detection. Redux was the best fit for the scale and complexity of the data. Redux-undo provides up to 10 steps of history, synchronising with the API on each change. We added it after feedback that managers sometimes saved too early or accidentally published when they shouldn't. The wait period lets you undo an action before it propagates. Linked box slots (spanning two adjacent boxes for larger groups) are handled atomically. The system supports weekly, monthly, and yearly recurrence patterns with intelligent handling of edge cases like leap years and varying month lengths. Times are stored in UTC; DST is handled elsewhere in the database.

A key feature is release scheduling: you can create timetables up to a year or two in advance, but only six months are released to the public at a time. A monthly job runs to release the new slots for the month six months ahead. Conflict detection runs periodically to catch overlapping bookings. Race conditions can occur when the system is altered elsewhere, so the UI surfaces visual inconsistencies when sync reveals conflicts. You can see who booked the conflicting slot and resolve it. Collision detection scripts run in the background, so no one has to watch the screen constantly.

Performance

The Schedule Management System faced many of the same performance challenges as the Reception Desk. The grid shows 18+ boxes across a full day in 5-minute intervals. Every cell could change independently, so we couldn't afford to re-render the lot. We used React.memo so cells only re-rendered when their data changed, useCallback for stable function references, and memoisation across key components. We chunked data from the backend to avoid flooding the UI.

Testing

Jest covered the main functionality: creating new schedules, scheduling their release, and verifying that specific classes were added to elements so animations fired when expected. Tests checked that cells updated correctly when pasting data. A lot of manual QA went into this because we had a strong QA team at the time. No Storybook or visual testing.

Observability

CloudWatch and Pingdom monitored uptime. Google Analytics tracked events and funnels: maintenance blocks added, whether they were modified after the fact, whether maintenance took longer than planned, and how that impacted other bookings. Custom dashboards were used internally.

CI/CD

Deployed via GitHub Actions to AWS. The pipeline ran linting and checks before deployment. Like everything else at TOCA at the time, it was serverless and hosted on AWS.

Security

Access required an admin-level account. Accounts were controlled through a Google organisation and linked to IAM roles on AWS. Only a select few people had access. A WAF sat in front of the application.

Accessibility

Accessibility wasn't a major focus because the tool was niche and other areas needed more attention. It did meet WCAG 2.1 standards for its time.

Outcome/Impact

The platform sped up the creation and management of schedules at TOCA. The biggest win was launching new venues quickly in a white-labelled way. Configuration made it easy to swap out venue ID and location, set who should have access for that venue, and adapt endpoints so the app could fetch data for that location only.