Reception Desk

TOCAWebOffline2021Fullstack Developer

Languages

JavaScript

Frontend

LottieReactSCSS

Infrastructure

AWSAWS LambdaAWS S3

State & Data

AxiosProp TypesReduxYup

Build & Tooling

BabelWebpackYarn

Testing

JestTesting Library

Quality & Linting

ESLint

Auth & Integrations

Auth0HubSpotNetSuiteSnowflakeStripe

Summary

The Reception Desk solves the problem of managing bookings within a TOCA Social venue. The Schedule Management System defines and releases schedules, which become available to book on the TOCA Social website. The Reception Desk then picks up those bookings alongside the Kiosk System. Staff use it to modify game metadata (phone number for SMS via AWS SNS, player count, notes that sync to Playmaker Social), extend or reduce bookings, delete, refund via Stripe, and check people in. Once a booking starts, handover goes to Playmaker Social. We built it because off-the-shelf options like SevenRooms didn't offer 5-minute intervals (their shortest was 15 minutes) or the level of control we needed around peak times and multiple pricing tiers.

Problem/Context

Before this, there was no unified place for booking information. Staff jumped between spreadsheets, paper, Snowflake, HubSpot, and other systems. The main pain became performance: so much was happening on screen that keeping the app responsive was a constant focus.

Role/Contribution

I was the sole developer. I built everything and we launched. After that I was promoted to Information Systems Development Manager, hired a team for future updates, and orchestrated from a project management perspective. I kept ownership of pull requests and still contributed to the codebase when it needed attention. I'm proud of the drag-and-drop: it required heavy optimisation because cells animate as you drag over them to show where a booking would drop. requestAnimationFrame helped defer those animations so they didn't clog the event queue. Optimistic updates were another win: we'd update the UI immediately with good data, and if anything went wrong, we'd correct it as soon as we knew.

Performance

Performance was critical. The main dashboard showed a 24-hour day split into 5-minute intervals across 18 boxes. That's 5,184 cells (288 slots × 18 boxes) on the grid alone, plus sidebar queues, new bookings, and maintenance blocks that could be added per box or synced from external services. Every cell could change independently. If one updated, we couldn't afford to re-render the lot. We used React.memo so cells only re-rendered when their data changed, useCallback for stable function references, and memoisation across key components. We chunked data from the backend to avoid flooding the UI. A heartbeat pinged every second to maintain uptime, but we had to keep the event queue clear so incoming updates didn't stall. The app also had to support touchscreens: staff sometimes used a touchscreen instead of keyboard and mouse, so we handled mouse, keyboard, and touch events including drag-and-drop on touch.

Accessibility

The interface was built to WCAG 2.1.

Observability

CloudWatch and Pingdom monitored availability. Google Analytics tracked events. A lot of sales and booking data lived elsewhere: NetSuite, HubSpot, Snowflake. The sync system talked to those services.

Architecture

The timeline used CSS Grid for layout. Redux managed state at that scale: multiple stores for different data types. API contracts connected the frontend to four backend microservices (Booking, Payment, Session, Audit). The system polled every 60 seconds to keep data current across terminals, skipping updates when modal dialogs were open to avoid overwriting edits. I worked mostly on the frontend and contracts; I coordinated backend work from a project management angle rather than implementing it directly.

Testing

Jest and Testing Library covered the data side: receiving data, verifying it, and displaying it correctly. Tests checked how changes in the data affected the UI. For example, a double-box booking had to appear in both box rows with one offset by the right interval. A normal booking had to span the correct number of interval slots. Maintenance blocks had to span the right cells and show as red blockers. We mocked Snowflake and HubSpot responses and tested malformed data so we had peace of mind against bad input and basic security issues. We also used page speed checks and Redux DevTools.

Security

Auth0 handled authentication with role-based permissions. Admins had access to extra controls and could overwrite certain actions without a password prompt; that acted as a safeguard. Reception staff had a read-focused view with limited editing. Every change was logged with which user was logged in and when, so we could trace issues back and pair logs with CCTV. For an internal tool, we followed standard compliance.

Outcome/Impact

Processing time dropped because staff no longer had to switch between systems to find or update a booking. Before, extending a booking meant editing the database directly, which wasn't accessible to the people who needed it. The app reduced booking processing from 3–4 minutes to under 60 seconds and eliminated double-booking incidents through real-time conflict detection. Support tickets were often about network connectivity at the venue rather than the app itself. We monitored logs for API issues and crashes so we could act fast.