A football trading system V2 powered by GO.

Back in 2021 I published a side project which had been started in pandemic times. It was called Stock The Ball. I was playing the Fantasy Premier League  with friends back then. I had a very good first season but I could not avoid to think how unfair this game can be. This game turns a team game into an individualist game. Players score for their own performance but this is not entirely true. If you team falls by 10 goals, your striker gets 2 points for 90', whereas the defender gets probably some big negative value. That's not fair. In a team game every player matters. Don't get me wrong, I love the Fantasy! But this made me to start a side project. So the idea was simple: To consume in near realtime events fro every game and process those identifying players and type of event. Events increase or decrease the score of each player and my friends can buy or sell their assets (football player) within 90'. The solution was very nasty, quick and incomplete to be honest. But I enjoyed the process.

My very bad performance in this FPL season triggered me to improve that game. But now I wanted to learn a new tech (I had used Python, Grafana and InfluxDB back then). I wanted to learn Golang. The new project is called Anfield and in this post I am gonna share what I did, what I learned and what's next. Regardless the solution, implementation and architecture the goal remains and it is just once: learn GO. This is: coding, dependency integration (such DBs, external and public APIs, etc.).

Overview  


All the code in the project is GO code. Anfield consumes every near realtime live commentary from the Web (there're several public web sites doing this). The processor connects to the site and scrapes the HTML page using Rod driver. The processor sends the normalised data to a Kafka topic. The loader consumes from this topic and inserts the asset updates and match information (once at the beginning). Users start the Anfield Bot, and they can buy or sell assets within 90' of the game. In that period the asset score might go up or down based on a set of rules applied to the event. For example: a Goal rule would give X points and so on.

The list of rules is something in progress. Right now it just includes a basic and hardcoded ruleset (IF conditions) for the sake of practicality. The main elements in this mono repo are: a Processor, a Loader, a Bot.

Data model

The DB is a MongoDB which includes 4 collections:

  • Users: to store every user who starts Anfield bot.
  • Matches: to store event metadata (url, date), lineups, and data (timed commentary) for each match.
  • Assets: to store score of each player and last updated time. The score is the result of calculating partial score of every match commentary applying the ruleset. So a current score might be calculated in different point of time if needed.
  • Transactions: to store buy/sell operations by users. This is immutable data.

🛠 The Processor

This module connects act as a web scrapper, collecting every raw data and producing into a Kafka topic. It can read from multiple matches or pre-defined by configuration. What this component does is:

  1. Get the list of URLs by event or match (e.g. https://some-fpl-commentary.com/liverpool-vs-everton)
  2. Iterate over the list above and proceed only if match is not finished (a boolean flag on DB event metadata)
  3. There is a Producer component to publish into 3 channels: commentary, match date and lineups. This happens in 3 goroutines.
  4. A fourth goroutine is a consumer of those channels above. This is in charge of consolidating the information and building the DTO (metadata + data) for every match and sending to Kafka topic.

Probably in a better design I should use one topic per type of message but it does not matter when it comes to the main goal.

📼 The Loader

This module consumes every message in Kafka topic produced by the Processor and upsert a new document in either Matches collection or Assets collection. In Matches it updates every commentary and in Assets it updates the current score for a given player.

⏱ Cut-off

When should Processor/Loader stop processing?

Well, conceptually every match duration is 90' + extra time (~5'), so that would be our flag. So the Processor applies a pattern matching and identifies this flag which once detected a grace period of X minutes starts to process more events just in case. After that period the Processor will send a message with end flag, so the Consumer Goroutine knows when it should finish. In this case Consumer Goroutine will send to Kafka topic an end flag to indicate the Loader to update the Matches collection with finished=true.

🤖 The Bot

This module implements a Telegram BOT acting as interface between Users and the trading system. A new user starts with a budget of X (this could represent money, any specific unit or just a simple number)

The available actions are:

  • BUY: users can buy shares of assets within 90' of the match.

Example: the following timeline of player Salah from Liverpool

  1. "Mohamed Salah from Liverpool passes the ball in the box, but it's intercepted by an opponent player"
    📈 ===> [ score = 6 ]
  2. "Ball possession: Liverpool: 76%, Manchester United: 24%."
    📈 ===> [ score = 8 ]
  3. "The referee shows Mohamed Salah the yellow card for unsportsmanlike conduct."
    📉 ===> [ score = -8 ]
  4. "Danger! Mohamed Salah from Liverpool successfully directs the ball behind the defence and finds a team mate"
    📈 ===> [ score = 10 ]
  5. "Gooooooooal! Mohamed Salah..."
    📈 ===> [ score = 10 ]

💵 Partial Score = 26

  • SELL: users can sell their shares to Anfield.

This feature has not been entirely implemented yet.

More desired actions:

  • STATS: show general information about the game such as ranking by user, asset peak or highest price, lowest, etc.
  • WALLET: show information related to transactions, current budget, etc.
  • TRANSFER: allow users to send or receive shares from other users.

Why the main interface is a Telegram BOT?
Because it is just a simple POC and a Telegram BOT is easy and quick to implement and share with friend. Ideally in "real life" this would be an app and/or a web site.

Finally I am not sure if I will continue with this project. Mainly because it was just a motivation to code in a different language (I usually code Java) but I am sure this gave me some learnings and tools which I expect to apply in real word projects. This PoC has lot of bad choices in terms of architecture and data consistency which are not the goal of such. Feel free to make it shine!

Takeaways

I don't wanna copy & paste pros/cons from internet but just a few points:

  1. GO shines in simplicity but an abuse of it might make us to do some bad practices. Example: pointers everywhere. I'm sure I really did it 😄 but also noticed it 😉.
  2. Compilation speed rocks!
  3. Async workers with Goroutines is very easy to implement and you can save time to process time consuming tasks. In my case the first approach was to scrape every page sequentially. This took a lot of time (>20s). Then I did it in 3 different Goroutines and in a matter of seconds (<5s) I have all the information I need for every match.