⚽ The Use Case: Populating Football Match Data
Almost two years ago, I shared my experience trying to generate SQL with ChatGPT and ran into plenty of hallucinations and incorrect outputs. Back then, writing accurate database queries with LLMs was challenging and unpredictable. I revisited the task and I gave two examples of SQL commands and the respective Wikipedia page about the 2026 FIFA World Cup as an input.
Just as a background, I wrote a little R/shiny app called SoccerSuccer, and invite my friends and family members to enter their bets for each game of the FIFA WorldCup.

The goal (like last time) was to write INSERT statements about the opposing football teams, the date and time, and the football statium of the 72 games in the group stage of the tournament. This time I used Github Copilot.
The process took several minutes and requested my permission to use additional Wikipedia pages with focus on the respective FIFA World Cup Goups (e.g. Group A).
The process was not flawless and required additional commands. The first attempt contained only 12 of 72 games. In the second attempt there were several duplicates. However, the final result was complete and correct, which is impressive and encouraging.
📊 Updating FIFA Rankings
In addition I tried to update the FIFA ranking, which is also part of my football app. I provided the FIFA website for extraction. Copilot couldn’t accept cookies and handle javascript buttons on this website. The LLM pointed out, that there is a FIFA API, which also didn’t work properly. I was only successful after pasting the ranking into the chat window. The transformation into SQL INSERT commands worked flawlessly.
🧠 My Takeaways
- LLMs have improved significantly since my last SQL experiment
- Copilot is now capable of generating large SQL datasets with relatively few corrections
- Extracting dynamic web content (JS, cookies) still remains a weak point
- For static or pasted data, transformations to SQL are very reliable
- Some iteration and human oversight is still needed
There’s still room for improvement, but my overall experience with LLMs this year has been much more encouraging than in the past.
