2019 ECSM: Preparing and hosting a Capture The Flag contest

In the late summer of 2019, some of our contributors were involved in preparing and hosting a public “Capture The Flag” (CTF) contest in the context of the 2019 European Cyber Security Month (ECSM). More than 1,000 teams registered; nearly 600 solved at least one challenge. Unlike other CTF contests, this one was focused on industrial security.

In this article, we give an insight into the preparations and hosting of the CTF event and discuss lessons learned.

Always stay in the loop!
Subscribe to our RSS/Atom feed.

The Syskron Security CTF

The Syskron Security CTF contest was a free online cyber security competition for everyone; especially for school and university students. CTF means “Capture The Flag.” Teams had to solve security challenges to retrieve flags (e.g., text strings). The teams got points for submitting the correct flags.

Preparing the contest

We started preparations about two months before starting the contest. Main activities of preparing the contest were:

  • Preparing the technical platform: We set up and configured four servers (see Notes on the infrastructure). Since the CTFd framework didn’t meet GDPR requirements, we needed time to customize it. The contest required a domain name, DNS records, an e-mail address, a security concept, a testing environment, and more.
  • Creating challenges: Creating and testing challenges took most of the time. Interesting challenges aren’t only about hiding a string somehow but about writing a story, creating hints that don’t immediately disclose the flag, defining categories and points, and writing a full walkthrough for internal testing. One challenge (MQTT broker) included a technical setup, needing additional monitoring, validation, and scripting.
  • Announcing the event: We announced the event on several platforms, including special platforms for the ECSM. Promoting the contest on ctftime.org was crucial as we learned.
  • Writing rules and policies: Yes, even a Capture The Flag contest requires boring rules and policies. For instance, our rules included that sharing flags with other teams is forbidden.

The challenges

We provided 26 challenges, split up into six categories:

An image showing the categories of the CTF as a pie chart.
The focus of the CTF was on forensics; however, we tried to create an equal number of challenges per category. (🔍 Zoom in)


OSINT was all about gathering intelligence from public sources. For example, identify the location of an industrial site, or find the default password of an industrial VPN device.


Cryptography contained three challenges. They addressed the risks of rolling your own cryptographic scheme and using easy-to-guess credentials. Additionally, one challenge was about NaCl (the “Networking and Cryptography library”) to show this library for network communication, encryption, decryption, and signatures.

An image showing a challenge description of the CTF.
The challenge 'Enhanced PLC Encryption Standard' addressed the risk of rolling your own cryptographic scheme. (🔍 Zoom in)


Eight Forensics challenges were provided. One required connecting to an MQTT broker and subscribing to an MQTT topic. Another included data hidden in the Siemens S7 protocol. Some challenges required participants to analyze log files, find leaked data, and conduct a dictionary attack. Steganography (the practice of concealing information within other information) was part of two challenges.


Trivia contained four knowledge-based questions. One addressed the computer worm Stuxnet, another one the WannaCry ransomware attack, and two questions addressed security features of OPC UA, a machine-to-machine communication protocol.


The Secret category was unlocked by solving the most straightforward OSINT challenge. It contained the industrial-themed 4-challenge-long story of Serra Raaphorst, an employee of a Dutch manufacturing company in Den Bosch. Raaphorst asked the CTF participant to discover the “dirty secrets” of her boss. Her story required people to decode DTMF (dual-tone multi-frequency signaling), solving a word search puzzle, and visiting the (digital) Library of Babel.


The Fun category featured three challenges, which weren’t related to industrial security.

An image showing the solve count of the CTF as a bar chart.
436 teams solved 'Industrial sightseeing tour 1', the OSINT challenge that unlocked the 'secret' category. However, nobody solved 'My eyes hurt'. (🔍 Zoom in)

The background

One of the main goals of this contest is raising awareness of industrial security (aka OT security). OT (operational technology) security is different from IT (information technology) security. Good practices of IT security don’t apply for OT security:

  • The crucial security objective of OT environments is constant availability and integrity. In IT environments, confidentiality has a top priority. Safety requirements exist for OT environments since OT can cause significant harm to people.
  • Many OT environments consist of special-purpose hardware running special-purpose software. Such hardware must be available around the clock, and it runs for decades. Installing security updates means costly downtime. There is the risk of changing configuration, breaking things (resulting in even more downtime), or there are just no security updates available for numerous reasons. Updating special-purpose OT may require hardware upgrades, costing millions of euros.
  • On the network level, machine-to-machine (M2M) communication may be in realtime. So packet inspection or other active analysis of network traffic can’t be applied.
  • General-purpose IT tools (on the attacker’s and the defender’s side) may not work in OT environments. For instance, vulnerability scanners can’t identify many security vulnerabilities in OT environments, anti-malware software breaks availability, and host-based mitigations are often unavailable since many OT components run special-purpose operating systems.

More differences between the IT and OT worlds exist. Keep this in mind when you read about cyber attacks on industrial environments next time.

Notes on the infrastructure

Our technical setup consisted of four servers:

  • Submission server: This server ran a modified CTFd 2.1.5 framework and Nginx as a proxy server. We moved resources hosted by third parties by default (like fonts and CSS files) to our own installation and secured the whole setup. We modified several CTFd files to meet the requirements of the European GDPR. One physical server with 8GB memory was sufficient to handle 1,500 users for five days.
  • Challenge server: This server ran the MQTT broker that was part of a challenge. It was in use to host a landing page before starting the contest.
  • Monitoring server: The monitoring server was only internally accessible. It ran Icinga 2 and monitored system resources as well as user and team registrations, and other security-related properties of the public-facing servers. We used it for alerting in case of anomalies.
  • Mail server: We set up a mail server for sending and receiving e-mails. Nothing special here.

We did not track any participants, so nice graphs showing the origin of the CTF participants don’t exist. At least, we can show the top 10 teams:

An image showing the top 10 teams over time.
The French team 0x90r00t was the fastest team reaching about 85% of all points. The second-best team TeamPowerPrinter/Gutenberg from Denmark got the same number of points about three days later. The academic team HgbSec from FH Oberösterreich (Campus Hagenberg) ranked third. (🔍 Zoom in)

Lessons learned

We want to share the following five lessons learned:

  1. 1,500 users registered and created more than 1,000 teams. However, well-known top CTF teams used a single user account during the contest only. They shared their credentials and accessed their account from many different origins. Keep this in mind if you monitor your setup. This means that the actual number of users was slightly higher.
  2. Most flags were directly shown to participants after solving the challenge. However, some challenges required users to write (not copy) the flag. We provided the flag format. For instance, people had to identify the company’s name of an industrial structure in a picture. The lesson here was many people didn’t read (or understand) the flag format and general directions regarding these flags. Instructions were shown directly on the challenges page. Directions must be part of the challenge’s description.
  3. Two different types of hints were provided: Free general hints and hints that helped solving the challenge. Teams had to “pay” 10% of the challenge’s points for unlocking hints that helped them to solve a challenge. (For example, solving a challenge resulted in +500 pts. So they needed to “pay” 50 pts.) We think this approach is fair since teams solving challenges without using hints get more points.
  4. We excluded several teams for violating one of the six rules of the CTF. Some teams tried to brute force the flag of one challenge by entering hundreds of random strings; others tried to brute force the flag by systematically iterating over possible answers. One challenge was easily solvable by guessing since only six possible answers existed. Keep this in mind when designing your challenges and writing your rules.
  5. While many participants liked the contest (the contest is rated 97% on ctftime.org, ~50 teams voted), others reported solving challenges required guessing. At first glance, this looks like something that needs to be changed next time. However, most people likely didn’t encounter such challenges before, so they didn’t know how to solve this quickly. The lesson here is you shouldn’t back off from creating challenges differing from common Capture The Flag challenges.


The vast majority of teams rated our first contest as a challenging and enjoyable experience, and appreciated the event’s industrial style. Our infrastructure worked as expected, and there were no disturbances. Hosting Capture The Flag contests is a good way to raise awareness of information security.

Read also