The workshop is organized as part of the Joint Conference on Serious Games 2018, November 7-8, 2018, Darmstadt, Germany. The conference program is here.
The workshop will be held on the second day of the conference (November 8, 2018) from 10:45AM until 12:30PM.
The workshop is organized with the support of the RAGE H2020 flagship project on (serious) game technologies. The workshop will be led by:
- Wim van der Vegt (Wim.vanderVegt@ou.nl), Open University of the Netherlands
- Enkhbold Nyamsuren (Enkhbold.Nyamsuren@ou.nl), Open University of the Netherlands
- Wim Westera (Wim.Westera@ou.nl), Open University of the Netherlands
The workshop entails a hands-on technical session addressing how to enrich your serious game with RAGE software components. Based on concrete examples discussed and presented in the workshop you will learn and understand how to quickly unpack, install and integrate software components in your game project.
The workshop is primarily targeting developers from game studios as well as researchers, educators and students involved or interested in game development. Given the technical scope of the workshop, participants should have some basic knowledge of software development and/or game engines.
The workshop will consist of a short introductory presentation and a longer hands-on session involving some coding. For the presentation, you may want to check the RAGE Portal, which is currently exposing up 40 ready-to-use game technology components. If you want to actively participate in the hands-on session then a laptop with Visual Studio is required.
The European Commission has designated (serious) gaming as a top priority for addressing a multitude of societal issues in, e.g., education, training, and health, and in the wider scope of the “digital transformation” of society. Today, however, the serious gaming landscape is still highly fragmented, exposing a lot or reinventing the wheel. Component-based approaches and the reuse of software will support developers at creating better games easier, faster, and more cost-effectively.
(Fig. 1: General architecture. This project only provides the Real-time analysis box; the rest is part of the Rage Analytics platform)
This is the default analysis that gets executed for analytics traces from games. The Analytics Back-end can use any other analysis that can read from the input kafka-queue in the required format. This documentation is therefore useful both to understand the Analytics Realtime code, and to build your own analysis from scratch. For a detailed information about the data flow check out Understanding RAGE Analytics Traces Flow.
Your analysis must mimic the signature of the RealTime class, by providing methods to return a suitable StormTopology as a response to a suitable config. As can be seen in Figure 1, the analysis will run within Apache Trident.
The project can be tested outside this architecture by using the built-in tests (via mvn test
, assuming you have Maven correctly installed).
Incoming tuples will be of the form versionId, Map<String, Object>
. versionId
s track particular instances of games being played; for example, all students in a class could share the same versionId
, but if the game were to be played later, the teacher would typically generate another versionId
. Map keys are generally of the form derived by the Analytics Back-end in several steps:
- From xAPI to simplified JSON, which is then sent to a Kafka queue. The queue provides a buffer to prevent the loss of traces if the analysis cannot keep up with a spike in trace activity. For a full understanding of the input format of the data from Kafka check out Understanding RAGE Analytics Traces Flow - Step 2 - Collector (Backend) to Kibana then Storm Realtime
- From Kafka into your analysis: Extraction from Kafka, and conversion into final format. Note that you could choose to reimplement these differently, as they are both part of this module. For a full understanding of how to connect with Kafka check out Understanding RAGE Analytics Traces Flow - Step 3 - Realtime, transforming data from Kafka to ElasticSearch
Sample processed traces (with columns representing versionId
, gameplayId
, event
, target
, and response
, respectively) could be the following:
23,14,preferred,menu_start,tutorial_mode
23,14,skipped,introvideo1
This example would indicate a player 14 has selected menu_start
(type alternative
), and skipped var introvideo1 cutscene
. Also note that in the second trace, response is not set.
xAPI traces sent by games should comply with the xAPI for serious games specification.
The analysis will include details on how to connect to the ElasticSearch back-end. The information obtained from the analysis is stored in ElasticSearch for its use in visualizations (via Kibana). For a full understanding about the analysis output check out Storm Trident Computation.
mvn clean install
: run tests, check correct headers and generaterealtime-jar-with-dependencies.jar
file inside/target
foldermvn clean test
: run tests checking topology outputmvn license:check
: verify if some files miss license header. This goal is attached to the verify phase if declared in your pom.xml like above.mvn license:format
: add the license header when missing. If a header is existing, it is updated to the new one.mvn license:remove
: remove existing license header
This is a list of considerations before implementing a new RAGE Analytics Realtime file.
- The analysis package must follow the restrictions described here.
- Starting method of our topology class can be found here.
- Example connecting to Kafka to pull data here.
- Converting String data from Kafka to JSON objects here.
- ElasticSearch indices names definition here, methods
getTracesIndex(String sessionId)
andgetResultsIndex(String sessionId)
. - Topology definition (Storm Trident API) here.
- Sanitizing traces and defining the analysis stream and persisting it to
sessionId
ElasticSearch index (traces index). - Realtime GamePlay State analysis definition and persisting to
results-sessionId
ElasticSearch index (results index).
- Sanitizing traces and defining the analysis stream and persisting it to
- Topology tests example.
- Project dependencies assembled here.