Temporal Linked Data® (TLD) and TigerGraph’s Labeled Property Graph third-generation graph database have different purposes and so the schemas have slightly different structures.  In this TigerGraph schema image you can see that all aggregate vertices have a corresponding change vertex.  The attributes in the debit_credit_change vertex are also shown for example sake.


Temporal Linked Data® Agile Gamification example in TigerGraph (Image credit: TigerGraph Graph Studio)


Where TLD keeps changes sparse (only what has changed) as an ordered collection with the aggregate (vertex) to which it belongs, we have modeled our TigerGraph schema to have a “changes” edge between the aggregate vertex and its’ changes vertices.  Each change that is published to TigerGraph results in a new edge and change vertex associated with the aggregate, where the aggregate simply reflects the current state.


TigerGraph is built to perform deep analysis across many “hops” (physical edges between vertices).  Writing graph queries against this schema with ten aggregate vertices and their corresponding 10 change vertices is light work for TigerGraph.  Example analytic queries might include:


For a game, which individual has the highest score?  This query does not require any temporal data as the debit_credit aggregate keeps a running tally.  There are five hops between: game, gameledger, debit_credit, account, and individual.  


For a game, which individual has received the most positive comments from one’s colleagues?  This query requires the temporal data in debit_credit_change but is easily satisfied.  There are six hops to answer this query.  


For a game, which individual has been the most helpful to their colleagues?  Given the autogenerated events and specific collegial input, this query is easily satisfied with the same six hops.  


There are nearly two dozen analytic queries that could be discussed here and all but two of them require temporal data.  Something to think about. 


All two dozen analytic queries are available for realtime display on an operational dashboard as they perform efficiently and execute in a few milliseconds (many-hop analytic queries executed as often as you like for full participation in your operational system—that’s HTAP by way of TLD).



The Earth’s state as planet Theia collides to form our moon—temporal data tells a very interesting story. (Image credit: NASA)


Next we will look at the value of temporal data, both transactionally and analytically.


The creation of something meaningful requires understanding.  The building blocks of understanding is a vocabulary.  Rigorous vocabularies can be parsed and “understood” by a computer.  Programming languages represent this kind of rigor.  The W3C’s Resource Description Framework and related specifications also represents this kind of rigor.  


Hybrid Transactional / Analytic Processing (HTAP) backend by way of Temporal Linked Data® (TLD) is generated from vocabularies defined by RDF that are created using a very straightforward, approachable, and discrete technique.


In the same way that Tim Berners-Lee’s Semantic Web is able to leverage daemons and agents to “understand” and modify the public “data web” based on RDF and web architecture, we are able to generate:


  • REST API with JSON payloads;
  • BEAM modules for high-throughput, low-latency, reliable, concurrent, supervised transactional data processing;
  • Discrete, localized capture of who changed what and when associated with all TLD aggregates;
  • Automated, configured projections to scale-out to third-generation graph databases, such as TigerGraph, for realtime and longitudinal analytics;
  • As well as projections to RDF-based, Enterprise Knowledge Graph platforms, such as ReactiveCore’s rule-based inferencing and analytics.


The purpose of our auto-generated backend is to provide: 1) a domain-specific reference implementation, 2) written in Elixir, 3) packaged in docker containers for container orchestration, 4) deployable as an elastic, reactive compute grid, 5) executed in clustered BEAM VMs, 6) backed by BEAM ecosystem for dynamic temporal data structure persistence, and 7) projection to a world class graph analytics platform.


If you are new to Elixir or Phoenix Framework applications, TLD can serve as your working example from which to extrapolate and grow.  


If you are “the business” and simply interested in time to value, TLD can serve as: 1) your spike to flesh out a business concept, 2) your proof-of-concept to win mindshare among colleagues, 3) your pilot to demonstrate short time with high quality to production, and 4) your strategy for coherent business-driven continuous deployment.


If you are an old hand at Elixir and the established BEAM ecosystem, TLD can be your collaborator that allows you to focus on business concerns and modeling rather than software development.


A perfect solar eclipse. Patterns in nature allow us to study creation, make observations, validate hypothesis, develop models and the vocabulary that describes them.


The next post will provide an example of Temporal Linked Data® as TigerGraph schema for our Agile Gamification example.

Gamification is well established and is sometimes considered controversial when applied in a corporate setting.  Our objective in providing this example is not to cover topics more appropriately addressed through corporate policy and governance but rather to describe a type of system designed to fill in gaps of knowledge for analytics sake by being deployed with or integrated “alongside” operational systems.  Hybrid Transactional / Analytic Processing solutions by way of Temporal Linked Data® (TLD), lend themselves to these kinds of considerations.


In addition to providing a reference implementation for HTAP by way of TLD, our example of Agile Gamification is provided to extrapolate to any corporate domain where corporate leadership line-of-site into its business may be untrusted, untimely, incomplete, opaque, or is simply providing inexplicable results that need to be clarified.


Agile Gamification recognizes the invaluable information available in Application Lifecycle Management (ALM), Quality Assurance (QA), build pipeline (CI/CD), and source code management (SCM) systems to assess activity and identify communities and their attributes when subjected to analytic processing.  These technologies are invaluable to the day-to-day management of project, product, and portfolio activities.  But, from an analytic perspective, the events generated likely miss the nuance attributable to your most valuable asset: your people.  They do not tell the whole story and can result in learning the wrong lessons and rewarding the wrong behavior.


The reality of healthy agile organizations is that teams are fluid, forming for release cycles and spikes and dissolving and reforming according to the forces facing the business as people come and go.  These dynamic environments require aggregate knowledge shared by individuals that know the answer when it is needed and where it is needed.  Often, those individuals most valuable to team success are not the ones making the most frequent pull requests and commits or taking on the most tasks, but are serving as coach, technical advisor, and pair programmer, and doing so in a highly responsive, educational, and respectful manner.  


These intangibles are both invaluable and difficult to objectively quantify.  Gamification is one way to accomplish this by providing a single place to collect both the platform generated as well as the collegial input for realtime (game score), milestone (sprints and releases), and longitudinal (inter-game, inter-epoch) analysis.  HTAP by way of TLD with projections into graph database analytics provides an effective solution to all three dimensions of the problem.


In ordinary optical light, UGC 1382 was believed to be an ordinary elliptical galaxy. When augmented with ultraviolet light and deep optical data, and then incorporating low-density hydrogen gas, 1382 turned out to be a gigantic galaxy where its outside is older than its inside—the Frankenstein galaxy.
(Image credit: NASA/JPL/Caltech/SDSS/NRAO)


The next post will discuss what it means to auto-generate the backend from an RDF data model.


The purpose of this weblog post is to introduce a Temporal Linked Data® (TLD) data model by way of a generic gamification model.  To visualize the model’s vertices (TLD Aggregates) and edges (TLD Links) we will use the Visual Notation for OWL Ontologies (VOWL) plugin within Protege.  Each TLD Aggregate is comprised of OWL Datatype Properties (literal values represented by XSD base types) and OWL Object Properties (URIs that reference another TLD Aggregate, or other RDF Resource).  Each TLD Aggregate keeps the current value for each property as well as all changes applied to the aggregate over time, making them temporal data structures.


A well bounded context. TLD links aggregates that comprise a solution, trusting that it can link to other solutions as required, along the way, rather than ahead of time. (Image credit: NOAA/NASA GOES Project)


Let us assume that this Agile Gamification capability is a hosted service provided by a third party Organization, such as BRSG.  The idea would be that each team member would be an Individual known to the multi-tenant service provider.   Further, “Agile Gamification” would be an instance of a Game made available through an Organization, such as BRSG, but owned by the Organization that has a Subscription for the Game.  The Game is a BRSG Product made available as a Service at a public URI.  Each Individual is invited to play the Game and does so through their Game Ledger which has a special journaling data type called a Debit-Credit that keeps a running tally of their score by receiving “debit” and “credit” events described in the Agile Gamification weblog post.  Let’s look at these TLD aggregate models using VOWL.


The Organization depicted here is comprised of eight OWL Datatype Properties to identify, describe, name, associate, and track each organization instance.  Likewise, Organization contains six OWL Object Properties that link it to its’ external site, personnel, subscription, products, and services.  These RDF Properties, taken together, comprise the organization aggregate model.


Example “Organization” aggregate comprised of eight Datatype Properties, and six Object Properties. (Image credit: Protegy VOWL Plugin)


The Individual depicted here is comprised of 15 OWL Datatype Properties and seven OWL Object Properties that serve the same purpose as they do for Organization.


Example “Individual” aggregate comprised of 15 Datatype Properties, and seven Object Properties. (Image credit: Protegy VOWL Plugin)


Each individual TLD aggregate is modeled in this manner, both decoupled from each other while also cohesive in their overall structure and purpose.  When taken together, this network of TLD aggregate models are represented by a graph and can be directly projected into a third-generation graph database such as TigerGraph or an RDF-oriented Knowledge Graph such as ReactiveCore.  The following is a non-temporal representation of the TLD data model in TigerGraph (we will provide the temporal version of the TigerGraph schema when we get to the analytics posts).


TigerGraph graph model of gamification sans temporality. (Image credit: TigerGraph Graph Studio)


The next post will discuss how gamification can augment operational data for clarity, validation, timeliness, and additional perspective.