It should not be controversial to observe that every new or additional technology added to a portfolio represents needed licensing, training, senior expertise, supporting infrastructure and process as it relates to CI/CD and related build pipeline and production considerations.  Integration of disparate or heterogeneous tools, techniques, and technologies can compound this effect.


One way to look at this is from the perspective of architectural simplification, a value that we have long held.


For over 20 years we considered the Java ecosystem to be the best way to effectively accomplish the most while having to know the least.  In spite of the corresponding personal and corporate investment, we have come to appreciate the BEAM ecosystem to an even greater degree from both a business and technological perspective.


While our own reasons for favoring BEAM will emerge in subsequent posts, we leave you with a talk by Saša Jurić that offers as good of an overview as we can imagine.  For a terrific example of architectural simplification, pay special attention around minute 36 where Saša lists technologies supplanted by the BEAM ecosystem for just one realtime messaging platform.



In an effort to focus on The Business rather than technology, BRSG advocates for architectural simplification that leaves no gaps while performing the same work with much less human, infrastructure, and financial resource.  Please reach out if we can be of service:, or call call 303.309.6240.


The fact that a chosen analytics platform does not span the entire analytic use case spectrum can be cause for concern and reevaluation.  The temptation is to reach for a niche, special case platform or solution, but this may eliminate valuable and desirable forces.


Special case efforts tend to result in one-off projects rather than being available for the ordinary course of business.  This tends to reduce the expectation of timeliness and scaleability.  It can also unduly limit valuable concepts and algorithms by believing they cannot be used generally.


Page Rank is a good example of an algorithm that has general applicability as a measure of influence in a community, but that can be left out because the current analytic platform does not handle it well.  


Full Spectrum Solution


Our own commitment to a graph-throughout architecture with realtime data projection to a third-generation graph analytics platform provides the best chance of being able to use the right algorithm for the job, whether realtime analysis, historical analysis, creating machine learning data sets, or applying explainable AI (as provided by TigerGraph), as well as more RDF-oriented Knowledge Graph solutions (such as our friends at ReactiveCore provide).  


The time for graph-based transactional and analytic solutions is here.  Please do have a read through our weblogs discussing Hybrid Transactional / Analytic Processing by way of Temporal Linked Data®.  


BRSG advocates for business-oriented goals as well as considering lost forces that inhibit the ability to serve The Business well.  Please reach out if we can be of service:, or call 303.309.6240.


In a prior post, enterprise data was compared with an “hydraulic data cycle” where analytics would rain insight from precipitation drawn across “all” operational environments.  Rather than analytics being an afterthought, HTAP raises “business guidance” through analytics to first class citizen along side operational systems with realtime, temporal data projections.


Information Technology should view itself as part of and critical to the success of the businesses that it serves by regularly identifying, prioritizing, and delivering on the needs of The Business as enabled by: 1) collaborative and agile processes, and 2) scalable and reliable microservice deployments by way of container orchestration to hybrid on-premise, virtual-private, and public cloud platforms.  


Detail in the Big Picture


The resulting fine grained transactional deployments have to be reconciled against the need for a realtime 360 degree view of current and historic data for analytics and business guidance.


In like fashion, Blue River Systems Group advocates for high-throughput, low-latency, scale-out transactional micro-services that project directly to low-barrier, big data graph platforms for effective, realtime analytics.  Both third-generation Graph Database and enterprise grade inference-engine oriented Knowledge Graphs are candidate targets for realtime transactional data projection resulting in many micro-services contributing to a singular view of the enterprise.


It is time to break down the human, technology, and resource consuming silos created by cobbling together platforms on a case-by-case basis.  Keep the data—and information—flowing rather than allowing it to collect in a silo-induced lake.


BRSG consultants pursue ideas that scale, as well as that hold together well.  Please reach out if we can be of service:, or call 303.309.6240.


What if we used “hydraulic data cycle” rather than “data lake” as a metaphor for big data?  As a thought experiment, let’s consider the continuous and complex process whereby water on the earth’s surface evaporates, rises into the atmosphere, condenses into rain or snow, and falls back to the surface to water the land, run in rivers, pause in lakes and aquifers, perhaps even find its way back to the oceans, but mostly repeating the cycle.


Super Cell Water Cycle


Enterprise data should be this active for its entire lifecycle, not stagnating in an artificial lake.


Consider that data has two primary jobs: 1) operate the business, and 2) guide the business.  These two functions are distinct but should not be separate.  Transactional systems should work hand-in-hand with analytic platforms to provide operational capability, realtime analysis, historical perspective, training sets for machine learning, and explainable artificial intelligence—augmenting human strengths with high tech capabilities.  This is what BRSG seeks to accomplish with Hybrid Transactional/Analytic Processing (HTAP).


Further, consider that business data structures can—and should—reflect real world objects, thereby helping to reduce artificial barriers between The Business and Information Technology.  From an HTAP perspective, it would be useful if transactional data structures that reflect the business would translate one-for-one to the analytic structures that inform the business, producing analytics that rain insight from precipitation drawn across “all” operational environments.  


This series of blog posts will consider a number of “silo spanning” topics that HTAP poses.  From an enterprise data architecture perspective, we will explore the possibility of utilizing RDF data structures—semantic web and linked data—for high-throughput, low-latency, reliable, supervised transactional systems that naturally project into enterprise graph database and knowledge graph big data platforms.  


The premise for our discussion is that graphs are powerful for both transactions and analytics.  Because they are self-describing, they lend themselves to low-code/no-code, short time to market enterprise solutions.  


BRSG consultants develop ideas that scale such as Hybrid Transactional/Analytic Processing strategies by way of semantic web, temporal linked transactional data, and graph analytic data.  Please reach out if we can be of service:, or call 303.309.6240.