In search of the ideal data architecture
Competitive advantage and profitable growth doesn’t come from scale anymore. The rate at which big players in any and all industries beach their supertanker is unprecedented.
Competitive advantage and profitable growth doesn’t come from efficiency anymore either. What’s the point of making unwanted product efficiently?
Competitive advantage and profitable growth comes from adaptability. Pure and simple. Adapt or die.
A 2011 article in the Harvard Business Review pronounced adaptability the new competitive advantage. It asks how your managers can pick up the right signals to understand and harness change when they’re overwhelmed with changing information. The conclusion – instead of being really good at doing some particular thing, companies must be really good at learning how to do new things.
As Peter Senge points out, organizations only learn through individuals who learn, perhaps aided by machine learning these days. And learning craves meaningful data.
Lack of data was the problem of the 20th Century, yet the opportunity and challenge of the 21st is having too much of the stuff. This is the landscape of digital transformation and, I believe, the very bedrock of the meaning of business: establishing and driving mutual value creation (PDF).
Value flows when data flows meaningfully through sociotechnical networks, and I’ve been on a mission to find out how to make this happen.
Too much of not enough
Under-investment in IT isn’t the problem. Time and money have gone in, yet with the benefit of hindsight the resultant infrastructure too frequently strangles as much as liberates. As one CIO put it to me, “It nearly does what we need.” He pauses. “But nearly is effectively not.”
On examination, this remark is an indictment of some popular approaches to designing enterprise IT, not least enterprise architecture and service oriented architecture.
In IT terms:
EA is about aligning an enterprise’s IT assets (through strategy, design, and management) to effectively execute the business strategy and various operations using the proper IT capabilities. … it aims to reduce IT costs through technology reuse and eliminating duplicate functionality. 
The process is considered the glue between business and IT.
Unfortunately, EA in IT terms has not kept abreast of EA in non-IT terms. Enterprise IT architecting is mechanistic, a servant to the business strategy, and it fails too frequently to consider IT part of the holistic and dynamic whole . Thinking of it in terms of the agile manifesto, enterprise IT architecture falls short by defining and pursuing a plan rather than striving to support and drive continuous change.
The most evolved form of EA – enterprise ecological adaptation – is more my kinda groove, but it’s too easy to float above the technical foray and merely hope it catches up. It’s the tension between ecological adaptation and IT architecting that informs Gartner’s dedication to bimodal IT, more on which later.
In short, if you find that your business does what its IT allows rather than your IT doing what your business requires, your IT may have been EAd to death. Your business is indeed glued to its IT.
Service oriented architecture
SOA emerged to address this EA problem. While definitions vary considerably, Microsoft describes it as:
a loosely-coupled architecture designed to meet the business needs of the organization.
SOA is a philosophy rather than a methodology or technology. It’s an approach to building systems from autonomous services. A means rather than an ends. Unfortunately, it’s incredibly challenging to wield those means in a meaningful way.
Admittedly, much criticism of SOA focuses on the technology (eg, XML) rather than the philosophy, but any approach for which the dominant technological instantiations are flawed will rightly be cast in the same shadow.
A major flaw can be framed in terms of a primary ambition of the SOA Manifesto: intrinsic interoperability over custom integration. No matter the technology, services are too frequently reusable and not especially useful, or useful but not especially reusable. Striving for usefulness ties the SOA implementation in cloddish and complex knots, particularly when such services crave consumer context, and reusable stateless services create significant governance and security headaches.
Finally – in what might be a fatal case of SOA thrown under the (enterprise) bus – the SOA community has been grappling for many years with top-down versus bottom-up design and development. No-one argues that SOA is top-down by nature, but with business adaptability and corresponding tech agility in mind, many practitioners have sought to work it bottom-up. And they’ve failed. SOA may have a place in the increasingly rare world of deliberate strategy, but with emergent strategy in the ascendant, SOA’s sun is setting.
But SOA isn’t dead just yet. Some consider microservices the latest SOA incarnation – “fine grained SOA” according to Adrian Cockcroft at Netflix  – in which the granularity is married to more rigorous decoupling. There’s no agreed definition, although I like this attempt :
an approach to developing a single application as a suite of small services, each running in its own process and communicating with lightweight mechanisms.
Can microservices rise above the flaws of prior SOA manifestations? Things look promising although it’s not all plain sailing:
Almost all the successful microservice stories have started with a monolith that got too big and was broken up. Almost all the cases where I’ve heard of a system that was built as a microservice system from scratch, it has ended up in serious trouble. 
Unfortunately, there’s a further problem. Each and every service may have its own ‘flavour’ so to speak, lacking any standardised, semantic context – far from ideal, by definition, when striving for meaningful data flow.
Big data. Big mess. Big opportunity.
During the latter part of the 20th Century, leading manufacturers transitioned from pushing production through the factory to the warehouse, to pulling production through to meet market demand. The emphasis moved from stocks to flows, and this history is instructive for our working with data today.
Our relatively recent facility to deal with previously unimaginable quantities of data – the word big doesn’t really do it justice – invokes ideas of stocks, compounded by phrases such as data warehousing. Even the term data lake conjures up a relatively static body even if there are flows in and out.
It is then time to make the data supply chain real. Instead of having deliberate strategy push services onto the business, we need to help the data flow with the emerging needs of the business by allowing it to express and expose itself, guided by and ‘working alongside’ the right people in the organization (i.e. not just those in the IT Dept. or Data Office).
It’s tricky to get IT to be resilient and nimble, rock steady and malleable, and for this reason Gartner developed the concept of bimodal IT in 2014:
the practice of managing two separate, coherent modes of IT delivery, one focused on stability and the other on agility. Mode 1 is traditional and sequential, emphasizing safety and accuracy. Mode 2 is exploratory and nonlinear, emphasizing agility and speed.
The previous year, McKinsey had argued for the development of “greenfield IT” (effectively Mode 2):
… the creation of a new technology that lives and is independently managed outside the CIO. The reason for considering this is that so many companies are hampered by legacy technologies that they are unable to be flexible and simply cannot innovate.
Gartner claims that it’s “IT with traditional efficiency and predictability, yet with the agility and speed to use the digital core with utmost effectiveness.”  I add a critical caveat – so long as the two modes (of both the technology and people) co-exist harmoniously.
Bimodal shapes up to be a realistic response to a multi-faceted challenge and has a lot going for it in my opinion, but others aspire to a more ‘perfect’ single-mode world represented by the combination of continuous delivery and lean IT.
Continuous delivery and lean IT
Continuous delivery couldn’t be more evocative of the flow we’re devoted to here. And Lean IT extends the lean principle I briefly alluded to earlier (pull over push).
Lean entails the identification and elimination of waste (muda) and unevenness (mura), with the latter effectively synonymous with flow. Lean IT maps this dedication onto two types of value stream: business services that are directly value adding, and IT services that are wholly necessary and sufficient to deliver the business services.
Now I don’t want to be pedantic, but could it be that lean’s business and IT services and bimodal’s adaptive and systematic IT bear some resemblance to each other.
If you agree, this is how I explain that coincidence – there will always be stocks, and we all want to get those flows a-flowing, and we are therefore merely contemplating the juxtaposition of the two. Whether one considers it a hard boundary, or more a graduation, the incongruity must be resolved to play to the strengths and purpose of both modes, and, critically, to do much the same for all the people involved as well as the technology.
I’m excited to have the opportunity to be working alongside the Braintribe team (disclosure: they are a partner and client, not just the conclusion to my search). Without getting down deep into the mathematics, the company applies Gödel numbering in its development environment, Cortex.
Cortex is model driven, guaranteeing type-safe relationships between the formal terms. Gödelization facilitates the modeling of both the operands and operations, effectively normalizing the entire system and enabling the description and implementation of all variety of reflective operations (JVM runtime behavior modifications).
The result, I believe, is unrivalled power of expression and control. Cortex has some aspects in common with model driven architecture and object-oriented programming, with a dash of aspect-oriented programming and more than a sprinkling of expert systems concepts.
Braintribe’s data-as-a-service platform, Tribefire, is built on Cortex. It decouples data from the infrastructure and allows data to express itself the same way your people do. It really is a joy to witness non-technical people discover and build and combine visual data models, explore the previously inconceivable with simulated data, and expose APIs automatically for any innovation or incubation purposes before even going into production.
To my mind, this makes Tribefire the ideal platform for digital transformation, if that is you agree that digital transformation must be preceded by data transformation. And I’m already wondering about a Mode 3, the logical long-term conclusion to a bimodal reality.
I won’t go into more detail, but I could invite the company’s CTO, Peter Brandner, to write here rather than have me waffle on. Maybe a Q&A? Yes … do add any questions you’d like answered in the comments here.
Your data supply chain is everything you need it to be when the energy previously expended merely finding, extracting and working with data is re-focused on exploration, learning, and knowledge building. When data is instantly available, navigable, intuitive, and meaningful, it finally emerges as the exponential technology it’s made out to be, central to your organization breaking free of the linear world – hierarchical, centralized, closed, top down, and obsessed with scarcity.
You and your colleagues and everyone with a stake in your organization’s success will be shaping and exploring and testing and reshaping decentralised value networks, making your organization permeable to ideas and connections, and catalysing mutual value creation.
 Lapalme, J. 2012. ‘Three Schools of Thought on Enterprise Architecture’. IT Professional 14 (6): 37–43. doi:10.1109/MITP.2011.109.
 Gartner Executive Programs, Renovate the IT Core: Laying the Foundation for Digital Business, 2014, No. 8
Image credit: Thomas Schultz, https://en.wikipedia.org/wiki/File:DTI-sagittal-fibers.jpg, CC BY-SA 3.0