In a sense, this chapter is the heart of the book. What we have learned on our journey to arrive here is that the facts are not what they always seem. While we know that good decisions rely on sound data, there appears to be a gap in our knowledge that results in sub-optimal decision-making. In other words, there is much valuable information that remains inaccessible to us because we have not found a way to tap into our intuitive knowledge. We have new tools like big data, data analytics, and AI to help us navigate an increasingly complex world. However, none of these address the issue of inaccessibility of intuition, and other intangibles, and as a result end up with sub-optimal results. In this chapter, we propose a new way to tap into this intuition by creating a metric that can quantify intuition. In this way, we create a numerical proxy that transfers intuitive information from the inaccessible domain over to the mathematical realm that is the basis for much of decision-making in the modern world.
A brief glimpse back can help summarize and set the tone for the rest of this chapter. We began by realizing that modernity is confronted with a data problem, both in quantity and perhaps, more importantly, quality. To understand what data quality means we took a closer look at how science, primarily since the Enlightenment, has defined what facts are. However, a closer look at science, the discipline we turn to for truth, reveals something uncomfortable. Science itself is in constant flux and in a strange sense, all “facts” can be interpreted to be false. This is because, science is in continual flux and never stands still. An idea that fits in a model that is peer accepted today may be outdated tomorrow in light of new discoveries. Hence, what we consider truth today is false tomorrow. We can validate this and build confidence in this recursive pattern of the scientific process itself through studying the history of science. We are led to the inescapable conclusion that scientific truth is impermanent and that it is possible that all such scientific knowledge has a shelf life. All experimental models may eventually turn out to be false, to be succeeded by a more accurate model.
A further way knowledge may have been inadvertently distorted is due to human civilizations ongoing romance with psychoactive compounds. It seems plausible that the sheer volume of historical Euro-centric psychoactive drug consumption may have altered the course of human knowledge in some significant and sophisticated way, contributing to ideas that are now ingrained and normalized into our cultural institutions today. From an epistemological perspective, drugs can both degrade and enhance cognitive function. Under many conditions, cognitive impairment results, but in some cases, and with specific types of compounds taken by particular types of drug users, it can stimulate the emergence of novel ideas. Indeed, the significant consumption of Laudanum before the Enlightenment period may have influenced its outcome, through its many cognitive impairments, as well as stimulation of new ideas. The success of the Enlightenment has placed a heavy emphasis on rational, analytic thinking over intuitive thinking. As a result of the success of the Enlightenment, modern scientists’ frown upon fields of science which cannot quantify their key variables of study, subjecting them to rigorous, mathematical analysis.
However, more and more, researchers are discovering that intuition is only vague because it has been vaguely understood. Indeed, modern research suggests that intuitive thinking is highly evolved for increasing fitness and emerges from a predictive brain model. If appropriately used, intuition is an integral part of reasoning. Researchers like Daniel Kahnaman have demonstrated that the limitations of intuition are often in making common cognitive bias mistakes and in drawing from a sparse set of experiences. The best intuitions are the result of a rich experiential set of data. Hence experienced workers have much more accurate intuitions than green employees. In Kahnaman’s theory, intuition makes up the fast reasoning system 1, while slow, analytic reasoning makes up system 2.
Research is also beginning to reveal the mechanics of inner feelings, which are the distinguishing qualia of what we call intuition. Scientists label these feelings as interoceptive signals. They emerge from internal organs and send internal messages which we can sense. They are evolved out of millions of years of evolution to warn us of such things as an impending danger so as to increase our fitness for survival. All in all, current research into intuition is slowly shedding light on the mysteries of intuition and revealing its true predictive nature.
Intuition was also explored from the perspective of the Umwelt, the subjective universe of a living organism. This concept is useful in demonstrating the relative nature of experience and questions the long-held positivist notion of an objective reality. It illustrates how intuitions are a natural part of all living organisms. Instinctive feelings are what every living organism, including homo sapiens, uses to survive. As intuition is defined as an intuitive knowing that bypasses conscious, analytic reasoning, the subject of animal instinct becomes vital to understand.
Furthermore, we learned how different living organisms sense the world and the signals that are meaningful to them. As far as we know, most other species lack the kind of abstract, analytic reasoning ability that humans have. This makes the gap between intuition/instinct and analytic reasoning quite noticeable. In other species that lack the same power of symbolic logic, instinct and intuition dominate decision-making.
In light of the discoveries of modern science, we can no longer dismiss intuition in our decision-making process. In the fields of democracy and economics, in particular, everything depends on trusting relationships, and there are significant opportunities to improve decision-making by incorporating useful, intuitive information concerning social capital. We have already discussed the limitations of representative democracy, the most popular form of democracy, and it is clear that this form of democracy is not working. It suffers from both lack of analytic and intuitive knowledge expertise, and channels for that expertise to affect democratic and voting outcomes. More recently, we have seen the emergence of right-wing authoritarian leaders around the world, who don’t necessarily serve the best interest of the people. The representative system forces many voters to choose one representative whom they typically have no genuine trust relationship with. The only way to know them is through the media, and that message may be a biased and manipulated one. As a result of this long-range voting structure, intuitive awareness is at a disadvantage. In such a system, politicians can use media manipulation to win elections. Whoever wins the election must decide on policy issues, but representational democracy allows popular non-experts to prevail. The winner-take-all approach also has the potential of leading large segments of the population to be underserved. The result is poor governance that does not benefit large sections of the public. Finally, often, there doesn’t seem to be enough experts to make effective decisions, but there are parties that favor the smallest government possible. All these vulnerabilities are potential contributing factors to ineffective governance.
A new form of democracy is required to overcome the current systems’ structural problems. To this end, we explored the concept of the superorganism as a metaphoric model for human society and social capital as a useful indicator of the superorganism’s health. To measure this, we must be aware that a healthy superorganism requires active participation from all its cells. Each must be supported to do its specific function optimally. The heart must be encouraged to beat, the kidney to cleanse the blood, the brain to perform cognition and master supervision, the stomach to process food and transform it into nutrient forms acceptable for body metabolism.
In light of those ideas mentioned above, the schemata of proxy voting offers a means by which to begin to quantify and use social capital, and thereby ameliorate other problematic methods of building and enhancing social capital, while providing the prospect of improving democracy at large. Proxy voting makes participation easy and rewarding. The concept is simple, and therefore appealing. If an eligible voter chooses to pass up the opportunity to vote, they, in turn, can give their vote to someone else – someone they deem more qualified or better equipped for the ballot (i.e., with greater knowledge, experience, confidence, and more). This vote transfer itself is precisely an instance of social capital: it exemplifies the very elements of social capital, as the transfer emerges from an assumed relation between the one who transfers and the one to whom it is assigned. Simply speaking, it is an outward display of trust, of participation, and the execution of a relationship, or an informal network. Thus, when a vote is transferred it becomes more valuable, for all that is implied about social capital in the transfer itself. The removal of a vote is the creation of value – an act of added value – and one that may continue to increase as additional transfers ensue. Any such transfer between voters is referred to as a unit of social capital.
A proxy voting system creates an environment where individuals are encouraged to be involved: it is a method of increasing engagement, which is part of building greater social capital. In other words, the means meet the end; the instrument is part of the objective. Individuals thus choose to participate in an election despite their potential lack of knowledge, knowing that someone else’s expertise, experience, education, and so on, can be advantageous to the outcome of the election. Once all of these improvements start working together, proxy voting increases the reliability of the voting process while increasing the impact of every individual’s vote, or voice.
Not only would social capital increase dramatically, but the integration and harvest of an individual’s intuitive genius can also be better incorporated into decision making. Analyzing the people around you goes well beyond individual C.V.s and education history; instead, it is their story. That story is tied up into the individual’s Umwelt and Interoception which goes into all of their “close range” decisions. It cannot be over-emphasized how valuable this “intuitive” decision is to the group, and it cannot be exaggerated how valuable identifying and using this knowledge can be.
In proxy voting, this value is reflected primarily when the vote moves. Votes can be transferred multiple times until it reaches a final person in the sequence who finally votes. When that person at the end of the transfer chain does finally cast a vote, it is worth much more than an ordinary vote. It is a particular type of summation of all the vote transfers before it. Hence if 5 people were involved in a string of 4 vote transfers, then the value of the fifth person’s vote is much higher than the vote of just one voter.
Operating a company, an institute, a democracy or an economy without social capital is impossible. Social capital is currently only a qualitative variable that remains intangible and has resisted quantification, so far. As we have demonstrated there can be enormous benefits in quantifying it. The Democratic Quality Vector (DQV) is a new function we define that captures the essence of trust in a vote transfer sequence of a proxy voting system. We can also think of the DQV as a number that represents the total social capital value of a group of proxy voters.
In most institutions, the relative value of tangible and intangible capital is poorly understood. Because intangibles are rarely quantified, tangible capital that is quantified immediately has more veracity. This results in the typical situation where tangible equity is seen as having more value than intangible capital, when in fact the opposite is likely true. We rely upon others and refer to them to get through life intact. Is our food untainted? Will our children be safe at school? These are but two things we have to trust others with.
Indeed, some have argued – particularly in the accounting world – that several characteristics of intangibles disqualify them from being counted as capital. The lack of verifiability for intangible assets that are not acquired through market transactions; the lack of visibility of intangible assets after their acquisition that complicates efforts to track past vintages are both arguments that may disqualify the intangibles In addition, the nonrivalness of some intangible assets (that is, nonscarce – tangibles are rival because once one person uses it, another cannot); and the lack of appropriability (cannot be easily reproduced) of the returns from some intangibles. This is why many are afraid to start measuring intangibles because there is some risk involved.
It seems that economists and politicians have the same challenge in valuing intangibles as social scientists do, and therefore some of the tools economists use can be used to design a social metric. The attraction of the standard mathematical quantification of value that our culture has accepted is a normative validation. If we can’t verify it, we can’t trust it is our current default. This standard is unlikely to change rapidly. So, we may have higher confidence in the predictive power of intangibles if they are significantly vetted. Qualitative motivations capture attention, but only quantitative results inspire investment. That is why we often do not pay attention to social metrics. So that is why a consistent metric for social capital would help determine which inputs or changes lead to positive results.
One refined approach of valuing intangibles in economics that we may look to as a model is the ratio of a change in national income to the change in government spending that causes it. More generally, the exogenous spending multiplier is the ratio of a change in national income to an autonomous change in spending (private investment spending, consumer spending, government spending, or spending by foreigners on the country’s exports) that causes it. When this multiplier exceeds one, the enhanced effect on national income is called the multiplier effect. The mechanism that can give rise to a multiplier effect is that an initial incremental amount of spending can lead to increased consumer spending, increasing income further and hence further increasing consumption. Moreover, resulting in an overall increase in national income more significant than the initial incremental amount of spending. In other words, an initial change in demand may cause multiple shifts in output and income.
We can leverage this ratio approach, along with its underlying mathematics, to design a social capital measurement. By investing in activities that strengthen social ties, we could look for the resultant revenue generation from the group and then use a weighted multiplier to connect the two. This also applies to political systems.
In economics, delayed discounting is another accepted mathematical tool that represents the intangible value of time, and it too can be referenced when making a social metric. It is called a time-inconsistent model of discounting. Given two similar rewards, humans show a preference for one that arrives sooner rather than later. Humans are said to discount the value of the later compensation, by a factor that increases with the length of the delay. This process is traditionally modeled in the form of exponential discounting, a time-consistent model of discounting. Subsequently, a large number of studies have since demonstrated that the constant discount rate assumed in exponential discounting is systematically being violated. Delayed discounting is a particular mathematical model devised as an improvement over exponential discounting, in the sense that it better fits the experimental data about actual behavior. However, note, the time inconsistency of this behavior has some quite perverse consequences. Also, delayed discounting has been observed in both human and non-human animals.
Another context that we can look towards for some inspiration is networks in the natural sciences. They measure intangibles in similar methods as we are proposing. In biology and in the mathematical modeling of biological phenomena, the symmetrical and algorithmic properties of organic shapes have been extensively studied.
As we already know, knowledge and science change over time with new discoveries. Advances in science uncover certain patterns and invariant laws across all forms of life. The area of “social physics” – first coined by Auguste Comte and recently picked up by Alex Pentland – explores the parameters for human behavior and decision-making in groups. Pentland’s work, for example, engages precisely “how social networks can make us smarter.” Whether examining things from the cellular or molecular level or upwards from the broader social and civil perspective, there are emergent laws and constraints operating that shape – and in some cases determine – the interactions and dynamics of human individual and collective behavior, on the basis of evolutionary and cultural developments.
It seems, through analysis of already existing systems, that the best way to emulate a social network is to focus on the “big picture” mathematical structure and not the individual variables. In multiple disciplines, beyond what we have talked about here, research into networks has unveiled a similar mathematical pattern. Exponential, logarithmic, or hyperbolic models are identified and subsequently justified as the best fit for the application. We have seen this same investigative pattern in the development of the mathematics of human vision. In this field, many analytical formulas describe the same data set within the error band. Essentially, what they all have in common is that they are a form of curve that best matches the data they have on hand. They have some great ideas we will use to develop our own formula. Their weakness is that no one knows for sure which law is the best, because they are all derived from curve fitting.
To develop the algorithm for the DQV, we apply new 6-dimensional spacetime mathematics that the authors have developed. It may become a theory of everything. Its’ predictive power has been verified over a vast swath of physical, biological, neurological, chemical, and social science data. It is used to derive a law to describe social capital that emulates aspects of Kleiber’s empirically-derived power law relating an animal’s metabolic rate to its mass. We prototype a relationship similar to Kleiber’s law because social capital is a biological system indicator that seems to function in a similar delay discounted way.
The DQV itself is a vector function built using 6D symmetry mathematics and measures the total degree of social capital represented by the entire chain of voters involved in the vote transfer. The 6D mathematics applied in this context creates 4D projections, constraining particular types of power laws that all fit the data points. This theoretical basis gives it a strong and testable, predictive power that existing curve-fitted laws lack. We have already shown there is a universal basis for the symmetry of the world, and especially a human biological foundation for it. So, we are confident that we have a fundamental basis for our system of social capital.
This is relevant for our purposes insofar as the logic of vote transference converges with the discipline of science and its drive toward quantification. We begin to quantify trust: the sum of all steps, or transfers of votes, can be formulated into an equation that produces a trust value – trust follows along the lines of diminishing quarters.
(Trust = A + ∑ e¼, where “A” is the original vote numeric and “∑” is the total sum number of steps or transfers).
The form of this metric is supported through logic and research. It makes sense that I trust a person and that I would trust that person’s friend less than I trust that person. This is the beginning of a convergent series as the degree of separation from the source of the trust increases.
However, trust is also relative. Some individuals and groups will appreciate trust more than “the cold hard facts” and as such diminish its value relative to the more important quantities. So, although the comparable total value is less in some situations, the “shape” of the trust still follows an exponential pattern (convergent series) – alternatively, a hyperbolic pattern for the people interested in math. For example, because GE has a strong value proposition in its brand and momentum, it may value its social capital (its personnel; trust) and care less about employee turn-over. This is in contrast to a new, unproven start-up where the majority of value is in people, social capital, and trust. So incorporating this relative perspective, the total value of an organization would be T = x (conventional value) + y(A + ∑ e¼), where x and y can be synchronized to the organization or beliefs of the group.
As the source voter transfers the vote in a chain, the value of the vote decrements as it propagates through successive vote transfers because it is further and further away from the original voter. Hence, its’ worth is already 0.5 by the first vote transfer and almost 0.25 by the second. It decrements this way until it reaches a limit of four vote transfers. After this point, the decrement is not worth anything. The α is a parameter that determines the steepness of the curve and is a function of many variables of the particular social context such as group size, age, gender, education, and more.
This is but one formula for quantitatively capturing trust and offering a metric for calculating social physics. Further developing this new science of social equations will open new vistas for analyzing groups as diverse as political parties to companies – for approaching the relational aspect of human life with a degree of mathematical accuracy. Such developments in creative and technical thinking may soon offer ways of predicting erratic and undesirable behavior, as well as preventing catastrophic or destructive patterns from unfolding.
Beyond the rationale of cold hard facts, we give intuition a say in our revised voting system. With the new DQV, when we make a final decision, two measurable sources are combined for the final decision. The first is just the common facts; the second is the output of the DQV function, which measures the qualitative and intuitive knowledge of the decision-makers in our voting chain.
This allows us to quantify intangibles and convert social capital into economic value. This conversion will enable us to transfer intangibles over to the logical domain that is the mainstay of our modern society. The DQV can be used along with cold hard facts to create a total solution that integrates the best of the intuitive and rational world. Users can dial in as much or little of the DQV and see how the final answer changes based on the addition of the DQV. Such a technique can add a quantified intuitive dimension to practically any problem we intend to solve or decision we need to make.
A shift in thinking is required to amalgamate the quantitative and qualitative aspects at hand. In the case of social capital, something complex but elegant in its simplicity at the foundation is proposed: a metric that is participatory, fluid, and intelligible. The equation we recommend for the measurement of social capital is one to be incorporated within a proxy voting system. It is such an equation that would provide a quantitative basis for social capital to become another data system.
The Democratic Quality Vector is a new way to put a value on social capital. As we have learned, the debate between which is better, intuitive reasoning or analytic, is misleading because both play an essential role in effective decision-making. Daniel Kahneman’s system 1 (intuitive, fast decision-making) and system 2 (slow, analytical reasoning) categorization are useful in providing a means for categorizing these two, system 1 for rapid access to our storehouse of accumulated knowledge, and system 2 for analytical reasoning. Kahneman’s theory is also useful in pointing out the nuances of system 1 reasoning such as the need to be aware of a large number of cognitive biases that come with system 1 reasoning, as well as the quality as a function of our experience. Once again, both systems are essential, but only the second system is measurable, currently.
Our continuing focus is to develop a new way to measure intuitive or intangible sources of knowledge and to convert what has traditionally been considered intangible information into tangible, hard numbers. Thereby we will provide a means to quantify aspects of intuition for more effective decision-making, or to measure an organization’s intangible value to provide a more accurate economic valuation. Trust is the defining quality of effectively collaborating social group, whether a family, organization, community, or entire society. While social capital has many components such as trust, civic norms, civic engagement, and political engagement, all of them fundamentally depend on trust. Without trust existing in the group at some fundamental level, there can be no basis for any collective activity. When was the last time two active enemies had a cup of coffee together? A measure of trust is, therefore, a measure of a healthy organization.