The H I Mind Theory of Conscious Awareness.
John Cochrane – 13 May 2025
Abstract
This paper presents the third of four theories within the H I Mind Model of consciousness, each of which addresses a distinct aspect of conscious function. The present theory examines conscious awareness as an emergent property of layered evolutionary adaptations in sensory processing, attentional mechanisms, and internal simulations. Through a progression from rudimentary sensory detection to reflective self-awareness, this theory outlines a plausible framework for the development of conscious experience rooted in adaptive information-processing and behavioral control.
1. Introduction
Despite extensive inquiry, consciousness remains a challenging phenomenon to define operationally and mechanistically (Dehaene, 2014; Tononi et al., 2016). While subjective awareness is a central feature of conscious experience, its neural and evolutionary underpinnings are still not fully understood. The H I Mind Theory of Conscious Awareness posits that conscious awareness arises as an emergent property from evolutionary layers of sensory representation, attentional regulation, and executive simulation.
This framework proposes that the functions underlying awareness are not reducible to a single mechanism, but instead evolve from increasingly complex adaptations for perception, prediction, and behavioral coordination. Each cognitive layer remains active, contributing to a dynamic, integrated experience of reality.
2. From Sensory Detection to Representation: The Evolutionary Roots
2.1 Subconscious Sensory Processing
The earliest cognitive capabilities in evolutionary history likely involved basic sensory discrimination—for example, the detection of light and dark. Such rudimentary photoreception enabled primitive organisms to regulate circadian rhythms or navigate spatial environments based on simple environmental gradients (Nilsson, 2009).
These base-level sensory processes were likely subconscious in nature and functioned primarily to guide behavior through automatic, evolutionarily selected responses. This aligns with views in sensorimotor theory, which suggests that perception originates in embodied action and sensory feedback loops (O'Regan and Noë, 2001).
2.2 Pattern Recognition and Simulation
With the emergence of more complex nervous systems, pattern-recognition mechanisms evolved, supporting representations of shapes, motion, and spatial layout. Coupled with memory and innate or learned preferences, organisms could use sensory input to identify opportunities and threats in the environment—such as food, predators, or social partners (Gallistel, 2017).
This capacity for sensory simulation—the internal modeling of the external world—enabled anticipatory behavior. From an evolutionary perspective, such modeling provided a foundation for adaptive success, consistent with predictive coding theories of perception (Friston, 2010).
3. Attention as Sensory Prioritization
3.1 The Function of Attention
As sensory fidelity and complexity increased, the need for selective prioritization arose. Attention emerged as a mechanism for allocating cognitive resources to salient stimuli, thereby optimizing information processing (Posner and Petersen, 1990). Attention is not merely a filter, but a dynamic system that evaluates sensory input, guides behavioral relevance, and modulates perceptual simulations in light of current goals (Corbetta and Shulman, 2002).
3.2 Prediction and Simulation Integration
Attention also plays a role in forward modeling—using existing simulations to anticipate future states of the environment and the organism. This capability is vital for adaptive decision-making and underlies what might be described phenomenologically as “noticing.” Attention enriches raw sensory data by embedding it within contextually relevant interpretations and simulations of possible outcomes.
4. Concentrated Attention and Executive Function
4.1 Specialization of Focus
As attentional mechanisms became more refined, a more structured form of focused attention likely evolved. This “concentrated-attention” allowed organisms to compare multiple interpretations of environmental data and direct internal simulation resources toward task-relevant goals. Such focused processing is closely tied to the function of the prefrontal cortex in humans (Miller and Cohen, 2001).
The H I Mind Theory suggests a functional bifurcation within concentrated-attention systems: one stream focused on opportunity-seeking and another on threat-avoidance. This dual-processing model mirrors the division seen in mammalian affective and motivational systems, such as approach/avoidance behaviors and positive/negative valence networks (LeDoux, 2012).
4.2 Emergence of Behavioral Agency
Concentrated-attention systems are proposed to mediate between perceptual recognition and behavioral output, functioning as executive managers that link perception to action. These systems enable organisms to select among possible behavioral strategies and to modulate perception based on expected outcomes—a hallmark of goal-directed, conscious behavior.
Phenomenologically, this may be experienced as curiosity—a cognitive-emotional state driven by prediction errors and the desire to resolve informational gaps (Gottlieb and Oudeyer, 2018).
5. The Development of Self-Awareness
5.1 Self-Simulation and Metacognition
The next major evolutionary advance proposed is the emergence of self-awareness—the capacity to model not only the external world but also the internal cognitive and affective states of the self. Self-awareness involves a recursive layer of simulation, enabling monitoring and regulation of ongoing perceptual, evaluative, and decision-making processes.
This capacity likely evolved through the refinement of interoceptive systems, executive control, and mental state inference, supported by networks such as the default mode network and frontoparietal systems (Craig, 2009; Christoff et al., 2016).
5.2 Experiential Markers of Self
Self-awareness enables an appreciation of bodily states (interoception) and cognitive positioning of the self in relation to the environment. It also supports agency—the sense of being the author of one’s actions—and perspectival organization of experience, both of which are essential components of reflective consciousness (Gallagher, 2005; Seth, 2013).
6. Consciousness as Emergent Integration
6.1 Layered Processing and Emergent Experience
The culmination of this evolutionary trajectory is conscious awareness, defined here as an emergent property of recursive, layered information-processing systems. Consciousness does not reside in a specific brain structure or process but arises from the coordination and integration of perception, attention, simulation, memory, language, and action (Baars, 1988; Tononi et al., 2016).
The H I Mind Theory conceptualizes consciousness as a time-structured narrative constructed from internal simulations and working memory. The “storyline” of consciousness enables coherent representation of past, present, and predicted future events, providing an ongoing interpretation of reality as it relates to the self.
6.2 Consciousness as a Feedback Loop
In this model, consciousness is a high-level feedback system in which the brain continuously monitors its own simulations, sensory interpretations, and behavioral predictions. This recursive monitoring enables both flexible behavior and subjective continuity. It is through this feedback loop that consciousness enables reflective, intentional action in a complex and dynamic world.
7. Conclusion
The H I Mind Theory of Conscious Awareness proposes that consciousness evolved through the integration of increasingly complex sensory, attentional, and evaluative systems. Conscious awareness emerges from a dynamic interaction among these layers—culminating in a recursive, story-based simulation process that enables flexible, goal-directed behavior. This theory is compatible with contemporary models of predictive processing, metacognition, and emergent consciousness, and provides a biologically plausible foundation for understanding subjective awareness.
References
• Baars, B. J. (1988). A Cognitive Theory of Consciousness. Cambridge University Press.
• Christoff, K., Irving, Z. C., Fox, K. C., Spreng, R. N., and Andrews-Hanna, J. R. (2016). Mind-wandering as spontaneous thought: A dynamic framework. Nature Reviews Neuroscience, 17(11), 718–731.
• Corbetta, M., and Shulman, G. L. (2002). Control of goal-directed and stimulus-driven attention in the brain. Nature Reviews Neuroscience, 3(3), 201–215.
• Craig, A. D. (2009). How do you feel—now? The anterior insula and human awareness. Nature Reviews Neuroscience, 10(1), 59–70.
• Dehaene, S. (2014). Consciousness and the Brain: Deciphering How the Brain Codes Our Thoughts. Viking.
• Friston, K. (2010). The free-energy principle: A unified brain theory? Nature Reviews Neuroscience, 11(2), 127–138.
• Gallagher, S. (2005). How the Body Shapes the Mind. Oxford University Press.
• Gallistel, C. R. (2017). The coding question. Trends in Cognitive Sciences, 21(7), 498–508.
• Gottlieb, J., and Oudeyer, P.-Y. (2018). Towards a neuroscience of active sampling and curiosity. Nature Reviews Neuroscience, 19(12), 758–770.
• LeDoux, J. E. (2012). Rethinking the Emotional Brain. Neuron, 73(4), 653–676.
• Miller, E. K., and Cohen, J. D. (2001). An integrative theory of prefrontal cortex function. Annual Review of Neuroscience, 24(1), 167–202.
• Nilsson, D.-E. (2009). The evolution of eyes and visually guided behavior. Philosophical Transactions of the Royal Society B, 364(1531), 2833–2847.
• O'Regan, J. K., and Noë, A. (2001). A sensorimotor account of vision and visual consciousness. Behavioral and Brain Sciences, 24(5), 939–973.
• Posner, M. I., and Petersen, S. E. (1990). The attention system of the human brain. Annual Review of Neuroscience, 13(1), 25–42.
• Seth, A. K. (2013). Interoceptive inference, emotion, and the embodied self. Trends in Cognitive Sciences, 17(11), 565–573.
• Tononi, G., Boly, M., Massimini, M., and Koch, C. (2016). Integrated information theory: From consciousness to its physical substrate. Nature Reviews Neuroscience, 17(7), 450–461.