The attempt to define consciousness is one of the most challenging and complicated problems in the study of the mind. The traditional definition of consciousness often includes higher-order aspects such as self-awareness, which are not necessarily definitive of consciousness itself. However, a more productive approach may be to define consciousness in terms of its opposite, unconsciousness, and to look at empirical observations to determine the minimum necessary and distinctive characteristics necessary to identify consciousness. To this end, we have explored the complexity of different nervous systems, from the decentralized nerve net of the lion's mane jellyfish to the more centralized nervous system of the Etruscan shrew. We have also discussed how the presence of a true cerebral cortex appears to be a threshold of complexity for being considered "conscious" when awake and not asleep. Empirically, everything with a true cerebral cortex goes through true "sleep," which suggests that sleep is an essential state for restorative and cognitive functions.
Daniel Dennett's multiple drafts model of consciousness (see The Fool's Reading List for further reading) provides a useful theoretical framework for understanding the parallel systems that cooperate together in the cerebral cortex to create the experience of consciousness. This model describes how the brain processes multiple sensory inputs simultaneously, allowing for the integration of parallel channels of information.
In light of our empirical observations and theoretical frameworks, we can propose a new minimum objective definition of consciousness. Consciousness is an information processing system's ability to perceive, integrate, and respond to internal and external stimuli in a dynamic and flexible manner by integrating the parallel channels of sensory information.
By emphasizing the integration of parallel channels of sensory information and the dynamic and flexible processing of internal and external stimuli, we allow for the possibility of varying degrees of consciousness in different entities as well. This approach could help guide our ethical considerations, as it may enable us to identify the varying levels of consciousness in artificial intelligences and non-human animals. Consequently, we could develop more nuanced ethical frameworks that take into account the unique needs and experiences of these diverse beings. This would promote a more compassionate and just approach in our interactions with them, addressing the pressing ethical questions of our time.
In conclusion, finding an adequate definition of consciousness is a complex and challenging problem, however this more information and system-based approach to the definition proves more useful in grappling with timely questions we will need to deal with soon. By defining consciousness as the opposite of unconsciousness and looking at empirical observations and theoretical frameworks, we can propose a new definition that better captures the minimum necessary and distinctive characteristics of consciousness that may help us decide the hard cases, as well as outline the existence of a spectrum of lucidity that may be useful for further ethical considerations such as determining the minimum criteria for assessing if AI are conscious, and the ethical treatment of non-human animals